Semantic Scholar
Open Access
2022
559 sitasi
A General Survey on Attention Mechanisms in Deep Learning
Gianni Brauwers
Flavius Frasincar
Abstrak
Attention is an important mechanism that can be employed for a variety of deep learning models across many different domains and tasks. This survey provides an overview of the most important attention mechanisms proposed in the literature. The various attention mechanisms are explained by means of a framework consisting of a general attention model, uniform notation, and a comprehensive taxonomy of attention mechanisms. Furthermore, the various measures for evaluating attention models are reviewed, and methods to characterize the structure of attention models based on the proposed framework are discussed. Last, future work in the field of attention models is considered.
Topik & Kata Kunci
Penulis (2)
G
Gianni Brauwers
F
Flavius Frasincar
Akses Cepat
Informasi Jurnal
- Tahun Terbit
- 2022
- Bahasa
- en
- Total Sitasi
- 559×
- Sumber Database
- Semantic Scholar
- DOI
- 10.1109/TKDE.2021.3126456
- Akses
- Open Access ✓