arXiv Open Access 2021

Introduce the Result Into Self-Attention

Chengcheng Ye
Lihat Sumber

Abstrak

Traditional self-attention mechanisms in convolutional networks tend to use only the output of the previous layer as input to the attention network, such as SENet, CBAM, etc. In this paper, we propose a new attention modification method that tries to get the output of the classification network in advance and use it as a part of the input of the attention network. We used the auxiliary classifier proposed in GoogLeNet to obtain the results in advance and pass them into attention networks. we added this mechanism to SE-ResNet for our experiments and achieved a classification accuracy improvement of at most 1.94% on cifar100.

Topik & Kata Kunci

Penulis (1)

C

Chengcheng Ye

Format Sitasi

Ye, C. (2021). Introduce the Result Into Self-Attention. https://arxiv.org/abs/2109.13860

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2021
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓