arXiv Open Access 2021

NAM: Normalization-based Attention Module

Yichao Liu Zongru Shao Yueyang Teng Nico Hoffmann
Lihat Sumber

Abstrak

Recognizing less salient features is the key for model compression. However, it has not been investigated in the revolutionary attention mechanisms. In this work, we propose a novel normalization-based attention module (NAM), which suppresses less salient weights. It applies a weight sparsity penalty to the attention modules, thus, making them more computational efficient while retaining similar performance. A comparison with three other attention mechanisms on both Resnet and Mobilenet indicates that our method results in higher accuracy. Code for this paper can be publicly accessed at https://github.com/Christian-lyc/NAM.

Topik & Kata Kunci

Penulis (4)

Y

Yichao Liu

Z

Zongru Shao

Y

Yueyang Teng

N

Nico Hoffmann

Format Sitasi

Liu, Y., Shao, Z., Teng, Y., Hoffmann, N. (2021). NAM: Normalization-based Attention Module. https://arxiv.org/abs/2111.12419

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2021
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓