Semantic Scholar Open Access 2018 22701 sitasi

CBAM: Convolutional Block Attention Module

Sanghyun Woo Jongchan Park Joon-Young Lee In-So Kweon

Abstrak

We propose Convolutional Block Attention Module (CBAM), a simple yet effective attention module for feed-forward convolutional neural networks. Given an intermediate feature map, our module sequentially infers attention maps along two separate dimensions, channel and spatial, then the attention maps are multiplied to the input feature map for adaptive feature refinement. Because CBAM is a lightweight and general module, it can be integrated into any CNN architectures seamlessly with negligible overheads and is end-to-end trainable along with base CNNs. We validate our CBAM through extensive experiments on ImageNet-1K, MS COCO detection, and VOC 2007 detection datasets. Our experiments show consistent improvements in classification and detection performances with various models, demonstrating the wide applicability of CBAM. The code and models will be publicly available.

Topik & Kata Kunci

Penulis (4)

S

Sanghyun Woo

J

Jongchan Park

J

Joon-Young Lee

I

In-So Kweon

Format Sitasi

Woo, S., Park, J., Lee, J., Kweon, I. (2018). CBAM: Convolutional Block Attention Module. https://doi.org/10.1007/978-3-030-01234-2_1

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.1007/978-3-030-01234-2_1
Informasi Jurnal
Tahun Terbit
2018
Bahasa
en
Total Sitasi
22701×
Sumber Database
Semantic Scholar
DOI
10.1007/978-3-030-01234-2_1
Akses
Open Access ✓