Semantic Scholar Open Access 2022 952 sitasi

Multimodal Learning With Transformers: A Survey

P. Xu Xiatian Zhu D. Clifton

Abstrak

Transformer is a promising neural network learner, and has achieved great success in various machine learning tasks. Thanks to the recent prevalence of multimodal applications and Big Data, Transformer-based multimodal learning has become a hot topic in AI research. This paper presents a comprehensive survey of Transformer techniques oriented at multimodal data. The main contents of this survey include: (1) a background of multimodal learning, Transformer ecosystem, and the multimodal Big Data era, (2) a systematic review of Vanilla Transformer, Vision Transformer, and multimodal Transformers, from a geometrically topological perspective, (3) a review of multimodal Transformer applications, via two important paradigms, i.e., for multimodal pretraining and for specific multimodal tasks, (4) a summary of the common challenges and designs shared by the multimodal Transformer models and applications, and (5) a discussion of open problems and potential research directions for the community.

Penulis (3)

P

P. Xu

X

Xiatian Zhu

D

D. Clifton

Format Sitasi

Xu, P., Zhu, X., Clifton, D. (2022). Multimodal Learning With Transformers: A Survey. https://doi.org/10.1109/TPAMI.2023.3275156

Akses Cepat

Lihat di Sumber doi.org/10.1109/TPAMI.2023.3275156
Informasi Jurnal
Tahun Terbit
2022
Bahasa
en
Total Sitasi
952×
Sumber Database
Semantic Scholar
DOI
10.1109/TPAMI.2023.3275156
Akses
Open Access ✓