arXiv Open Access 2020

RealFormer: Transformer Likes Residual Attention

Ruining He Anirudh Ravula Bhargav Kanagal Joshua Ainslie
Lihat Sumber

Abstrak

Transformer is the backbone of modern NLP models. In this paper, we propose RealFormer, a simple and generic technique to create Residual Attention Layer Transformer networks that significantly outperform the canonical Transformer and its variants (BERT, ETC, etc.) on a wide spectrum of tasks including Masked Language Modeling, GLUE, SQuAD, Neural Machine Translation, WikiHop, HotpotQA, Natural Questions, and OpenKP. We also observe empirically that RealFormer stabilizes training and leads to models with sparser attention. Source code and pre-trained checkpoints for RealFormer can be found at https://github.com/google-research/google-research/tree/master/realformer.

Topik & Kata Kunci

Penulis (4)

R

Ruining He

A

Anirudh Ravula

B

Bhargav Kanagal

J

Joshua Ainslie

Format Sitasi

He, R., Ravula, A., Kanagal, B., Ainslie, J. (2020). RealFormer: Transformer Likes Residual Attention. https://arxiv.org/abs/2012.11747

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2020
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓