Semantic Scholar Open Access 2020 3093 sitasi

mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer

Linting Xue Noah Constant Adam Roberts Mihir Kale Rami Al-Rfou +3 lainnya

Abstrak

The recent “Text-to-Text Transfer Transformer” (T5) leveraged a unified text-to-text format and scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this paper, we introduce mT5, a multilingual variant of T5 that was pre-trained on a new Common Crawl-based dataset covering 101 languages. We detail the design and modified training of mT5 and demonstrate its state-of-the-art performance on many multilingual benchmarks. We also describe a simple technique to prevent “accidental translation” in the zero-shot setting, where a generative model chooses to (partially) translate its prediction into the wrong language. All of the code and model checkpoints used in this work are publicly available.

Topik & Kata Kunci

Penulis (8)

L

Linting Xue

N

Noah Constant

A

Adam Roberts

M

Mihir Kale

R

Rami Al-Rfou

A

Aditya Siddhant

A

Aditya Barua

C

Colin Raffel

Format Sitasi

Xue, L., Constant, N., Roberts, A., Kale, M., Al-Rfou, R., Siddhant, A. et al. (2020). mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer. https://doi.org/10.18653/V1/2021.NAACL-MAIN.41

Akses Cepat

Informasi Jurnal
Tahun Terbit
2020
Bahasa
en
Total Sitasi
3093×
Sumber Database
Semantic Scholar
DOI
10.18653/V1/2021.NAACL-MAIN.41
Akses
Open Access ✓