Semantic Scholar Open Access 2019 2959 sitasi

Cross-lingual Language Model Pretraining

Guillaume Lample Alexis Conneau

Abstrak

Recent studies have demonstrated the efficiency of generative pretraining for English natural language understanding. In this work, we extend this approach to multiple languages and show the effectiveness of cross-lingual pretraining. We propose two methods to learn cross-lingual language models (XLMs): one unsupervised that only relies on monolingual data, and one supervised that leverages parallel data with a new cross-lingual language model objective. We obtain state-of-the-art results on cross-lingual classification, unsupervised and supervised machine translation. On XNLI, our approach pushes the state of the art by an absolute gain of 4.9% accuracy. On unsupervised machine translation, we obtain 34.3 BLEU on WMT’16 German-English, improving the previous state of the art by more than 9 BLEU. On supervised machine translation, we obtain a new state of the art of 38.5 BLEU on WMT’16 Romanian-English, outperforming the previous best approach by more than 4 BLEU. Our code and pretrained models will be made publicly available.

Topik & Kata Kunci

Penulis (2)

G

Guillaume Lample

A

Alexis Conneau

Format Sitasi

Lample, G., Conneau, A. (2019). Cross-lingual Language Model Pretraining. https://www.semanticscholar.org/paper/ec4eba83f6b3266d9ae7cabb2b2cb1518f727edc

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2019
Bahasa
en
Total Sitasi
2959×
Sumber Database
Semantic Scholar
Akses
Open Access ✓