arXiv
Open Access
2022
Adapting BigScience Multilingual Model to Unseen Languages
Zheng-Xin Yong
Vassilina Nikoulina
Abstrak
We benchmark different strategies of adding new languages (German and Korean) into the BigScience's pretrained multilingual language model with 1.3 billion parameters that currently supports 13 languages. We investigate the factors that affect the language adaptability of the model and the trade-offs between computational costs and expected performance.
Penulis (2)
Z
Zheng-Xin Yong
V
Vassilina Nikoulina
Akses Cepat
Informasi Jurnal
- Tahun Terbit
- 2022
- Bahasa
- en
- Sumber Database
- arXiv
- Akses
- Open Access ✓