arXiv Open Access 2024

LOLA -- An Open-Source Massively Multilingual Large Language Model

Nikit Srivastava Denis Kuchelev Tatiana Moteu Ngoli Kshitij Shetty Michael Röder +3 lainnya
Lihat Sumber

Abstrak

This paper presents LOLA, a massively multilingual large language model trained on more than 160 languages using a sparse Mixture-of-Experts Transformer architecture. Our architectural and implementation choices address the challenge of harnessing linguistic diversity while maintaining efficiency and avoiding the common pitfalls of multilinguality. Our analysis of the evaluation results shows competitive performance in natural language generation and understanding tasks. Additionally, we demonstrate how the learned expert-routing mechanism exploits implicit phylogenetic linguistic patterns to potentially alleviate the curse of multilinguality. We provide an in-depth look at the training process, an analysis of the datasets, and a balanced exploration of the model's strengths and limitations. As an open-source model, LOLA promotes reproducibility and serves as a robust foundation for future research. Our findings enable the development of compute-efficient multilingual models with strong, scalable performance across languages.

Topik & Kata Kunci

Penulis (8)

N

Nikit Srivastava

D

Denis Kuchelev

T

Tatiana Moteu Ngoli

K

Kshitij Shetty

M

Michael Röder

H

Hamada Zahera

D

Diego Moussallem

A

Axel-Cyrille Ngonga Ngomo

Format Sitasi

Srivastava, N., Kuchelev, D., Ngoli, T.M., Shetty, K., Röder, M., Zahera, H. et al. (2024). LOLA -- An Open-Source Massively Multilingual Large Language Model. https://arxiv.org/abs/2409.11272

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2024
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓