arXiv Open Access 2024

Flexible and Effective Mixing of Large Language Models into a Mixture of Domain Experts

Rhui Dih Lee Laura Wynter Raghu Kiran Ganti
Lihat Sumber

Abstrak

We present a toolkit for creating low-cost Mixture-of-Domain-Experts (MOE) from trained models. The toolkit can be used for creating a mixture from models or from adapters. We perform extensive tests and offer guidance on defining the architecture of the resulting MOE using the toolkit. A public repository is available.

Topik & Kata Kunci

Penulis (3)

R

Rhui Dih Lee

L

Laura Wynter

R

Raghu Kiran Ganti

Format Sitasi

Lee, R.D., Wynter, L., Ganti, R.K. (2024). Flexible and Effective Mixing of Large Language Models into a Mixture of Domain Experts. https://arxiv.org/abs/2408.17280

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2024
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓