arXiv
Open Access
2024
Flexible and Effective Mixing of Large Language Models into a Mixture of Domain Experts
Rhui Dih Lee
Laura Wynter
Raghu Kiran Ganti
Abstrak
We present a toolkit for creating low-cost Mixture-of-Domain-Experts (MOE) from trained models. The toolkit can be used for creating a mixture from models or from adapters. We perform extensive tests and offer guidance on defining the architecture of the resulting MOE using the toolkit. A public repository is available.
Penulis (3)
R
Rhui Dih Lee
L
Laura Wynter
R
Raghu Kiran Ganti
Akses Cepat
Informasi Jurnal
- Tahun Terbit
- 2024
- Bahasa
- en
- Sumber Database
- arXiv
- Akses
- Open Access ✓