arXiv Open Access 2025

Domain Adaptation of Foundation LLMs for e-Commerce

Christian Herold Michael Kozielski Tala Bazazo Pavel Petrushkov Patrycja Cieplicka +4 lainnya
Lihat Sumber

Abstrak

We present the e-Llama models: 8 billion and 70 billion parameter large language models that are adapted towards the e-commerce domain. These models are meant as foundation models with deep knowledge about e-commerce, that form a base for instruction- and fine-tuning. The e-Llama models are obtained by continuously pretraining the Llama 3.1 base models on 1 trillion tokens of domain-specific data. We discuss our approach and motivate our choice of hyperparameters with a series of ablation studies. To quantify how well the models have been adapted to the e-commerce domain, we define and implement a set of multilingual, e-commerce specific evaluation tasks. We show that, when carefully choosing the training setup, the Llama 3.1 models can be adapted towards the new domain without sacrificing significant performance on general domain tasks. We also explore the possibility of merging the adapted model and the base model for a better control of the performance trade-off between domains.

Topik & Kata Kunci

Penulis (9)

C

Christian Herold

M

Michael Kozielski

T

Tala Bazazo

P

Pavel Petrushkov

P

Patrycja Cieplicka

D

Dominika Basaj

Y

Yannick Versley

S

Seyyed Hadi Hashemi

S

Shahram Khadivi

Format Sitasi

Herold, C., Kozielski, M., Bazazo, T., Petrushkov, P., Cieplicka, P., Basaj, D. et al. (2025). Domain Adaptation of Foundation LLMs for e-Commerce. https://arxiv.org/abs/2501.09706

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓