DOAJ Open Access 2026

FeynTune: large language models for high-energy theory

Paul Richmond Constantinos Papageorgakis Vasilis Niarchos Borun Chowdhury Prarit Agarwal

Abstrak

We present specialized large language models (LLMs) for theoretical high-energy physics, obtained as 20 fine-tuned variants of the 8 billion parameter Llama-3.1 model. Each variant was trained on arXiv abstracts (through August 2024) from different combinations of hep-th, hep-ph and gr-qc. For a comparative study, we also trained models on datasets that contained abstracts from disparate fields such as the q-bio and cs categories. All models were fine-tuned using two distinct low-rank adaptation fine-tuning approaches and varying dataset sizes, and outperformed the base model on hep-th abstract completion tasks. We compare performance against leading commercial LLMs (ChatGPT, Claude, Gemini, DeepSeek) and derive insights for further developing specialized language models for high-energy theoretical physics.

Penulis (5)

P

Paul Richmond

C

Constantinos Papageorgakis

V

Vasilis Niarchos

B

Borun Chowdhury

P

Prarit Agarwal

Format Sitasi

Richmond, P., Papageorgakis, C., Niarchos, V., Chowdhury, B., Agarwal, P. (2026). FeynTune: large language models for high-energy theory. https://doi.org/10.1088/2632-2153/ae47bb

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.1088/2632-2153/ae47bb
Informasi Jurnal
Tahun Terbit
2026
Sumber Database
DOAJ
DOI
10.1088/2632-2153/ae47bb
Akses
Open Access ✓