arXiv Open Access 2024

The Large Language Model GreekLegalRoBERTa

Vasileios Saketos Despina-Athanasia Pantazi Manolis Koubarakis
Lihat Sumber

Abstrak

We develop four versions of GreekLegalRoBERTa, which are four large language models trained on Greek legal and nonlegal text. We show that our models surpass the performance of GreekLegalBERT, Greek- LegalBERT-v2, and GreekBERT in two tasks involving Greek legal documents: named entity recognition and multi-class legal topic classification. We view our work as a contribution to the study of domain-specific NLP tasks in low-resource languages, like Greek, using modern NLP techniques and methodologies.

Topik & Kata Kunci

Penulis (3)

V

Vasileios Saketos

D

Despina-Athanasia Pantazi

M

Manolis Koubarakis

Format Sitasi

Saketos, V., Pantazi, D., Koubarakis, M. (2024). The Large Language Model GreekLegalRoBERTa. https://arxiv.org/abs/2410.12852

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2024
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓