arXiv Open Access 2024

On the Suitability of pre-trained foundational LLMs for Analysis in German Legal Education

Lorenz Wendlinger Christian Braun Abdullah Al Zubaer Simon Alexander Nonn Sarah Großkopf +2 lainnya
Lihat Sumber

Abstrak

We show that current open-source foundational LLMs possess instruction capability and German legal background knowledge that is sufficient for some legal analysis in an educational context. However, model capability breaks down in very specific tasks, such as the classification of "Gutachtenstil" appraisal style components, or with complex contexts, such as complete legal opinions. Even with extended context and effective prompting strategies, they cannot match the Bag-of-Words baseline. To combat this, we introduce a Retrieval Augmented Generation based prompt example selection method that substantially improves predictions in high data availability scenarios. We further evaluate the performance of pre-trained LLMs on two standard tasks for argument mining and automated essay scoring and find it to be more adequate. Throughout, pre-trained LLMs improve upon the baseline in scenarios with little or no labeled data with Chain-of-Thought prompting further helping in the zero-shot case.

Topik & Kata Kunci

Penulis (7)

L

Lorenz Wendlinger

C

Christian Braun

A

Abdullah Al Zubaer

S

Simon Alexander Nonn

S

Sarah Großkopf

C

Christofer Fellicious

M

Michael Granitzer

Format Sitasi

Wendlinger, L., Braun, C., Zubaer, A.A., Nonn, S.A., Großkopf, S., Fellicious, C. et al. (2024). On the Suitability of pre-trained foundational LLMs for Analysis in German Legal Education. https://arxiv.org/abs/2412.15902

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2024
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓