arXiv Open Access 2025

EuroGEST: Investigating gender stereotypes in multilingual language models

Jacqueline Rowe Mateusz Klimaszewski Liane Guillou Shannon Vallor Alexandra Birch
Lihat Sumber

Abstrak

Large language models increasingly support multiple languages, yet most benchmarks for gender bias remain English-centric. We introduce EuroGEST, a dataset designed to measure gender-stereotypical reasoning in LLMs across English and 29 European languages. EuroGEST builds on an existing expert-informed benchmark covering 16 gender stereotypes, expanded in this work using translation tools, quality estimation metrics, and morphological heuristics. Human evaluations confirm that our data generation method results in high accuracy of both translations and gender labels across languages. We use EuroGEST to evaluate 24 multilingual language models from six model families, demonstrating that the strongest stereotypes in all models across all languages are that women are 'beautiful', 'empathetic' and 'neat' and men are 'leaders', 'strong, tough' and 'professional'. We also show that larger models encode gendered stereotypes more strongly and that instruction finetuning does not consistently reduce gendered stereotypes. Our work highlights the need for more multilingual studies of fairness in LLMs and offers scalable methods and resources to audit gender bias across languages.

Topik & Kata Kunci

Penulis (5)

J

Jacqueline Rowe

M

Mateusz Klimaszewski

L

Liane Guillou

S

Shannon Vallor

A

Alexandra Birch

Format Sitasi

Rowe, J., Klimaszewski, M., Guillou, L., Vallor, S., Birch, A. (2025). EuroGEST: Investigating gender stereotypes in multilingual language models. https://arxiv.org/abs/2506.03867

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓