arXiv Open Access 2025

Sacred or Synthetic? Evaluating LLM Reliability and Abstention for Religious Questions

Farah Atif Nursultan Askarbekuly Kareem Darwish Monojit Choudhury
Lihat Sumber

Abstrak

Despite the increasing usage of Large Language Models (LLMs) in answering questions in a variety of domains, their reliability and accuracy remain unexamined for a plethora of domains including the religious domains. In this paper, we introduce a novel benchmark FiqhQA focused on the LLM generated Islamic rulings explicitly categorized by the four major Sunni schools of thought, in both Arabic and English. Unlike prior work, which either overlooks the distinctions between religious school of thought or fails to evaluate abstention behavior, we assess LLMs not only on their accuracy but also on their ability to recognize when not to answer. Our zero-shot and abstention experiments reveal significant variation across LLMs, languages, and legal schools of thought. While GPT-4o outperforms all other models in accuracy, Gemini and Fanar demonstrate superior abstention behavior critical for minimizing confident incorrect answers. Notably, all models exhibit a performance drop in Arabic, highlighting the limitations in religious reasoning for languages other than English. To the best of our knowledge, this is the first study to benchmark the efficacy of LLMs for fine-grained Islamic school of thought specific ruling generation and to evaluate abstention for Islamic jurisprudence queries. Our findings underscore the need for task-specific evaluation and cautious deployment of LLMs in religious applications.

Topik & Kata Kunci

Penulis (4)

F

Farah Atif

N

Nursultan Askarbekuly

K

Kareem Darwish

M

Monojit Choudhury

Format Sitasi

Atif, F., Askarbekuly, N., Darwish, K., Choudhury, M. (2025). Sacred or Synthetic? Evaluating LLM Reliability and Abstention for Religious Questions. https://arxiv.org/abs/2508.08287

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓