arXiv Open Access 2025

HypER: Literature-grounded Hypothesis Generation and Distillation with Provenance

Rosni Vasu Chandrayee Basu Bhavana Dalvi Mishra Cristina Sarasua Peter Clark +1 lainnya
Lihat Sumber

Abstrak

Large Language models have demonstrated promising performance in research ideation across scientific domains. Hypothesis development, the process of generating a highly specific declarative statement connecting a research idea with empirical validation, has received relatively less attention. Existing approaches trivially deploy retrieval augmentation and focus only on the quality of the final output ignoring the underlying reasoning process behind ideation. We present $\texttt{HypER}$ ($\textbf{Hyp}$othesis Generation with $\textbf{E}$xplanation and $\textbf{R}$easoning), a small language model (SLM) trained for literature-guided reasoning and evidence-based hypothesis generation. $\texttt{HypER}$ is trained in a multi-task setting to discriminate between valid and invalid scientific reasoning chains in presence of controlled distractions. We find that $\texttt{HypER}$ outperformes the base model, distinguishing valid from invalid reasoning chains (+22\% average absolute F1), generates better evidence-grounded hypotheses (0.327 vs. 0.305 base model) with high feasibility and impact as judged by human experts ($>$3.5 on 5-point Likert scale).

Topik & Kata Kunci

Penulis (6)

R

Rosni Vasu

C

Chandrayee Basu

B

Bhavana Dalvi Mishra

C

Cristina Sarasua

P

Peter Clark

A

Abraham Bernstein

Format Sitasi

Vasu, R., Basu, C., Mishra, B.D., Sarasua, C., Clark, P., Bernstein, A. (2025). HypER: Literature-grounded Hypothesis Generation and Distillation with Provenance. https://arxiv.org/abs/2506.12937

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓