arXiv Open Access 2025

SciEvent: Benchmarking Multi-domain Scientific Event Extraction

Bofu Dong Pritesh Shah Sumedh Sonawane Tiyasha Banerjee Erin Brady +2 lainnya
Lihat Sumber

Abstrak

Scientific information extraction (SciIE) has primarily relied on entity-relation extraction in narrow domains, limiting its applicability to interdisciplinary research and struggling to capture the necessary context of scientific information, often resulting in fragmented or conflicting statements. In this paper, we introduce SciEvent, a novel multi-domain benchmark of scientific abstracts annotated via a unified event extraction (EE) schema designed to enable structured and context-aware understanding of scientific content. It includes 500 abstracts across five research domains, with manual annotations of event segments, triggers, and fine-grained arguments. We define SciIE as a multi-stage EE pipeline: (1) segmenting abstracts into core scientific activities--Background, Method, Result, and Conclusion; and (2) extracting the corresponding triggers and arguments. Experiments with fine-tuned EE models, large language models (LLMs), and human annotators reveal a performance gap, with current models struggling in domains such as sociology and humanities. SciEvent serves as a challenging benchmark and a step toward generalizable, multi-domain SciIE.

Topik & Kata Kunci

Penulis (7)

B

Bofu Dong

P

Pritesh Shah

S

Sumedh Sonawane

T

Tiyasha Banerjee

E

Erin Brady

X

Xinya Du

M

Ming Jiang

Format Sitasi

Dong, B., Shah, P., Sonawane, S., Banerjee, T., Brady, E., Du, X. et al. (2025). SciEvent: Benchmarking Multi-domain Scientific Event Extraction. https://arxiv.org/abs/2509.15620

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓