SciEvent: Benchmarking Multi-domain Scientific Event Extraction

📅 2025-09-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing scientific information extraction (SciIE) methods are confined to single-domain entity-relation identification, failing to model interdisciplinary scientific events and their contextual dependencies—leading to fragmented and inconsistent knowledge. Method: We introduce SciEvent, the first benchmark for interdisciplinary scientific event extraction, comprising 500 paper abstracts across five domains, annotated with a unified, fine-grained event schema including triggers and multi-role arguments. We propose a multi-stage event extraction framework integrating paragraph segmentation, trigger identification, and argument extraction, and conduct systematic evaluation via fine-tuned models, large language models, and human annotation. Contribution/Results: Experiments reveal significant performance degradation on social sciences and humanities, exposing critical bottlenecks in semantic understanding and cross-domain generalization. SciEvent quantifies gaps between model predictions and human annotations, establishing a standardized, high-challenge evaluation platform for structuring interdisciplinary scientific knowledge.

Technology Category

Application Category

📝 Abstract
Scientific information extraction (SciIE) has primarily relied on entity-relation extraction in narrow domains, limiting its applicability to interdisciplinary research and struggling to capture the necessary context of scientific information, often resulting in fragmented or conflicting statements. In this paper, we introduce SciEvent, a novel multi-domain benchmark of scientific abstracts annotated via a unified event extraction (EE) schema designed to enable structured and context-aware understanding of scientific content. It includes 500 abstracts across five research domains, with manual annotations of event segments, triggers, and fine-grained arguments. We define SciIE as a multi-stage EE pipeline: (1) segmenting abstracts into core scientific activities--Background, Method, Result, and Conclusion; and (2) extracting the corresponding triggers and arguments. Experiments with fine-tuned EE models, large language models (LLMs), and human annotators reveal a performance gap, with current models struggling in domains such as sociology and humanities. SciEvent serves as a challenging benchmark and a step toward generalizable, multi-domain SciIE.
Problem

Research questions and friction points this paper is trying to address.

Addressing fragmented scientific information extraction across domains
Creating multi-domain benchmark for structured event extraction
Overcoming limitations of entity-relation models in interdisciplinary research
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-domain event extraction benchmark
Unified schema for scientific abstracts
Multi-stage pipeline for context-aware extraction
🔎 Similar Papers
No similar papers found.