+VeriRel: Verification Feedback to Enhance Document Retrieval for Scientific Fact Checking

📅 2025-08-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing scientific fact-checking retrieval methods rank documents solely based on textual relevance, ignoring whether they provide supporting or refuting evidence for a claim—leading to suboptimal evidence relevance. To address this, we propose a fine-grained evidence assessment framework tailored for scientific fact-checking, which explicitly models verification success—i.e., a document’s capacity to substantiate or refute the claim—as the core signal for relevance ranking. Our approach synergistically integrates information retrieval with natural language inference, leveraging outputs from a verification model to dynamically re-rank retrieved documents. This breaks away from conventional surface-level matching paradigms. Empirical evaluation demonstrates state-of-the-art evidence retrieval performance on three benchmark datasets—SciFact, SciFact-Open, and Check-Covid—and yields significant improvements in downstream fact-checking accuracy.

Technology Category

Application Category

📝 Abstract
Identification of appropriate supporting evidence is critical to the success of scientific fact checking. However, existing approaches rely on off-the-shelf Information Retrieval algorithms that rank documents based on relevance rather than the evidence they provide to support or refute the claim being checked. This paper proposes +VeriRel which includes verification success in the document ranking. Experimental results on three scientific fact checking datasets (SciFact, SciFact-Open and Check-Covid) demonstrate consistently leading performance by +VeriRel for document evidence retrieval and a positive impact on downstream verification. This study highlights the potential of integrating verification feedback to document relevance assessment for effective scientific fact checking systems. It shows promising future work to evaluate fine-grained relevance when examining complex documents for advanced scientific fact checking.
Problem

Research questions and friction points this paper is trying to address.

Enhancing document retrieval for scientific fact checking
Integrating verification feedback into document ranking
Improving evidence identification over relevance ranking
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates verification feedback into document ranking
Enhances evidence retrieval for scientific claims
Improves downstream fact verification performance
🔎 Similar Papers
No similar papers found.