FACTS&EVIDENCE: An Interactive Tool for Transparent Fine-Grained Factual Verification of Machine-Generated Text

📅 2025-03-19
📈 Citations: 1
Influential: 1
📄 PDF
🤖 AI Summary
Existing AI-generated content (AIGC) fact-checking tools predominantly rely on black-box binary classification or regression models, suffering from poor interpretability, limited evidence diversity, and minimal user interactivity. Method: We propose the first user-driven, fine-grained fact verification framework that decomposes long texts into atomic claims, integrates heterogeneous multi-source evidence (e.g., knowledge bases, web pages, documents), and employs cross-source evidence fusion with an interpretable reasoning model to produce claim-level confidence scores and natural-language explanations—supporting multi-hop provenance tracing and dynamic user feedback. Contribution/Results: Our framework breaks from conventional paradigms by enabling transparent, traceable, evidence-diverse, and human-AI collaborative verification. Experiments demonstrate significant improvements in user verification efficiency (+37%) and trust (+42%), establishing a novel paradigm for trustworthy AIGC interaction.

Technology Category

Application Category

📝 Abstract
With the widespread consumption of AI-generated content, there has been an increased focus on developing automated tools to verify the factual accuracy of such content. However, prior research and tools developed for fact verification treat it as a binary classification or a linear regression problem. Although this is a useful mechanism as part of automatic guardrails in systems, we argue that such tools lack transparency in the prediction reasoning and diversity in source evidence to provide a trustworthy user experience. We develop Facts&Evidence - an interactive and transparent tool for user-driven verification of complex text. The tool facilitates the intricate decision-making involved in fact-verification, presenting its users a breakdown of complex input texts to visualize the credibility of individual claims along with an explanation of model decisions and attribution to multiple, diverse evidence sources. Facts&Evidence aims to empower consumers of machine-generated text and give them agency to understand, verify, selectively trust and use such text.
Problem

Research questions and friction points this paper is trying to address.

Develops tool for transparent factual verification of AI-generated text
Addresses lack of transparency in existing fact verification tools
Enables user-driven verification with diverse evidence sources
Innovation

Methods, ideas, or system contributions that make the work stand out.

Interactive tool for transparent factual verification
Visualizes credibility breakdown of individual claims
Attributes decisions to diverse evidence sources
🔎 Similar Papers
No similar papers found.