Request a Note: How the Request Function Shapes X's Community Notes System

📅 2025-09-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
X’s Community Notes introduced a “request notes” feature to enhance scalability of crowdsourced fact-checking, yet its impact on coverage, subject distribution, and annotation quality remains unclear. Method: Leveraging a dataset of 98,685 requested posts and associated notes, we conduct quantitative analysis and statistical modeling to assess the mechanism’s real-world effects. Contribution/Results: (1) Requesting notes does not significantly alter partisan bias in political content verification; (2) High-contributing annotators exhibit selective responsiveness to misleading content—producing higher-quality, more neutral, and less polarized notes; (3) Only 12% of requests yield high-quality notes, yet these receive significantly better user ratings. Findings indicate that incentive structures and selective participation—not mere request volume—are critical levers for improving the efficacy and credibility of platform-mediated fact-checking. This study provides key empirical evidence for designing scalable, trustworthy crowdsourced verification systems.

Technology Category

Application Category

📝 Abstract
X's Community Notes is a crowdsourced fact-checking system. To improve its scalability, X recently introduced "Request Community Note" feature, enabling users to solicit fact-checks from contributors on specific posts. Yet, its implications for the system -- what gets checked, by whom, and with what quality -- remain unclear. Using 98,685 requested posts and their associated notes, we evaluate how requests shape the Community Notes system. We find that contributors prioritize posts with higher misleadingness and from authors with greater misinformation exposure, but neglect political content emphasized by requestors. Selection also diverges along partisan lines: contributors more often annotate posts from Republicans, while requestors surface more from Democrats. Although only 12% of posts receive request-fostered notes from top contributors, these notes are rated as more helpful and less polarized than others, partly reflecting top contributors' selective fact-checking of misleading posts. Our findings highlight both the limitations and promise of requests for scaling high-quality community-based fact-checking.
Problem

Research questions and friction points this paper is trying to address.

Evaluates how user requests influence fact-checking selection and quality
Examines partisan bias in contributor annotation versus requestor emphasis
Assesses scalability and effectiveness of community-driven misinformation mitigation
Innovation

Methods, ideas, or system contributions that make the work stand out.

User-solicited fact-checks via request function
Prioritization based on misleadingness and author exposure
Top contributors produce higher-quality, less polarized notes
🔎 Similar Papers
No similar papers found.