Timeliness, Consensus, and Composition of the Crowd: Community Notes on X

📅 2025-10-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the lack of large-scale empirical evaluation of X’s Community Notes—its crowdsourced content moderation system. Using 1.8 million Community Notes records, we quantitatively assess system efficiency along three dimensions: participation inequality, consensus formation, and response timeliness. Methodologically, we apply Gini coefficient analysis, odds ratios (OR), regression modeling, and time-series analysis. Results reveal severe structural imbalances: 1.2% of contributors generate over 50% of annotations (high concentration); only 11.5% of notes achieve consensus, while 69% exhibit classification conflicts; median publication delay is 65.7 hours, with longer delays strongly associated with lower consensus rates. Paradoxically, the “no annotation needed” label—intended for low-controversy cases—emerges as the most efficient resolution pathway. Critically, we identify Community Notes not as egalitarian crowdsourcing but as an elite-dominated deliberative structure. This work delivers the first large-scale empirical benchmark and a structural critique framework for platform governance.

Technology Category

Application Category

📝 Abstract
This study presents the first large-scale quantitative analysis of the efficiency of X's Community Notes, a crowdsourced moderation system for identifying and contextualising potentially misleading content. Drawing on over 1.8 million notes, we examine three key dimensions of crowdsourced moderation: participation inequality, consensus formation, and timeliness. Despite the system's goal of collective moderation, we find substantial concentration effect, with the top 10% of contributors producing 58% of all notes (Gini Coefficient = 0.68). The observed consensus is rare-only 11.5% of notes reach agreement on publication, while 69% of posts receive conflicting classifications. A majority of noted posts (approximately 68%) are annotated as "Note Not Needed", reflecting the repurposing of the platform for debate rather than moderation. We found that such posts are paradoxically more likely to yield published notes (OR = 3.12). Temporal analyses show that the notes, on average, are published 65.7 hours after the original post, with longer delays significantly reducing the likelihood of consensus. These results portray Community Notes as a stratified, deliberative system dominated by a small contributor elite, marked by persistent dissensus, and constrained by timeliness. We conclude this study by outlining design strategies to promote equity, faster consensus, and epistemic reliability in community-based moderation.
Problem

Research questions and friction points this paper is trying to address.

Analyzing participation inequality in crowdsourced moderation systems
Examining consensus formation challenges in content classification
Investigating timeliness impact on misinformation annotation effectiveness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analyzed participation inequality using Gini coefficient
Measured consensus formation through note agreement rates
Evaluated timeliness impact on moderation effectiveness
🔎 Similar Papers
No similar papers found.
O
Olesya Razuvayevskaya
The University of Sheffield
A
Adel Tayebi
CY Cergy Paris University, NOVA University Lisbon
U
Ulrikke Dybdal Sørensen
Aalborg Universitet
Kalina Bontcheva
Kalina Bontcheva
Professor of Text Analytics, University of Sheffield
Natural Language Processing
Richard Rogers
Richard Rogers
Professor of New Media & Digital Culture, University of Amsterdam
new mediainternet studiesdigital methodsmedia studiesissue mapping