Prediction Laundering: The Illusion of Neutrality, Transparency, and Governance in Polymarket

📅 2026-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study examines how prediction markets such as Polymarket algorithmically aggregate information in ways that obscure subjectivity, uncertainty, and capital inequality, thereby producing an illusion of “objective consensus.” Through a sociotechnical audit employing digital ethnography, interpretive walkthroughs, and semi-structured interviews, and drawing on MacFarlane’s framework of epistemic transmission, the research introduces the concept of “prediction laundering” and articulates a four-stage lifecycle model. This model elucidates the mechanisms of epistemic stratification and governance deficits underlying synthetic truths. Findings reveal that platforms repackage highly uncertain events and whale-driven capital maneuvers as credible probabilities, inducing epistemic vertigo and accountability vacuums. The study argues that this dynamic exacerbates the cognitive divide between technical elites and the public, and calls for friction-positive design to reconfigure platform governance.

Technology Category

Application Category

📝 Abstract
The growing reliance on prediction markets as epistemic infrastructures has positioned platforms like Polymarket as providers of objective, real-time probabilistic truth, yet the signals they produce often obscure uncertainty, strategic manipulation, and capital asymmetries, encouraging misplaced epistemic trust. This paper presents a qualitative sociotechnical audit of Polymarket (N = 27), combining digital ethnography, interpretive walkthroughs, and semi-structured interviews to examine how probabilistic authority is produced and contested. We introduce the concept of Prediction Laundering, drawing on MacFarlanes framework of knowledge transmission, to describe how subjective, high-uncertainty bets, strategic hedges, and capital-heavy whale activity are stripped of their original noise through algorithmic aggregation. We trace a four-stage laundering lifecycle: Structural Sanitization, where a centralized ontology scripts the bet-able future; Probabilistic Flattening, which collapses heterogeneous motives into a single signal; Architectural Masking, which conceals capital-driven influence behind apparent consensus; and Epistemic Hardening, which erases governance disputes to produce an objective historical fact. We show that this process induces epistemic vertigo and accountability gaps by offloading truth-resolution to off-platform communities such as Discord. Challenging narratives of frictionless collective intelligence, we demonstrate Epistemic Stratification, in which technical elites audit underlying mechanisms while the broader public consumes a sanitized, capital-weighted signal, and we conclude by advocating Friction-Positive Design that surfaces the social and financial frictions inherent in synthetic truth production.
Problem

Research questions and friction points this paper is trying to address.

prediction markets
epistemic trust
algorithmic aggregation
capital asymmetry
governance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Prediction Laundering
Epistemic Stratification
Friction-Positive Design
Probabilistic Flattening
Sociotechnical Audit
🔎 Similar Papers
No similar papers found.
Y
Yasaman Rohanifar
Computer Science, University of Toronto, Canada
Syed Ishtiaque Ahmed
Syed Ishtiaque Ahmed
University of Toronto
Responsible AIHuman-AI InteractionAI PolicyHCI
S
S. Sultana
University of Illinois Urbana-Champaign, USA