A Year of the DSA Transparency Database: What it (Does Not) Reveal About Platform Moderation During the 2024 European Parliament Election

📅 2025-04-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates whether social media platforms dynamically adjusted their content moderation practices during the 2024 European Parliament elections in compliance with the Digital Services Act (DSA), and whether the DSA’s Transparency Database effectively reveals such adjustments. Leveraging the first large-scale empirical analysis of 1.58 billion reported moderation actions from eight platforms over an eight-month period spanning pre- and post-election phases, we employ time-series comparison, cross-platform statistical modeling, and transparency-gap attribution. Results show no statistically significant change in moderation intensity or strategy across platforms. Critically, the Transparency Database exhibits systemic deficiencies—including reporting delays, insufficient granularity, and inconsistent categorization—that substantially obscure actual platform responsiveness. The study thus exposes substantive limitations in the DSA’s initial accountability mechanisms and establishes the first quantitative assessment framework for evaluating the democratic efficacy of the DSA’s Transparency Database.

Technology Category

Application Category

📝 Abstract
Social media platforms face heightened risks during major political events; yet, how platforms adapt their moderation practices in response remains unclear. The Digital Services Act Transparency Database offers an unprecedented opportunity to systematically study content moderation at scale, enabling researchers and policymakers to assess platforms' compliance and effectiveness. Herein, we analyze 1.58 billion self-reported moderation actions taken by eight large social media platforms during an extended period of eight months surrounding the 2024 European Parliament elections. Our findings reveal a lack of adaptation in moderation strategies, as platforms did not exhibit significant changes in their enforcement behaviors surrounding the elections. This raises concerns about whether platforms adapted their moderation practices at all, or if structural limitations of the database concealed possible adjustments. Moreover, we found that noted transparency and accountability issues persist nearly a year after initial concerns were raised. These results highlight the limitations of current self-regulatory approaches and underscore the need for stronger enforcement and data access mechanisms to ensure that online platforms uphold their responsibility in safeguarding democratic processes.
Problem

Research questions and friction points this paper is trying to address.

Assessing platform moderation adaptation during 2024 EU elections
Evaluating transparency and accountability in self-reported moderation data
Identifying limitations in current regulatory enforcement for democratic safeguards
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analyzed 1.58 billion platform moderation actions
Used Digital Services Act Transparency Database
Assessed lack of moderation strategy adaptation
🔎 Similar Papers
No similar papers found.