🤖 AI Summary
This study investigates whether social media platforms dynamically adjusted their content moderation practices during the 2024 European Parliament elections in compliance with the Digital Services Act (DSA), and whether the DSA’s Transparency Database effectively reveals such adjustments. Leveraging the first large-scale empirical analysis of 1.58 billion reported moderation actions from eight platforms over an eight-month period spanning pre- and post-election phases, we employ time-series comparison, cross-platform statistical modeling, and transparency-gap attribution. Results show no statistically significant change in moderation intensity or strategy across platforms. Critically, the Transparency Database exhibits systemic deficiencies—including reporting delays, insufficient granularity, and inconsistent categorization—that substantially obscure actual platform responsiveness. The study thus exposes substantive limitations in the DSA’s initial accountability mechanisms and establishes the first quantitative assessment framework for evaluating the democratic efficacy of the DSA’s Transparency Database.
📝 Abstract
Social media platforms face heightened risks during major political events; yet, how platforms adapt their moderation practices in response remains unclear. The Digital Services Act Transparency Database offers an unprecedented opportunity to systematically study content moderation at scale, enabling researchers and policymakers to assess platforms' compliance and effectiveness. Herein, we analyze 1.58 billion self-reported moderation actions taken by eight large social media platforms during an extended period of eight months surrounding the 2024 European Parliament elections. Our findings reveal a lack of adaptation in moderation strategies, as platforms did not exhibit significant changes in their enforcement behaviors surrounding the elections. This raises concerns about whether platforms adapted their moderation practices at all, or if structural limitations of the database concealed possible adjustments. Moreover, we found that noted transparency and accountability issues persist nearly a year after initial concerns were raised. These results highlight the limitations of current self-regulatory approaches and underscore the need for stronger enforcement and data access mechanisms to ensure that online platforms uphold their responsibility in safeguarding democratic processes.