YouTube Recommendations Reinforce Negative Emotions: Auditing Algorithmic Bias with Emotionally-Agentic Sock Puppets

📅 2025-01-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates whether YouTube’s recommendation system amplifies users’ exposure to negatively valenced content (e.g., anger, resentment), thereby reinforcing affective preferences and fostering “affective filter bubbles.” Method: We introduce a novel affectively programmable bot-account framework, integrating automated session simulation, emotion-labeled video selection, cross-session recommendation trajectory tracking, and longitudinal affective distribution analysis. Contribution/Results: We find that contextual (non-personalized) recommendations amplify negative affective bias more than personalized ones—challenging the assumption of algorithmic neutrality. YouTube significantly increases both the frequency and ranking priority of negatively valenced content; this reinforcement effect accumulates over time and persists across diverse topics. To our knowledge, this is the first empirical demonstration of how platform recommendation mechanisms systematically shape user affective preferences. The work establishes a new paradigm for algorithmic auditing and emotion-health–oriented platform governance.

Technology Category

Application Category

📝 Abstract
Personalized recommendation algorithms, like those on YouTube, significantly shape online content consumption. These systems aim to maximize engagement by learning users' preferences and aligning content accordingly but may unintentionally reinforce impulsive and emotional biases. Using a sock-puppet audit methodology, this study examines YouTube's capacity to recognize and reinforce emotional preferences. Simulated user accounts with assigned emotional preferences navigate the platform, selecting videos that align with their assigned preferences and recording subsequent recommendations. Our findings reveal reveal that YouTube amplifies negative emotions, such as anger and grievance, by increasing their prevalence and prominence in recommendations. This reinforcement intensifies over time and persists across contexts. Surprisingly, contextual recommendations often exceed personalized ones in reinforcing emotional alignment. These findings suggest the algorithm amplifies user biases, contributing to emotional filter bubbles and raising concerns about user well-being and societal impacts. The study emphasizes the need for balancing personalization with content diversity and user agency.
Problem

Research questions and friction points this paper is trying to address.

YouTube recommendation system
emotional polarization
negative content
Innovation

Methods, ideas, or system contributions that make the work stand out.

Puppet Account Method
YouTube Recommendation Algorithm
Emotional Impact
🔎 Similar Papers
No similar papers found.