🤖 AI Summary
This study presents the first systematic investigation into how cross-site tracking influences YouTube’s recommendations of political and misinformation content, addressing a critical gap in prior research that focused solely on within-platform viewing history. By deploying a “sock puppet” experimental framework, the authors simulate user news-browsing behavior in browser environments with cross-site tracking either enabled or disabled, and subsequently collect and categorize recommended videos. The findings reveal that cross-site tracking significantly amplifies the exposure to political and false information, while mainstream privacy-focused browsers demonstrate limited efficacy in mitigating this effect. This work expands the scope of algorithmic auditing to include cross-site tracking and exposes substantial shortcomings in current privacy protection mechanisms.
📝 Abstract
YouTube has today become the primary news source for many users, which raises concerns about the role its recommendation algorithm can play in the spread of misinformation and political polarization. Prior work in this area has mainly analyzed how recommendations evolve based on users' watch history within the platform. Nevertheless, recommendations can also depend on off-platform browsing activity that Google collects via trackers on news websites, a factor that has not been considered so far. To fill this gap, we propose a sock-puppet-based experimental framework that automatically interacts with news media articles and then collects YouTube recommendations to measure how cross-site tracking affects the political and misinformation content users see. Moreover, by running our audits in both tracking-permissive and tracking-restrictive browser environments, we assess whether common privacy-focused browsers can protect users from tracking-driven political and misinformation bubbles on YouTube.