Examining Racial Stereotypes in YouTube Autocomplete Suggestions

📅 2024-10-04
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates how YouTube’s search auto-complete system systematically reinforces racial stereotypes and discrimination. Employing a novel five-dimensional sociocultural framework—encompassing appearance, competence, culture, social equity, and behavioral norms—the research introduces the concept of “aggregated discrimination” to characterize the algorithmic compounding of multiple biases against racial minorities (Asian, Black, Latinx, and White groups). Combining algorithmic output auditing with Critical Discourse Analysis (CDA), the study empirically demonstrates that auto-complete intensifies racial othering, exposes intergroup tensions, and perpetuates structural inequities. Findings reveal systematic bias in query suggestions, disproportionately associating minority groups with negative or reductive attributes. The work contributes both a practical bias-detection toolkit for platform governance and concrete policy intervention pathways. It constitutes the first empirical study to systematically deconstruct racialized auto-complete mechanisms on a major video-sharing platform.

Technology Category

Application Category

📝 Abstract
Autocomplete is a popular search feature that predicts queries based on user input and guides users to a set of potentially relevant suggestions. In this study, we examine how YouTube autocompletes serve as an information source for users exploring information about race. We perform an algorithm output audit of autocomplete suggestions for input queries about four racial groups and examine the stereotypes they embody. Using critical discourse analysis, we identify five major sociocultural contexts in which racial biases manifest -- Appearance, Ability, Culture, Social Equity, and Manner. Our results show evidence of aggregated discrimination and interracial tensions in the autocompletes we collected and highlight their potential risks in othering racial minorities. We call for urgent innovations in content moderation policy design and enforcement to address these biases in search outputs.
Problem

Research questions and friction points this paper is trying to address.

Analyzing racial stereotypes in YouTube autocomplete
Identifying biases in autocomplete suggestions by race
Highlighting risks of discrimination in search outputs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Algorithm output audit
Critical discourse analysis
Content moderation policy
🔎 Similar Papers
No similar papers found.