🤖 AI Summary
This study identifies a severe WEIRD (Western, Educated, Industrialized, Rich, Democratic) bias in algorithmic auditing research: over 60% of studies focus exclusively on the U.S., English-language contexts, and a narrow set of platforms, while 85% examine only reductive demographic attributes (e.g., race, gender), neglecting structural discrimination and non-Western sociotechnical settings. Through a systematic literature review (SLR) of 176 peer-reviewed papers, we conduct metadata coding, geolinguistic distribution analysis, thematic clustering, and bias mapping—quantitatively confirming systemic imbalances in platform selection, linguistic coverage, geographic representation, and operationalization of group attributes. Our key contribution is the proposal of an “Inclusive Algorithmic Auditing” framework, advocating multilingual, multicentric, and multidimensional approaches to auditing—including structural and intersectional attributes—and providing an actionable roadmap for transnational collaboration. This work advances a paradigm shift toward globally representative, contextually embedded, and socially accountable algorithmic fairness research.
📝 Abstract
The increasing reliance on complex algorithmic systems by online platforms has sparked a growing need for algorithm auditing, a methodology evaluating these systems' functionality and impact. In this paper, we systematically review 176 peer-reviewed online platform-focused algorithm auditing studies and identify trends in their methodological approaches, the geographic distribution of authors, and the selection of platforms, languages, geographies, and group-based attributes in the focus of the reviewed research. We find a significant skew of research focus towards few online platforms, Western contexts, particularly the US, and English language data. Additionally, our analysis indicates a tendency to focus on a narrow set of group-based attributes, often operationalized in simplified ways, which might obscure more nuanced aspects of algorithmic bias and discrimination. We provide a clearer understanding of the current state of the online platform-focused algorithm auditing and identify gaps to be addressed for a more inclusive and representative research landscape.