🤖 AI Summary
This study examines how “folk algorithmic theories”—informal, practitioner-derived understandings of opaque platform algorithms—become institutionalized as governance tools by intermediary organizations in platform labor. Drawing on a nine-month ethnographic fieldwork and 37 in-depth interviews with livestreamers and MCN (Multi-Channel Network) staff in China’s livestreaming industry, and grounded in CSCW and labor sociology frameworks, we identify a systematic appropriation of grassroots algorithmic knowledge: MCNs internally operationalize probabilistic risk management strategies while externally disseminating simplified, standardized algorithmic discourses to foster self-discipline and intensified resource investment among livestreamers. Our work is the first to demonstrate how folk algorithmic knowledge can be formalized by intermediaries into a soft infrastructural control mechanism—enabling responsibility displacement and the moralization of labor—thereby transforming informal cognition into structural governance. This advances theoretical paradigms in platform labor studies and algorithmic governance.
📝 Abstract
As algorithmic systems increasingly structure platform labor, workers often rely on informal"folk theories", experience-based beliefs about how algorithms work, to navigate opaque and unstable algorithmic environments. Prior research has largely treated these theories as bottom-up, peer-driven strategies for coping with algorithmic opacity and uncertainty. In this study, we shift analytical attention to intermediary organizations and examine how folk theories of algorithms can be institutionally constructed and operationalized by those organizations as tools of labor management. Drawing on nine months of ethnographic fieldwork and 37 interviews with live-streamers and staff at Multi-Channel Networks (MCNs) in China, we show that MCNs develop and circulate dual algorithmic theories: internally, they acknowledge the volatility of platform systems and adopt probabilistic strategies to manage risk; externally, they promote simplified, prescriptive theories portraying the algorithm as transparent, fair, and responsive to individual effort. They have further operationalize those folk theories for labor management, encouraging streamers to self-discipline and invest in equipment, training, and routines, while absolving MCNs of accountability. We contribute to CSCW and platform labor literature by demonstrating how informal algorithmic knowledge, once institutionalized, can become infrastructures of soft control -- shaping not only how workers interpret platform algorithms, but also how their labor is structured, moralized and governed.