🤖 AI Summary
This study conducts a systematic platform audit of TikTok’s “Teen Mode” for U.S. children under 13, exposing critical deficiencies in age-appropriateness, safety safeguards, and privacy compliance. Employing a hybrid methodology—automated data collection coupled with expert human annotation—we perform fine-grained content classification and functional completeness evaluation of recommended videos against COPPA requirements. Our empirical analysis reveals that 83% of recommended videos violate child-directed content standards, with widespread exposure to non-age-appropriate and potentially harmful material. Critically, the mode lacks robust parental controls and accessibility features, increasing the risk of children reverting to the unrestricted main feed. The paper introduces a novel, reusable auditing framework for children’s digital products, grounded in regulatory standards and empirical validation. This framework provides actionable, evidence-based insights to inform policymaking, enforcement of child online safety regulations (e.g., COPPA), and platform accountability mechanisms.
📝 Abstract
TikTok, the social media platform that is popular among children and adolescents, offers a more restrictive "Under 13 Experience" exclusively for young users in the US, also known as TikTok's "Kids Mode". While prior research has studied various aspects of TikTok's regular mode, including privacy and personalization, TikTok's Kids Mode remains understudied, and there is a lack of transparency regarding its content curation and its safety and privacy protections for children. In this paper, (i) we propose an auditing methodology to comprehensively investigate TikTok's Kids Mode and (ii) we apply it to characterize the platform's content curation and determine the prevalence of child-directed content, based on regulations in the Children's Online Privacy Protection Act (COPPA). We find that 83% of videos observed on the "For You" page in Kids Mode are actually not child-directed, and even inappropriate content was found. The platform also lacks critical features, namely parental controls and accessibility settings. Our findings have important design and regulatory implications, as children may be incentivized to use TikTok's regular mode instead of Kids Mode, where they are known to be exposed to further safety and privacy risks.