The DSA's Blind Spot: Algorithmic Audit of Advertising and Minor Profiling on TikTok

📅 2026-03-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses a regulatory gap in the EU Digital Services Act (DSA) Article 28(2), which prohibits profiling-based advertising targeting minors but adopts a narrow definition of “advertising” that excludes undisclosed influencer marketing. Through an algorithmic audit on TikTok, the authors deployed simulated minor and adult accounts, combined with automated content annotation and statistical analysis, to empirically demonstrate that—despite nominal compliance—minors are exposed to substantial volumes of undisclosed commercial content driven by high-intensity interest-based profiling. The intensity of such profiling reaches five to eight times that of formal advertisements shown to adults. The findings underscore the need to broaden the legal definition of “advertising” to encompass emerging forms of commercial content, thereby closing critical enforcement loopholes in digital platform regulation.

Technology Category

Application Category

📝 Abstract
Adolescents spend an increasing amount of their time in digital environments where their still-developing cognitive capacities leave them unable to recognize or resist commercial persuasion. Article 28(2) of the Digital Service Act (DSA) responds to this vulnerability by prohibiting profiling-based advertising to minors. However, the regulation's narrow definition of"advertisement"excludes current advertising practices including influencer marketing and promotional content that serve functionally equivalent commercial purposes. We provide the first empirical evidence of how this definitional gap operates in practice through an algorithmic audit of TikTok. Our approach deploys sock-puppet accounts simulating a pair of minor and adult users with distinct interest profiles. The content recommended to these users is automatically annotated, enabling systematic statistical analysis across four video categories: containing formal, disclosed, undisclosed and none advertisement; as well as advertisement topical relevance to user's interest. Our findings reveal a stark regulatory paradox. TikTok demonstrates formal compliance with Article 28(2) by shielding minors from profiled formal advertisements, yet both disclosed and undisclosed ads exhibit significant profiling aligned with user interests (5-8 times stronger than for adult formal advertising). The strongest profiling emerges within undisclosed commercial content, where brands/creators fail to label promotional content/paid partnership and the platform neither corrects this omission nor prevents its personalized delivery to minors. We argue that protecting minors requires expanding the regulatory definition of advertisement to encompass brand/influencer marketing and extending the Article 28(2) prohibition accordingly, ensuring that commercial content cannot circumvent protections merely by operating outside formal advertising channels.
Problem

Research questions and friction points this paper is trying to address.

Digital Services Act
minor profiling
advertising regulation
TikTok
influencer marketing
Innovation

Methods, ideas, or system contributions that make the work stand out.

algorithmic audit
minor profiling
advertising regulation
TikTok
Digital Services Act
🔎 Similar Papers
No similar papers found.
S
Sara Solarova
Kempelen Institute of Intelligent Technologies
M
Matej Mosnar
Kempelen Institute of Intelligent Technologies
M
Matus Tibensky
Kempelen Institute of Intelligent Technologies
J
Jan Jakubcik
Kempelen Institute of Intelligent Technologies
A
Adrian Bindas
Kempelen Institute of Intelligent Technologies
S
Simon Liska
Kempelen Institute of Intelligent Technologies
F
Filip Hossner
Kempelen Institute of Intelligent Technologies
Matúš Mesarčík
Matúš Mesarčík
Docent, Univerzita Komenského v Bratislave
právotechnológieochrana súkromiaochrana osobných údajov
Ivan Srba
Ivan Srba
Kempelen Institute of Intelligent Technologies
AIMachine LearningNatural Language ProcessingSocial ComputingDisinformation