How to Stop Playing Whack-a-Mole: Mapping the Ecosystem of Technologies Facilitating AI-Generated Non-Consensual Intimate Images

📅 2026-02-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the growing misuse of AI-generated non-consensual intimate imagery (AIG-NCII), a problem exacerbated by fragmented and reactive governance due to the absence of a unified analytical framework. For the first time, it constructs a comprehensive technical ecosystem of AIG-NCII, systematically identifying eleven key technologies spanning generation, dissemination, infrastructure, and monetization. Through an interdisciplinary synthesis of academic literature, policy documents, media reports, and technical community insights—complemented by case studies and ecosystem modeling—the research operationalizes this framework in real-world contexts, including the Grok case. It advances three actionable recommendations: mapping the legal landscape, establishing a dynamic technical database, and adopting a relational research perspective. Together, these measures aim to shift governance from ad hoc responses toward systematic, anticipatory intervention.

Technology Category

Application Category

📝 Abstract
The last decade has witnessed a rapid advancement of generative AI technology that significantly scaled the accessibility of AI-generated non-consensual intimate images (AIG-NCII), a form of image-based sexual abuse that disproportionately harms women and girls. There is a patchwork of commendable efforts across industry, policy, academia, and civil society to address AIG-NCII. However, these efforts lack a shared, consistent mental model that situates the technologies they target within the context of a large, interconnected, and ever-evolving technological ecosystem. As a result, interventions remain siloed and are difficult to evaluate and compare, leading to a reactive cycle of whack-a-mole. We contribute the first comprehensive AIG-NCII technological ecosystem that maps and taxonomizes 11 categories of technologies facilitating the creation, distribution, proliferation and discovery, infrastructural support, and monetization of AIG-NCII. First, we build and visualize the ecosystem through a synthesis of over a hundred primary sources from researchers, journalists, advocates, policymakers, and technologists. Next, we demonstrate how stakeholders can use the ecosystem as a tool to 1) understand new incidents of harm via a case study of Grok and 2) evaluate existing interventions via three more case studies. We conclude with three actionable recommendations, namely that stakeholders should 1) use the ecosystem to map out state, federal, and international laws to produce a clearer policy landscape, 2) collectively develop a database that dynamically tracks the 11 technologies in the ecosystem to better evaluate interventions, and 3) adopt a relational approach to researching AIG-NCII to better understand how the ecosystem technologies interact.
Problem

Research questions and friction points this paper is trying to address.

AI-generated non-consensual intimate images
technological ecosystem
image-based sexual abuse
generative AI
intervention evaluation
Innovation

Methods, ideas, or system contributions that make the work stand out.

AI-generated non-consensual intimate images
technological ecosystem
taxonomy
systemic intervention
relational approach
🔎 Similar Papers
No similar papers found.