Stop the Nonconsensual Use of Nude Images in Research

📅 2025-10-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper exposes widespread ethical violations in machine learning research involving the non-consensual collection, processing, and dissemination of nude imagery—exacerbating risks of image-based sexual abuse. Through a systematic literature review of nudity detection papers published at top-tier computer vision conferences (CVPR, ICCV, ECCV), we identify high-risk practices including distribution of datasets without informed consent, failure to anonymize faces, and misuse of authentic victim imagery. We propose, for the first time, a novel data ethics paradigm centered on user autonomy, advocating for mandatory image-use review mechanisms and enforceable academic publishing ethics standards. Our findings have catalyzed critical scholarly reflection on training data provenance and directly influenced ACM and IEEE to initiate revisions of their data ethics guidelines. The work bridges technical practice and normative accountability, establishing foundational principles for ethically grounded computer vision research.

Technology Category

Application Category

📝 Abstract
In order to train, test, and evaluate nudity detection models, machine learning researchers typically rely on nude images scraped from the Internet. Our research finds that this content is collected and, in some cases, subsequently distributed by researchers without consent, leading to potential misuse and exacerbating harm against the subjects depicted. This position paper argues that the distribution of nonconsensually collected nude images by researchers perpetuates image-based sexual abuse and that the machine learning community should stop the nonconsensual use of nude images in research. To characterize the scope and nature of this problem, we conducted a systematic review of papers published in computing venues that collect and use nude images. Our results paint a grim reality: norms around the usage of nude images are sparse, leading to a litany of problematic practices like distributing and publishing nude images with uncensored faces, and intentionally collecting and sharing abusive content. We conclude with a call-to-action for publishing venues and a vision for research in nudity detection that balances user agency with concrete research objectives.
Problem

Research questions and friction points this paper is trying to address.

Researchers collect nude images without consent for machine learning models
Nonconsensual distribution of nude images perpetuates image-based sexual abuse
Current practices lack ethical norms and enable harmful content distribution
Innovation

Methods, ideas, or system contributions that make the work stand out.

Stop nonconsensual use of nude images
Conduct systematic review of computing papers
Propose balanced nudity detection with user agency
🔎 Similar Papers
No similar papers found.