Open Access Publisher and Free Library
09-victimization.jpg

VICTIMIZATION

VICTIMIZATION-ABUSE-WITNESSES-VICTIM SURVEYS

Posts tagged Thorn
Deepfake Nudes & Young People Navigating a new frontier in technology-facilitated nonconsensual sexual abuse and exploitation

By Thorn in partnership with Burson Insights, Data & Intelligence 

Since 2019, Thorn has focused on amplifying youth voices to better understand their digital lives, with particular attention to how they encounter and navigate technologyfacilitated forms of sexual abuse and exploitation. Previous youth-centered research has explored topics such as child sexual abuse material (CSAM)1 —including that which is self-generated (“SG-CSAM”)—nonconsensual resharing, online grooming, and the barriers young people face in disclosing or reporting negative experiences. Thorn’s Emerging Threats to Young People research series aims to examine emergent online risks to better understand how current technologies create and/or exacerbate child safety vulnerabilities and identify areas where solutions are needed. This report, the first in the series, sheds light specifically on young people’s perceptions of and experiences with deepfake nudes. Future reports in this initiative will address other pressing issues, including sextortion and online solicitations. Drawing on responses from a survey of 1,200 young people aged 13-20, this report explores their awareness of deepfake nudes, lived experiences with them, and their involvement in creating such content. Three key findings emerged from this researc.  1. Young people overwhelmingly recognize deepfake nudes as a form of technology-facilitated abuse that harms the person depicted. Eighty-four percent of those surveyed believe that deepfake nudes cause harm, attributing this largely to the emotional and psychological impacts on victims, the potential for reputational damage, and the increasingly photorealistic quality of the imagery, which leads viewers to perceive—and consume—it as authentic. 2. Deepfake nudes already represent real experiences that young people have to navigate. Not only are many young people familiar with the concept, but a significant number report personal connections to this harm—either knowing someone targeted or experiencing it themselves. Forty-one percent of young people surveyed indicated they had heard the term “deepfake nudes,” including 1 in 3 (31%) teens. Additionally, among teens, 1 in 10 (10%) reported personally knowing someone who had deepfake nude imagery created of them, and 1 in 17 (6%) disclosed having been a direct victim of this form of abuse. 3. Among the limited sample of young people who admit to creating deepfake nudes of others, they describe easy access to deepfake technologies. Creators described access to the technologies through their devices’ app stores and accessibility via general search engines and social media 

El Segundo, CA  Thorn, 2025. 32p.