About one in 10 children said they had friends who used generative artificial intelligence (AI) systems to create nudes of other kids.
That is according to a new report from non-profit organization Thorn. The organization, which fights child sexual abuse, surveyed 1,040 children between the ages of nine and 17 in the United States. The report noted that 10% or 104 of the children surveyed said they knew of friends and classmates who created deepfake nudes of their peers using Gen AI.
Among age groups, 12% of children ages nine to 12 knew of friends who generated nudes of other kids using AI. This fell to 10% among children ages 13 to 17. However, it is important to note that 8% of kids ages nine to 12 and 11% of those ages 13 to 17 said they would rather not answer the question.
Children in the US Share Deepfake Nudes
Apart from generating deepfake nudes of other children, the survey also found that 23% of minors believe it is normal for people their age to share nudes with each other. Among those who shared self-generated child sexual abuse material (SG-CSAM), 83% said they shared it with someone they know offline, while 46% said they sent it to a person they only met and knew online, per the report.
Among minors who shared SG-CSAM, 69% said they were more likely to share nudes to a person they believed to be under the age of 18 while 34% said they were more likely to send nudes to an adult.
Children at Risk of Sextortion
Children who share SG-CSAM are at higher risk of falling victim to sextortion, in which predators blackmail them into sending more nude photos or videos. In sextortion cases, the predator would typically threaten to expose the victim's sexually explicit photos or videos unless they meet the demands.
Victims of sextortion often suffer from emotional and psychological trauma. Between October 2021 and March 2023, the FBI reported 12,600 victims of sextortion, primarily boys. Of them, at least 20 victims committed suicide.