Nude deepfakes created using artificial intelligence are becoming an increasingly prevalent issue in schools across the country.
Students are sharing fake lurid photos of their classmates using social media apps. These children, usually boys, are taking ordinary photos of girls from social media and using AI “nudify” tools to transform them into sexual images, according to a WIRED investigation.
Almost 90 schools in 28 countries reported these incidents, affecting at least 600 students.
The findings show that since 2023, schoolchildren—most often boys in high schools—in at least 28 countries have been accused of using generative AI to target their classmates with sexualized deepfakes. The explicit imagery, containing minors, is considered to be child sexual abuse material (CSAM). This analysis is believed to be the first to review real-world cases of AI deepfake abuse taking place at schools globally.
As a whole, the analysis shows the worldwide reach of harmful AI nudification technology, which can earn their creators millions of dollars per year, and shows that in many incidents, schools and law enforcement officials are often not prepared to respond to the serious sexual abuse incidents.
Across North America, there have been nearly 30 reported deepfake sexual abuse cases since 2023—including one with more than 60 alleged victims, one where the victim was temporarily expelled from school, and others where pupils at multiple schools have allegedly been targeted simultaneously. More than 10 cases have been publicly reported in South America, more than 20 across Europe, and another dozen in Australia and East Asia combined.
WIRED and Indicator's analysis exposed nearly 90 schools and 600 students worldwide scarred by AI-generated deepfake nudes, a crisis already entrenched across borders. The spread continues unchecked, exploiting open-source tools and lax platform moderation. Most probable path:… https://t.co/BOA4uUda0Q
— U.S.A.I. 🇺🇸 (@researchUSAI) April 15, 2026
Students can spread these images quickly online and among their classmates, leaving victims feeling humiliated and afraid that they could follow them even into adulthood.
WIRED noted that many schools and police departments are not prepared to deal with this type of abuse, which is becoming more ubiquitous. It warned that the emotional damage can be severe and the images can be considered child sexual abuse material when minors are involved.
Recommended
With the advent of artificial intelligence, tools have become faster and easier to use. The growing availability of these chatbots means students can create harmful deepfakes with very little effort. Some schools have responded by changing their policies on sharing student photos.
The AI deepfake nde situation in schools is worse than you realize
— 🔻agitprop + absurdity🔻 (@agtprpnabsrdty) April 15, 2026
Teenage boys are using free apps to strip their classmates naked, and schools have no idea what to do about it.
The scope of the problem:
A joint review by WIRED and Indicator found AI-generated deepfake nude…
The National Center for Missing and Exploited Children said reports to its CyberTipline involving AI-generated child sexual abuse images surged from 4,700 in 2023 to 67,000 in 2024, and then 440,000 in just the first half of 2025. Many of these cases involve students whose ordinary social media photos were turned into fake nudes without their consent.
The problem is that the trauma these photos inflict carries on into perpetuity because they can be saved, reshared, and rediscovered even after adults think the problem has abated. The National Education Association issued a report suggesting that 40 to 50 percent of students are aware of deepfakes circulating at their schools.
State lawmakers are playing catch-up, updating criminal statutes and school policies. But they are often forced to take reactive measures rather than proactive ones. By 2025, more than half of U.S. states had passed laws aimed at punishing those who use AI to create realistic fake images and audio.
In Iowa, four boys were charged in juvenile court for using AI to create fake nude images of 44 girls from social media photos. The incident prompted the district to work on new policies.







