States confront surge in AI deepfake child abuse
An Associated Press report details how U.S. schools and law enforcement are confronting a rapid rise in students using AI tools to turn classmates’ photos into sexually explicit deepfakes, prompting new state laws and criminal prosecutions. The article highlights a Louisiana middle-school case believed to be the first charged under the state’s new deepfake statute, notes that at least half of U.S. states passed deepfake-related laws in 2025, and cites National Center for Missing and Exploited Children data showing reports of AI‑generated child sexual abuse images exploding from 4,700 in 2023 to 440,000 in just the first six months of 2025.
📌 Key Facts
- In fall 2025, AI-generated nude images swept through a Louisiana middle school, leading to charges against two boys and the expulsion of a girl who fought a boy she believed had created the images.
- Louisiana’s prosecution is believed to be the first under a new state law authored by Republican state Sen. Patrick Connick targeting AI deepfakes, one of many such statutes passed as at least half of U.S. states enacted generative-AI laws in 2025.
- The National Center for Missing and Exploited Children reports AI-generated child sexual abuse images reported to its cyber tipline jumped from 4,700 in 2023 to 440,000 in the first six months of 2025.
- Students have faced criminal charges in Florida and Pennsylvania and expulsions in California for similar deepfake cases, and a fifth-grade teacher in Texas was charged with using AI to create child pornography of his students.
- Experts quoted in the article warn that deepfake tools are now easily accessible via apps with no technical expertise, and say many schools and parents are unprepared, lacking clear policies and education around AI-generated abuse.
📊 Relevant Data
In a 2024 survey, 6% of U.S. teens aged 13-17 reported having deepfake nudes created of them by someone else.
Creators of deepfake nudes targeted females in 74% of cases, according to a 2024 survey of young people.
LGBTQ+ teens were more likely to know someone targeted by deepfake nudes (18%) compared to non-LGBTQ+ peers, based on 2024 data.
Less than a quarter of teachers reported that their schools had released policies to address deepfakes depicting sexually explicit imagery without consent, according to a 2025 survey.
Risks from AI use are growing alongside its popularity in schools — K-12 Dive
36% of surveyed high school students reported an issue in their school involving deepfakes during the 2024-25 school year.
Risks from AI use are growing alongside its popularity in schools — K-12 Dive