The Deepfake Nudes Crisis in Schools Is Worse Than You Thought
5 hours ago
- #Digital Abuse
- #AI Deepfakes
- #Child Safety
- Teenage boys globally are using 'nudify' apps to create and share fake nude images of female classmates, causing severe emotional distress.
- Since 2023, AI-generated deepfake sexual abuse has been reported in at least 28 countries, impacting over 600 students, with many cases unreported.
- Incidents often involve boys sharing content via social media, with victims facing humiliation, school avoidance, and fears of lifelong image tracking.
- Schools and law enforcement are frequently unprepared, leading to inconsistent responses, while some victims and families take legal or advocacy actions.
- Motivations vary from sexual gratification to humiliation, and efforts include policy changes, school training, and bans on nudification apps to combat the crisis.