Hasty Briefsbeta

Bilingual

Elon Musk's xAI sued for turning three girls' real photos into AI CSAM

4 hours ago
  • #CSAM
  • #Grok
  • #Elon Musk
  • Anonymous Discord tip led to discovery of Grok-generated child sexual abuse materials (CSAM).
  • Elon Musk previously denied CSAM generation by Grok, despite evidence from researchers.
  • Researchers found about 23,000 sexualized images of children among 3 million generated by Grok.
  • xAI restricted Grok access to paying subscribers to limit circulation of harmful outputs.
  • A researcher found nearly 10% of 800 Grok Imagine outputs appeared to include CSAM.
  • Musk dismissed reports, claiming no awareness of underage nude images from Grok.
  • A Discord user alerted a victim, prompting law enforcement involvement.
  • Three Tennessee girls filed a class-action lawsuit against Musk, accusing him of profiting from child exploitation.
  • The lawsuit seeks an injunction to stop Grok's harmful outputs and damages for affected minors.