Elon Musk's Grok 'Undressing' Problem Isn't Fixed
4 months ago
- #AI Ethics
- #Nonconsensual Imagery
- #Content Moderation
- X (formerly Twitter) has introduced new restrictions on Grok's image generation to prevent editing and generating images of real people in revealing clothing, following global outrage over nonconsensual 'undressing' photos.
- Despite X's restrictions, the standalone Grok app and website still allow the generation of 'undress'-style images and pornographic content, as confirmed by researchers and journalists.
- Elon Musk's companies, including xAI, X, and Grok, have faced international condemnation and investigations for enabling the creation of nonconsensual intimate imagery and sexualized content of minors.
- X claims to have implemented geoblocking for generating images of real people in bikinis or similar attire in jurisdictions where it's illegal, alongside efforts to remove violative content like CSAM.
- Since January 9, only verified subscribers on X can generate images using Grok, a move criticized as monetizing abuse, while the Grok website and app remain less restricted.
- Elon Musk stated that Grok allows upper body nudity of imaginary adults (not real people) in regions where it's legal, aligning with R-rated movie standards.
- Users have attempted to bypass Grok's moderation to create explicit content, with mixed success, as some report stricter moderation recently, while others still find ways to generate nudity.
- A pornography forum dedicated to creating explicit content with Grok Imagine reports varying results, with some users successfully generating nudity and others facing increased moderation.