X to stop Grok AI from undressing images of real people after backlash
4 months ago
- #Deepfake Regulation
- #AI Ethics
- #Social Media Policy
- Grok AI will no longer edit photos of real people to show them in revealing clothing due to backlash.
- X implemented technological measures to prevent image editing of real people in revealing clothing, applying to all users, including paid subscribers.
- California's top prosecutor announced a probe into the spread of sexualized AI deepfakes, including those of children, generated by Grok.
- Only paid users will be able to edit images using Grok on X, adding an extra layer of protection against misuse.
- Grok will block users from generating images of real people in bikinis, underwear, and similar clothing based on jurisdictional laws.
- With NSFW settings enabled, Grok allows upper body nudity of imaginary adult humans, consistent with R-rated films, varying by region.
- Elon Musk defended X, accusing critics of suppressing free speech, while sharing AI-generated images of UK Prime Minister in a bikini.
- Malaysia and Indonesia banned Grok AI after users reported non-consensual explicit image alterations.
- UK's Ofcom is investigating whether X violated UK law over sexual images, with some UK MPs leaving the platform.
- California's Attorney General highlighted the misuse of AI-generated explicit material to harass women and children online.
- Policy researcher Riana Pfefferkorn criticized X's delayed response and Musk's handling of the situation.