Tech firms will have 48 hours to remove abusive images under new law
6 days ago
- #UK-law
- #tech-regulation
- #online-safety
- UK proposes law requiring tech platforms to remove non-consensual intimate images within 48 hours.
- Failure to comply could result in fines up to 10% of global sales or service blockage in the UK.
- Victims only need to flag an image once, and tech companies must prevent re-uploads.
- The law aims to treat intimate image abuse with the same severity as child sexual abuse and terrorist content.
- Women, girls, and LGBT people are disproportionately affected by intimate image abuse.
- Reports show a 20.9% increase in intimate image abuse cases in 2024.
- Tech companies will be held accountable, with fines and oversight bodies enforcing the law.
- New legislation also targets AI-generated deepfake images, making them illegal in the UK.