Hasty Briefsbeta

Bilingual

West Virginia sues Apple over child sex abuse material stored on iCloud

3 days ago
  • #Child Safety
  • #Apple
  • #iCloud
  • West Virginia's attorney general sued Apple for allegedly allowing iCloud to distribute child sexual abuse material (CSAM).
  • Apple is accused of not deploying tools to scan and detect CSAM in iCloud, prioritizing privacy over child safety.
  • A 2020 internal Apple message revealed an executive calling Apple 'the greatest platform for distributing child porn.'
  • Victims of child sexual exploitation sued Apple in 2024 for $1.2 billion in damages, with the case ongoing.
  • The UK's NSPCC accused Apple of underreporting CSAM cases in its products, with police data showing higher numbers in England and Wales alone.
  • Apple reports far fewer CSAM cases to NCMEC compared to Google or Meta.
  • Apple considered scanning iCloud images but abandoned the plan due to privacy concerns and potential government misuse.
  • Apple introduced NeuralHash in 2021 to detect CSAM but canceled it in 2022 after criticism and privacy concerns.
  • Apple implemented Communication Safety to blur nudity in messages to/from children's devices but does not scan iCloud uploads.
  • Apple made only 267 CSAM reports in 2023, compared to millions by Google and Meta.
  • Apple is seeking dismissal of a related lawsuit, citing protections under Section 230 of the Communications Decency Act.