Court filings allege Meta downplayed risks to children and misled the public
14 hours ago
- #Meta
- #Social Media
- #Child Safety
- Meta had a '17x' strike policy for accounts involved in sex trafficking, allowing multiple violations before suspension.
- Meta allegedly downplayed risks to young users, including mental health issues and inappropriate adult interactions.
- Internal research suggested Meta's platforms were addictive, but the company publicly minimized these findings.
- Meta resisted safety changes like default private accounts for teens due to concerns over engagement metrics.
- The company allegedly targeted young users aggressively, including children under 13, despite legal restrictions.
- Meta executives shelved initiatives to reduce toxicity, such as hiding likes and banning beauty filters, due to negative impact on ad revenue.
- Harmful content, including self-harm and child sexual abuse material, was not automatically removed despite detection.
- Meta's internal communications revealed awareness of platform addiction, likening Instagram to a 'drug.'