TikTok 'directs child accounts to pornographic content within a few clicks'
3 hours ago
- #TikTok
- #Child Safety
- #Online Safety Act
- TikTok directed children's accounts to pornographic content within a few clicks, per a Global Witness report.
- Fake accounts set up with a 13-year-old's birth date and 'restricted mode' still received explicit search suggestions.
- Suggested terms included 'very very rude skimpy outfits' and escalated to 'hardcore pawn [sic] clips'.
- Pornographic content, including women flashing and penetrative sex, was encountered after minimal clicks.
- Some content evaded moderation by embedding explicit clips within innocuous videos or images.
- Two videos featured individuals appearing under 16, reported to the Internet Watch Foundation.
- Global Witness claims TikTok breached the UK's Online Safety Act by failing to protect children from harmful content.
- Ofcom requires tech companies to filter harmful content from children's feeds under the OSA.
- TikTok removed offending videos and updated search recommendations after being notified.