Hasty Briefsbeta

Bilingual

Character.ai bans users under 18 after being sued over child's suicide

6 months ago
  • #Mental Health
  • #Teen Safety
  • #AI Regulation
  • Character.AI will ban users 18 and under from conversing with its virtual companions starting late November.
  • The decision follows legal scrutiny, including lawsuits over teen mental health impacts and a child's suicide linked to the platform.
  • Character.AI is introducing an 'age assurance functionality' to ensure age-appropriate experiences.
  • The company faces multiple lawsuits, including one from the family of a 14-year-old who died by suicide after forming an emotional attachment to a chatbot.
  • OpenAI also faces scrutiny, with reports of users displaying suicidal intent and psychosis during ChatGPT interactions.
  • California passed an AI law with safety guidelines for minors, banning sexual content and requiring reminders about AI interactions.
  • A new federal bill proposes banning minors from using AI companions and mandates age verification.
  • Senator Josh Hawley emphasized the need for regulations to prevent harm from AI chatbots, citing fake empathy and suicide encouragement risks.