Hasty Briefsbeta

Tesla Is Urging Drowsy Drivers to Use 'Full Self-Driving'. That Could Go Wrong

10 hours ago
  • #Driver Safety
  • #Tesla
  • #Autonomous Vehicles
  • Tesla's Full Self-Driving (FSD) feature, despite its name, requires constant driver supervision and attention to the road.
  • New in-car messages encourage drivers to activate FSD when drowsy or drifting, contradicting safety guidelines and potentially promoting unsafe usage.
  • Experts warn that suggesting reliance on FSD during moments of inattention could lead to accidents, calling Tesla's messaging conflicting and counterproductive.
  • Research shows humans are poor passive supervisors of automated systems, leading to reduced vigilance and control in critical situations.
  • Tesla has implemented measures like driver monitoring cameras and a strike system to mitigate inattention but faces criticism over recent FSD prompts.
  • Tesla is under legal and regulatory scrutiny, including a $243 million liability ruling in a fatal crash case and accusations of misleading advertising about FSD capabilities.
  • CEO Elon Musk's ambitious goals for FSD, including a transition to unsupervised autonomy, face skepticism due to past unfulfilled promises and slow scaling of robotaxi services.
  • The broader automotive industry struggles with balancing improved Level 2 driver assistance systems and preventing driver distraction, highlighting a universal challenge.