Hasty Briefsbeta

Bilingual

My Self-Driving Car Crash – The Tesla was driving perfectly–until it wasn't

3 hours ago
  • #self-driving cars
  • #human oversight
  • #AI accountability
  • A Tesla Model X in Full Self-Driving mode crashed into a wall, leaving the driver and kids unharmed but the car totaled.
  • The driver, a former self-driving-car division head at Uber, reflects on the moral crumple zone—where humans take blame for automated system failures.
  • Tesla logs driver data (hand position, reaction time, eye tracking) and has used it to shift blame onto drivers post-accident.
  • Legal responsibility falls on drivers, as Tesla’s Full Self-Driving is classified as Level 2 automation, requiring human control.
  • A 2025 verdict held Tesla partly liable in a wrongful-death case, awarding $243 million, marking rare corporate accountability.
  • The vigilance decrement: Monitoring near-perfect systems leads to complacency, delaying mental reengagement during emergencies.
  • AI and self-driving systems train users to trust them, eroding vigilance until failures occur—with humans bearing the blame.
  • BYD’s 2025 policy to cover self-parking crash damages shows accountability is possible, urging other companies to follow suit.
  • The article warns against the ‘almost autonomy’ trap, where systems appear reliable but shift risk and blame onto users.