Hasty Briefsbeta

Bilingual

AI got the blame for the Iran school bombing. The truth is more worrying

15 hours ago
  • #AI-ethics
  • #bureaucratic-failure
  • #military-technology
  • American forces struck the Shajareh Tayyebeh primary school in Minab, Iran, killing 175-180 people, mostly young girls.
  • The targeting system used was Maven, not Claude (an AI chatbot), despite media focus on AI involvement.
  • Maven, developed by Palantir, consolidates intelligence data to speed up military targeting decisions.
  • The school was misclassified as a military facility due to outdated intelligence databases.
  • The 'kill chain' concept refers to the bureaucratic process from target detection to destruction, which Maven aims to compress.
  • Historical military targeting errors, like in Vietnam and Iraq, highlight systemic issues with relying on unverified data.
  • Maven reduces human deliberation in targeting, increasing speed but risking errors like the Minab strike.
  • The debate around AI (Claude) overshadowed deeper bureaucratic and human failures in the Minab tragedy.
  • Palantir's CEO, Alex Karp, promotes automation as eliminating bureaucracy, but it removes critical human judgment.
  • The Minab strike reflects a broader pattern of technological fanaticism in military decision-making.