Algorithm-based tool for home support funding is 'cruel' and 'inhumane'
7 days ago
- #government-policy
- #algorithm-controversy
- #aged-care
- Aged care clinicians criticize the algorithm-based assessment tool (IAT) for being 'cruel' and 'inhumane', stripping away clinical expertise.
- The IAT determines federal home support funding packages, often overriding human assessors' judgments, leading to inadequate support for elderly individuals.
- Mark Aitken, a nurse with 39 years of experience, quit his job due to the tool's inflexibility and inaccuracies in classifying needs.
- Assessors are rarely allowed to override the IAT's decisions, even when they disagree with the classification of need.
- Examples include misclassifications: one elderly woman with good support was deemed high priority, while another with advanced dementia was classified as low need.
- The IAT lacks transparency in how it weighs risk, need, or complexity, leaving assessors and the public in the dark.
- Independent MP Dr Monique Ryan criticizes the IAT for removing clinical judgment and nuance, calling it 'robo-aged-care'.
- Concerns about the IAT mirror past controversies like robodebt, with fears of unethical outcomes and lack of accountability.
- Aged care assessors have resorted to 'gaming' the system by inputting false information to secure necessary care levels.
- A spokesperson defended the IAT, stating it provides a 'holistic view' of an older person's needs, but critics remain unconvinced.