He Couldn't Land a Job Interview. Was AI to Blame?
17 hours ago
- #AI in Hiring
- #Algorithmic Bias
- #Medical Residency
- Chad Markey, a medical student with strong credentials, faced unexpected rejections and no interview offers for residencies, suspecting an AI screening tool called Cortex might be to blame.
- Markey's Medical Student Performance Evaluation (MSPE) mentioned 'voluntary' leaves of absence for personal reasons, which he felt misrepresented his medically necessary absences due to ankylosing spondylitis, potentially triggering AI bias.
- He embarked on a six-month investigation, coding and analyzing how AI might interpret his application, including creating synthetic datasets and reverse-engineering a patent for a similar AI tool.
- After manually emailing programs about a new research publication, Markey received immediate interview offers, leading him to believe his application was initially overlooked by AI-driven screening.
- Issues with Cortex were reported, including inaccurate grade displays, though Thalamus, the maker, stated these were minimal and corrected, and clarified that Cortex does not use AI to score or rank applicants.
- Markey successfully matched with Columbia University's psychiatry program but continued his research into AI bias, highlighting the lack of transparency and regulation in AI hiring tools.
- Background-check AI tools are regulated under the Fair Credit Reporting Act, offering some recourse, unlike many AI screening systems in hiring, which lack similar oversight for individual complaints.