AI doubted a female developer's work until she switched profile to a white man
11 days ago
- #Machine Learning
- #AI Bias
- #Gender Discrimination
- A developer named Cookie experienced bias from Perplexity's AI when it doubted her expertise in quantum algorithms due to her gender.
- AI researchers warn that models can exhibit bias due to biased training data and annotation practices.
- Examples of AI bias include gender-based assumptions in professions and storytelling, such as portraying professors as men and students as women.
- AI models can infer user demographics like gender or race based on language patterns, leading to implicit biases.
- Studies show AI can discriminate based on dialects, assigning lesser job titles to speakers of African American Vernacular English (AAVE).
- OpenAI and other organizations are working to reduce bias in AI models through research and improved training practices.
- Researchers emphasize that AI models are not sentient and should be used with awareness of their limitations and potential biases.