AI-generated 'poverty porn' fake images being used by aid agencies
11 hours ago
- #ethical representation
- #AI-generated imagery
- #poverty porn
- AI-generated images of extreme poverty, children, and sexual violence survivors are increasingly used by health NGOs, raising ethical concerns.
- These images often replicate and exaggerate stereotypes, such as children in muddy water or racialized depictions of poverty.
- The use of AI-generated images is driven by cost savings and avoiding consent issues, especially amid NGO budget cuts.
- Stock photo sites like Adobe Stock and Freepik host dozens of such images, some with racialized and stereotypical captions.
- Freepik's CEO argues that responsibility lies with media consumers, not platforms, though efforts are made to curb biases.
- Leading charities, including Plan International and the UN, have used AI-generated images in campaigns, sparking backlash.
- Critics warn that biased AI images could perpetuate stereotypes and amplify prejudice in future AI models.
- NGOs like Plan International have since adopted guidelines against using AI to depict individual children.