India's female workers watching hours of abusive content to train AI
3 months ago
- #AI labor
- #content moderation
- #mental health
- Monsumi Murmu, a 26-year-old content moderator in India, works from her village, classifying violent and disturbing content flagged by AI systems.
- Content moderation involves viewing up to 800 videos/images daily, leading to emotional numbing and psychological trauma.
- Studies show content moderators experience traumatic stress, intrusive thoughts, and sleep disturbances, with lasting cognitive impacts.
- India's data annotation sector employs ~70,000 workers, mostly from rural/marginalized backgrounds, with 80% being women.
- Women are preferred for their perceived reliability and willingness to accept home-based work, but this reinforces marginalization.
- Workers often face unexpected exposure to graphic content (e.g., child abuse, pornography), causing personal distress and dissociation.
- Job listings are vague, and NDAs prevent workers from discussing trauma, isolating them further.
- Psychological support is rare, and India’s labor laws lack protections for mental health harms in this sector.
- Workers like Murmu cope through solitary activities (e.g., walks, painting) but fear unemployment more than the job’s toll.