How Do You Find an Illegal Image Without Looking at It?
3 days ago
- #child safety
- #perceptual hashing
- #CSAM detection
- NCMEC received 21.3 million reports with 61.8 million CSAM files in 2025.
- CSAM detection relies on perceptual hashing (e.g., PDQ, PhotoDNA) to match known images without viewing content.
- Machine learning classifiers detect new and AI-generated CSAM, but have higher false positives.
- Video detection uses TMK+PDQF to fingerprint temporal structure and visual averages.
- Industry pattern: Hasher-Matcher-Actioner separates fingerprinting, matching, and actions.
- False positives vs. false negatives trade-off is a critical ethical and technical decision.
- Generative AI creates new CSAM and overwhelms human review pipelines.
- Tools like Cloudflare CSAM Scanning, Thorn Safer, and Google Content Safety API are available.