How OpenAI, the US government and Persona built an identity surveillance machine
7 days ago
- #privacy
- #OpenAI
- #surveillance
- OpenAI and Persona built a surveillance system that screens users against government watchlists using facial recognition and other biometric data.
- The system files Suspicious Activity Reports (SARs) directly to FinCEN and Suspicious Transaction Reports (STRs) to FINTRAC, tagged with intelligence program codenames.
- Users' selfies are compared to political figures (PEPs) with similarity scoring, and the system maintains biometric face databases with a 3-year retention policy.
- The platform performs 269 distinct verification checks, including experimental ML models on biometric data and crypto address screening via Chainalysis.
- A government deployment named 'onyx.withpersona-gov.com' appeared recently, possibly linked to ICE's $4.2M AI surveillance tool, Fivecast ONYX.
- 53 MB of unprotected source code was exposed on a FedRAMP-authorized endpoint, revealing the platform's inner workings.
- OpenAI blocks users from Ukraine despite no legal sanctions, raising questions about policy transparency and fairness.
- The system lacks user recourse—denials are unexplained, and data retention policies conflict between OpenAI's disclosures (1 year) and the code (3 years).
- The investigation was conducted via passive recon (Shodan, DNS, HTTP headers) without breaching systems or using credentials.
- A list of individuals associated with the surveillance system's development was published for transparency.