Amazonbot is finally respecting robots.txt
5 hours ago
- #anti-scraping
- #web security
- #proof of work
- Anubis protects websites from AI scraping by implementing a Proof-of-Work scheme.
- The scheme, similar to Hashcash, increases costs for mass scrapers but has minimal impact on individual users.
- It aims to reduce server downtime caused by aggressive scraping.
- Anubis is a temporary solution while work continues on fingerprinting headless browsers for better detection.
- It requires modern JavaScript; plugins like JShelter must be disabled for it to work.