Looks cool. But please help me understand. What's to stop AI companies from solving the challenge, completing the proof of work and scrape websites anyway?
Because it's not a lot of CPU, you only have to solve it once per website, and the default policy difficulty of 16 for bots is worthless because you can just change your user agent so you get a difficulty of 4.
Anubis adds a cookie name `within.website-x-cmd-anubis-auth` which can be used by scrapers for not solving it more than once. Just have a fleet of servers whose sole purpose is to extract the cookie after solving the challenges and make sure all of them stay valid. It's not a big deal
Requests are associated with the cookie meaning you can trace and block or rate limit as necessary. The cost of solving the PoW is the cost of establishing a new session. If you get blocked you have to solve again.