←back to thread

Anubis Works

(xeiaso.net)
313 points evacchi | 1 comments | | HN request time: 0.288s | source
Show context
throwaway150 ◴[] No.43668638[source]
Looks cool. But please help me understand. What's to stop AI companies from solving the challenge, completing the proof of work and scrape websites anyway?
replies(6): >>43668690 #>>43668774 #>>43668823 #>>43668857 #>>43669150 #>>43670014 #
1. crq-yml ◴[] No.43668774[source]
It's a strategy to redefine the doctrine of information warfare on the public Internet from maneuver(leveraged and coordinated usage of resources to create relatively greater effects) towards attrition(resources are poured in indiscriminately until one side capitulates).

Individual humans don't care about a proof-of-work challenge if the information is valuable to them - many web sites already load slowly through a combination of poor coding and spyware ad-tech. But companies care, because that changes their ability to scrape from a modest cost of doing business into a money pit.

In the earlier periods of the web, scraping wasn't necessarily adversarial because search engines and aggregators were serving some public good. In the AI era it's become belligerent - a form of raiding and repackaging credit. Proof of work as a deterrent was proposed to fight spam decades ago(Hashcash) but it's only now that it's really needed to become weaponized.