←back to thread

Anubis Works

(xeiaso.net)
313 points evacchi | 2 comments | | HN request time: 0.713s | source
Show context
gyomu ◴[] No.43668594[source]
If you’re confused about what this is - it’s to prevent AI scraping.

> Anubis uses a proof-of-work challenge to ensure that clients are using a modern browser and are able to calculate SHA-256 checksums

https://anubis.techaro.lol/docs/design/how-anubis-works

This is pretty cool, I have a project or two that might benefit from it.

replies(2): >>43669511 #>>43671745 #
x3haloed ◴[] No.43669511[source]
I’ve been wondering to myself for many years now whether the web is for humans or machines. I personally can’t think of a good reason to specifically try to gate bots when it comes to serving content. Trying to post content or trigger actions could obviously be problematic under many circumstances.

But I find that when it comes to simple serving of content, human vs. bot is not usually what you’re trying to filter or block on. As long as a given client is not abusing your systems, then why do you care if the client is a human?

replies(8): >>43669544 #>>43669558 #>>43669572 #>>43670108 #>>43670208 #>>43670880 #>>43671272 #>>43676454 #
1. praptak ◴[] No.43670208[source]
The good thing about proof of work is that it doesn't specifically gate bots.

It may have some other downsides - for example I don't think that Google is possible in a world where everyone requires proof of work (some may argue it's a good thing) but it doesn't specifically gate bots. It gates mass scraping.

replies(1): >>43678070 #
2. fc417fc802 ◴[] No.43678070[source]
Things like google are still possible. Operators would need to whitelist services.

Alternatively shared resources similar in spirit to common crawl but scaled up could be used. That would have the benefit of democratizing the ability to create and operate large scale search indexes.