←back to thread

473 points edent | 8 comments | | HN request time: 0.551s | source | bottom
1. eichin ◴[] No.43773527[source]
With a catchphrase like "reclaim the public internet" I expected they were funding Anubis https://xeiaso.net/blog/2025/anubis/ (don't get me wrong, they've got a neat list of projects, I'm just quibbling with what's "public internet" about solar powered motherboards or Ada bootstrapping.)
replies(2): >>43773544 #>>43773653 #
2. eadmund ◴[] No.43773544[source]
Anubis isn’t really about reclaiming the public internet, though: it’s about excluding some internet users. It has its reasons, of course, but it’s fundamentally about making the internet not a commons.
replies(3): >>43773700 #>>43773754 #>>43834420 #
3. xena ◴[] No.43773653[source]
I have NLNet on the TODO list, I just haven't had time for it yet :(
4. eichin ◴[] No.43773700[source]
From my perspective, anubis (and iocaine etc) is about keeping misbehaving load generators from suppressing small-scale "classic internet" sites. So yeah, it's exclusionary, like keeping semi trucks from taking shortcuts through a schoolyard.
replies(1): >>43780731 #
5. freddydumont ◴[] No.43773754[source]
While I get your point about Anubis excluding some users, its purpose is to protect the "commons" from those who would abuse and destroy them: big tech crawlers that do not respect robots.txt files.
6. pabs3 ◴[] No.43780731{3}[source]
It would probably be better if it excluded problematic behavior rather than excluding classes of clients.
replies(1): >>43834385 #
7. hooverd ◴[] No.43834385{4}[source]
Sure, but there's a very high correlation between classes of client and problematic behavior, and said classes of clients go to great lengths to mask their problematic behavior.
8. hooverd ◴[] No.43834420[source]
For the purposes of analogy, imagine the commons as a park where you can have a picnic or play frisbee, and AI crawlers are people ripping it rip on their dirt bikes.