←back to thread

454 points positiveblue | 1 comments | | HN request time: 0.208s | source
Show context
jaredcwhite ◴[] No.45067994[source]
This article can easily be dismissed when hardly a moment in you see the headline "Agents Are Inevitable"

I'm sorry, but the "agents" of "agentic AI" is completely different from the original purpose of the World-Wide Web which was to support user agents. User agents are used directly by users—aka browsers. API access came later, but even then it was often directed by user activity…and otherwise quite normally rate-limited or paywalled.

The idea that now every web server must comply with servicing an insane number of automated bots doing god-knows-what without users even understanding what's happening a lot of the time, or without the consent of content owners to have all their IP scraped into massive training datasets is, well, asinine.

That's not the web we built, that's not the web we signed up for; and yes, we will take drastic measures to block your ass.

replies(1): >>45072543 #
1. 1gn15 ◴[] No.45072543[source]
Speak for yourself. This is just the semantic web: a web not built just for humans, but also for robots or any other types of agents that may wish to build upon the data. User agents never meant just web browsers, and operators blocking based on it necessitated hiding your identity.

Blocking bots is an absurd and unwinnable proposition, just like DRM; there's always the final, nuclear option of the analog hole, a literal video camera pointed at a monitor and using a keyboard and mouse.

If you really need to, deploy a proof of work shield that doesn't discriminate against user agents, just like what Onionsites do.