←back to thread

454 points positiveblue | 1 comments | | HN request time: 0s | source
Show context
impure ◴[] No.45066528[source]
Well, if you have a better way to solve this that’s open I’m all ears. But what Cloudflare is doing is solving the real problem of AI bots. We’ve tried to solve this problem with IP blocking and user agents, but they do not work. And this is actually how other similar problems have been solved. Certificate authorities aren’t open and yet they work just fine. Attestation providers are also not open and they work just fine.
replies(6): >>45066914 #>>45067091 #>>45067829 #>>45072492 #>>45072740 #>>45072778 #
Voultapher ◴[] No.45072778[source]
> Well, if you have a better way to solve this that’s open I’m all ears.

Regulation.

Make it illegal to request the content of a webpage by crawler if a website operator doesn't explicitly allows it via robots.txt. Institute a government agency that is tasked with enforcement. If you as a website operator can show that traffic came from bots, you can open a complaint with the government agency and they take care of shaking painful fines out of the offending companies. Force cloud hosts to keep books on who was using what IP addresses. Will it be a 100% fix, no, will it have a massive chilling effect if done well, absolutely.

replies(4): >>45072849 #>>45073524 #>>45075933 #>>45078127 #
zimmund ◴[] No.45078127[source]
> Institute a government agency that is tasked with enforcement.

You're forgetting about the first W in WWW...

replies(1): >>45091669 #
1. account42 ◴[] No.45091669[source]
So what you're saying is that if I were to host a bit torrent tracker in Sweden then the US can't do anything about it?