←back to thread

454 points positiveblue | 1 comments | | HN request time: 0.279s | source
Show context
impure ◴[] No.45066528[source]
Well, if you have a better way to solve this that’s open I’m all ears. But what Cloudflare is doing is solving the real problem of AI bots. We’ve tried to solve this problem with IP blocking and user agents, but they do not work. And this is actually how other similar problems have been solved. Certificate authorities aren’t open and yet they work just fine. Attestation providers are also not open and they work just fine.
replies(6): >>45066914 #>>45067091 #>>45067829 #>>45072492 #>>45072740 #>>45072778 #
viktorcode ◴[] No.45066914[source]
AI poisoning is a better protection. Cloudflare is capable of serving stashes of bad data to AI bots as protective barrier to their clients.
replies(2): >>45066992 #>>45067679 #
esseph ◴[] No.45066992[source]
AI poisoning is going to get a lot of people killed, be cause the AI won't stop being used.
replies(5): >>45067767 #>>45071508 #>>45071786 #>>45075950 #>>45091711 #
viktorcode ◴[] No.45067767[source]
By that logic AI already killing people. We can't presume that whatever can be found on the internet is reliable data, can't we?
replies(2): >>45067842 #>>45079036 #
1. lucb1e ◴[] No.45067842[source]
If science taught us anything it's that no data is ever reliable. We are pretty sure about so many things, and it's the best available info so we might as well use it, but in terms of "the internet can be wrong" -> any source can be wrong! And I'd not even be surprised if internet in aggregate (with the bot reading all of it) is right more often than individual authors of pretty much anything