Most active commenters
  • viktorcode(3)

←back to thread

454 points positiveblue | 13 comments | | HN request time: 0.001s | source | bottom
Show context
impure ◴[] No.45066528[source]
Well, if you have a better way to solve this that’s open I’m all ears. But what Cloudflare is doing is solving the real problem of AI bots. We’ve tried to solve this problem with IP blocking and user agents, but they do not work. And this is actually how other similar problems have been solved. Certificate authorities aren’t open and yet they work just fine. Attestation providers are also not open and they work just fine.
replies(6): >>45066914 #>>45067091 #>>45067829 #>>45072492 #>>45072740 #>>45072778 #
1. viktorcode ◴[] No.45066914[source]
AI poisoning is a better protection. Cloudflare is capable of serving stashes of bad data to AI bots as protective barrier to their clients.
replies(2): >>45066992 #>>45067679 #
2. esseph ◴[] No.45066992[source]
AI poisoning is going to get a lot of people killed, be cause the AI won't stop being used.
replies(5): >>45067767 #>>45071508 #>>45071786 #>>45075950 #>>45091711 #
3. verdverm ◴[] No.45067679[source]
You don't think that the AI companies will take efforts to detect and filter bad data for training? Do you suppose they are already doing this, knowing that data quality has an impact on model capabilities?
replies(2): >>45067756 #>>45071793 #
4. viktorcode ◴[] No.45067756[source]
They will learn to pay for high quality data instead of blindly relying on internet contents.
5. viktorcode ◴[] No.45067767[source]
By that logic AI already killing people. We can't presume that whatever can be found on the internet is reliable data, can't we?
replies(2): >>45067842 #>>45079036 #
6. lucb1e ◴[] No.45067842{3}[source]
If science taught us anything it's that no data is ever reliable. We are pretty sure about so many things, and it's the best available info so we might as well use it, but in terms of "the internet can be wrong" -> any source can be wrong! And I'd not even be surprised if internet in aggregate (with the bot reading all of it) is right more often than individual authors of pretty much anything
7. beeflet ◴[] No.45071508[source]
Okay, let them
8. culi ◴[] No.45071786[source]
The current state of the art in AI poisoning is Nightshade from the University of Chicago. It's meant to eventually be an addon to their WebGlaze[1] which is an invite-only tool meant for artists to protect their art from AI mimicry

Nobody is dying because artists are protecting their art

[0] https://nightshade.cs.uchicago.edu/whatis.html

[1] https://glaze.cs.uchicago.edu/webglaze.html

9. culi ◴[] No.45071793[source]
The current state of the art in AI poisoning is Nightshade from the University of Chicago. It's meant to eventually be an addon to their WebGlaze[1] which is an invite-only tool meant for artists to protect their art from AI mimicry

If these companies are adding extra code to bypass artists trying to protect their intellectual property from mimicry then that is an obvious and egregious copyright violation

More likely it will push these companies to actually pay content creators for the content they work on to be included in their models.

[0] https://nightshade.cs.uchicago.edu/whatis.html

[1] https://glaze.cs.uchicago.edu/webglaze.html

replies(1): >>45079362 #
10. jlarocco ◴[] No.45075950[source]
You mean incompetent users of AI will get people killed. You don't get a free pass because you used a tool that sucked.
11. esseph ◴[] No.45079036{3}[source]
Yet we use it every day for police, military, and political targeting with economic and kinetic consequences.
12. verdverm ◴[] No.45079362{3}[source]
Seems like their poisoning is something that shouldn't be hard to detect and filter on. There is enough perturbation to create visual artifacts people can see. Steganography research is much further along in being undetectable. I would imaging in order to disrupt training sufficiently, you would not be able to have so few perturbations that it would go undetected
13. account42 ◴[] No.45091711[source]
This is some next level blame shifting. Next you are going to steal motor oil and then complain that your customers got sick when you used it to cook their food.