←back to thread

646 points blendergeek | 2 comments | | HN request time: 0.689s | source
Show context
at_a_remove ◴[] No.42725810[source]
I have a very vague concept for this, with a different implementation.

Some, uh, sites (forums?) have content that the AI crawlers would like to consume, and, from what I have heard, the crawlers can irresponsibly hammer the traffic of said sites into oblivion.

What if, for the sites which are paywalled, the signup, which invariably comes with a long click-through EULA, had a legal trap within it, forbidding ingestion by AI models on pain of, say, owning ten percent of the company should this be violated. Make sure there is some kind of token payment to get to the content.

Then seed the site with a few instances of hapax legomenon. Trace the crawler back and get the resulting model to vomit back the originating info, as proof.

This should result in either crawlers being more respectful or the end of the hated click-through EULA. We win either way.

replies(4): >>42725992 #>>42726182 #>>42726319 #>>42726701 #
1. 9283409232 ◴[] No.42726182[source]
This doesn't work like you think it does but even if it did, do you have the money to sustain several years long legal battle against OpenAI?
replies(1): >>42726331 #
2. grajaganDev ◴[] No.42726331[source]
Exactly, the lawyers would be the only winners (as usual).