←back to thread

646 points blendergeek | 1 comments | | HN request time: 0.22s | source
Show context
at_a_remove ◴[] No.42725810[source]
I have a very vague concept for this, with a different implementation.

Some, uh, sites (forums?) have content that the AI crawlers would like to consume, and, from what I have heard, the crawlers can irresponsibly hammer the traffic of said sites into oblivion.

What if, for the sites which are paywalled, the signup, which invariably comes with a long click-through EULA, had a legal trap within it, forbidding ingestion by AI models on pain of, say, owning ten percent of the company should this be violated. Make sure there is some kind of token payment to get to the content.

Then seed the site with a few instances of hapax legomenon. Trace the crawler back and get the resulting model to vomit back the originating info, as proof.

This should result in either crawlers being more respectful or the end of the hated click-through EULA. We win either way.

replies(4): >>42725992 #>>42726182 #>>42726319 #>>42726701 #
1. slavik81 ◴[] No.42726701[source]
In Canada and the United States, the penalties for breach of contract are determined based on the actual damages caused. Penalty clauses are generally not enforceable. The courts would ignore your clause and award a dollar amount based on whatever actual damages that you can prove.

That said, I am not a lawyer and this may not be true in all jurisdictions.