"Non-consensually", as if you had to ask for permission to perform a GET request to an open HTTP server.
Yes, I know about weev. That was a travesty.
"Non-consensually", as if you had to ask for permission to perform a GET request to an open HTTP server.
Yes, I know about weev. That was a travesty.
robots.txt is a polite request to please not scrape these pages because it's probably not going to be productive. It was never meant to be a binding agreement, otherwise there would be a stricter protocol around it.
It's kind of like leaving a note for the deliveryman saying please don't leave packages on the porch. It's fine for low stakes situations, but if package security is of utmost importance to you, you should arrange to get it certified or to pick it up at the delivery center. Likewise if enforcing a rule of no scraping is of utmost importance you need to require an API token or some other form of authentication before you serve the pages.
"Taking" versus "giving" is neither here nor there for this discussion. The question is are you expressing a preference on etiquette versus a hard rule that must be followed. I personally believe robots.txt is the former, and I say that as someone who serves more pages than they scrape
No one is calling for the criminalization of door-to-door sales and no one is worried about how much door-to-door sales increases water consumption.
Anybody may watch the demo screen of an arcade game for free, but you have to insert a quarter to play — and you can have even greater access with a key.
> and you’ve explicitly left a sign saying ‘you are not welcome here’
And the sign said "Long-haired freaky people Need not apply" So I tucked my hair up under my hat And I went in to ask him why He said, "You look like a fine upstandin' young man I think you'll do" So I took off my hat and said, "Imagine that Huh, me workin' for you"
Lets say a large art hosting site realizes how damaging AI training on their data can be - should they respond by adding a paywall before any of their data is visible? If that paywall is added (let's just say $5/mo) can most of the artists currently on their site afford to stay there? Can they afford it if their potential future patrons are limited to just those folks who can pay $5/mo? Would the scraper be able to afford a one time cost of $5 to scrape all of that data?
I think, as much they are a deeply flawed concept, this is a case where EULAs or an assumption of no-access for training unless explicitly granted that's actually enforced through the legal system is required. There are a lot of small businesses and side projects that are dying because of these models and I think that creative outlet has societal value we would benefit from preserving.
This "legal concept" is enforceable through legacy systems of police and violence. The internet does not recognize it. How much more obvious can this get?
If we stumble down the path of attempting to apply this legal framework, won't some jurisdiction arise with no IP protections whatsoever and just come to completely dominate the entire economy of the internet?
If I can spin up a server in copyleftistan with a complete copy of every album and film ever made, available for free download, why would users in copyrightistan use the locked down services of their domestic economy?
So what's the solution? How do I host a website that welcomes human visitors, but rejects all scrapers?
There is no mechanism! The best I can do is a cat-and-mouse arms race where I try to detect the traffic I don't want, and block it, while the people generating the traffic keep getting more sophisticated about hiding from my detection.
No, putting up a paywall is not a reasonable response to this.
> The question is are you expressing a preference on etiquette versus a hard rule that must be followed.
Well, there really aren't any hard rules that must be followed, because there are no enforcement mechanisms outside of going nuclear (requiring login). Everything is etiquette. And I agree that robots.txt is also etiquette, and it is super messed up that we tolerate "AI" companies stomping all over that etiquette.
Do we maybe want laws that say everyone must respect robots.txt? Maybe? But then people will just move their scrapers to a jurisdiction without those laws. And I'm sure someone could make the argument that robots.txt doesn't apply to them because they spoofed a browser user-agent (or another user-agent that a site explicitly allows). So perhaps we have a new mechanism, or new laws, or new... something.
But this all just highlights the point I'm making here: there is no reasonable mechanism (no, login pages and http auth don't count) for site owners to restrict access to their site based on these sorts of criteria. And that's a problem.
You use "legacy" as if these systems are obsolete and on their way out. They're not. They're here to stay, and will remain dominant, for better or worse. Calling them "legacy" feels a bit childish, as if you're trying to ignore reality and base arguments on your preferred vision of how things should be.
> The internet does not recognize it.
Sure it does. Not universally, but there are a lot of things governments and law enforcement can do to control what people see and do on the internet.
> If we stumble down the path of attempting to apply this legal framework, won't some jurisdiction arise with no IP protections whatsoever and just come to completely dominate the entire economy of the internet?
No, of course not, that's silly. That only really works on the margins. Any other country would immediately slap economic sanctions on that free-for-all jurisdiction and cripple them. If that fails, there's always a military response they can resort to.
> If I can spin up a server in copyleftistan with a complete copy of every album and film ever made, available for free download, why would users in copyrightistan use the locked down services of their domestic economy?
Because the governments of all the copyrightistans will block all traffic going in and out of copyleftistan. While this may not stop determined, technically-adept people, it will work for the most part. As I said, this sort of thing only really works on the margins.
You are free to say "well, there is no mechanism to do that", and I would agree with you. That's the problem!
Fence off your yard if you don't want people coming by and dumping a mountain of garbage on it every day.
You can certainly choose to live in a society that thinks these are acceptable solutions. I think it's bullshit, and we'd all be better off if anyone doing these things would be breaking rocks with their teeth in a re-education camp, until they learn how to be a decent human being.
> robots.txt. This is not the law
In Germany, it is the law. § 44b UrhG says (translated):
(1) Text and data mining is the automated analysis of one or more digital or digitized works to obtain information, in particular about patterns, trends, and correlations.
(2) Reproductions of lawfully accessible works for text and data mining are permitted. These reproductions must be deleted when they are no longer needed for text and data mining.
(3) Uses pursuant to paragraph 2, sentence 1, are only permitted if the rights holder has not reserved these rights. A reservation of rights for works accessible online is only effective if it is in machine-readable form.
> You use "legacy" as if these systems are obsolete and on their way out. They're not.
I have serious doubts that nation states will still exist in 500 years. I feel quite certain that they'll be gone in 10,000. And I think it's generally good to build an internet for those time scales.
> base arguments on your preferred vision of how things should be.
I hope we all build toward our moral compass; I don't mean for arguments to fall into fallacies on this basis, but yeah I think our internet needs to resilient against the waxing and waning of the affairs of state. I don't know if that's childish... Maybe we need to have a more child-like view of things? The internet _is_ a child in the sense of its maturation timeframe.
> there are a lot of things governments and law enforcement can do to control what people see and do on the internet.
Of course there are things that governments do. But are they effective? I just returned from a throatsinging retreat in Tuva - a fairly remote part of Siberia. The Russian government has apparently quietly begun to censor quite a few resources on the internet, and it has caused difficulty in accessing the traditional music of the Tuvan people. And I was very happily astonished to find that everybody to whom I ran into, including a shaman grandmother, was fairly adept at routing around this censorship using a VPN and/or SSH tunnel.
I think the internet is doing a wonderful job at routing around censorship - better than any innovation ever discovered by humans so far.
> Any other country would immediately slap economic sanctions on that free-for-all jurisdiction and cripple them. If that fails, there's always a military response they can resort to.
Again, maybe I'm just more optimistic, but I think that on longer time frames, the sober elder statesmen/women will prevail and realize that violence is not an appropriate response to bytes transiting the wire that they wish weren't.
And at the end of the day, I don't think governments even have the power here - the content creators do. I distribute my music via free channels because that's the easiest way to reach my audience, and because, given the high availability of compelling free content, there's just no way I can make enough money on publishing to even concern myself with silly restrictions.
It seems to me that I'm ahead of the curve in this area, not behind it. But I'm certainly open to being convinced otherwise.
A curious position. There isn't a secondary species using the internet. There is only humans. Unless you foresee some kind of alien invasion or earthworm uprising, nothing other than humans will ever access your content. Rejecting the tools humans use to bridge their biological gaps is rather nonsensical.
> You are free to say "well, there is no mechanism to do that", and I would agree with you. That's the problem!
I suppose it would be pretty neat if humans were born with some kind of internet-like telepathy ability, but lacking that mechanism isn't any kind of real problem. Humans are well adept at using tools and have successfully used tools for millennia. The internet itself is a tool! Which, like before, makes rejecting the human use of tools nonsensical.
Ok, I am, right now.
It seems like there are two sides here that are talking past one another: "people will do X and you accept it if you do not actively prevent it, if you can" and "X is bad behavior that should be stopped and shouldn't be the burden of individuals to stop". As someone who leans to the latter, the former just sounds like restating the problem being complained about.
Your framing is off because this notion of fairness or morality isn't something they concern themselves with. They're using violence because if they didn't, they would be allowing other entities to gain wealth and power at their expense. I don't think it's much more complex than that.
See how differently these same bytes are treated in the hands of Aaron Swartz vs OpenAI. One threatened to empower humanity at the expense of reducing profits for a few rich men, so he got crucified for it. The other is hoping to make humans redundant, concentrate the distribution of wealth even further, and strengthen the US world dominance, so all of the right wheels get greased for them and they get a license to kill - figuratively and literally.
I suppose the ultimate solution would be browsers and operating systems and hardware manufacturers co-operating to implement some system that somehow cryptographically signs HTTP requests which attests that it was triggered by an actual, physical interaction with a computing device by a human.
Though you don't have to think for very long to come up with all kinds of collateral damage that would cause and how bad actors could circumvent it anyway.
All in all, this whole issue seems more like a legal problem than a technical one.
There are some states where the considerations for self defense do not include a duty to retreat if possible, either in general (“stand your ground" law) or specifically in the home (“castle doctrine"), but all the other requirements (imminent threat of certain kinds of serious harm, proportional force) for self-defense remain part of the law in those states, and trespassing by/while disregarding a ”no soliciting” would not, by itself, satisfy those requirements.
He just doesn't want tools humans use to access content to be used in association with his content.
What he failed to realize is that if you eliminate the tools, the human cannot access the content anyway. They don't have the proper biological interfaces. Had he realized that, he'd have come to notice that simply turning off his server fully satisfies the constraints.