←back to thread

398 points pyman | 1 comments | | HN request time: 0.207s | source
Show context
pyman ◴[] No.44488332[source]
Anthropic's cofounder, Ben Mann, downloaded million copies of books from Library Genesis in 2021, fully aware that the material was pirated.

Stealing is stealing. Let's stop with the double standards.

replies(8): >>44488391 #>>44488540 #>>44488816 #>>44490720 #>>44491032 #>>44491583 #>>44492035 #>>44493242 #
originalvichy ◴[] No.44488540[source]
At least most pirates just consume for personal use. Profiting from piracy is a whole other level beyond just pirating a book.
replies(4): >>44488621 #>>44488853 #>>44489003 #>>44490718 #
KoolKat23 ◴[] No.44489003[source]
This isn't really profiting from piracy. They don't make money off the raw input data. It's no different to consuming for personal use.

They make money off the model weights, which is fair use (as confirmed by recent case law).

replies(1): >>44489216 #
j_w ◴[] No.44489216[source]
This is absurd. Remove all of the content from the training data that was pirated and what is the quality of the end product now?
replies(2): >>44489279 #>>44489283 #
KoolKat23 ◴[] No.44489283[source]
That's the law.

Please keep in mind, copyright is intended as a compromise between benefit to society and to the individual.

A thought experiment, students pirating textbooks and applying that knowledge later on in their work?

replies(2): >>44489587 #>>44495512 #
1. nwienert ◴[] No.44495512[source]
Its the law (for now, very early on this in the process of deciding the law, untested, appealable, likely to be appealed and tested many times in many ways).

Meanwhile other cases have been less friendly to it being fair use, AI companies are already paying vast sums to publishers who presumably they wouldn’t if they felt confident it was “the law”, and on and on.

I don’t like arguing from “it’s the law”. A lot of law is terrible. What’s right? It’s clear to me that if AI gets good enough, as it nearly is now, it sucks a lot of profit away from creators. That is unbalanced. The AI doesn’t exist without the creators, the creators need to exist for our society to be great (we want new creative works, more if anything). Law tends to start conservatively based on historical precedent, and when a new technology comes along it often errs on letting it do some damage to avoid setting a bad precedent. In time it catches up as society gets a better view of things.

The right thing is likely not to let our creative class be decimated so a few tech companies become fantastically wealthy - in the long run, it’s the right thing even for the techies.