←back to thread

390 points pyman | 1 comments | | HN request time: 0.208s | source
Show context
guywithahat ◴[] No.44491931[source]
If you own a book, it should be legal for your computer to take a picture of it. I honestly feel bad for some of these AI companies because the rules around copyright are changing just to target them. I don't owe copyright to every book I read because I may subconsciously incorporate their ideas into my future work.
replies(6): >>44491968 #>>44491997 #>>44492019 #>>44492128 #>>44492134 #>>44492187 #
organsnyder ◴[] No.44492019[source]
The difference here is that an LLM is a mechanical process. It may not be deterministic (at least, in a way that my brain understands determinism), but it's still a machine.

What you're proposing is considering LLMs to be equal to humans when considering how original works are created. You could make the argument that LLM training data is no different from a human "training" themself over a lifetime of consuming content, but that's a philosophical argument that is at odds with our current legal understanding of copyright law.

replies(2): >>44492057 #>>44492121 #
1. ◴[] No.44492057[source]