←back to thread

393 points pyman | 3 comments | | HN request time: 0.615s | source
1. codedokode ◴[] No.44492305[source]
> "Like any reader aspiring to be a writer, Anthropic's LLMs trained upon works not to race ahead and replicate or supplant them — but to turn a hard corner and create something different," he wrote.

But this analogy seems wrong. First, LLM is not a human and cannot "learn" or "train" - only human can do it. And LLM developers are not aspiring to become writers and do not learn anything, they just want to profit by making software using copyrighted material. Also people do not read millions of books to become a writer.

replies(1): >>44493617 #
2. CaptainFever ◴[] No.44493617[source]
> But this analogy seems wrong. First, LLM is not a human and cannot "learn" or "train" - only human can do it.

The analogy refers to humans using machines to do what would already be legally if they did it manually.

> And LLM developers are not aspiring to become writers and do not learn anything, they just want to profit by making software using copyrighted material.

[Citation needed], and not a legal argument.

> Also people do not read millions of books to become a writer.

But people do hear millions of words as children.

replies(1): >>44496306 #
3. codedokode ◴[] No.44496306[source]
> But people do hear millions of words as children.

At a rate 1000 words/day it takes 3 years to hear a million words. Also "million words" is not equal to "million books". Humans are ridiculously efficient in learning compared to LLMs.