←back to thread

1503 points participant3 | 4 comments | | HN request time: 0s | source
Show context
CamperBob2 ◴[] No.43574347[source]
And? What's the model supposed to do? It's just doing what many human artists would do, if they're not explicitly being paid to create new IP.

If infringement is happening, it arguably doesn't happen when an infringing work product is generated (or regurgitated, or whatever you want to call it.) Much less when the model is trained. It's when the output is used commercially -- by a human -- that the liability should rightfully attach.

And it should attach to the human, not the tool.

replies(9): >>43574359 #>>43574421 #>>43574494 #>>43574501 #>>43574545 #>>43574555 #>>43574573 #>>43574601 #>>43575140 #
4ndrewl ◴[] No.43574421[source]
Assuming you can identify it's someone else's IP. Clearly these are hugely contrived examples, but what about text or code that you might not be as familiar with?
replies(2): >>43574452 #>>43575756 #
1. CamperBob2 ◴[] No.43574452[source]
It doesn't matter. Sue whoever uses it commercially.

If you insist on making it about the model, you will wreck something wonderful.

replies(1): >>43574479 #
2. 4ndrewl ◴[] No.43574479[source]
Ah, so don't use the outputs of an LLM commercially?
replies(2): >>43574843 #>>43574958 #
3. fxtentacle ◴[] No.43574843[source]
That, or get sued.
4. IAmBroom ◴[] No.43574958[source]
If it "may" violate copyright, correct!