←back to thread

451 points croes | 1 comments | | HN request time: 0.205s | source
Show context
mattxxx ◴[] No.43962976[source]
Well, firing someone for this is super weird. It seems like an attempt to censor an interpretation of the law that:

1. Criticizes a highly useful technology 2. Matches a potentially-outdated, strict interpretation of copyright law

My opinion: I think using copyrighted data to train models for sure seems classically illegal. Despite that, Humans can read a book, get inspiration, and write a new book and not be litigated against. When I look at the litany of derivative fantasy novels, it's obvious they're not all fully independent works.

Since AI is and will continue to be so useful and transformative, I think we just need to acknowledge that our laws did not accomodate this use-case, then we should change them.

replies(19): >>43963017 #>>43963125 #>>43963168 #>>43963214 #>>43963243 #>>43963311 #>>43963423 #>>43963517 #>>43963612 #>>43963721 #>>43963943 #>>43964079 #>>43964280 #>>43964365 #>>43964448 #>>43964562 #>>43965792 #>>43965920 #>>43976732 #
palmotea[dead post] ◴[] No.43963168[source]
[flagged]
ulbu ◴[] No.43963480[source]
these comparisons of llms with human artists copying are just ridiculous. it’s saying “well humans are allowed to break twigs and damage the planet in various ways, so why not allow building a fucking DEATH STAR”.

abstracting llms from their operators and owners and possible (and probable) ends and the territories they trample upon is nothing short of eye-popping to me. how utterly negligent and disrespectful of fellow people must one be at the heart to give any credence to such arguments

replies(3): >>43964105 #>>43964159 #>>43964449 #
temporalparts ◴[] No.43964105[source]
The problem isn't that people aren't aware that the scale and magnitude differences are large and significant.

It's that the space of intellectual property LAW does not handle the robust capabilities of LLMs. Legislators NEED to pass laws to reflect the new realities or else all prior case law relies on human analogies which fail in the obvious ways you alluded to.

If there was no law governing the use of death stars and mass murder, and the only legal analogy is to environmental damage, then the only crime the legal system can ascribe is mass environmental damage.

replies(1): >>43964252 #
Intralexical ◴[] No.43964252[source]
Why do you think the obvious analogy is LLM=Human, and not LLM=JPEG or LLM=database?

I think you're overstating the legal uniqueness of LLMs. They're covered just fine by the existing legal precedents around copyrighted and derived works, just as building a death star would be covered by existing rules around outer space use and WMDs. Pretending they should be treated differently is IMO the entire lie told by the "AI" companies about copyright.

replies(2): >>43964507 #>>43968544 #
sdenton4 ◴[] No.43964507[source]
LLMs are certainly not a jpeg or a database...

The google news snippets case is, in my non-lawyer opinion, the most obvious touch point. And in that case, it was decided that providing large numbers of snippets in search results was non-infringing, despite being a case of copying text from other people at-scale... And the reasons this was decided are worth reading and internalizing.

There is not an obvious right answer here. Copyright rules are, in fact, Calvinball, and we're deep in uncharted territory.

replies(1): >>43964597 #
Intralexical ◴[] No.43964597[source]
> LLMs are certainly not a jpeg or a database...

Their weights are derived from copyrighted works. Evaluating them preserves the semantic meaning and character of the source material. And the output directly competes against the copyrighted source materials.

The fact they're smudgy and non-deterministic doesn't change how they relate to the rights of authors and artists.

replies(3): >>43964975 #>>43967423 #>>43967466 #
Suppafly ◴[] No.43967466[source]
>Their weights are derived from copyrighted works. Evaluating them preserves the semantic meaning and character of the source material.

That sounds like you're arguing that they should be legal. Copyright law protects specific expressions, not handwavy "smudgy and non-deterministic" things.

replies(1): >>43969125 #
johnnyanmac ◴[] No.43969125[source]
Llms can't express, that's the primary issue. You can't just make a collage of copyrighted works and shield yourself from copyright with "expression".
replies(2): >>43976226 #>>43976269 #
Suppafly ◴[] No.43976269[source]
>You can't just make a collage of copyrighted works and shield yourself from copyright with "expression".

And yet collage artists do that all the time.

replies(1): >>43982167 #
johnnyanmac ◴[] No.43982167[source]
I'll remind you that all fanart is technically in a gray area of copyright infringement. Legally speaking, companies can take down and charge infringement for anything using their IP thars not under fair use. Collages don't really pass that benchmark.

Yoinnking their up and mass producing slop sure is a line to cross, though.

replies(1): >>43984849 #
1. temporalparts ◴[] No.43984849[source]
I'm not an expert, but I thought fan art that people try to monetize in some form is explicitly illegal unless it's protected by parody, and any non commercial "violations" of copyright is totally legal. Disney can't stop me from drawing Mickey in the privacy of my own house, just monetizing/getting famous off of them.