Most active commenters
  • stevenAthompson(6)
  • toast0(3)

←back to thread

452 points croes | 13 comments | | HN request time: 1.584s | source | bottom
Show context
mattxxx ◴[] No.43962976[source]
Well, firing someone for this is super weird. It seems like an attempt to censor an interpretation of the law that:

1. Criticizes a highly useful technology 2. Matches a potentially-outdated, strict interpretation of copyright law

My opinion: I think using copyrighted data to train models for sure seems classically illegal. Despite that, Humans can read a book, get inspiration, and write a new book and not be litigated against. When I look at the litany of derivative fantasy novels, it's obvious they're not all fully independent works.

Since AI is and will continue to be so useful and transformative, I think we just need to acknowledge that our laws did not accomodate this use-case, then we should change them.

replies(19): >>43963017 #>>43963125 #>>43963168 #>>43963214 #>>43963243 #>>43963311 #>>43963423 #>>43963517 #>>43963612 #>>43963721 #>>43963943 #>>43964079 #>>43964280 #>>43964365 #>>43964448 #>>43964562 #>>43965792 #>>43965920 #>>43976732 #
1. stevenAthompson ◴[] No.43963243[source]
Doing a cover song requires permission, and doing it without that permission can be illegal. Being inspired by a song to write your own is very legal.

AI is fine as long as the work it generates is substantially new and transformative. If it breaks and starts spitting out other peoples work verbatim (or nearly verbatim) there is a problem.

Yes, I'm aware that machines aren't people and can't be "inspired", but if the functional results are the same the law should be the same. Vaguely defined ideas like your soul or "inspiration" aren't real. The output is real, measurable, and quantifiable and that's how it should be judged.

replies(3): >>43963561 #>>43963629 #>>43964441 #
2. toast0 ◴[] No.43963561[source]
> Doing a cover song requires permission, and doing it without that permission can be illegal.

I believe cover song licensing is available mechanically; you don't need permission, you just need to follow the procedures including sending the licensing fees to a rights clearing house. Music has a lot of mechanical licenses and clearing houses, as opposed to other categories of works.

replies(1): >>43965692 #
3. mjburgess ◴[] No.43963629[source]
I fear the lack of our ability to measure your mind might render you without many of the legal or moral protections you imagine you have. But go ahead, tare down the law to whatever inanity can be described by the trivial machines of the world's current popular charlatans. Presumably you weren't using society's presumption of your agency anyway.
replies(1): >>43965409 #
4. datavirtue ◴[] No.43964441[source]
"If it breaks and starts spitting out other peoples work verbatim (or nearly verbatim) there is a problem."

Why is that? Seems all logic gets thrown out the window when invoking AI around here. References are given. If the user publishes the output without attribution, NOW you have a problem. People are being so rabid and unreasonable here. Totally bat shit.

replies(1): >>43965672 #
5. stevenAthompson ◴[] No.43965409[source]
> I fear the lack of our ability to measure your mind might render you without many of the legal or moral protections you imagine you have.

Society doesn't need to measure my mind, they need to measure the output of it. If I behave like a conscious being, I am a conscious being. Alternatively you might phrase it such that "Anything that claims to be conscious must be assumed to be conscious."

It's the only answer to the p-zombie problem that makes sense. None of this is new, philosophers have been debating it for ages. See: https://en.wikipedia.org/wiki/Philosophical_zombie

However, for copyright purposes we can make it even simpler. If the work is new, it's not covered by the original copyright. If it is substantially the same, it isn't. Forget the arguments about the ghost in the machine and the philosophical mumbo-jumbo. It's the output that matters.

replies(1): >>43965699 #
6. stevenAthompson ◴[] No.43965672[source]
> If the user publishes the output without attribution, NOW you have a problem.

I didn't meant to imply that the AI can't quote Shakespeare in Context, just that it shouldn't try to pass off Shakespeare as it's own or plagiarize huge swathes of the source text.

> People are being so rabid and unreasonable here.

People here are more reasonable than average. Wait until mainstream society starts to really feel the impact of all this.

7. stevenAthompson ◴[] No.43965692[source]
> you don't need permission, you just need to follow the procedures

Those procedures are how you ask for permission. As you say, it usually involves a fee but doesn't have to.

replies(1): >>43966650 #
8. mjburgess ◴[] No.43965699{3}[source]
In your case, it isnt the output that matters. Your saying "I'm conscious" isn't why we attribute consciousness to you. We would do so regardless of your ability to verbalise anything in particular.

Your radical behaviourism seems an advantage to you when you want to delete one disfavoured part of copyright law, but I assure you, it isn't in your interest. It doesnt universalise well at all. You do not want to be defined by how you happen to verbalise anything, unmoored from your intention, goals, and so on.

The law, and society, imparts much to you that is never measured and much that is unmeasurable. What can be measured is, at least, extremely ambiguous with respect to those mental states which are being attributed. Because we do not attribute mental states by what people say -- this plays very little role (consider what a mess this would make of watching movies). And none of course in the large number of animals which share relevant mental states.

Nothing of relevance is measured by an LLM's output. It is highly unambigious: the LLM has no mental states, and thus is irrelevant to the law, morality, society and everything else.

It's a obcene sort of self-injury to assume that whatever kind of radical behaviourism is necessary to hype the LLM is the right sort. Hype for LLMs does not lead to a credible theory of minds.

replies(1): >>43966504 #
9. stevenAthompson ◴[] No.43966504{4}[source]
> We would do so regardless of your ability to verbalise anything in particular

I don't mean to say that they literally have to speak the words by using their meat to make the air vibrate. Just that, presuming it has some physical means, it be capable (and willing) to express it in some way.

> It's a obcene sort of self-injury to assume that whatever kind of radical behaviourism is necessary to hype the LLM is the right sort.

I appreciate why you might feel that way. However, I feel it's far worse to pretend we have some undetectable magic within us that allows us to perceive the "realness" of others peoples consciousness by other than physical means.

Fundamentally, you seem to be arguing that something with outputs identical to a human is not human (or even human like), and should not be viewed within the same framework. Do you see how dangerous an idea that is? It is only a short hop from "Humans are different than robots, because of subjective magic" to "Humans are different than <insert race you don't like>, because of subjective magic."

10. toast0 ◴[] No.43966650{3}[source]
(in the US) Mechanical licenses are compulsory; you don't need permission, you can just follow the forms and pay the fees set by the Copyright Royalty Board (appointed by the Librarian of Congress). You can ask the rightsholder to negotiate a lower fee, but there's no need for consent of the rightsholder if you notify as required (within 30 days of recording and before distribution) and pay the set fees.
replies(1): >>43967107 #
11. stevenAthompson ◴[] No.43967107{4}[source]
Thanks for clarifying. Sometimes I forget that HN has a lot experts floating around who take things in a very literal and legalistic way. I was speaking in more general terms, and missed that you were being very precise with your language.

Compulsory licenses are interesting aren't they? It just feels wrong. If Metallica doesn't want me to butcher their songs, why should the be forced to allow it?

replies(2): >>43967433 #>>43967596 #
12. skolskoly ◴[] No.43967433{5}[source]
Any live band performing a song is subject to mechanical licensing as much as a recording artist. Typically the venue pays it, just like how radio stations pay royalties. This system exists because historically, that's how music reproduction worked. You hire some musicians to play the music you want to hear. Copyright applied to the score, the lyrics, and so on. The 'mechanical' rights had to come later, because recording hadn't been invented yet!
13. toast0 ◴[] No.43967596{5}[source]
They are very interesting. IMHO, it's a nice compromise between making sure the artists are paid for their work, and giving them complete control over their work. Licensing for radio-style play is also compulsory, and terrestrial radio used to not even have to pay the recording artists (I think this changed?), but did have to track and pay to ASCAP.

As a consumer, it would amazing if there were compulsory licenses for film and tv; then we wouldn't have to subscribe to 70 different services to get to the things we want to see. And there would likely be services that spring up to redistribute media where the rightsholders aren't able to or don't care to; it might be pulled from VHS that fans recorded off of TV in the old days, but at least it'd be something.