←back to thread

553 points bookofjoe | 3 comments | | HN request time: 0s | source
Show context
adzm ◴[] No.43654878[source]
Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care. The AI features in Photoshop are the best around in my experience and come in handy constantly for all sorts of touchup work.

Anyway I don't really think they deserve a lot of the hate they get, but I do hope this encourages development of viable alternatives to their products. Photoshop is still pretty much peerless. Illustrator has a ton of competitors catching up. After Effects and Premiere for video editing are getting overtaken by Davinci Resolve -- though for motion graphics it is still hard to beat After Effects. Though I do love that Adobe simply uses JavaScript for its expression and scripting language.

replies(36): >>43654900 #>>43655311 #>>43655626 #>>43655700 #>>43655747 #>>43655859 #>>43655907 #>>43657271 #>>43657436 #>>43658069 #>>43658095 #>>43658187 #>>43658412 #>>43658496 #>>43658624 #>>43659012 #>>43659378 #>>43659401 #>>43659469 #>>43659478 #>>43659507 #>>43659546 #>>43659648 #>>43659715 #>>43659810 #>>43660283 #>>43661100 #>>43661103 #>>43661122 #>>43661755 #>>43664378 #>>43664554 #>>43665148 #>>43667578 #>>43674357 #>>43674455 #
AnthonyMouse ◴[] No.43659810[source]
> Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care.

It's because nobody actually wants that.

Artists don't like AI image generators because they have to compete with them, not because of how they were trained. How they were trained is just the the most plausible claim they can make against them if they want to sue OpenAI et al over it, or to make a moral argument that some kind of misappropriation is occurring.

From the perspective of an artist, a corporation training an AI image generator in a way that isn't susceptible to moral or legal assault is worse, because then it exists and they have to compete with it and there is no visible path for them to make it go away.

replies(7): >>43659874 #>>43660487 #>>43662522 #>>43663679 #>>43668300 #>>43670381 #>>43683088 #
Sir_Twist ◴[] No.43660487[source]
I'd say that is a bit of an ungenerous characterization. Is it possible that it could be both? That while artists maybe do feel under attack in terms of competition, that there is a genuine ethical dilemma at hand?

If I were an artist, and I made a painting and published it to a site which was then used to train an LLM, I would feel as though the AI company treated me disingenuously, regardless of competition or not. Intellectual property laws aside, I think there is a social contract being broken when a publicly shared work is then used without the artist's direct, explicit permission.

replies(4): >>43660625 #>>43660937 #>>43660970 #>>43661337 #
AnthonyMouse ◴[] No.43660970[source]
> Is it possible that it could be both? That while artists maybe do feel under attack in terms of competition, that there is a genuine ethical dilemma at hand?

The rights artists have over their work are economic rights. The most important fair use factor is how the use affects the market for the original work. If Disney is lobbying for copyright term extensions and you want to make art showing Mickey Mouse in a cage with the CEO of Disney as the jailer, that's allowed even though you're not allowed to open a movie theater and show Fantasia without paying for it, and even though (even because!) Disney would not approve of you using Mickey to oppose their lobbying position. And once the copyright expires you can do as you like.

So the ethical argument against AI training is that the AI is going to compete with them and make it harder for them to make a living. But substantially the same thing happens if the AI is trained on some other artist's work instead. Whose work it was has minimal impact on the economic consequences for artists in general. And being one of the artists who got a pittance for the training data is little consolation either.

The real ethical question is whether it's okay to put artists out of business by providing AI-generated images at negligible cost. If the answer is no, it doesn't really matter which artists were in the training data. If the answer is yes, it doesn't really matter which artists were in the training data.

replies(3): >>43661004 #>>43661497 #>>43662059 #
1. pastage ◴[] No.43662059[source]
Actually moral rights is what allow you to say no to AI. It is also a big part of copyright and more important in places were fair use does not exist in the extent it does in the US.

Further making a variant of a famous art piece under copyright might very well be a derivative. There are court cases here just some years for the AI boom were a format shift from photo to painting was deemed to be a derivative. The picture generated with "Painting of a archeologist with a whip" will almost certainly be deemed a derivative if it would go through the same court.

replies(1): >>43665639 #
2. AnthonyMouse ◴[] No.43665639[source]
> Actually moral rights is what allow you to say no to AI.

The US doesn't really have moral rights and it's not clear they're even constitutional in the US, since the copyright clause explicitly requires "promote the progress" and "limited times" and many aspects of "moral rights" would be violations of the First Amendment. Whether they exist in some other country doesn't really help you when it's US companies doing it in the US.

> Further making a variant of a famous art piece under copyright might very well be a derivative.

Well of course it is. That's what derivative works are. You can also produce derivative works with Photoshop or MS Paint, but that doesn't mean the purpose of MS Paint is to produce derivative works or that it's Microsoft rather than the user purposely creating a derivative work who should be responsible for that.

replies(1): >>43667546 #
3. fc417fc802 ◴[] No.43667546[source]
Well one could argue that this ought to be a discussion of morality and social acceptability rather than legality. After all the former can eventually lead to the latter. However if you make that argument you immediately run into the issue that there clearly isn't broad consensus on this topic.

Personally I'm inclined to liken ML tools to backhoes. I don't want the law to force ditches to be dug by hand. I'm not a fan of busywork.