←back to thread

553 points bookofjoe | 4 comments | | HN request time: 0.16s | source
Show context
adzm ◴[] No.43654878[source]
Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care. The AI features in Photoshop are the best around in my experience and come in handy constantly for all sorts of touchup work.

Anyway I don't really think they deserve a lot of the hate they get, but I do hope this encourages development of viable alternatives to their products. Photoshop is still pretty much peerless. Illustrator has a ton of competitors catching up. After Effects and Premiere for video editing are getting overtaken by Davinci Resolve -- though for motion graphics it is still hard to beat After Effects. Though I do love that Adobe simply uses JavaScript for its expression and scripting language.

replies(36): >>43654900 #>>43655311 #>>43655626 #>>43655700 #>>43655747 #>>43655859 #>>43655907 #>>43657271 #>>43657436 #>>43658069 #>>43658095 #>>43658187 #>>43658412 #>>43658496 #>>43658624 #>>43659012 #>>43659378 #>>43659401 #>>43659469 #>>43659478 #>>43659507 #>>43659546 #>>43659648 #>>43659715 #>>43659810 #>>43660283 #>>43661100 #>>43661103 #>>43661122 #>>43661755 #>>43664378 #>>43664554 #>>43665148 #>>43667578 #>>43674357 #>>43674455 #
AnthonyMouse ◴[] No.43659810[source]
> Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care.

It's because nobody actually wants that.

Artists don't like AI image generators because they have to compete with them, not because of how they were trained. How they were trained is just the the most plausible claim they can make against them if they want to sue OpenAI et al over it, or to make a moral argument that some kind of misappropriation is occurring.

From the perspective of an artist, a corporation training an AI image generator in a way that isn't susceptible to moral or legal assault is worse, because then it exists and they have to compete with it and there is no visible path for them to make it go away.

replies(7): >>43659874 #>>43660487 #>>43662522 #>>43663679 #>>43668300 #>>43670381 #>>43683088 #
Sir_Twist ◴[] No.43660487[source]
I'd say that is a bit of an ungenerous characterization. Is it possible that it could be both? That while artists maybe do feel under attack in terms of competition, that there is a genuine ethical dilemma at hand?

If I were an artist, and I made a painting and published it to a site which was then used to train an LLM, I would feel as though the AI company treated me disingenuously, regardless of competition or not. Intellectual property laws aside, I think there is a social contract being broken when a publicly shared work is then used without the artist's direct, explicit permission.

replies(4): >>43660625 #>>43660937 #>>43660970 #>>43661337 #
AnthonyMouse ◴[] No.43660970[source]
> Is it possible that it could be both? That while artists maybe do feel under attack in terms of competition, that there is a genuine ethical dilemma at hand?

The rights artists have over their work are economic rights. The most important fair use factor is how the use affects the market for the original work. If Disney is lobbying for copyright term extensions and you want to make art showing Mickey Mouse in a cage with the CEO of Disney as the jailer, that's allowed even though you're not allowed to open a movie theater and show Fantasia without paying for it, and even though (even because!) Disney would not approve of you using Mickey to oppose their lobbying position. And once the copyright expires you can do as you like.

So the ethical argument against AI training is that the AI is going to compete with them and make it harder for them to make a living. But substantially the same thing happens if the AI is trained on some other artist's work instead. Whose work it was has minimal impact on the economic consequences for artists in general. And being one of the artists who got a pittance for the training data is little consolation either.

The real ethical question is whether it's okay to put artists out of business by providing AI-generated images at negligible cost. If the answer is no, it doesn't really matter which artists were in the training data. If the answer is yes, it doesn't really matter which artists were in the training data.

replies(3): >>43661004 #>>43661497 #>>43662059 #
1. card_zero ◴[] No.43661497[source]
> But substantially the same thing happens if the AI is trained on some other artist's work instead.

You could take that further and say that "substantially the same thing" happens if the AI is trained on music instead. It's just another kind of artwork, right? Somebody who was going to have an illustration by [illustrator with distinctive style] might choose to have music instead, so the music is in competition, so all that illustrator's art might as well be in the training data, and that doesn't matter because the artist would get competed with either way. Says you.

replies(1): >>43665550 #
2. AnthonyMouse ◴[] No.43665550[source]
If you type "street art" as part of an image generation prompt, the results are quite similar to typing "in the style of Banksy". They're direct substitutes for each other, neither of them is actually going to produce Banksy-quality output and it's not even obvious which one will produce better results for a given prompt.

You still get images in a particular style by specifying the name of the style instead of the name of the artist. Do you really think this is no different than being able to produce only music when you want an image?

replies(1): >>43667749 #
3. card_zero ◴[] No.43667749[source]
This hinges on denying that artists have distinctive personal styles. Instead your theory seems to be that styles are genres, and that the AI only needs to be trained on the genre, not the specific artist's output, in order to produce that artist's style. Which under this theory is equivalent to the generic style.

My counter-argument is "no". Ideally I'd elaborate on that. So ummm ... no, that's not the way things are. Is it?

replies(1): >>43669069 #
4. int_19h ◴[] No.43669069{3}[source]
The argument here isn't so much that individual artists don't have their specific styles, but rather whether AI actually tracks that, or whether using "in the style of ..." is effectively a substitute for identifying the more general style category to which this artist belongs.