←back to thread

553 points bookofjoe | 1 comments | | HN request time: 0.285s | source
Show context
adzm ◴[] No.43654878[source]
Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care. The AI features in Photoshop are the best around in my experience and come in handy constantly for all sorts of touchup work.

Anyway I don't really think they deserve a lot of the hate they get, but I do hope this encourages development of viable alternatives to their products. Photoshop is still pretty much peerless. Illustrator has a ton of competitors catching up. After Effects and Premiere for video editing are getting overtaken by Davinci Resolve -- though for motion graphics it is still hard to beat After Effects. Though I do love that Adobe simply uses JavaScript for its expression and scripting language.

replies(36): >>43654900 #>>43655311 #>>43655626 #>>43655700 #>>43655747 #>>43655859 #>>43655907 #>>43657271 #>>43657436 #>>43658069 #>>43658095 #>>43658187 #>>43658412 #>>43658496 #>>43658624 #>>43659012 #>>43659378 #>>43659401 #>>43659469 #>>43659478 #>>43659507 #>>43659546 #>>43659648 #>>43659715 #>>43659810 #>>43660283 #>>43661100 #>>43661103 #>>43661122 #>>43661755 #>>43664378 #>>43664554 #>>43665148 #>>43667578 #>>43674357 #>>43674455 #
f33d5173 ◴[] No.43655907[source]
Adobe isn't trying to be ethical, they are trying to be more legally compliant, because they see that as a market opportunity. Otoh, artists complain about legal compliance of AIs not because that is what they care about, but because they see that as their only possible redress against a phenomenon they find distasteful. A legal reality where you can only train AI on content you've licensed would be the worst for everybody bar massive companies, legacy artists included.
replies(7): >>43658034 #>>43658253 #>>43659203 #>>43659245 #>>43659443 #>>43659929 #>>43661258 #
_bin_ ◴[] No.43658034[source]
Right, but "distaste" isn't grounds for trying to ban something. There are all kinds of things people and companies do which I dislike but for which there's no just basis for regulating. If Adobe properly licenses all their training data artists don't have a right to say "well i think this is bad for creativity and puts my job at risk, ban it!!!" Or more precisely, they have a right to say that, but no moral justification for trying to ban/regulate/sue over it.

I hate Adobe's subscription model as much as the next guy and that's a good reason to get annoyed at them. Adobe building AI features is not.

replies(5): >>43658454 #>>43659616 #>>43660867 #>>43663988 #>>43667492 #
TeMPOraL ◴[] No.43658454[source]
> Right, but "distaste" isn't grounds for trying to ban something.

It isn't, but it doesn't stop people from trying and hoping for a miracle. That's pretty much all there is to the arguments of image models, as well as LLMs, being trained in violation of copyright - it's distaste and greed[0], with a slice of basic legalese on top to confuse people into believing the law says what it doesn't (at least yet) on top.

> If Adobe properly licenses all their training data artists don't have a right to say "well i think this is bad for creativity and puts my job at risk, ban it!!!" Or more precisely, they have a right to say that, but no moral justification for trying to ban/regulate/sue over it.

I'd say they have plenty of moral / ethical justification for trying to ban/regulate/sue over it, they just don't have much of a legal one at this point. But that's why they should be trying[1] - they have a legitimate argument that this is an unexpected, undeserved, unfair calamity for them, threatening to derail their lives, and lives of their dependents, across the entire sector - and therefore that laws should be changed to shield them, or compensate them for the loss. After all, that's what laws are for.

(Let's not forget that the entire legal edifice around recognizing and protecting "intellectual property" is an entirely artificial construct that goes against the nature of information and knowledge, forcing information to behave like physical goods, so it's not unfair to the creators in an economy that's built around trading physical goods. IP laws were built on moral arguments, so it's only fair to change them on moral grounds too.)

--

[0] - Greed is more visible in the LLM theatre of this conflict, because with textual content there's vastly more people who believe that they're entitled to compensation just because some comments they wrote on the Internet may have been part of the training dataset, and are appalled to see LLM providers get paid for the service while they are not. This Dog in the Manger mentality is distinct from that of people whose output was used in training a model that now directly competes with them for their job; the latter have legitimate ethical reasons to complain.

[1] - Even though myself I am for treating training datasets to generative AI as exempt from copyright. I think it'll be better for society in general - but I recognize it's easy for me to say it, because I'm not the one being rugpulled out of a career path by GenAI, watching it going from 0 to being half of the way towards automating away visual arts, in just ~5 years.

replies(3): >>43659609 #>>43663932 #>>43667684 #
fc417fc802 ◴[] No.43667684[source]
> I'm not the one being rugpulled out of a career path by GenAI,

That's quite a bold assumption. Betting that logic and reasoning ability plateaus prior to "full stack developer" seems like a very risky gamble.

replies(1): >>43671087 #
1. TeMPOraL ◴[] No.43671087[source]
I meant right now. I acknowledge elsewhere that software development is still near the top of the list, but it isn't affecting us just yet in the way it affects artists today.