Most active commenters
  • TeMPOraL(5)
  • _bin_(3)

←back to thread

553 points bookofjoe | 22 comments | | HN request time: 0.001s | source | bottom
Show context
adzm ◴[] No.43654878[source]
Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care. The AI features in Photoshop are the best around in my experience and come in handy constantly for all sorts of touchup work.

Anyway I don't really think they deserve a lot of the hate they get, but I do hope this encourages development of viable alternatives to their products. Photoshop is still pretty much peerless. Illustrator has a ton of competitors catching up. After Effects and Premiere for video editing are getting overtaken by Davinci Resolve -- though for motion graphics it is still hard to beat After Effects. Though I do love that Adobe simply uses JavaScript for its expression and scripting language.

replies(36): >>43654900 #>>43655311 #>>43655626 #>>43655700 #>>43655747 #>>43655859 #>>43655907 #>>43657271 #>>43657436 #>>43658069 #>>43658095 #>>43658187 #>>43658412 #>>43658496 #>>43658624 #>>43659012 #>>43659378 #>>43659401 #>>43659469 #>>43659478 #>>43659507 #>>43659546 #>>43659648 #>>43659715 #>>43659810 #>>43660283 #>>43661100 #>>43661103 #>>43661122 #>>43661755 #>>43664378 #>>43664554 #>>43665148 #>>43667578 #>>43674357 #>>43674455 #
f33d5173 ◴[] No.43655907[source]
Adobe isn't trying to be ethical, they are trying to be more legally compliant, because they see that as a market opportunity. Otoh, artists complain about legal compliance of AIs not because that is what they care about, but because they see that as their only possible redress against a phenomenon they find distasteful. A legal reality where you can only train AI on content you've licensed would be the worst for everybody bar massive companies, legacy artists included.
replies(7): >>43658034 #>>43658253 #>>43659203 #>>43659245 #>>43659443 #>>43659929 #>>43661258 #
1. _bin_ ◴[] No.43658034[source]
Right, but "distaste" isn't grounds for trying to ban something. There are all kinds of things people and companies do which I dislike but for which there's no just basis for regulating. If Adobe properly licenses all their training data artists don't have a right to say "well i think this is bad for creativity and puts my job at risk, ban it!!!" Or more precisely, they have a right to say that, but no moral justification for trying to ban/regulate/sue over it.

I hate Adobe's subscription model as much as the next guy and that's a good reason to get annoyed at them. Adobe building AI features is not.

replies(5): >>43658454 #>>43659616 #>>43660867 #>>43663988 #>>43667492 #
2. TeMPOraL ◴[] No.43658454[source]
> Right, but "distaste" isn't grounds for trying to ban something.

It isn't, but it doesn't stop people from trying and hoping for a miracle. That's pretty much all there is to the arguments of image models, as well as LLMs, being trained in violation of copyright - it's distaste and greed[0], with a slice of basic legalese on top to confuse people into believing the law says what it doesn't (at least yet) on top.

> If Adobe properly licenses all their training data artists don't have a right to say "well i think this is bad for creativity and puts my job at risk, ban it!!!" Or more precisely, they have a right to say that, but no moral justification for trying to ban/regulate/sue over it.

I'd say they have plenty of moral / ethical justification for trying to ban/regulate/sue over it, they just don't have much of a legal one at this point. But that's why they should be trying[1] - they have a legitimate argument that this is an unexpected, undeserved, unfair calamity for them, threatening to derail their lives, and lives of their dependents, across the entire sector - and therefore that laws should be changed to shield them, or compensate them for the loss. After all, that's what laws are for.

(Let's not forget that the entire legal edifice around recognizing and protecting "intellectual property" is an entirely artificial construct that goes against the nature of information and knowledge, forcing information to behave like physical goods, so it's not unfair to the creators in an economy that's built around trading physical goods. IP laws were built on moral arguments, so it's only fair to change them on moral grounds too.)

--

[0] - Greed is more visible in the LLM theatre of this conflict, because with textual content there's vastly more people who believe that they're entitled to compensation just because some comments they wrote on the Internet may have been part of the training dataset, and are appalled to see LLM providers get paid for the service while they are not. This Dog in the Manger mentality is distinct from that of people whose output was used in training a model that now directly competes with them for their job; the latter have legitimate ethical reasons to complain.

[1] - Even though myself I am for treating training datasets to generative AI as exempt from copyright. I think it'll be better for society in general - but I recognize it's easy for me to say it, because I'm not the one being rugpulled out of a career path by GenAI, watching it going from 0 to being half of the way towards automating away visual arts, in just ~5 years.

replies(3): >>43659609 #>>43663932 #>>43667684 #
3. skissane ◴[] No.43659609[source]
> they have a legitimate argument that this is an unexpected, undeserved, unfair calamity for them, threatening to derail their lives, and lives of their dependents, across the entire sector - and therefore that laws should be changed to shield them, or compensate them for the loss. After all, that's what laws are for.

Lots of people have had their lives disrupted by technological and economic changes before - entire careers which existed a century ago are now gone. Given society provided little or no compensation for prior such cases of disruption, what’s the argument for doing differently here?

replies(4): >>43659678 #>>43660270 #>>43661387 #>>43669091 #
4. skywhopper ◴[] No.43659616[source]
In the context of encouraging art, it totally is! Copyright and patents are 100% artificial and invented legal concepts that are based solely on the distaste for others profiting off a creator’s ideas. The reason for them is to encourage creativity by allowing creators to profit off new ideas.

So there’s no reason why “distaste” about AI abuse of human artists’ work shouldn’t be a valid reason to regulate or ban it. If society values the creation of new art and inventions, then it will create artificial barriers to encourage their creation.

replies(2): >>43661607 #>>43666848 #
5. TeMPOraL ◴[] No.43659678{3}[source]
Moral growth and learning from history?
replies(1): >>43659856 #
6. skissane ◴[] No.43659856{4}[source]
There’s a big risk that you end up creating a scheme to compensate for technological disruption in one industry and then fail to do so in another, based on the political clout / mindshare / media attention each has - and then there are many people in even worse personal situations (through no fault of their own) who would also miss out.

Wouldn’t a better alternative be to work on improving social safety nets for everybody, as opposed to providing a bespoke one for a single industry?

replies(1): >>43668694 #
7. CamperBob2 ◴[] No.43660270{3}[source]
Given society provided little or no compensation for prior such cases of disruption

That's going to be hard for you to justify in the long run, I think. Virtually everybody who ever lost a job to technology ended up better off for it.

replies(2): >>43660652 #>>43664103 #
8. disconcision ◴[] No.43660652{4}[source]
> Virtually everybody who ever lost a job to technology ended up better off for it.

this feels like a much stronger claim than is typically made about the benefits of technological progress

replies(1): >>43661389 #
9. cratermoon ◴[] No.43660867[source]
> Right, but "distaste" isn't grounds for trying to ban something

I disagree. There are many laws on the books codifying social distastes. They keep your local vice squad busy.

replies(1): >>43666817 #
10. petre ◴[] No.43661387{3}[source]
You're only going yo get "AI art" in the future because artists will have get a second job at McDonalds to survive. The same old themes all over again. It's like the only music is Richard Clayderman tunes.
11. CamperBob2 ◴[] No.43661389{5}[source]
Certainly no stronger than the claim I was responding to. They are essentially pining for the return of careers that haven't existed for a century.
12. bmacho ◴[] No.43661607[source]
Yup, banning AI for the sake of artist would be exactly the same as the current copyright laws. (Also they are attacking AI not purely for fear of their jobs, but bc it is illegal already.)
13. anileated ◴[] No.43663932[source]
> The entire legal edifice around recognizing and protecting intellectual property is an entirely artificial construct

The presence of “natural” vs. “artificial” argument is a placeholder for nonexistent substantiation. There is never a case when it does anything else but add a disguise of objectivity to some wild opinion.

Artificial as opposed to what? Do you consider what humans do is “unnatural” because humans are somehow not part of nature?

If some humans (in case of big tech abusing copyright, vast majority, once the realization reaches the masses) want something and other humans don’t, what exactly makes one natural and another unnatural other than your own belonging to one group or the other?

> that goes against the nature of information and knowledge

What is that nature of information and knowledge that you speak about?

> forcing information to behave like physical goods, so it's not unfair to the creators in an economy that's built around trading physical goods

Its point has been to encourage innovation, creativity, and open information sharing—exactly those things that gave us ML and LLMs. We would have none of these in that rosy land of IP communism where no idea or original work belongs to its author that you envision.

Recognition of intellectual ownership of original work (coming in many shapes, including control over how it is distributed, ability to monetize it, and just being able to say you have done it) is the primary incentive for people to do truly original work. You know, the work that gave us GNU Linux et al., true innovation that tends to come when people are not giving their work to their employer in return for paycheck.

> IP laws were built on moral arguments, so it's only fair to change them on moral grounds too.

That is, perhaps, the exact point of people who argue that copyright law should be changed or at least clarified as new technology appears.

14. TeMPOraL ◴[] No.43664103{4}[source]
> Virtually everybody who ever lost a job to technology ended up better off for it.

That's plain wrong, and quite obviously so. You're demonstrating here a very common misunderstanding of the arguments people affected by (or worried about) automation taking their jobs make. In a very concise form:

- It's true that society and humanity so far always benefited from eliminating jobs through technology, in the long term.

- It's not true that society and humanity benefited in the immediate term, due to the economic and social disruption. And, most importantly:

- It's not true that people who lost jobs to technology were better off for it - those people, those specific individuals, as well as their families and local communities, were all screwed over by progress, having their lives permanently disrupted, and in many cases being thrown into poverty for generations.

(Hint: yes, there may be new jobs to replace old ones, but those jobs are there for the next generation of people, not for those who just lost theirs.)

Understanding that distinction - society vs. individual victims - will help make sense of e.g. why Luddites destroyed the new mechanized looms and weaving frames. It was not about technology, it was about capital owners pulling the rug from under them, and leaving them and their children to starve.

15. _bin_ ◴[] No.43666817[source]
I thought most people supported moving away from that and towards a more socially liberal model. If we're no longer doing that I have a whole stack of socially conservative policies I guess I'll go back to pushing.

I don't think y'all really want to go down this road; it leads straight back to the nineties republicans holding senate hearings on what's acceptable content for a music album.

replies(1): >>43667753 #
16. _bin_ ◴[] No.43666848[source]
Disagree. Authority is given Congress to establish an IP regime for the purpose of "promot[ing] the progress of science and useful arts". You would have to justify how banning gen AI is a. feasible at all, particularly with open-weight models; and b. how it "promotes the progress of useful arts." You would lose in court because it's very difficult to argue that keeping art as a skilled craftsman's trade is worse for its progress than lowering the barriers to individuals expressing what they see.

I think bad AI makes bad output and so a few people are worried it will replace good human art with bad AI art. Realistically, the stuff it's replacing now is bad human art: stock photos and clipart stuff that weren't really creative expression to start with. As it improves, we'll be increasingly able to go do a targeted inpaint to create images that more closely match our creative vision. There's a path here that lowers the barriers for someone getting his ideas into a visual form and that's an unambiguous good, unless you're one of the "craftsmen" who invested time to learn the old way.

It's almost exactly the same as AI development. As an experienced dev who knows the ins and outs really well I look at AI code and say, "wow, that's garbage." But people are using it to make unimportant webshit frontends, not do "serious work". Once it can do "serious work" that will decrease the number of jobs in the field but be good for software development as a whole.

17. weregiraffe ◴[] No.43667492[source]
>Right, but "distaste" isn't grounds for trying to ban something.

https://en.wikipedia.org/wiki/United_States_obscenity_law

18. fc417fc802 ◴[] No.43667684[source]
> I'm not the one being rugpulled out of a career path by GenAI,

That's quite a bold assumption. Betting that logic and reasoning ability plateaus prior to "full stack developer" seems like a very risky gamble.

replies(1): >>43671087 #
19. fc417fc802 ◴[] No.43667753{3}[source]
Many laws come down to distaste at the root. There's usually an alternative angle about market efficiency or social stability or whatever if you want to frame it that way. The same applies in this case as well.

For but a few examples consider laws regarding gambling, many aspects of zoning, or deceptive marketing.

What's the purpose of the law if not providing stability? Why should social issues be exempted from that?

20. TeMPOraL ◴[] No.43668694{5}[source]
> Wouldn’t a better alternative be to work on improving social safety nets for everybody, as opposed to providing a bespoke one for a single industry?

Yes, but:

1) It's not really an exclusive choice; different people can pursue different angles, including all of them - one can both seek immediate support/compensation for the specific case they're the victim of and seek longer-term solution for everyone who'd face the same problem in the future.

2) A bespoke solution is much more likely to be achievable than a general one.

3) I don't believe it would be good for society for artists to succeed in curtailing generative AI! But, should they succeed, I imagine the consequences will encourage people to seek the more general solution that mitigates occupational damage of GenAI while preserving its availability, instead of having to deal with a series of bespoke stopgaps that also kills GenAI entirely.

4) Not that banning GenAI has any chance of succeeding - the most we'd get is it being unavailable in some countries, who'd then be at a disadvantage in competition with countries that embraced it.

Again, I'm not in favor of banning GenAI - on the contrary, I'm in favor of giving a blanket exception from copyright laws for purposes of training generative models. However, I recognize the plight of artists and other people who are feeling the negative economic impact on their jobs right now (and hell, my own line of work - software development - is still one of the most at risk in the near to mid-term, too); I wish for a solution that will help them (and others about to be in this situation), but in the meantime, I don't begrudge them for trying to fight it - I think they have full right to. I only have problems with people who oppose AI because they feel that Big AI is depriving them of opportunity to seek rent from society for the value AI models are creating.

21. int_19h ◴[] No.43669091{3}[source]
It depends on who was harmed. When countries were banning slavery (or serfdom in places where it was functionally equivalent, like Russia), slave owners made this very argument that depriving them of legitimately acquired workpower was an undeserved and unfair calamity for them, and were generally compensated.
22. TeMPOraL ◴[] No.43671087{3}[source]
I meant right now. I acknowledge elsewhere that software development is still near the top of the list, but it isn't affecting us just yet in the way it affects artists today.