Most active commenters
  • Juliate(14)
  • squigz(9)
  • AnthonyMouse(7)
  • fc417fc802(6)
  • furyofantares(3)
  • lcnPylGDnU4H9OF(3)
  • luckylion(3)

←back to thread

554 points bookofjoe | 73 comments | | HN request time: 0.087s | source | bottom
Show context
adzm ◴[] No.43654878[source]
Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care. The AI features in Photoshop are the best around in my experience and come in handy constantly for all sorts of touchup work.

Anyway I don't really think they deserve a lot of the hate they get, but I do hope this encourages development of viable alternatives to their products. Photoshop is still pretty much peerless. Illustrator has a ton of competitors catching up. After Effects and Premiere for video editing are getting overtaken by Davinci Resolve -- though for motion graphics it is still hard to beat After Effects. Though I do love that Adobe simply uses JavaScript for its expression and scripting language.

replies(36): >>43654900 #>>43655311 #>>43655626 #>>43655700 #>>43655747 #>>43655859 #>>43655907 #>>43657271 #>>43657436 #>>43658069 #>>43658095 #>>43658187 #>>43658412 #>>43658496 #>>43658624 #>>43659012 #>>43659378 #>>43659401 #>>43659469 #>>43659478 #>>43659507 #>>43659546 #>>43659648 #>>43659715 #>>43659810 #>>43660283 #>>43661100 #>>43661103 #>>43661122 #>>43661755 #>>43664378 #>>43664554 #>>43665148 #>>43667578 #>>43674357 #>>43674455 #
1. AnthonyMouse ◴[] No.43659810[source]
> Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care.

It's because nobody actually wants that.

Artists don't like AI image generators because they have to compete with them, not because of how they were trained. How they were trained is just the the most plausible claim they can make against them if they want to sue OpenAI et al over it, or to make a moral argument that some kind of misappropriation is occurring.

From the perspective of an artist, a corporation training an AI image generator in a way that isn't susceptible to moral or legal assault is worse, because then it exists and they have to compete with it and there is no visible path for them to make it go away.

replies(7): >>43659874 #>>43660487 #>>43662522 #>>43663679 #>>43668300 #>>43670381 #>>43683088 #
2. mjmsmith ◴[] No.43659874[source]
Most artists would prefer not to compete with an AI image generator that has been trained on their own artwork without their permission, for obvious reasons.
replies(2): >>43659995 #>>43660494 #
3. AnthonyMouse ◴[] No.43659995[source]
That's exactly the moral argument Adobe is taking away from them, and the same argument has minimal economic relevance because it's so rare that a customer requires a specific individual artist's style.
replies(2): >>43661174 #>>43661478 #
4. Sir_Twist ◴[] No.43660487[source]
I'd say that is a bit of an ungenerous characterization. Is it possible that it could be both? That while artists maybe do feel under attack in terms of competition, that there is a genuine ethical dilemma at hand?

If I were an artist, and I made a painting and published it to a site which was then used to train an LLM, I would feel as though the AI company treated me disingenuously, regardless of competition or not. Intellectual property laws aside, I think there is a social contract being broken when a publicly shared work is then used without the artist's direct, explicit permission.

replies(4): >>43660625 #>>43660937 #>>43660970 #>>43661337 #
5. unethical_ban ◴[] No.43660494[source]
He's arguing that artists are so scared of Adobe and AI that they actually want Adobe to be more evil so artists have more to complain about.
replies(1): >>43660739 #
6. kmeisthax ◴[] No.43660625[source]
Artists do not want to get paid micropennies for use-of-training-data licenses for something that destroys the market for new art. And that's the only claim Adobe Firefly makes for being ethical. Adobe used a EULA Roofie to make all their Adobe Stock contributors consent to getting monthly payments for images trained on in Firefly.
replies(1): >>43660897 #
7. AnthonyMouse ◴[] No.43660739{3}[source]
They want AI image generation to go away. That isn't likely to happen, but their best hope would be to make copyright claims or try to turn the public against AI companies with accusations of misappropriation. Adobe's "ethical" image generator would be immune to those claims while still doing nothing to address their primary concern, the economic consequences. It takes away their ammunition while leaving their target standing. Are they supposed to like a company doing that or does it just make them even more upset?
8. Sir_Twist ◴[] No.43660897{3}[source]
Indeed, and I agree that Adobe is in the wrong here. For an agreement between Adobe and an artist to be truly permissive, the artist should have the ability to not give their consent. Ethically, I think Adobe is in the same position as the other AI companies – if the artist doesn't directly (EULAs are not direct, in my opinion) agree to the terms, and if they don't have the option to decline, then it isn't an agreement, it is an method of coercion. If an artist, like you said, doesn't want to be paid micropennies, they shouldn't have to agree.

I believe it is completely reasonable for an artist to want to share their work publicly on the Internet without fear of it being appropriated, and I wish there was a pragmatic way they could achieve this.

9. furyofantares ◴[] No.43660937[source]
I've never seen anyone make the complaint about image classifiers or image segmentation. It's only for generative models and only once they got good enough to be useful.
replies(1): >>43663369 #
10. AnthonyMouse ◴[] No.43660970[source]
> Is it possible that it could be both? That while artists maybe do feel under attack in terms of competition, that there is a genuine ethical dilemma at hand?

The rights artists have over their work are economic rights. The most important fair use factor is how the use affects the market for the original work. If Disney is lobbying for copyright term extensions and you want to make art showing Mickey Mouse in a cage with the CEO of Disney as the jailer, that's allowed even though you're not allowed to open a movie theater and show Fantasia without paying for it, and even though (even because!) Disney would not approve of you using Mickey to oppose their lobbying position. And once the copyright expires you can do as you like.

So the ethical argument against AI training is that the AI is going to compete with them and make it harder for them to make a living. But substantially the same thing happens if the AI is trained on some other artist's work instead. Whose work it was has minimal impact on the economic consequences for artists in general. And being one of the artists who got a pittance for the training data is little consolation either.

The real ethical question is whether it's okay to put artists out of business by providing AI-generated images at negligible cost. If the answer is no, it doesn't really matter which artists were in the training data. If the answer is yes, it doesn't really matter which artists were in the training data.

replies(3): >>43661004 #>>43661497 #>>43662059 #
11. __loam ◴[] No.43661174{3}[source]
Artists don't hate Adobe just because they're making an AI art generator, they hate Adobe because it's a predatory, scummy corporation that is difficult to work with and is the gatekeeper for common industry tools. Also, Adobe didn't take away the moral arguments against AI art, they just used previously liscened imagery that existed before they started making AI art generators. There's still an argument that it's deceptive to grandfather in previously licensed work into a new technology, and there's still an argument that spending resources on automating cultural expression is a shitty thing to do.
replies(2): >>43662191 #>>43665850 #
12. scarface_74 ◴[] No.43661337[source]
Adobe only trains its AI on properly licensed images that the artists have explicitly signed a contract with Adobe to train on.
13. mjmsmith ◴[] No.43661478{3}[source]
That must be why AI image prompts never reference an artist name.
replies(1): >>43665807 #
14. card_zero ◴[] No.43661497{3}[source]
> But substantially the same thing happens if the AI is trained on some other artist's work instead.

You could take that further and say that "substantially the same thing" happens if the AI is trained on music instead. It's just another kind of artwork, right? Somebody who was going to have an illustration by [illustrator with distinctive style] might choose to have music instead, so the music is in competition, so all that illustrator's art might as well be in the training data, and that doesn't matter because the artist would get competed with either way. Says you.

replies(1): >>43665550 #
15. becquerel ◴[] No.43661582{4}[source]
It crushes the orphans very quickly, and on command, and allows anyone to crush orphans from the comfort of their own home. Most people are low-taste enough that they don't really care about the difference between hand-crushed orphans and artisanal hand-crushed orphans.
replies(1): >>43668188 #
16. pastage ◴[] No.43662059{3}[source]
Actually moral rights is what allow you to say no to AI. It is also a big part of copyright and more important in places were fair use does not exist in the extent it does in the US.

Further making a variant of a famous art piece under copyright might very well be a derivative. There are court cases here just some years for the AI boom were a format shift from photo to painting was deemed to be a derivative. The picture generated with "Painting of a archeologist with a whip" will almost certainly be deemed a derivative if it would go through the same court.

replies(1): >>43665639 #
17. t0bia_s ◴[] No.43662191{4}[source]
As an artist, mine major complain about Adobe is their spyware software design. Constant calls for adobe servers, unable to work offline in field with their product and no support for linux.

Also, I'm curious, when they start censoring exports from their software. They already do that for money scans.

I'm not worry about image generators. They'll never generate art by definition. AI tools are same as camera back then - a new tool that still require human skills and purpose to create specific tasks.

18. squigz ◴[] No.43662522[source]
I don't think all artists are treating this tool as such an existential threat.
replies(3): >>43662734 #>>43663852 #>>43683137 #
19. bbarnett ◴[] No.43662734[source]
I don't think all artists are treating this tool as such an existential threat.

You cannot find any group, where "all" is true in such context. There's always an element of outlier.

That said, you're not really an artist if you direct someone else to paint. Imagine a scenario where you sit back, and ask someone to paint an oil painting for you. During the event, you sit in an easy chair, watch them with easel and brush, and provide direction "I want clouds", "I want a dark background". The person does so.

You're not the artist.

All this AI blather is the same. At best, you're a fashion designer. Arranging things in a pleasant way.

replies(1): >>43663049 #
20. squigz ◴[] No.43663049{3}[source]
One could say much the same thing about photographers, or digital artists. They don't use paint, or sculpt marble, so they're not real artists.
replies(1): >>43663269 #
21. Juliate ◴[] No.43663269{4}[source]
Who talked about "real" here?

Photographers do manipulate cameras, and rework afterwise the images to develop.

Digital artists do manipulate digital tools.

Their output is a large function of their informed input, experience, taste, knowledge, practice and intention, using their own specific tools in their own way.

Same with developers: the result is a function of their input (architecture, code, etc.). Garbage in, garbage out.

With AI prompters, the output is part function of the (very small) prompt, part function of the (huuuuuuuge) training set, part randomness.

If you're the director of a movie, or of a photo shoot, you're the director. Not the photographer, not the set painter, not the carpenter, not the light, etc.

If you're the producer, you're not the artist (unless you _also_ act as an artist in the production).

Do you feel the difference?

replies(3): >>43664401 #>>43667434 #>>43668211 #
22. lancebeet ◴[] No.43663369{3}[source]
I'm not entirely convinced by the artists' argument, but this argument is also unconvincing to me. If someone steals from you, but it's a negligible amount, or you don't even notice it, does that make it not stealing? If the thief then starts selling the things they stole from you, directly competing with you, are your grievances less valid now since you didn't complain about the theft before?
replies(1): >>43663875 #
23. PaulHoule ◴[] No.43663679[source]
I went through a phase of using the A.I. tools to touch up photos and thought they were helpful. If I needed to add another row of bricks to a wall or remove something they get it done. I haven’t used it in a few months because I’m taking different photos than I was back then.
replies(1): >>43663966 #
24. stafferxrr ◴[] No.43663852[source]
Of course not. People who are actually creative will use new tools creatively.

Adobe AI tools are pretty shit though if you want to use them to do something creative. Shockingly bad really.

They are probably good if you want to add a few elements to an instagram photo but terrible for actual digital art.

25. lcnPylGDnU4H9OF ◴[] No.43663875{4}[source]
Nothing was stolen from the artists but instead used without their permission. The thing being used is an idea, not anything the artist loses access to when someone else has it. What is there to complain about? Why should others listen to the complaints (disregarding copyright law because that is circular reasoning)?
replies(2): >>43664184 #>>43673530 #
26. davidee ◴[] No.43663966[source]
We used that particular feature quite heavily. A lot of our clients often have poorly cropped photos or something with branding that needed removal and the context-aware generative fill was quite good.

But we decided to drop Adobe after some of their recent shenanigans and moved to a set of tools that didn't have this ability and, frankly, we didn't really miss it that much. Certainly not enough to ever give Adobe another cent.

27. ChrisPToast ◴[] No.43664184{5}[source]
So many problems with your reasoning.

"Nothing was stolen from the artists but instead used without their permission"

Yes and no. Sure, the artist didn't loose anything physical, but neither did music or movie producers when people downloaded and shared MP3s and videos. They still won in court based on the profits they determined the "theft" cost them, and the settlements were absurdly high. How is this different? An artist's work is essentially their resume. AI companies use their work without permission to create programs specifically intended to generate similar work in seconds, this substantially impacts an artist's ability to profit from their work. You seem to be suggesting that artists have no right to control the profits their work can generate - an argument I can't imagine you would extend to corporations.

"The thing being used is an idea"

This is profoundly absurd. AI companies aren't taking ideas directly from artist's heads... yet. They're not training their models on ideas. They're training them on the actual images artists create with skills honed over decades of work.

"not anything the artist loses access to when someone else has it"

Again, see point #1. The courts have long established that what's lost in IP theft is the potential for future profits, not something directly physical. By your reasoning here, there should be no such things as patents. I should be able to take anyone or any corporation's "ideas" and use them to produce my own products to sell. And this is a perfect analogy - why would any corporation invest millions or billions of dollars developing a product if anyone could just take the "ideas" they came up with and immediately undercut the corporation with clones or variants of their products? Exactly similar, why would an artist invest years or decades of time honing the skills needed to create imagery if massive corporations can just take that work, feed it into their programs and generate similar work in seconds for pennies?

"What is there to complain about"

The loss of income potential, which is precisely what courts have agreed with when corporations are on the receiving end of IP theft.

"Why should others listen to the complaints"

Because what's happening is objectively wrong. You are exactly the kind of person the corporatocracy wants - someone who just say "Ehhh, I wasn't personally impacted, so I don't care". And not only don't you care, you actively argue in favor of the corporations. Is it any wonder society is what it is today?

replies(2): >>43667613 #>>43668428 #
28. luckylion ◴[] No.43664401{5}[source]
> With AI prompters, the output is part function of the (very small) prompt, part function of the (huuuuuuuge) training set, part randomness.

With photographers, the output is part function of the (very small) orientation of the camera and pressing the button, part function of the (huuuuuuuge) technical marvel that are modern cameras, part randomness.

Let's be realistic here. Without the manufactured cameras, 99.9% of photographers wouldn't be photographers, only the 10 people who'd want it enough to build their own cameras, and they wouldn't have much appeal beyond a curiosity because their cameras would suck.

replies(1): >>43666062 #
29. AnthonyMouse ◴[] No.43665550{4}[source]
If you type "street art" as part of an image generation prompt, the results are quite similar to typing "in the style of Banksy". They're direct substitutes for each other, neither of them is actually going to produce Banksy-quality output and it's not even obvious which one will produce better results for a given prompt.

You still get images in a particular style by specifying the name of the style instead of the name of the artist. Do you really think this is no different than being able to produce only music when you want an image?

replies(1): >>43667749 #
30. AnthonyMouse ◴[] No.43665639{4}[source]
> Actually moral rights is what allow you to say no to AI.

The US doesn't really have moral rights and it's not clear they're even constitutional in the US, since the copyright clause explicitly requires "promote the progress" and "limited times" and many aspects of "moral rights" would be violations of the First Amendment. Whether they exist in some other country doesn't really help you when it's US companies doing it in the US.

> Further making a variant of a famous art piece under copyright might very well be a derivative.

Well of course it is. That's what derivative works are. You can also produce derivative works with Photoshop or MS Paint, but that doesn't mean the purpose of MS Paint is to produce derivative works or that it's Microsoft rather than the user purposely creating a derivative work who should be responsible for that.

replies(1): >>43667546 #
31. AnthonyMouse ◴[] No.43665807{4}[source]
The vast majority of AI image prompts don't reference an artist name, and the ones that do are typically using it as a proxy for a given style and would generally get similar results by specifying the name of the style instead of the name of the artist.

The ones using the name of the artist/studio (e.g. Ghiblification) also seem more common than they are because they're the ones that garner negative attention. Then the media attention a) causes people perceive it as being more common than it is and b) causes people do it more for a short period of time, making it temporarily more common even though the long-term economic relevance is still negligible.

replies(1): >>43667389 #
32. dragonwriter ◴[] No.43665850{4}[source]
> Artists don't hate Adobe just because they're making an AI art generator, they hate Adobe because it's a predatory, scummy corporation that is difficult to work with and is the gatekeeper for common industry tools.

From what I've seen from artists, they hate Adobe for both reasons, and the AI thing is often more of a dogmatic, uncompromising hate (and is not based on any of the various rationalizations used to persuade others to act in accord with it) and less of the kind of hate that is nevertheless willing to accept products for utility.

33. Juliate ◴[] No.43666062{6}[source]
Ludicrous rebuttal.

Reducing this to "orientation of the camera" is such a dismissive take on the eye and focus of the person that decides to take a picture, where/when he/she is; this is really revealing you do not practice it.

And... before cameras were even electronic, back in the early 2000, there were already thousands and more of extremely gifted photographers.

Yes, cameras are marvellous tools. But they are _static_. They don't dynamically, randomly change the input.

Generative AI are not _static_. They require training sets to be anywhere near useful.

Cameras _do not_ feed on all the previous photographies taken by others.

replies(2): >>43666526 #>>43667192 #
34. luckylion ◴[] No.43666526{7}[source]
> Reducing this to "orientation of the camera" is such a dismissive take

What's more important: the person behind the camera or the camera? Show me the photos taken without the camera and then look at all the great photos taken by amateurs.

> They require training sets to be anywhere near useful.

And the camera needs assembly and R&D. But when either arrives at your door, it's "ready to go".

> Cameras _do not_ feed on all the previous photographies taken by others.

Cameras do feed on all the research of previous cameras though. The photos don't matter to the Camera. The Camera manufacturers are geniuses, the photographers are users.

It's really not far off from AI, especially when the cameras do so much, and then there's the software-tools afterwards etc etc.

Yeah, yeah, everybody wants to feel special and artsy and all that and looks down on the new people who aren't even real artists. But most people really shouldn't.

replies(1): >>43667087 #
35. Juliate ◴[] No.43667087{8}[source]
You’re confusing the tools (which are their own marvels) and the practice (which is art, using the tools).

However good or not is the camera, it’s not the camera that dictates the inner qualities of a photograph, there is _something else_ that evades the technicalities of the tools and comes from the context and the choice of the photograph (and of accident, too, because it’s the nature of photography: capturing an accident of light).

The same camera in the hands of two persons will give two totally different sets of pictures, if only because, their sight, their looking at the world is different; and because one knows how to use the tools, and the other, not in the same way, or not at all.

It’s not a matter of « feeling artsy » or special, it’s a matter of « doing art ».

Everyone is an artist, if they want to: it’s a matter of practicing and intent, not a matter of outputting.

Art is in the process (of making, and of receiving), not in the output (which is the artefact of art and which has its own set of controversial and confusing economics and markets).

Generative AI on the contrary of tools that stay in their specific place, steals the insight from previous artists (from the training set) and strips the prompter from their own insights and personality and imprint (because it is not employed, but only through a limited text prompt at an interface).

Generative AI enthousiasts may be so. They have every right to be. But not by ignoring and denying the fundamental steal that injecting training sets without approval is, and the fundamental difference there is between _doing art_ and asking a computer to produce art.

Ignoring those two is a red flag of people having no idea what art, and practice is.

replies(2): >>43670342 #>>43674966 #
36. squigz ◴[] No.43667192{7}[source]
> Reducing this to "orientation of the camera" is such a dismissive take on the eye and focus of the person that decides to take a picture, where/when he/she is; this is really revealing you do not practice it.

Oh, the irony...

37. fc417fc802 ◴[] No.43667389{5}[source]
The latter example (Ghibli) is also somewhat misleading. Other studios sometimes use very similar styles. They might not have the same budget for fine detail throughout the entire length of the animation, and they probably don't do every production with that single art style, but when comparing still frames (which is what these tools generate after all) the style isn't really unique to a single studio.
38. fc417fc802 ◴[] No.43667434{5}[source]
So AI tools take you from "artist" to "art director". That's an interesting thought. I think I agree.
39. fc417fc802 ◴[] No.43667546{5}[source]
Well one could argue that this ought to be a discussion of morality and social acceptability rather than legality. After all the former can eventually lead to the latter. However if you make that argument you immediately run into the issue that there clearly isn't broad consensus on this topic.

Personally I'm inclined to liken ML tools to backhoes. I don't want the law to force ditches to be dug by hand. I'm not a fan of busywork.

40. fc417fc802 ◴[] No.43667613{6}[source]
It's piracy, not theft. Those aren't the same thing but they are both against the law and the court will assess damages for both.

The person you replied to derailed the conversation by misconstruing an analogy.

> what's happening is objectively wrong.

Doesn't seem like a defensible claim to me. Clearly plenty of people don't feel that way, myself included.

Aside, you appear to be banned. Just in case you aren't aware.

replies(1): >>43668523 #
41. card_zero ◴[] No.43667749{5}[source]
This hinges on denying that artists have distinctive personal styles. Instead your theory seems to be that styles are genres, and that the AI only needs to be trained on the genre, not the specific artist's output, in order to produce that artist's style. Which under this theory is equivalent to the generic style.

My counter-argument is "no". Ideally I'd elaborate on that. So ummm ... no, that's not the way things are. Is it?

replies(1): >>43669069 #
42. wizzwizz4 ◴[] No.43668188{5}[source]
You know "puréed orphan extract" is just salt, right? You can extract it from seawater in an expensive process that, nonetheless, is way cheaper than crushing orphans (not to mention the ethical implications). Sure, you have to live near the ocean, but plenty of people do, and we already have distribution networks to transport the resulting salt to your local market. Just one fist-sized container is the equivalent of, like, three or four dozen orphans; and you can get that without needing a fancy press or an expensive meat-sink.
43. jhbadger ◴[] No.43668211{5}[source]
Historically, it took a long time for traditional artists (painters and sculptors) to see photographers as fellow artists rather than mere technicians using technology to replace art. The same thing was true of early digital artists who dared to make images without paint or pencils.
replies(1): >>43670307 #
44. timewizard ◴[] No.43668300[source]
> or to make a moral argument that some kind of misappropriation is occurring.

They can also make a legal argument that the training set will fully reproduce copyrighted work. Which is just an actual crime as well as being completely amoral.

> because then it exists and they have to compete with it

The entire point of copyright law is: "To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries."

Individual artists should not have to "compete" against a billion dollar corporation which freely engages in copyright violations that these same artists have to abide by.

45. lcnPylGDnU4H9OF ◴[] No.43668428{6}[source]
I dunno, man. Re-read your comment but change one assumption:

> They still won in court based on the profits they determined the "theft" cost them, and the settlements were absurdly high.

Such court determinations are wrong. At least hopefully you can see how perhaps there is not so much wrong with the reasoning, even if you ultimately disagree.

> They're training them on the actual images artists create with skills honed over decades of work.

This is very similar to a human studying different artists and practicing; it’s pretty inarguable that art generated by such humans is not the product of copyright infringement, unless the image copies an artist’s style. Studio Ghibli-style AI images come to mind, to be fair, which should be a liability to whoever is running the AI because they’re distributing the image after producing it.

If one doesn’t think that it’s wrong for, e.g., Meta to torrent everything they can, as I do not, then it is not inconsistent to think their ML training and LLM deployment is simply something that happened and changed market conditions.

replies(1): >>43673537 #
46. lcnPylGDnU4H9OF ◴[] No.43668523{7}[source]
> The person you replied to derailed the conversation by misconstruing an analogy.

Curious why you say this. They seem to have made the copyright infringement analogous to theft and I addressed that directly in the comment.

replies(1): >>43668808 #
47. fc417fc802 ◴[] No.43668808{8}[source]
It was an analogy, ie a comparison of the differences between pairs. The relevant bit then is the damages suffered by the party stolen from. If you fail to pursue when the damages are small or nonexistent (image classifiers, employee stealing a single apple, individual reproduction for personal use) why should that undermine a case you bring when the damages become noticeable (generative models, employee stealing 500 lbs of apples, bulk reproduction for commercial sale)?
replies(1): >>43669325 #
48. int_19h ◴[] No.43669069{6}[source]
The argument here isn't so much that individual artists don't have their specific styles, but rather whether AI actually tracks that, or whether using "in the style of ..." is effectively a substitute for identifying the more general style category to which this artist belongs.
49. furyofantares ◴[] No.43669325{9}[source]
This is precisely where the analogy breaks down. The victim suffers damages in any theft, independent of any value the perpetrator gains. Damages due to copyright infringement don't work this way. Copyright exists to motivate the creation of valuable works; damages for copyright are an invented thing meant to support this.
replies(1): >>43669730 #
50. fc417fc802 ◴[] No.43669730{10}[source]
That would only be a relevant distinction if the discussion were specifically about realized damages. It is not.

The discussion is about whether or not ignoring something that is of little consequence to you diminishes a later case you might bring when something substantially similar causes you noticeable problems. The question at hand had nothing to do with damages due to piracy (direct, perceived, hypothetical, legal fiction, or otherwise).

It's confusing because the basis for the legal claim is damages due to piracy and the size of that claim probably hasn't shifted all that much. But the motivating interest is not the damages. It is the impact of the thing on their employment. That impact was not present before so no one was inclined to pursue a protracted uphill battle.

replies(1): >>43670680 #
51. Juliate ◴[] No.43670307{6}[source]
Not the same thing again.

That comparison would be fair if the generative AI you use is trained exclusively on your own (rightfully acquired) data and work.

Existing generative AIs are feeding on the work of millions of people who did not consent.

That’s a violation of their work and of their rights.

And that should also alert those that expect to use/benefit of their own production out of these generators: why would it be 1/ protectable, 2/ protected at all.

It is no coincidence that these generators makers’ philosophy aligns with an autocrat political project, and some inhuman « masculinity » promoters. It’s all about power and nothing about playing by the rules of a society.

replies(2): >>43672068 #>>43672461 #
52. Juliate ◴[] No.43670342{9}[source]
There is a third and fourth red flag, is it conscious or not I don’t know.

I am not even speaking of « do the users feel what it is ». Here it is:

If some people are so enthusiastic and ruthless defenders of AI generators that were trained/fed from the work of millions on unconsenting artists…

1/ what do they expect will happen to their own generated production?

2/ what do they expect will happen to their own consent, in that particular matter, or in others matters (as this will have been an additional precedent, a de facto)?

Again, said it elsewhere, there is a power play behind this, that is very related to the brolicharchy pushing for some kind of twisted, « red pilled » (lol) masculinity, and that is related to rape as a culture, not only in sexual matter but in all of them.

replies(1): >>43678979 #
53. dbdr ◴[] No.43670381[source]
> From the perspective of an artist, a corporation training an AI image generator in a way that isn't susceptible to moral or legal assault is worse

That's ignoring the fact that an AI image generator trained without infringing on existing works would have way worse quality, because of the reduced amount and quality of the training set.

54. furyofantares ◴[] No.43670680{11}[source]
Oh, I agree with all that, I had sort of ignored the middle post in this chain.
55. squigz ◴[] No.43672068{7}[source]
> That comparison would be fair if the generative AI you use is trained exclusively on your own (rightfully acquired) data and work.

> Existing generative AIs are feeding on the work of millions of people who did not consent.

There are LLMs that are trained on non-copyright work, but apparently that's irrelevant according to the comment I replied to.

56. jhbadger ◴[] No.43672461{7}[source]
As people have mentioned, people are still against legally-sourced generative AI systems like Adobe's, so concern over IP rights isn't the only, or I suspect, major, objection to generative AI that people have.
replies(1): >>43673067 #
57. Juliate ◴[] No.43673067{8}[source]
It's not the only objection, but it's one of the major and blocking ones, because how do you _prove_ that you do not have unconsented copyrighted contents in your training set?

The other objections, in the economic range (replacing/displacing artists work for financial gain, from the producers point of view) are totally valid too, but don't rely on the same argument.

And my point above is not really an objection, it's a reminder: of what are AI generators, and what they are not (and that AI generators promoters pretend they are, without any piece of evidence or real argument).

Of what their output is (a rough, industrial barely specified and mastered product), and what it is not (art).

replies(1): >>43674248 #
58. Juliate ◴[] No.43673530{5}[source]
> Nothing was stolen from the artists but instead used without their permission.

Which is equally illegal.

> disregarding copyright law because that is circular reasoning

This is not circular, copyright is non-negotiable.

59. Juliate ◴[] No.43673537{7}[source]
> This is very similar to a human...

A machine, software, hardware, whatever, as much as a corporation, _is not a human person_.

60. squigz ◴[] No.43674248{9}[source]
> how do you _prove_ that you do not have unconsented copyrighted contents in your training set?

And this is why I've stopped arguing with people from this crowd. Beyond the classic gatekeeping of what art is, I'm sick of the constant moving of the goalposts. Even if a company provides proof, I'm sure you'd find another issue with them

Underlying all of it is a fundamental misunderstanding of how AI tools are used for art, and a subtle implication that it's really the amount of effort that defines what "art" really is.

replies(1): >>43675356 #
61. luckylion ◴[] No.43674966{9}[source]
> Generative AI on the contrary of tools that stay in their specific place, steals the insight from previous artists (from the training set) and strips the prompter from their own insights and personality and imprint (because it is not employed, but only through a limited text prompt at an interface).

Because every piece of generative AI looks identical, right? I mean, if the prompt had an impact, and two people using some ML-model would create different results based on what they choose to input, it sounds suspiciously like your "the same camera in two different hands", doesn't it?

> the fundamental difference there is between _doing art_ and asking a computer to produce art.

You mean doing art by asking a computer do produce a dump of sensor-data by pressing a button?

You appear to be completely blind to the similarities and just retreat towards "I draw the lines around art, and this is inside, and that's outside of it" without being able to explain how the AI-tool is fundamentally different from the camera-tool, but obviously one negates all possibility to create art, while the other totally is art, because that's what people say!

Needless to say that the people making those distinctions can't even tell apart a photo from an AI-generated picture.

replies(1): >>43680620 #
62. Juliate ◴[] No.43675356{10}[source]
You’re sure? How?

And what crowd? I am stating my viewpoint, from an education in humanities AND tech, and from 25 years of career in software tech, and 30 years of musician and painter practice.

Sorry but who is moving the goalpost here? Who is coming with their tech saying « hi, but we don’t care about how your laws make sense and we don’t care that we don’t know what art is because we never studied about it, neither do we have any artistic practice, we just want to have what you guys do by pressing a button. Oh and all of your stuff is free for us to forage thru, don’t care about what you say about your own work. »

Typical entitled behavior. Don’t act surprised that this is met with counter arguments and reality.

replies(1): >>43675391 #
63. squigz ◴[] No.43675391{11}[source]
Typical gatekeeping behavior. Don't act surprised when the world and artistic expression moves on without you.
replies(2): >>43678700 #>>43678744 #
64. Juliate ◴[] No.43678700{12}[source]
Laughable.

What would be gatekeeping is if someone prevented you to pick a pencil, paper, a guitar, a brush, to make something out of your own.

You’re the only one gatekeeping yourself here.

Looks like it’s the same pattern as with blockchains, and NFTs and Web3 stuff and the move fast/break things mantra: you cannot argue for and demonstrate for what your « solutions » actually solve, so you need brute force to break things and impose them.

65. Juliate ◴[] No.43678744{12}[source]
Artistic expression does not « move on » without me, or people.

Artistic expression is people in motion, alone or in groups.

You’re talking about the economics of performances and artefacts, which are _something else_ out of artistic expression.

EDIT to clarify/reinforce:

Elvis without Elvis isn’t Elvis. Discs, movies, books are captures of Elvis. Not the same thing.

Miyazaki without Miyazaki isn’t Miyazaki. It may look like it, but it is not it.

Artistic expression is someone’s expression, practice (yours, mine, theirs). It’s the definition of the originality of it (who it comes from, who it is actually made by).

A machine, a software may produce (raw) materials for artistic expression, whatever it is, but it is not artistic expression by itself.

Bowie using the Verbasizer is using a tool for artistic expression. The Verbasizer output isn’t art by itself. Bowie made Bowie stuff.

66. squigz ◴[] No.43678979{10}[source]
Can you talk more about "rape as a culture"?
replies(1): >>43680394 #
67. Juliate ◴[] No.43680394{11}[source]
Rape is fundamentally about power, control and the violation of consent.

The casual dismissal of artists' fundamental rights to control their work and how they are used is a part of a larger cultural problem, where might would rule over law, power would rule over justice, lies over truth.

That may seem a charged argument, and it is, because it hits right and it is particularly uncomfortable to acknowledge.

The same tech leaders that push for this move over IP law are the tech leaders that fund(ed) the current dismantling of US democracy and that have chosen their political team because it aligns precisely (up to the man that got the presidential seat, the man that has (had?) quite problematic issues towards women) with their values.

This is too obvious to be an accident.

And this is also a stern warning. Because the ideology behind power does not stop at anything. It goes on until it eats itself.

replies(1): >>43682626 #
68. Juliate ◴[] No.43680620{10}[source]
> Because every piece of generative AI looks identical, right? I mean, if the prompt had an impact, and two people using some ML-model would create different results based on what they choose to input, it sounds suspiciously like your "the same camera in two different hands", doesn't it?

I feel there's something interesting to discuss here but I'm still not convinced: a camera captures light from the physical reality. AI generators "capture" something from a model trained on existing artworks from other people (most likely not consenting). There's a superficial similarity in the push of the button, but that's it. Each does not operate the same way, on the same domain.

> You appear to be completely blind to the similarities [...] without being able to explain how the AI-tool is fundamentally different from the camera-tool, but obviously one negates all possibility to create art, while the other totally is art, because that's what people say!

There's a vocabulary issue here. Art is a practice, not a thing, not a product. You can create a picture, however you like it.

What makes a picture cool to look at is how it looks. And that is very subjective and contextual. No issue with that. What makes it _interesting_ and catchy is not so much what it _is_ but what it says, what it means, what it triggers, from the intent of the artist (if one gets to have the info about it), to its techniques[1] all the way to the inspiration it creates in the onlookers (which is also a function of a lot of things).

Anything machine-produced can be cool/beautiful/whatever.

Machines also reproduce/reprint original works. And while there are common qualities, it is not the same to look at a copy, at a reproduction of a thing, and to look at the original thing, that was made by the original artist. If you haven't experienced that, please try to (going to a museum for instance, or a gallery, anywhere).

[1] and there, using AI stuff as anything else as a _tool_ to practice/make art? of course. But to say that what this tool makes _is_ art or a work of art? Basic no for me.

> Needless to say that the people making those distinctions can't even tell apart a photo from an AI-generated picture.

1/ It does get better and better, but it still looks like AI-generated (as of April 2025).

2/ Human-wise/feeling-wise/intellectual-wise, anything that I know has been generated by AI will be a. interesting perhaps, for ideas, for randomness, b. but soulless. And that is connection, relief, soul (mine, and those of others) I am looking for in art (as a practice, an artefact or a performance); I'm pretty sure that's what connects us humans.

3/ Market-wise, I predict that any renowned artwork will lose of its value as soon as its origin being AI-made will be known; for the very reason 2/ above.

69. squigz ◴[] No.43682626{12}[source]
Do you maybe think using 'rape' in such a casual way takes away anything from actual rape victims?
replies(1): >>43686955 #
70. Suppafly ◴[] No.43683088[source]
>Artists don't like AI image generators because they have to compete with them

I always wonder why people make statements like this. Anyone that knows more than one artist knows that artists uses these tools for a variety of reasons and aren't nearly as scared as random internet concern trolls make them out to be.

71. Suppafly ◴[] No.43683137[source]
>I don't think all artists are treating this tool as such an existential threat.

Agreed, the ones I know in real life are excited by these tools and have been using them.

replies(1): >>43683198 #
72. squigz ◴[] No.43683198{3}[source]
Well no, those aren't actually artists, because they clearly don't understand what art is.

(/s)

73. Juliate ◴[] No.43686955{13}[source]
1/ It does not take anything away. The use is not casual but deliberate and analytical. The concept of « rape culture » extends beyond sexual assault to other patterns of consent violation and power dynamics.

2/ it has been discussed for like, decades, in academic and social contexts, how attitudes in some domain reflects and reinforces them in others.

3/ Your « actual » makes an assumption about my experience that you have no basis for.

Point remains that non-consensual use of artists’ work reflects the same fundamental disregard for autonomy that characterizes other consent violations.