Most active commenters
  • Juliate(14)
  • Riverheart(13)
  • fc417fc802(9)
  • squigz(9)
  • TeMPOraL(8)
  • AnthonyMouse(7)
  • numpad0(6)
  • tpmoney(5)
  • bawolff(4)
  • UtopiaPunk(4)

←back to thread

553 points bookofjoe | 248 comments | | HN request time: 3.014s | source | bottom
1. adzm ◴[] No.43654878[source]
Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care. The AI features in Photoshop are the best around in my experience and come in handy constantly for all sorts of touchup work.

Anyway I don't really think they deserve a lot of the hate they get, but I do hope this encourages development of viable alternatives to their products. Photoshop is still pretty much peerless. Illustrator has a ton of competitors catching up. After Effects and Premiere for video editing are getting overtaken by Davinci Resolve -- though for motion graphics it is still hard to beat After Effects. Though I do love that Adobe simply uses JavaScript for its expression and scripting language.

replies(36): >>43654900 #>>43655311 #>>43655626 #>>43655700 #>>43655747 #>>43655859 #>>43655907 #>>43657271 #>>43657436 #>>43658069 #>>43658095 #>>43658187 #>>43658412 #>>43658496 #>>43658624 #>>43659012 #>>43659378 #>>43659401 #>>43659469 #>>43659478 #>>43659507 #>>43659546 #>>43659648 #>>43659715 #>>43659810 #>>43660283 #>>43661100 #>>43661103 #>>43661122 #>>43661755 #>>43664378 #>>43664554 #>>43665148 #>>43667578 #>>43674357 #>>43674455 #
2. Angostura ◴[] No.43654900[source]
Now that would have been a really interesting thing for them to start a conversation about on Bluesky. They would have got some genuine engagement if they wanted it.

Much better than the transparently vapid marketing-speak

replies(1): >>43655117 #
3. masswerk ◴[] No.43655117[source]
I think, part of the fiasco is about that engagement posters are not really welcomed on Bluesky. And, "What’s fueling your creativity right now?” is a pure engagement post, contributing nothing on its side of the conversation. Hence, it's more like another attempt to harvest Adobe's subscribers. — For X/Twitter-bound marketing it's probably fine, at least, much what we had become used to, but it totally fails the Bluesky community. (Lesson leaned: not all social media are the same.)
4. jsbisviewtiful ◴[] No.43655311[source]
> Adobe is the one major company trying to be ethical

Adobe is cannibalizing their paid C-Suite artists by pumping out image generators to their enterprise customers. How is that ethical? They are double dipping and screwing over their longtime paying artists

replies(1): >>43655529 #
5. multimoon ◴[] No.43655529[source]
This is I think a narrow viewpoint that assumes the AI will ever get truly as good as a human artist. Will it get good enough for most people? Probably, but if not Adobe then four others will do the same thing, and as another commenter pointed out Adobe is the only one even attempting to make AI tools ethically. I think the hate is extremely misdirected.

AI tech and tools aren’t just going to go away, and people aren’t going to just not make a tool you don’t like, so sticking your head in the sand and pretending like it will stop if you scream loud enough is not going to help, you should instead be encouraging efforts like Adobe’s to make these tools ethically.

replies(2): >>43656220 #>>43658566 #
6. bpodgursky ◴[] No.43655626[source]
> Anyway I don't really think they deserve a lot of the hate they get

The dark lesson here is that you avoid hate and bad PR by cutting artists out of the loop entirely and just shipping whatever slop the AI puts out. Maybe you lose 20% of the quality but you don't have to deal with the screaming and dogpiles.

7. gdulli ◴[] No.43655700[source]
The problem isn't their specific practices, but more that they're in general one of the companies profiting from our slopcore future.
8. nonchalantsui ◴[] No.43655747[source]
For their pricing and subscription practices alone, they deserve far more backlash than they get.
replies(2): >>43659101 #>>43659197 #
9. cosmotic ◴[] No.43655859[source]
There are a lot of good photoshop alternatives. Most are better at individual use cases than photoshop. For example, nearly all the alternatives are better at designing website comps because they are object-based instead of layer-based.
replies(1): >>43656131 #
10. f33d5173 ◴[] No.43655907[source]
Adobe isn't trying to be ethical, they are trying to be more legally compliant, because they see that as a market opportunity. Otoh, artists complain about legal compliance of AIs not because that is what they care about, but because they see that as their only possible redress against a phenomenon they find distasteful. A legal reality where you can only train AI on content you've licensed would be the worst for everybody bar massive companies, legacy artists included.
replies(7): >>43658034 #>>43658253 #>>43659203 #>>43659245 #>>43659443 #>>43659929 #>>43661258 #
11. genevra ◴[] No.43656131[source]
There are "some" Photoshop wannabes. I still haven't found any program on Linux that can give me anywhere close to the same ease of use and powerful tools that Photoshop has. The example you provided sounds like you want to use Illustrator for your use case anyway.
replies(2): >>43660594 #>>43661732 #
12. Brian_K_White ◴[] No.43656220{3}[source]
There is no such thing as "get as good as a human artist" unless it becomes an actual human that lived the human experience. Even bad art starts with something to express and a want to express it.

Without that, it's only as good as a human artist in the way a picture of a work of art is.

Actual AI art would first require an ai that wants to express something, and then it would have be trying to express something about the the life of an ai, which could really only be understood by another ai.

The most we could get out of it is maybe by chance it might be appealing like a flower or a rock. That is, an actual flower not an artists depiction of a flower or even an actual flower that someone pointed out to you.

An actual flower, that wasn't presented but you just found growing, might be pretty but it isn't a message and has no meaning or intent and isn't art. We like them as irrelevant bystanders observing something going on between plants and pollenators. Any meaning we percieve is actually only our own meanings we apply to something that was not created for that purpose.

And I don't think you get to say the hate is misdirected. What an amazing statement. These are the paying users saying what they don't like directly. They are the final authority on that.

replies(2): >>43656434 #>>43662172 #
13. multimoon ◴[] No.43656434{4}[source]
I’m not sure where we launched into the metaphysics of if an AI can produce an emotionally charged meaningful work, but that wasn’t part of the debate here, I recall my stance being that the AI will never get as good as the human. Since photoshop is a tool like any other, “good enough” refers to making the barrier of entry to make a given work (in this case some image) so low that anyone could buy a photoshop license and type some words into a prompt and get a result that satisfies them instead of paying an artist to use photoshop - which is where the artists understandable objection comes from.

I pay for photoshop along with the rest of the adobe suite myself, so you cannot write off my comment either while saying the rest of the paying users are “the final authority” when I am in fact a paying user.

My point is simply that with or without everyone’s consent and moral feel-goods these tools are going to exist and sticking your head in the sand pretending like that isn’t true is silly. So you may as well pick the lesser evil and back the company who at least seems to give the slightest bit of a damn of the morals involved, I certainly will.

replies(2): >>43657458 #>>43660619 #
14. UtopiaPunk ◴[] No.43657271[source]
You are assuming that there is an ethical way to use AI. There are several ethical concerns around using AI, and Adobe is perhaps concerned with one of these (charitably, respecting artists, or a little more cynically, respecting copyright).

Many would argue, myself included, that the most ethical approach towards AI is to not use it. Procreate is a popular digital art program that is loudly taking that position: https://procreate.com/ai

replies(2): >>43657316 #>>43658096 #
15. rmwaite ◴[] No.43657316[source]
Procreate is also owned by Apple, who is definitely not taking that position. Not saying both can't be true, but if a strong anti-AI stance is what you seek--I would be worried.
replies(1): >>43658057 #
16. ◴[] No.43657436[source]
17. UtopiaPunk ◴[] No.43657458{5}[source]
I'm not the person who responded, but I believe it came from a place of "what is art" (and you had used the word "artist").

My own position is that "art" can only be created by a human. AI can produce text, images, and sounds, and perhaps someday soon they can even create content that is practically indistinguishable from Picasso or Mozart, but they would still fail to be "art."

So sure, an AI can create assets to pad out commercials for trucks or sugary cereal, and they will more than suffice. Commercials and other similar content can be made more cheaply. Maybe that's good?

But I would never willingly spend my time or money engaging with AI "art." By that, I mean I would never attend a concert, watch a film, visit a museum, read a book, or even scroll through an Instagram profile if what I'm viewing is largely the output of AI. What would the point be?

I'll admit that there is some middle ground, where a large project may have some smaller pieces touched by AI (say, art assets in the background of a movie scene, or certain pieces of code in a video game). I personally err on the side of avoiding that when it is known, but I currently don't have as strong of an opinion on that.

replies(3): >>43658519 #>>43658534 #>>43662529 #
18. _bin_ ◴[] No.43658034[source]
Right, but "distaste" isn't grounds for trying to ban something. There are all kinds of things people and companies do which I dislike but for which there's no just basis for regulating. If Adobe properly licenses all their training data artists don't have a right to say "well i think this is bad for creativity and puts my job at risk, ban it!!!" Or more precisely, they have a right to say that, but no moral justification for trying to ban/regulate/sue over it.

I hate Adobe's subscription model as much as the next guy and that's a good reason to get annoyed at them. Adobe building AI features is not.

replies(5): >>43658454 #>>43659616 #>>43660867 #>>43663988 #>>43667492 #
19. input_sh ◴[] No.43658057{3}[source]
Procreate is not owned by Apple, you're probably thinking of Pixelmator.
replies(1): >>43659606 #
20. giancarlostoro ◴[] No.43658069[source]
I will forever miss Fireworks. I dont do much with graphics but Fireworks was the best thing I ever used. Now I do zero with graphics.
21. cosmic_cheese ◴[] No.43658095[source]
Even if they’re “trying”, it’s moot if the result isn’t clearly more ethical, and with the proliferation of stolen imagery on their stock image service (which they use to train their models), the ethics of their models are very much not clear.

If I saw news of a huge purge of stolen content on their stock image service with continued periodic purges afterwards (and subsequent retraining of their models to exclude said content), I might take the claim more seriously.

22. _bin_ ◴[] No.43658096[source]
It's a corporation which knows that more of its users are artsy types who care about this than Adobe, which trends a little more professional. I have no idea what position the leadership personally holds but this is very much like DEI in that corporations embrace and discard it opportunistically.
23. lawlessone ◴[] No.43658187[source]
They're making money off it.

At least Meta gives their models to the public.

24. Riverheart ◴[] No.43658253[source]
“A legal reality where you can only train AI on content you've licensed would be the worst for everybody bar massive companies, legacy artists included.”

Care to elaborate?

Also, saying artists only concern themselves with the legality of art used in AI because of distaste when there are legal cases where their art has been appropriated seems like a bold position to take.

It’s a practice founded on scooping everything up without care for origin or attribution and it’s not like it’s a transparent process. There are people that literally go out of their way to let artists know they’re training on their art and taunt them about it online. Is it unusual they would assume bad faith from those purporting to train their AI legally when participation up till now has either been involuntary or opt out? Rolling out AI features when your customers are artists is tone deaf at best and trolling at worst.

replies(1): >>43658703 #
25. m463 ◴[] No.43658412[source]
I remember pixelmator being a breath of fresh air.
replies(1): >>43658466 #
26. TeMPOraL ◴[] No.43658454{3}[source]
> Right, but "distaste" isn't grounds for trying to ban something.

It isn't, but it doesn't stop people from trying and hoping for a miracle. That's pretty much all there is to the arguments of image models, as well as LLMs, being trained in violation of copyright - it's distaste and greed[0], with a slice of basic legalese on top to confuse people into believing the law says what it doesn't (at least yet) on top.

> If Adobe properly licenses all their training data artists don't have a right to say "well i think this is bad for creativity and puts my job at risk, ban it!!!" Or more precisely, they have a right to say that, but no moral justification for trying to ban/regulate/sue over it.

I'd say they have plenty of moral / ethical justification for trying to ban/regulate/sue over it, they just don't have much of a legal one at this point. But that's why they should be trying[1] - they have a legitimate argument that this is an unexpected, undeserved, unfair calamity for them, threatening to derail their lives, and lives of their dependents, across the entire sector - and therefore that laws should be changed to shield them, or compensate them for the loss. After all, that's what laws are for.

(Let's not forget that the entire legal edifice around recognizing and protecting "intellectual property" is an entirely artificial construct that goes against the nature of information and knowledge, forcing information to behave like physical goods, so it's not unfair to the creators in an economy that's built around trading physical goods. IP laws were built on moral arguments, so it's only fair to change them on moral grounds too.)

--

[0] - Greed is more visible in the LLM theatre of this conflict, because with textual content there's vastly more people who believe that they're entitled to compensation just because some comments they wrote on the Internet may have been part of the training dataset, and are appalled to see LLM providers get paid for the service while they are not. This Dog in the Manger mentality is distinct from that of people whose output was used in training a model that now directly competes with them for their job; the latter have legitimate ethical reasons to complain.

[1] - Even though myself I am for treating training datasets to generative AI as exempt from copyright. I think it'll be better for society in general - but I recognize it's easy for me to say it, because I'm not the one being rugpulled out of a career path by GenAI, watching it going from 0 to being half of the way towards automating away visual arts, in just ~5 years.

replies(3): >>43659609 #>>43663932 #>>43667684 #
27. pavel_lishin ◴[] No.43658466[source]
I still use it, and might upgrade to their latest version.

It's fine as a way of making shitposts, but I don't know if it's a professional-grade graphics editor - but I'm not a professional myself, so what do I know.

replies(1): >>43660587 #
28. numpad0 ◴[] No.43658496[source]
What it implies is, it's not really about ethics per se, just like it's not really about 6th digits per se. People hate AI images, cut and dry.

Law is agreeable hate, in a way. Things that gets enough hate will get regulated out, sooner or later.

replies(3): >>43658963 #>>43659528 #>>43661593 #
29. spiderice ◴[] No.43658519{6}[source]
> I mean I would never...if what I'm viewing is largely the output of AI. What would the point be?

I agree with the sentiment, however..

Good luck to all of us at holding to that philosophy as AI & Non-AI become indistinguishable. You can tell now. I don't think you'll be able to tell much longer. If for no other reason than the improvements in the last 3 years alone. You'll literally have to research the production process of a painting before you can decide if you should feel bad for liking it.

replies(1): >>43667869 #
30. TheOtherHobbes ◴[] No.43658534{6}[source]
The point would be to have an interesting and novel experience in an experimental medium - which has been a major driver of art since its beginning.

Also, realistically, most people want entertainment, not art (by your definition). They want to consume experiences that are very minor variations of on experiences they've already had, using familiar and unsurprising tropes/characters/imagery/twists/etc.

The idea that only humans can make that kind of work has already been disproven. I know a number of authors who are doing very well mass-producing various kinds of trashy genre fiction. Their readers not only don't care, they love the books.

I suspect future generations of AI will be better at creating compelling original art because the AI will have a more complete model of our emotional triggers - including novelty and surprise triggers - than we do ourselves.

So the work will be experienced as more emotional, soulful, insightful, deep, and so on than even the best human creators.

This may or may not be a good thing, but it seems as inevitable as machine superiority in chess and basic arithmetic.

replies(1): >>43667972 #
31. numpad0 ◴[] No.43658566{3}[source]

  > AI tech and tools aren’t just going to go away, and people aren’t going to just not make a tool you don’t like  
It could. Film photography effectively went away, dragging street snaps along it. If it continues to not make artistic sense, people will eventually move on.
32. nitwit005 ◴[] No.43658624[source]
While I agree about Adobe behaving more ethically, I suspect they simply talked to their customers, and decided they didn't have much choice. CELSYS, who makes Clip Studio, suffered a backlash and pulled their initial AI features: https://www.clipstudio.net/en/news/202212/02_01/
replies(2): >>43659492 #>>43659561 #
33. Workaccount2 ◴[] No.43658703{3}[source]
There is no "scooping up", the models aren't massive archives of copied art. People either don't understand how these models work or they purposely misrepresent it (or purposely refuse to understand it).

Showing the model an picture doesn't create a copy of that picture in it's "brain". It moves a bunch of vectors around that captures an "essence" of what the image is. The next image shown from a totally different artist with a totally different style may well move around many of those same vectors again. But suffice to say, there is no copy of the picture anywhere inside of it.

This also why these models hallucinate so much, they are not drawing from a bank of copies, they are working off of a fuzzy memory.

replies(3): >>43658755 #>>43658813 #>>43658942 #
34. TeMPOraL ◴[] No.43658755{4}[source]
> People either don't understand how these models work or they purposely misrepresent it (or purposely refuse to understand it).

Not only that, they also assume or pretend that this is obviously violating copyright, when in fact this is a) not clear, and b) pending determination by courts and legislators around the world.

FWIW, I agree with your perspective on training, but I also accept that artists have legitimate moral grounds to complain and try to fight it - so I don't really like to argue about this with them; my pet peeve is on the LLM side of things, where the loudest arguments come from people who are envious and feel entitled, even though they have no personal stake in this.

replies(3): >>43658883 #>>43659263 #>>43660250 #
35. Riverheart ◴[] No.43658813{4}[source]
The collection of the training data is the “scooping up” I mentioned. I assume you acknowledge the training data doesn’t spontaneously burst out of the aether?

As for the model, it’s still creating deterministic, derivative works based off its inputs and the only thing that makes it random is the seed so it being a database of vectors is irrelevant.

replies(1): >>43660474 #
36. Riverheart ◴[] No.43658883{5}[source]
“Not only that, they also assume or pretend that this is obviously violating copyright, when in fact this is a) not clear, and b) pending determination by courts and legislators around the world.”

Uh huh, so much worse than the people that assume or pretend that it’s obviously not infringing and legal. Fortunately I don’t need to wait for a lawyer to form an opinion and neither do those in favor of AI as you might’ve noticed.

You see any of them backing down and waiting for answer from a higher authority?

replies(2): >>43659104 #>>43662672 #
37. ToucanLoucan ◴[] No.43658942{4}[source]
Training data at scale unavoidably taints models with vast amounts of references to the same widespread ideas that appear repeatedly in said data, so because the model has "seen" probably millions of photos of Indiana Jones, if you ask for an image of an archeologist who wears a hat and uses a whip, it's weighted averages are going to lead it to create something extremely similar to Indiana Jones because it has seen Indiana Jones so much. Disintegrating IP into trillions of pieces and then responding to an instruction to create it with something so close to the IP as to barely be distinguishable is still infringement.

The flip-side to that is the truly "original" images where no overt references are present all look kinda similar. If you run vague enough prompts to get something new that won't land you in hot water, you end up with a sort of stock-photo adjacent looking image where the lighting doesn't make sense and is completely unmotivated, the framing is strange, and everything has this over-smoothed, over-tuned "magazine copy editor doesn't understand the concept of restraint" look.

replies(1): >>43659527 #
38. TeMPOraL ◴[] No.43658963[source]
> People hate AI images, cut and dry.

People hate bad AI images, because they hate bad images, period. They don't hate good AI images, and when they see great AI images, they don't even realize they are made by AI.

It's true, there's a deluge of bad art now, and it's almost entirely AI art. But it's not because AI models exist or how they're trained - it's because marketers[0] don't give a fuck about how people feel. AI art is cheap and takes little effort to get - it's so cheap and low-effort, that on the lower end of quality scale, there is no human competition. It makes no economic sense to commission human labor to make art this bad. But with AI, you can get it for free - and marketing loves this, because, again, they don't care about people or the commons[1], they just see an ability to get ahead by trading away quality for greater volume at lower costs.

In short: don't blame bad AI art on AI, blame it on people who spam us with it.

--

[0] - I don't mean here just marketing agencies and people with marketing-related job titles, but also generally people engaging in excessive promotion of their services, content, or themselves.

[1] - Such as population-level aesthetic sensibilities, or sanity.

replies(2): >>43664352 #>>43666527 #
39. Spooky23 ◴[] No.43659012[source]
End of the day, the hate is: “The software is great, but these jerks expect me to pay for it!”

Their sales went crazy because everyone was relentlessly pirating their software.

replies(1): >>43665645 #
40. fxtentacle ◴[] No.43659101[source]
I would describe my business relationship with Adobe as:

"hostage"

They annually harass me with licensing checks and questionnaires because they really hate you if you run Photoshop inside a VM (my daily driver is Linux), although it is explicitly allowed. Luckily, I don't need the Adobe software that often. But they hold a lot of important old company documents hostage in their proprietary file formats. So I can't cancel the subscription, no matter how much I'd like to.

replies(2): >>43661036 #>>43663302 #
41. TeMPOraL ◴[] No.43659104{6}[source]
> You see any of them backing down and waiting for answer from a higher authority?

Should they? That's generally not how things work in most places. Normally, if something isn't clearly illegal, especially when it's something too new and different for laws to clearly cover, you're free to go ahead and try it; you're not expected to first seek a go-ahead from a court.

replies(1): >>43659472 #
42. Lammy ◴[] No.43659197[source]
I am so happy that my Win32 CS3 Master Collection still works fully-offline and will continue to do so for as long as I care to keep using it :)
replies(1): >>43660605 #
43. spoaceman7777 ◴[] No.43659203[source]
> Adobe isn't trying to be ethical, they are trying to be more legally compliant

Is the implication of this statement that using AI for image editing and creation is inherently unethical?

Is that really how people feel?

replies(1): >>43659347 #
44. no_wizard ◴[] No.43659245[source]
> A legal reality where you can only train AI on content you've licensed would be the worst for everybody bar massive companies, legacy artists included.

Quite an assertion. Why exactly would this be true?

replies(1): >>43660420 #
45. Root_Denied ◴[] No.43659263{5}[source]
>Not only that, they also assume or pretend that this is obviously violating copyright, when in fact this is a) not clear, and b) pending determination by courts and legislators around the world.

Legislation always takes time to catch up with tech, that's not new.

The question I'm see being put forth from those with legal and IP backgrounds is about inputs vs. outputs, as in "if you didn't have access to X (which has some form of legal IP protection) as an input, would you be able to get the output of a working model?" The comparison here is with manufacturing where you have assembly of parts made by others into some final product and you would be buying those inputs to create your product output.

The cost of purchasing the required inputs is not being done for AI, which pretty solidly puts AI trained on copyrighted materials in hot water. The fact that it's an imperfect analogy and doesn't really capture the way software development works is irrelevant if the courts end up agreeing with something they can understand as a comparison.

All that being said I don't think the legality is under consideration for any companies building a model - the profit margins are too high to care for now, and catching them at it is potentially difficult.

There's also a tendency for AI advocates to try and say that AI/LLM's are "special" in some way, and to compare their development process to someone "learning" the style of art (or whatever input) that they then internalize and develop into their own style. Personally I think that argument gives a lot of assumed agency to these models that they don't actually have, and weakens the overall legal case.

46. mtndew4brkfst ◴[] No.43659347{3}[source]
For creation, yes, because of the provenance of the training data that got us here. It was acquired unethically in the overwhelming majority of cases. Using models derived from that training is laundering and anonymizing the existing creativity of other humans and then still staking the claim "I made this", like the stick figure comic. It's ghoulish.
replies(2): >>43659656 #>>43661512 #
47. crest ◴[] No.43659378[source]
> Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care.

It's sad that it's funny that you think Adobe is motivated by ethical consideration.

replies(6): >>43659406 #>>43659537 #>>43659649 #>>43659659 #>>43659699 #>>43659924 #
48. Bluescreenbuddy ◴[] No.43659401[source]
This Adobe. They don’t care about ethic. And frankly fuck them.
49. ◴[] No.43659406[source]
50. dinkumthinkum ◴[] No.43659443[source]
I'm curious why you think it would be worse for everybody? This argument seems to depend on the assumption that if something makes AI less viable then the situation for human beings is worse overall. I don't think many actual people would accept that premise.
replies(1): >>43660637 #
51. ◴[] No.43659469[source]
52. Riverheart ◴[] No.43659472{7}[source]
You just chided people for having strong opinions about AI infringement without a court ruling to back them up but now you’re saying that creating/promoting an entire industry based on a legal grey area is a social norm that you have no strong feelings about. I would have thought the same high bar to speak on copyright for those who believe it infringes would be applied equally to those saying it does not, especially when it financially benefits them. I don’t think we’ll find consensus.
53. quitit ◴[] No.43659478[source]
I'm not pointing fingers in any specific direction, but there is a lot of importance in AI leadership, and with that you're going to see a lot of bot activity and astroturfing to hinder the advancement of competitors. We also see companies such as OpenAI publicly calling out Elon Musk for what appears to be competition-motivated harassment.

So while I think we're all pretty aware of both sides of the image gen discussion and may have differing opinions about that - I think we can all agree that the genie can't be put back in the bottle. This will naturally lead for those that do take advantage of the technology to outpace those which do not.

Also I applaud Adobe's approach to building their models "ethically", yes they are inferior to many competitors, but they work well enough to save significant time and money. They have been very good at honing in what AI is genuinely useful for instead of bolting on a chatbot onto every app like clock radios in the 1980s.

54. mubou ◴[] No.43659492[source]
Probably didn't help that Clip Studio is predominantly used by Japanese artists, and virtually all models capable of producing anime-style images were trained on a dataset of their own, stolen pixiv art.
55. matt_heimer ◴[] No.43659507[source]
The best? I tried the Photoshop AI features to clean up a old photo for the first time this week and it crashed every time. After a bunch of searching I found a post identifying problem - it always crashes if there are two or more faces in the photo. Guess someone forgot to test on the more than one person edge case.
replies(1): >>43659538 #
56. tpmoney ◴[] No.43659527{5}[source]
> if you ask for an image of an archeologist who wears a hat and uses a whip, it's weighted averages are going to lead it to create something extremely similar to Indiana Jones because it has seen Indiana Jones so much.

If you ask a human artist for an image of "an archeologist who wears a hat and uses a whip" you're also going to get something extremely similar to Indiana Jones unless you explicitly ask for something else. Let's imagine we go to deviantart and ask some folks to draw us some drawing from these prompts:

A blond haired fighter from a fantasy world that wears a green tunic and green pointy cap and used a sword and shield.

A foreboding space villain with all black armor, a cape and full face breathing apparatus that uses a laser sword.

A pudgy plumber in blue overalls and a red cap of Italian descent

I don't know about you but I would expect with nothing more than that, most of the time you're going to get something very close to Link, Darth Vader and Mario. Link might be the one with the best chance to get something different just because the number of publicly known images of "fantasy world heroes" is much more diverse than the set of "black armored space samurai" and "Italian plumbers"

> Disintegrating IP into trillions of pieces and then responding to an instruction to create it with something so close to the IP as to barely be distinguishable is still infringement.

But it's the person that causes the creation of the infringing material that is responsible for the infringement, not the machine or device itself. A xerox machine is a machine that disintegrates IP into trillions of pieces and then responds to instructions to duplicate that IP almost exactly (or to the best of its abilities). And when that functionality was challenged, the courts rightfully found that a xerox machine in and of itself, regardless of its capability to be used for infringement is not in and of itself infringing.

replies(2): >>43659716 #>>43664912 #
57. adzm ◴[] No.43659528[source]
> People hate AI images, cut and dry.

I don't know for sure about the common usage, but personally my use of AI in Photoshop are things like replacing a telephone pole with a tree, or extending a photo outside of frame, which is much different than just generating entire images. It is unfortunate that this usage of generative AI is lumped in with everything else.

58. ngcazz ◴[] No.43659537[source]
Or that generative AI is ethical at all
replies(2): >>43659650 #>>43669105 #
59. ZeroTalent ◴[] No.43659538[source]
I know 5 AI image-gen apps that are better than photoshop and cost around $10-20/month. For example, ideogram. Photoshop doesn't even come close.
replies(1): >>43663861 #
60. skywhopper ◴[] No.43659546[source]
Uh, not sure where you’ve been but Adobe is slavering over using the content its locked-in users create to train its products. It only (seemingly) backed off this approach last year when the cost in terms of subscription revenue got too high. But you’re naive if you think they aren’t desperately planning how to get back to that original plan of owning an ever-growing slice of every bit of human creativity that touches their software.
61. paulddraper ◴[] No.43659561[source]
Talking to customers is a good thing.

Let's normalize it.

62. rmwaite ◴[] No.43659606{4}[source]
Oh snap, you’re right. My mistake!
63. skissane ◴[] No.43659609{4}[source]
> they have a legitimate argument that this is an unexpected, undeserved, unfair calamity for them, threatening to derail their lives, and lives of their dependents, across the entire sector - and therefore that laws should be changed to shield them, or compensate them for the loss. After all, that's what laws are for.

Lots of people have had their lives disrupted by technological and economic changes before - entire careers which existed a century ago are now gone. Given society provided little or no compensation for prior such cases of disruption, what’s the argument for doing differently here?

replies(4): >>43659678 #>>43660270 #>>43661387 #>>43669091 #
64. skywhopper ◴[] No.43659616{3}[source]
In the context of encouraging art, it totally is! Copyright and patents are 100% artificial and invented legal concepts that are based solely on the distaste for others profiting off a creator’s ideas. The reason for them is to encourage creativity by allowing creators to profit off new ideas.

So there’s no reason why “distaste” about AI abuse of human artists’ work shouldn’t be a valid reason to regulate or ban it. If society values the creation of new art and inventions, then it will create artificial barriers to encourage their creation.

replies(2): >>43661607 #>>43666848 #
65. ilrwbwrkhv ◴[] No.43659648[source]
Yes and this is what I was worried about in my essay on AI.

They have burned so much of goodwill that the community is not willing to engage even with positive things now.

This broadly is happening to tech as well.

66. ahartmetz ◴[] No.43659649[source]
Probably want to look good to their customer base - artists
67. esalman ◴[] No.43659650{3}[source]
It's funny pg once compared hackers with painters, but given how people abuse crypto and generative AI, is seems hackers have more in common with thieves and robbers.
replies(1): >>43660403 #
68. skissane ◴[] No.43659656{4}[source]
There exist image generation models that were trained on purely licensed content, e.g. Getty’s. I don’t know about Adobe’s specifically-but if not, it seems like a problem Adobe could easily fix-either buy/license a stock image library for AI training (maybe they already have one), and use that to train their own model-or else license someone else’s model e.g. Getty’s
replies(2): >>43659970 #>>43660002 #
69. XorNot ◴[] No.43659659[source]
Where did the poster say they think Adobe is motivated by that? They said Adobe is operating that way.
70. TeMPOraL ◴[] No.43659678{5}[source]
Moral growth and learning from history?
replies(1): >>43659856 #
71. jfengel ◴[] No.43659699[source]
They don't have to be motivated by ethics. I'm fine with them grudgingly doing ethical things because their customer base is all artists, many of whom would look for an alternative product.
replies(1): >>43659964 #
72. doctorpangloss ◴[] No.43659715[source]
There’s no evidence that their generative tools are more ethical.

Even if you believe everything they say, they are lying by omission. For example, for their text to image technology, they never specify what their text language model is trained on - it’s almost certainly CLIP or T5, which is trained on plenty of not-expressly-licensed data. If they trained such a model from scratch - they don’t have enough image bureau data to make their own CLIP, even at 400m images, CLIP only performs well at the 4-7b image-caption pair scale - where’s the paper? It’s smoke and mirrors dude.

There’s a certain personality type that is getting co-opted on social media like Hacker News to “mook” for Adobe. Something on the intersection of a certain obsessive personality and Dunning Kruger.

73. Riverheart ◴[] No.43659716{6}[source]
You know why we put up with copyrighted info in the human brain right? Because those are human beings, it’s unavoidable. This? Avoidable.

Also, the model isn’t a human brain. Nobody has invented a human brain.

And the model might not infringe if its inputs are licensed but that doesn’t seem to be the case for most and it’s not clearly transparent they don’t. If the inputs are bad, the intent of the user is meaningless. I can ask for a generic super hero and not mean to get superman but if I do I can’t blame that on myself, I had no role in it, heck even the model doesn’t know what it’s doing, it’s just a function. If I Xerox Superman my intent is clear.

replies(1): >>43659952 #
74. AnthonyMouse ◴[] No.43659810[source]
> Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care.

It's because nobody actually wants that.

Artists don't like AI image generators because they have to compete with them, not because of how they were trained. How they were trained is just the the most plausible claim they can make against them if they want to sue OpenAI et al over it, or to make a moral argument that some kind of misappropriation is occurring.

From the perspective of an artist, a corporation training an AI image generator in a way that isn't susceptible to moral or legal assault is worse, because then it exists and they have to compete with it and there is no visible path for them to make it go away.

replies(7): >>43659874 #>>43660487 #>>43662522 #>>43663679 #>>43668300 #>>43670381 #>>43683088 #
75. skissane ◴[] No.43659856{6}[source]
There’s a big risk that you end up creating a scheme to compensate for technological disruption in one industry and then fail to do so in another, based on the political clout / mindshare / media attention each has - and then there are many people in even worse personal situations (through no fault of their own) who would also miss out.

Wouldn’t a better alternative be to work on improving social safety nets for everybody, as opposed to providing a bespoke one for a single industry?

replies(1): >>43668694 #
76. mjmsmith ◴[] No.43659874[source]
Most artists would prefer not to compete with an AI image generator that has been trained on their own artwork without their permission, for obvious reasons.
replies(2): >>43659995 #>>43660494 #
77. bolognafairy ◴[] No.43659924[source]
A strawman argument so you can condescendingly and snarkily lecture someone? I can see you were among those mouthing off at Adobe on Bluesky.
replies(2): >>43660222 #>>43661311 #
78. tbrownaw ◴[] No.43659929[source]
> Adobe isn't trying to be ethical, they are trying to be more legally compliant,

Ethics (as opposed to morals) is about codified rules.

The law is a set of codified rules.

So are these really that different (beyond how the law is a hodge-podge and usually a minimum requirement rather than an ideal to reach for)?

79. tpmoney ◴[] No.43659952{7}[source]
> You know why we put up with copyrighted info in the human brain right? Because those are human beings, it’s unavoidable.

I would hope we put up with it because "copyright" is only useful to us insofar as it advances good things that we want in our society. I certainly don't want to live in a world where if we could forcibly remove copyrighted information from human brains as soon as the "license" expired that we would do so. That seems like a dystopian hell worse than even the worst possible predictions of AI's detractors.

> I can ask for a generic super hero and not mean to get superman but if I do I can’t blame that on myself, I had no role in it, heck even the model doesn’t know what it’s doing, it’s just a function.

And if you turn around and discard that output and ask for something else, then no harm has been caused. Just like when artists trace other artists work for practice, no harm is caused and while it might be copyright infringement in a "literal meaning of the words" it's also not something that as a society we consider meaningfully infringing. If on the other hand, said budding artist started selling copies of those traces, or making video games using assets scanned from those traces, then we do consider it infringement worth worrying about.

> If I Xerox Superman my intent is clear.

Is it? If you have a broken xerox machine and you think you have it fixed, grab the nearest papers you can find and as a result of testing the machine xerox Superman, what is your intent? I don't think it was to commit copyright infringement, even if again in the "literal meaning of the words" sense you absolutely did.

replies(1): >>43660326 #
80. djeastm ◴[] No.43659964{3}[source]
You are fine with it, of course, because you're reasonable. But OP's claim was that Adobe is "trying to be ethical with its AI training data and no one seems to even care" as if we're meant to give special consideration to a company for doing the only economically sensible thing when most of its customers are artists.
replies(2): >>43660044 #>>43665118 #
81. spookie ◴[] No.43659970{5}[source]
Well they do license the art they use, but in... let's say... "interesting" ways through their ToS.
82. AnthonyMouse ◴[] No.43659995{3}[source]
That's exactly the moral argument Adobe is taking away from them, and the same argument has minimal economic relevance because it's so rare that a customer requires a specific individual artist's style.
replies(2): >>43661174 #>>43661478 #
83. bolognafairy ◴[] No.43660002{5}[source]
They are training using licensed images! That’s the thing! There’s some sort of ridiculous brainworm infecting certain online groups that has them believing that stealing content is inherent in using generative AI.

I watch this all quite closely, and It’s chronically online, anime / fursona profile picture, artists.

Exact same thing happened when that ‘open’ trust and safety platform was announced a few months ago, which used “AI” in its marketing material. This exact same group of people—not even remotely the target audience for this B2B T&S product—absolutely lost it on Bluesky. “We don’t want AI everywhere!” “You’re taking the humanity out of everything!” “This is so unethical!” When you tell them that machine learning has been used in content moderation for decades, they won’t have a bar of it. Nor when you explain that T&S AI isn’t generative and almost certainly isn’t using “stolen” data. I had countless people legitimately say that having humans have to sift through gore and CSAM is a Good Thing because it gives them jobs, which AI is taking away.

It’s all the same sort of online presence. Anime profile picture, Ko-fi in bio, “minors dni”, talking about not getting “commissions” anymore. It genuinely feels like a psy-op / false flag operation or something.

replies(1): >>43660243 #
84. ambicapter ◴[] No.43660044{4}[source]
The great thing about loudly painting Adobe with the brush of "ethical AI training" (regardless of why they're doing it) is that the backlash will exponentially bigger if/when they do something that betrays that label. Potentially big enough to make them reverse course. It's not much, but it's something.
85. subjectsigma ◴[] No.43660243{6}[source]
> I had countless people legitimately say that having humans have to sift through gore and CSAM is a Good Thing because it gives them jobs, which AI is taking away.

Link even a single example of someone explicitly saying this and I would be astounded

86. jillyboel ◴[] No.43660250{5}[source]
It's unauthorized commercial use. Which part of that is confusing to you?
replies(1): >>43660459 #
87. CamperBob2 ◴[] No.43660270{5}[source]
Given society provided little or no compensation for prior such cases of disruption

That's going to be hard for you to justify in the long run, I think. Virtually everybody who ever lost a job to technology ended up better off for it.

replies(2): >>43660652 #>>43664103 #
88. sneak ◴[] No.43660283[source]
Subscriptionware is cancer. They deserve all the hate they get.
89. Riverheart ◴[] No.43660326{8}[source]
I’m saying that retaining information is a natural, accepted part of being human and operating in society. Don’t know why it needed to be turned into an Orwell sequel.
replies(2): >>43660501 #>>43661530 #
90. labster ◴[] No.43660403{4}[source]
Hollywood was right all along then about hackers being outlaws, then. Hacker News must be the very heart of the Dark Web (where “dark” is short for late-stage capitalism).
replies(1): >>43660446 #
91. drilbo ◴[] No.43660420{3}[source]
who else has would ever have a significantly large store of licensed material?
replies(1): >>43667788 #
92. jordanb ◴[] No.43660446{5}[source]
> hackers being outlaws

That gives them too much credit. "Outlaws" are folk heroes. Robin Hood was an outlaw, Bonnie and Clyde were outlaws. Luigi is an outlaw.

Nobody's going to be telling fables about the exploits of Sam Altman.

replies(1): >>43661394 #
93. rcxdude ◴[] No.43660459{6}[source]
So is google books, and that got ruled as fair use. That it's being used commercially is not a slam dunk case against an argument for fair use.
94. rcxdude ◴[] No.43660474{5}[source]
deterministic is neither here nor there for copyright infringement. a hash of an image is not infringing, and a slightly noisy version of it is.
replies(1): >>43661683 #
95. Sir_Twist ◴[] No.43660487[source]
I'd say that is a bit of an ungenerous characterization. Is it possible that it could be both? That while artists maybe do feel under attack in terms of competition, that there is a genuine ethical dilemma at hand?

If I were an artist, and I made a painting and published it to a site which was then used to train an LLM, I would feel as though the AI company treated me disingenuously, regardless of competition or not. Intellectual property laws aside, I think there is a social contract being broken when a publicly shared work is then used without the artist's direct, explicit permission.

replies(4): >>43660625 #>>43660937 #>>43660970 #>>43661337 #
96. unethical_ban ◴[] No.43660494{3}[source]
He's arguing that artists are so scared of Adobe and AI that they actually want Adobe to be more evil so artists have more to complain about.
replies(1): >>43660739 #
97. tpmoney ◴[] No.43660501{9}[source]
I had assumed when you said that a human retaining information was "unavoidable" and a machine retaining it was "avoidable" that the implication was we wouldn't tolerate humans retaining information if it was also "avoidable". Otherwise I'm unclear what the intent of distinguishing between "avoidable" and "unavoidable" was, and I'm unclear what it has to do with whether or not an AI model that was trained with "unlicensed" content is or isn't copyright infringing on its own.
replies(1): >>43661524 #
98. geerlingguy ◴[] No.43660587{3}[source]
It's like 95% of the way there for me—there are a few little workflow niggles that keep me from fully switching over, like the inability to do a full export-close cycle without saving, without having to use my mouse (moving the hand to the trackpad/mouse is annoying when it's not necessary).

In Photoshop, likely because it's been used by pros for decades, little conveniences are all over the place, like the ability to press 'd' for 'Don't Save' in a save dialog box.

That said, the past few versions of Photoshop, which moved away from fully-native apps to some sort of web UI engine... they are getting worse and worse. On one of my Macs, every few weeks it gets stuck on the 'Hand' tool, no matter what (even surviving a preferences nuke + restart), until I reboot the entire computer.

99. geerlingguy ◴[] No.43660594{3}[source]
Pixelmator Pro is very close... but Mac only, still. Image editing on Linux is rough.
100. dylan604 ◴[] No.43660605{3}[source]
Does it work on modern hardware running modern OS? Specifically, wondering if this was a Mac version. I could see WinX versions still running, but the Mac arch has changed significantly: 32bit -> 64bit, mactel -> AppleSI
replies(2): >>43660998 #>>43661396 #
101. Brian_K_White ◴[] No.43660619{5}[source]
The fact that you are a paying user who does not hate some thing that other users do, does not change the fact that they do, and that they are the final authority on what they hate and why they hate it.

It has nothing to do with you. You are free not to have the same priorities as them, but that's all that difference indicates, is that your priorities are different.

The "what is art?" stuff is saying why I think that "get as good as a human artist" is a fundamentally invalid concept.

Not that humans are the mostest bestest blessed by god chosen whatever. Just that it's a fundamentally meaningless sequence of words.

102. kmeisthax ◴[] No.43660625{3}[source]
Artists do not want to get paid micropennies for use-of-training-data licenses for something that destroys the market for new art. And that's the only claim Adobe Firefly makes for being ethical. Adobe used a EULA Roofie to make all their Adobe Stock contributors consent to getting monthly payments for images trained on in Firefly.
replies(1): >>43660897 #
103. crimony ◴[] No.43660637{3}[source]
It's worse only if AI turns out to be of high value.

In that case only large companies that can afford to license training data will be dominant.

104. disconcision ◴[] No.43660652{6}[source]
> Virtually everybody who ever lost a job to technology ended up better off for it.

this feels like a much stronger claim than is typically made about the benefits of technological progress

replies(1): >>43661389 #
105. AnthonyMouse ◴[] No.43660739{4}[source]
They want AI image generation to go away. That isn't likely to happen, but their best hope would be to make copyright claims or try to turn the public against AI companies with accusations of misappropriation. Adobe's "ethical" image generator would be immune to those claims while still doing nothing to address their primary concern, the economic consequences. It takes away their ammunition while leaving their target standing. Are they supposed to like a company doing that or does it just make them even more upset?
106. cratermoon ◴[] No.43660867{3}[source]
> Right, but "distaste" isn't grounds for trying to ban something

I disagree. There are many laws on the books codifying social distastes. They keep your local vice squad busy.

replies(1): >>43666817 #
107. Sir_Twist ◴[] No.43660897{4}[source]
Indeed, and I agree that Adobe is in the wrong here. For an agreement between Adobe and an artist to be truly permissive, the artist should have the ability to not give their consent. Ethically, I think Adobe is in the same position as the other AI companies – if the artist doesn't directly (EULAs are not direct, in my opinion) agree to the terms, and if they don't have the option to decline, then it isn't an agreement, it is an method of coercion. If an artist, like you said, doesn't want to be paid micropennies, they shouldn't have to agree.

I believe it is completely reasonable for an artist to want to share their work publicly on the Internet without fear of it being appropriated, and I wish there was a pragmatic way they could achieve this.

108. furyofantares ◴[] No.43660937{3}[source]
I've never seen anyone make the complaint about image classifiers or image segmentation. It's only for generative models and only once they got good enough to be useful.
replies(1): >>43663369 #
109. AnthonyMouse ◴[] No.43660970{3}[source]
> Is it possible that it could be both? That while artists maybe do feel under attack in terms of competition, that there is a genuine ethical dilemma at hand?

The rights artists have over their work are economic rights. The most important fair use factor is how the use affects the market for the original work. If Disney is lobbying for copyright term extensions and you want to make art showing Mickey Mouse in a cage with the CEO of Disney as the jailer, that's allowed even though you're not allowed to open a movie theater and show Fantasia without paying for it, and even though (even because!) Disney would not approve of you using Mickey to oppose their lobbying position. And once the copyright expires you can do as you like.

So the ethical argument against AI training is that the AI is going to compete with them and make it harder for them to make a living. But substantially the same thing happens if the AI is trained on some other artist's work instead. Whose work it was has minimal impact on the economic consequences for artists in general. And being one of the artists who got a pittance for the training data is little consolation either.

The real ethical question is whether it's okay to put artists out of business by providing AI-generated images at negligible cost. If the answer is no, it doesn't really matter which artists were in the training data. If the answer is yes, it doesn't really matter which artists were in the training data.

replies(3): >>43661004 #>>43661497 #>>43662059 #
110. cosmic_cheese ◴[] No.43660998{4}[source]
I haven’t tried so I can’t say for sure but my hunch is that you’d have better luck running old CS versions on modern Macs with WINE, which can run 32-bit x86 Windows binaries on ARM just fine (via Rosetta).

Performance is obviously going to take a hit though. Depending on the machines in question one would probably get better results from a current gen x86 box running that same Windows version of CS1/CS2/CS3 running through WINE (or of course Windows 11, but then you’re stuck with Windows 11).

111. sureIy ◴[] No.43661036{3}[source]
> proprietary file formats

Gimp can't handle them?

replies(3): >>43661757 #>>43662557 #>>43666681 #
112. sdrothrock ◴[] No.43661100[source]
> Adobe is the one major company trying to be ethical with its AI training data

I was actually contacted by someone at Adobe for a chat about disability representation and sensitivity in Japan because they were doing research to gauge the atmosphere here and ensure that people with disabilities were represented, and how those representations would be appropriate for Japanese culture. It really blew my mind.

replies(1): >>43669008 #
113. devmor ◴[] No.43661103[source]
If they are trying to be ethical, all it takes is one look at their stock photo service to see that they are failing horribly.
114. Henchman21 ◴[] No.43661122[source]
SUPER ethical to try and put artists and entire industries out of business to be replaced with Adobe products.
115. __loam ◴[] No.43661174{4}[source]
Artists don't hate Adobe just because they're making an AI art generator, they hate Adobe because it's a predatory, scummy corporation that is difficult to work with and is the gatekeeper for common industry tools. Also, Adobe didn't take away the moral arguments against AI art, they just used previously liscened imagery that existed before they started making AI art generators. There's still an argument that it's deceptive to grandfather in previously licensed work into a new technology, and there's still an argument that spending resources on automating cultural expression is a shitty thing to do.
replies(2): >>43662191 #>>43665850 #
116. zmmmmm ◴[] No.43661258[source]
The ship has sailed, but I can understand artists feeling that no matter how any AI is trained prospectively, it was only made possible because the methods to do so were learned through unethical means - we now know the exact model architectures, efficient training methods and types of training data needed so that companies like Adobe can recreate it with a fraction of the cost.

We obviously can never unscramble that egg, which is sad because it probably means there will never be a way to make such people feel OK about AI.

117. eloisius ◴[] No.43661311{3}[source]
“Mouthing off” is always uttered by someone with an undeserved sense of authority over the other party, like a mall cop yelling at a teenager for skateboarding
118. scarface_74 ◴[] No.43661337{3}[source]
Adobe only trains its AI on properly licensed images that the artists have explicitly signed a contract with Adobe to train on.
119. petre ◴[] No.43661387{5}[source]
You're only going yo get "AI art" in the future because artists will have get a second job at McDonalds to survive. The same old themes all over again. It's like the only music is Richard Clayderman tunes.
120. CamperBob2 ◴[] No.43661389{7}[source]
Certainly no stronger than the claim I was responding to. They are essentially pining for the return of careers that haven't existed for a century.
121. econ ◴[] No.43661394{6}[source]
AI could do it. Seems a good use of it.
122. Lammy ◴[] No.43661396{4}[source]
I have the offline CS3 Mac version too, but it's 32-bit Intel so you can't run it on anything after Catalina. The Win32 version works fine on Windows 10.
123. mjmsmith ◴[] No.43661478{4}[source]
That must be why AI image prompts never reference an artist name.
replies(1): >>43665807 #
124. card_zero ◴[] No.43661497{4}[source]
> But substantially the same thing happens if the AI is trained on some other artist's work instead.

You could take that further and say that "substantially the same thing" happens if the AI is trained on music instead. It's just another kind of artwork, right? Somebody who was going to have an illustration by [illustrator with distinctive style] might choose to have music instead, so the music is in competition, so all that illustrator's art might as well be in the training data, and that doesn't matter because the artist would get competed with either way. Says you.

replies(1): >>43665550 #
125. CaptainFever ◴[] No.43661512{4}[source]
The whole point is that Adobe's AI doesn't do this, yet is still hated. It reveals that some people simply hate the whole concept of generative AI, regardless of how it was made. You're never going to please them.
replies(1): >>43664719 #
126. Riverheart ◴[] No.43661524{10}[source]
I’m in the camp that believes that it’s neither necessary nor desirable to hold humans and software to the same standard of law. Society exists for our collective benefit and we make concessions with each other to ensure it functions smoothly and I don’t think those concessions should necessarily extend to automated processes even if they do in fact mimic humans for the myriad ways in which they differ from us.
replies(1): >>43664300 #
127. CaptainFever ◴[] No.43661530{9}[source]
Appeal to nature fallacy.

https://www.logicallyfallacious.com/logicalfallacies/Appeal-...

replies(2): >>43661581 #>>43668604 #
128. Riverheart ◴[] No.43661581{10}[source]
I’m not saying it’s better because it’s naturally occurring, the objective reality is that we live in a world of IP laws where humans have no choice but to retain copyrighted information to function in society. I don’t care that text or images have been compressed into an AI model as long as it’s done legally but the fact that it is has very real consequences for society since, unlike a human, it doesn’t need to eat, sleep, pay taxes, nor will it ever die which is constantly ignored in this conversation of what’s best for society.

These tools are optional whether people like to hear it or not. I’m not even against them ideologically, I just don’t think they’re being integrated into society in anything resembling a well thought out way.

129. becquerel ◴[] No.43661582{5}[source]
It crushes the orphans very quickly, and on command, and allows anyone to crush orphans from the comfort of their own home. Most people are low-taste enough that they don't really care about the difference between hand-crushed orphans and artisanal hand-crushed orphans.
replies(1): >>43668188 #
130. becquerel ◴[] No.43661593[source]
If everyone hated AI images, nobody would be creating them.
replies(1): >>43703937 #
131. bmacho ◴[] No.43661607{4}[source]
Yup, banning AI for the sake of artist would be exactly the same as the current copyright laws. (Also they are attacking AI not purely for fear of their jobs, but bc it is illegal already.)
132. Riverheart ◴[] No.43661683{6}[source]
Nobody is trying to copyright an image hash and determinism matters because it’s why the outputs are derivative rather than inspired.
replies(1): >>43662706 #
133. dgellow ◴[] No.43661732{3}[source]
Have you tried Affinity?
134. mesh ◴[] No.43661755[source]
For reference, here is Adobe's approach to generative ai:

https://www.adobe.com/fireflyapproach/

(I work for Adobe)

135. mamonoleechi ◴[] No.43661757{4}[source]
If not, Affinity Photo or Photopea will probably do the job.
136. pastage ◴[] No.43662059{4}[source]
Actually moral rights is what allow you to say no to AI. It is also a big part of copyright and more important in places were fair use does not exist in the extent it does in the US.

Further making a variant of a famous art piece under copyright might very well be a derivative. There are court cases here just some years for the AI boom were a format shift from photo to painting was deemed to be a derivative. The picture generated with "Painting of a archeologist with a whip" will almost certainly be deemed a derivative if it would go through the same court.

replies(1): >>43665639 #
137. visarga ◴[] No.43662172{4}[source]
> There is no such thing as "get as good as a human artist" unless it becomes an actual human that lived the human experience. Even bad art starts with something to express and a want to express it.

There is always an actual human who has actual human experience in the loop, the AI doesn't need to have it. AI doesn't intend to draw anything on its own, and can't enjoy the process, there has to be a human to make it work on either intent (input) or value (output) side.

138. t0bia_s ◴[] No.43662191{5}[source]
As an artist, mine major complain about Adobe is their spyware software design. Constant calls for adobe servers, unable to work offline in field with their product and no support for linux.

Also, I'm curious, when they start censoring exports from their software. They already do that for money scans.

I'm not worry about image generators. They'll never generate art by definition. AI tools are same as camera back then - a new tool that still require human skills and purpose to create specific tasks.

139. squigz ◴[] No.43662522[source]
I don't think all artists are treating this tool as such an existential threat.
replies(3): >>43662734 #>>43663852 #>>43683137 #
140. vladvasiliu ◴[] No.43662529{6}[source]
> But I would never willingly spend my time or money engaging with AI "art." By that, I mean I would never attend a concert, watch a film, visit a museum, read a book, or even scroll through an Instagram profile if what I'm viewing is largely the output of AI. What would the point be?

Why not? The output of AI is usually produced at the request of a human. So if the human will then alter the request such that the result suits whatever the human's goal is, why would there be no point?

This, to me, sound like the debate of whether just pressing a button on a box to produce a photograph is actually art, compared to a painting. I wonder whether painters felt "threatened" when cameras became commonplace. AI seems just like a new, different way of producing images. Sure, it's based on prior forms of art, just like photography is heavily inspired by painting.

And just because most images are weird or soulless or whatever doesn't disqualify the whole approach. Are most photographs works of art? I don't think so. Ditto for paintings.

To your point about Instagram profiles, I actually do follow some dude who creates "AI art" and I find the images do have "soul" and I very much enjoy looking at them.

141. fxtentacle ◴[] No.43662557{4}[source]
For InDesign magazines with embedded images, for example, I'm not aware of any compatible 3rd party software
replies(1): >>43663065 #
142. bawolff ◴[] No.43662672{6}[source]
This is silly. What are you proposing? A coup to ban AI? Because that is the alternative to waiting for legislators and courts.
replies(1): >>43663851 #
143. bawolff ◴[] No.43662706{7}[source]
That is not how copyright works. "Inspired" works can still be derrivative. In the US, entirely deterministic works are not considered derrivative works as they aren't considered new creative works (if anything they are considered the same as the original). See https://en.wikipedia.org/wiki/Bridgeman_Art_Library_v._Corel...
replies(1): >>43664167 #
144. bbarnett ◴[] No.43662734{3}[source]
I don't think all artists are treating this tool as such an existential threat.

You cannot find any group, where "all" is true in such context. There's always an element of outlier.

That said, you're not really an artist if you direct someone else to paint. Imagine a scenario where you sit back, and ask someone to paint an oil painting for you. During the event, you sit in an easy chair, watch them with easel and brush, and provide direction "I want clouds", "I want a dark background". The person does so.

You're not the artist.

All this AI blather is the same. At best, you're a fashion designer. Arranging things in a pleasant way.

replies(1): >>43663049 #
145. squigz ◴[] No.43663049{4}[source]
One could say much the same thing about photographers, or digital artists. They don't use paint, or sculpt marble, so they're not real artists.
replies(1): >>43663269 #
146. mandmandam ◴[] No.43663065{5}[source]
Here are some options which might help [0] (Bias: I love Affinity Publisher and despise Adobe).

0 - https://forum.affinity.serif.com/index.php?/topic/225143-wha...

147. Juliate ◴[] No.43663269{5}[source]
Who talked about "real" here?

Photographers do manipulate cameras, and rework afterwise the images to develop.

Digital artists do manipulate digital tools.

Their output is a large function of their informed input, experience, taste, knowledge, practice and intention, using their own specific tools in their own way.

Same with developers: the result is a function of their input (architecture, code, etc.). Garbage in, garbage out.

With AI prompters, the output is part function of the (very small) prompt, part function of the (huuuuuuuge) training set, part randomness.

If you're the director of a movie, or of a photo shoot, you're the director. Not the photographer, not the set painter, not the carpenter, not the light, etc.

If you're the producer, you're not the artist (unless you _also_ act as an artist in the production).

Do you feel the difference?

replies(3): >>43664401 #>>43667434 #>>43668211 #
148. xeonmc ◴[] No.43663302{3}[source]
Have you seen the recently posted video "For Profit (Creative) Software"?

https://www.youtube.com/watch?v=I4mdMMu-3fc

149. lancebeet ◴[] No.43663369{4}[source]
I'm not entirely convinced by the artists' argument, but this argument is also unconvincing to me. If someone steals from you, but it's a negligible amount, or you don't even notice it, does that make it not stealing? If the thief then starts selling the things they stole from you, directly competing with you, are your grievances less valid now since you didn't complain about the theft before?
replies(1): >>43663875 #
150. PaulHoule ◴[] No.43663679[source]
I went through a phase of using the A.I. tools to touch up photos and thought they were helpful. If I needed to add another row of bricks to a wall or remove something they get it done. I haven’t used it in a few months because I’m taking different photos than I was back then.
replies(1): >>43663966 #
151. Riverheart ◴[] No.43663851{7}[source]
Never proposed a ban, the issue is copyright, use licensed inputs and I could care less.

Pro AI people need to stop behaving like it’s a foregone conclusion that anything they do is right and protected from criticism because, as was pointed out, the legality of what is being done with unlicensed inputs, which is the majority of inputs, is still up for debate.

I’m just calling attention to the double standard being applied in who is allowed to have an opinion on what the legal outcome should be prior to that verdict. Temporal said people shouldn’t “pretend or assume” that lots of AI infringes on other people’s work because the law hasn’t caught up but the same argument applies equally to them (AI proponents) and they have already made up their mind, independent of any legal authority, that using unlicensed inputs is legal.

The difference in our opinions is that if I’m wrong, no harm done, if they’re wrong, lots of harm has already been done.

I’m trying to have a nuanced conversation but this has devolved into some pro/anti AI, all or nothing thing. If you still think I want to ban AI after this wall of text I don’t know what to tell you dude. If I’ve been unclear it’s not for lack of trying.

replies(1): >>43666405 #
152. stafferxrr ◴[] No.43663852{3}[source]
Of course not. People who are actually creative will use new tools creatively.

Adobe AI tools are pretty shit though if you want to use them to do something creative. Shockingly bad really.

They are probably good if you want to add a few elements to an instagram photo but terrible for actual digital art.

153. stafferxrr ◴[] No.43663861{3}[source]
Thanks. I will check this out. I was shocked how terrible the output of Photoshop AI tools are. Not even midjourney 4 level.
replies(1): >>43664947 #
154. lcnPylGDnU4H9OF ◴[] No.43663875{5}[source]
Nothing was stolen from the artists but instead used without their permission. The thing being used is an idea, not anything the artist loses access to when someone else has it. What is there to complain about? Why should others listen to the complaints (disregarding copyright law because that is circular reasoning)?
replies(2): >>43664184 #>>43673530 #
155. anileated ◴[] No.43663932{4}[source]
> The entire legal edifice around recognizing and protecting intellectual property is an entirely artificial construct

The presence of “natural” vs. “artificial” argument is a placeholder for nonexistent substantiation. There is never a case when it does anything else but add a disguise of objectivity to some wild opinion.

Artificial as opposed to what? Do you consider what humans do is “unnatural” because humans are somehow not part of nature?

If some humans (in case of big tech abusing copyright, vast majority, once the realization reaches the masses) want something and other humans don’t, what exactly makes one natural and another unnatural other than your own belonging to one group or the other?

> that goes against the nature of information and knowledge

What is that nature of information and knowledge that you speak about?

> forcing information to behave like physical goods, so it's not unfair to the creators in an economy that's built around trading physical goods

Its point has been to encourage innovation, creativity, and open information sharing—exactly those things that gave us ML and LLMs. We would have none of these in that rosy land of IP communism where no idea or original work belongs to its author that you envision.

Recognition of intellectual ownership of original work (coming in many shapes, including control over how it is distributed, ability to monetize it, and just being able to say you have done it) is the primary incentive for people to do truly original work. You know, the work that gave us GNU Linux et al., true innovation that tends to come when people are not giving their work to their employer in return for paycheck.

> IP laws were built on moral arguments, so it's only fair to change them on moral grounds too.

That is, perhaps, the exact point of people who argue that copyright law should be changed or at least clarified as new technology appears.

156. davidee ◴[] No.43663966{3}[source]
We used that particular feature quite heavily. A lot of our clients often have poorly cropped photos or something with branding that needed removal and the context-aware generative fill was quite good.

But we decided to drop Adobe after some of their recent shenanigans and moved to a set of tools that didn't have this ability and, frankly, we didn't really miss it that much. Certainly not enough to ever give Adobe another cent.

157. TeMPOraL ◴[] No.43664103{6}[source]
> Virtually everybody who ever lost a job to technology ended up better off for it.

That's plain wrong, and quite obviously so. You're demonstrating here a very common misunderstanding of the arguments people affected by (or worried about) automation taking their jobs make. In a very concise form:

- It's true that society and humanity so far always benefited from eliminating jobs through technology, in the long term.

- It's not true that society and humanity benefited in the immediate term, due to the economic and social disruption. And, most importantly:

- It's not true that people who lost jobs to technology were better off for it - those people, those specific individuals, as well as their families and local communities, were all screwed over by progress, having their lives permanently disrupted, and in many cases being thrown into poverty for generations.

(Hint: yes, there may be new jobs to replace old ones, but those jobs are there for the next generation of people, not for those who just lost theirs.)

Understanding that distinction - society vs. individual victims - will help make sense of e.g. why Luddites destroyed the new mechanized looms and weaving frames. It was not about technology, it was about capital owners pulling the rug from under them, and leaving them and their children to starve.

158. Riverheart ◴[] No.43664167{8}[source]
“In the US, entirely deterministic works are not considered derrivative works as they aren't considered new creative works (if anything they are considered the same as the original)”

Okay, so if the inputs to the model are my artwork to replicate my style, is the output copyrightable by you? You just said deterministic works aren’t derivative, they’re considered the same as the original. That’s not anything I’ve heard AI proponents claim and the outputs are more original than a 1 to 1 photocopy but I assume like the case you linked to that the answer will be, no, you can’t copyright.

replies(2): >>43666244 #>>43667703 #
159. ChrisPToast ◴[] No.43664184{6}[source]
So many problems with your reasoning.

"Nothing was stolen from the artists but instead used without their permission"

Yes and no. Sure, the artist didn't loose anything physical, but neither did music or movie producers when people downloaded and shared MP3s and videos. They still won in court based on the profits they determined the "theft" cost them, and the settlements were absurdly high. How is this different? An artist's work is essentially their resume. AI companies use their work without permission to create programs specifically intended to generate similar work in seconds, this substantially impacts an artist's ability to profit from their work. You seem to be suggesting that artists have no right to control the profits their work can generate - an argument I can't imagine you would extend to corporations.

"The thing being used is an idea"

This is profoundly absurd. AI companies aren't taking ideas directly from artist's heads... yet. They're not training their models on ideas. They're training them on the actual images artists create with skills honed over decades of work.

"not anything the artist loses access to when someone else has it"

Again, see point #1. The courts have long established that what's lost in IP theft is the potential for future profits, not something directly physical. By your reasoning here, there should be no such things as patents. I should be able to take anyone or any corporation's "ideas" and use them to produce my own products to sell. And this is a perfect analogy - why would any corporation invest millions or billions of dollars developing a product if anyone could just take the "ideas" they came up with and immediately undercut the corporation with clones or variants of their products? Exactly similar, why would an artist invest years or decades of time honing the skills needed to create imagery if massive corporations can just take that work, feed it into their programs and generate similar work in seconds for pennies?

"What is there to complain about"

The loss of income potential, which is precisely what courts have agreed with when corporations are on the receiving end of IP theft.

"Why should others listen to the complaints"

Because what's happening is objectively wrong. You are exactly the kind of person the corporatocracy wants - someone who just say "Ehhh, I wasn't personally impacted, so I don't care". And not only don't you care, you actively argue in favor of the corporations. Is it any wonder society is what it is today?

replies(2): >>43667613 #>>43668428 #
160. tpmoney ◴[] No.43664300{11}[source]
So what benefit do we derive as a society from deciding that the capability for copyright infringement is in and of itself infringement? What do we gain by overturning the current protections the law (or society) currently has for technologies like xerox machines, VHS tapes, blank CDs and DVDs, media ripping tools, and site scraping tools? Open source digital media encoding, blank media, site scraping tools and bit-torrent enable copyright infringement on a massive scale to the tune of millions or more dollars in losses every year if you believe the media companies. And yet, I would argue as a society we would be worse off without those tools. In fact, I'd even argue that as a society we'd be worse off without some degree of tolerated copyright infringement. How many pieces of interesting media have been "saved" from the dust bin of history and preserved for future generations by people committing copyright infringement for their own purposes? Things like early seasons of Dr Who or other TV shows that were taped over and so the only extant copies are from people's home collections taped off the TV. The "De-specialized" editions of Star Wars are probably the most high quality and true to the original cuts of the original Star Wars trilogy that exist, and they are unequivocally pure copyright infringement.

Or consider the youtube video "Fan.tasia"[1]. That is a collection of unlicensed video clips, combined with another individual's work which itself is a collection of unlicensed audio clips mashed together into a amalgamation of sight and sound to produce something new and I would argue original, but very clearly also full of copyright infringement and facilitated by a bunch of technologies that enable doing infringement at scale. It is (IMO) far more obviously copyright infringement than anything an AI model is. Yet I would argue a world in which that media and the technologies that enable it were made illegal, or heavily restricted to only the people that could afford to license all of the things that went into it from the people who created all the original works, would be a worse world for us all. The ability to easily commit copyright infringement at scale enabled the production of new and interesting art that would not have existed otherwise, and almost certainly built skills (like editing and mixing) for the people involved. That, to me, is more valuable to society than ensuring that all the artists and studios whose work went into that media got whatever fractions of a penny they lost from having their works infringed.

[1]: https://www.youtube.com/watch?v=E-6xk4W6N20&pp=ygUJZmFuLnRhc...

replies(1): >>43664710 #
161. numpad0 ◴[] No.43664352{3}[source]
I haven't seen a single AI image that were good let alone great.

To be completely honest, I can't always tell, but when I come across images that give me inexplicable gastric discomfort that I can't explain why, and then it was revealed that it had been AI generated, that explains it all(doesn't remove the discomfort, just explains it).

I don't have reasons to believe that I have above-average eyes on art among HNers, but it'll be funny and painful if so. I mean, I'm no Hayao "I sense insult to life itself" Miyazaki...

replies(1): >>43666534 #
162. washadjeffmad ◴[] No.43664378[source]
What can Photoshop AI do that ipadapter / controlnets can't and haven't done for the past two years?

"Get artists to use it" is the free square :)

163. luckylion ◴[] No.43664401{6}[source]
> With AI prompters, the output is part function of the (very small) prompt, part function of the (huuuuuuuge) training set, part randomness.

With photographers, the output is part function of the (very small) orientation of the camera and pressing the button, part function of the (huuuuuuuge) technical marvel that are modern cameras, part randomness.

Let's be realistic here. Without the manufactured cameras, 99.9% of photographers wouldn't be photographers, only the 10 people who'd want it enough to build their own cameras, and they wouldn't have much appeal beyond a curiosity because their cameras would suck.

replies(1): >>43666062 #
164. SuperNinKenDo ◴[] No.43664554[source]
ACME is the one major company trying to be ethical with its orphan crushing training data and no one even seems to care!
165. Riverheart ◴[] No.43664710{12}[source]
The capability of the model to infringe isn’t the problem. Ingesting unlicensed inputs to create the model is the initial infringement before the model has even output anything and I’m saying that copyright shouldn’t be assigned to it or its outputs. If you train on licensed art and output Darth Vader that’s cool so long as you know better than to try copyrighting that. If you train on licensed art and produce something original and the law says it’s cool to copyright that or there’s just no one to challenge you, also cool.

If you want to ingest unlicensed input and produce copyright infringing stuff for no profit, just for the love of the source material, well that’s complicated. I’m not saying no good ever came of it, and the tolerance for infringement comes from it happening on a relatively small scale. If I take an artists work with a very unique style and feed it into a machine then mass produce art for people based on that style and the artist is someone who makes a living off commissions I’m obviously doing harm to their business model. Fanfics/fanart of Nintendo characters probably not hurting Nintendo. It’s not black or white. It’s about striking a balance, which is hard to do. I can’t just give it a pass because large corporations will weather it fine.

That Fantasia video was good. You ever see Pogo’s Disney remixes? Incredible musical creativity but also infringing. I don’t doubt the time and effort needed to produce these works, they couldn’t just write a prompt and hit a button. I respect that. At the same time, this stuff is special partly because there aren’t a lot of things like it. If you made a AI to spit out stuff like this it would be just another video on the internet. Stepping outside copyright, I would prefer not to see a flood of low effort work drown out everything that feels unique, whimsical, and personal but I can understand those who would prefer the opposite. Disney hasn’t taken it down in the last 17 years and god I’m old. https://youtu.be/pAwR6w2TgxY?si=K8vN2epX4CyDsC96

The training of unlicensed inputs is the ultimate issue and we can just agree to disagree on how that should be handled. I think

166. blibble ◴[] No.43664719{5}[source]
> It reveals that some people simply hate the whole concept of generative AI, regardless of how it was made. You're never going to please them.

and unfortunately for adobe: these people are its customers

167. ToucanLoucan ◴[] No.43664912{6}[source]
> But it's the person that causes the creation of the infringing material that is responsible for the infringement, not the machine or device itself.

That's simply not good enough. This is not merely a machine that can be misused if desired by a bad actor, this is a machine that specializes in infringement. It's a machine which is internally biased, by the nature of how it works, towards infringement, because it is inherently "copying:" It is copying the weighted averages of millions perhaps billions of training images, many of which depict similar things. No, it doesn't explicitly copy one Indiana Jones image or another: It copies a shit ton of Indiana Jones images, mushed together into a "new" image from a technical perspective, but will inherit all the most prominent features from all of those images, and thus: it remains a copy.

And if you want to disagree with this point, it'd be most persuasive then to explain why, if this is not the case, AI images regularly end up infringing on various aspects of various popular artworks, like characters, styles, intellectual properties, when those things are not being requested by the prompt.

> If you ask a human artist for an image of "an archeologist who wears a hat and uses a whip" you're also going to get something extremely similar to Indiana Jones unless you explicitly ask for something else.

No, you aren't, because an artist is a person that doesn't want to suffer legal consequences for drawing something owned by someone else. Unless you specifically commission "Indiana Jones fanart" I in fact, highly doubt you'll get something like him because an artist will want to use this work to promote their work to others, and unless you are driven to exist in the copyright gray area of fan created works, which is inherently legally dicey, you wouldn't do that.

replies(1): >>43669615 #
168. ZeroTalent ◴[] No.43664947{4}[source]
also check out gpt4o image generation. It can fit in up 20 objects with correct texts and it's very good at following instruction, in my experience.
169. nearbuy ◴[] No.43665118{4}[source]
You should be. Otherwise, you're showing Adobe and other companies that ethical training is pointless, and isn't economically sensible after all.
170. therealpygon ◴[] No.43665148[source]
Ethical? You realize most of their training data was obtained by users forced agreement to a EULA with the intention of their art being sold on Adobe’s marketplace without it ever being made explicit their art was going to be used for AI training until much later, right?
171. AnthonyMouse ◴[] No.43665550{5}[source]
If you type "street art" as part of an image generation prompt, the results are quite similar to typing "in the style of Banksy". They're direct substitutes for each other, neither of them is actually going to produce Banksy-quality output and it's not even obvious which one will produce better results for a given prompt.

You still get images in a particular style by specifying the name of the style instead of the name of the artist. Do you really think this is no different than being able to produce only music when you want an image?

replies(1): >>43667749 #
172. AnthonyMouse ◴[] No.43665639{5}[source]
> Actually moral rights is what allow you to say no to AI.

The US doesn't really have moral rights and it's not clear they're even constitutional in the US, since the copyright clause explicitly requires "promote the progress" and "limited times" and many aspects of "moral rights" would be violations of the First Amendment. Whether they exist in some other country doesn't really help you when it's US companies doing it in the US.

> Further making a variant of a famous art piece under copyright might very well be a derivative.

Well of course it is. That's what derivative works are. You can also produce derivative works with Photoshop or MS Paint, but that doesn't mean the purpose of MS Paint is to produce derivative works or that it's Microsoft rather than the user purposely creating a derivative work who should be responsible for that.

replies(1): >>43667546 #
173. gs17 ◴[] No.43665645[source]
I've never heard anyone (at least not anyone who wasn't already using GIMP) complain about the concept of paying for it, it's always been the way Adobe tries to squeeze extra money out of you. First it was bundles where you'd have to buy software you didn't need to get what you do. Then it was a subscription. Also, each CS version seemed to add very little for the price.
174. AnthonyMouse ◴[] No.43665807{5}[source]
The vast majority of AI image prompts don't reference an artist name, and the ones that do are typically using it as a proxy for a given style and would generally get similar results by specifying the name of the style instead of the name of the artist.

The ones using the name of the artist/studio (e.g. Ghiblification) also seem more common than they are because they're the ones that garner negative attention. Then the media attention a) causes people perceive it as being more common than it is and b) causes people do it more for a short period of time, making it temporarily more common even though the long-term economic relevance is still negligible.

replies(1): >>43667389 #
175. dragonwriter ◴[] No.43665850{5}[source]
> Artists don't hate Adobe just because they're making an AI art generator, they hate Adobe because it's a predatory, scummy corporation that is difficult to work with and is the gatekeeper for common industry tools.

From what I've seen from artists, they hate Adobe for both reasons, and the AI thing is often more of a dogmatic, uncompromising hate (and is not based on any of the various rationalizations used to persuade others to act in accord with it) and less of the kind of hate that is nevertheless willing to accept products for utility.

176. Juliate ◴[] No.43666062{7}[source]
Ludicrous rebuttal.

Reducing this to "orientation of the camera" is such a dismissive take on the eye and focus of the person that decides to take a picture, where/when he/she is; this is really revealing you do not practice it.

And... before cameras were even electronic, back in the early 2000, there were already thousands and more of extremely gifted photographers.

Yes, cameras are marvellous tools. But they are _static_. They don't dynamically, randomly change the input.

Generative AI are not _static_. They require training sets to be anywhere near useful.

Cameras _do not_ feed on all the previous photographies taken by others.

replies(2): >>43666526 #>>43667192 #
177. bawolff ◴[] No.43666244{9}[source]
That depends on how much "creativity" is in the prompt, but generally i would lean towards no, the AI created work is not copyrightable by the person who used the model to "create" it.

I believe that is the conclusion the US copyright office came to as well https://www.copyright.gov/ai/ (i didnt actually read their report, but i think that's what it says)

178. bawolff ◴[] No.43666405{8}[source]
But this is hardly limited to AI.

Copyright is full of grey areas and disagreement over its rules happen all the time. AI is not particularly special in that regard, except perhaps in scale.

Generally the way stuff moves forward is somebody tries something, gets sued and either they win or lose and we move forward from that point.

Ultimately "harm" and "legality" are very different things. Something could be legal and harmful - many things are. In this debate i think different groups are harmed depending on which side that "wins".

If you want to have a nuanced debate, the relavent issue is not if the input works are licensed - they obviously are not, but on the following principles:

- de minimis - is the amount of each individual copyrighted work too small to matter.

- is the AI just extracting "factual" information from the works separate from their presentation. After all each individual work only adjusts the model by a couple bytes. Is it less like copying the work or more like writing a book about the artwork that someone could later use to make a similar work (which would not be copyright infringement if a human did it)

- fair use - complicated, but generally the more "transformative" a work is, the more fair use it would be, and AI is extremely transformative. On the other hand it potentially competes commercially with the original work, which usually means less likely to be fair use (and maybe you could have a mixed outcome here, where the AI generators are fine, but using them to sell competing artwork is not, but other uses are ok).

[Ianal]

179. luckylion ◴[] No.43666526{8}[source]
> Reducing this to "orientation of the camera" is such a dismissive take

What's more important: the person behind the camera or the camera? Show me the photos taken without the camera and then look at all the great photos taken by amateurs.

> They require training sets to be anywhere near useful.

And the camera needs assembly and R&D. But when either arrives at your door, it's "ready to go".

> Cameras _do not_ feed on all the previous photographies taken by others.

Cameras do feed on all the research of previous cameras though. The photos don't matter to the Camera. The Camera manufacturers are geniuses, the photographers are users.

It's really not far off from AI, especially when the cameras do so much, and then there's the software-tools afterwards etc etc.

Yeah, yeah, everybody wants to feel special and artsy and all that and looks down on the new people who aren't even real artists. But most people really shouldn't.

replies(1): >>43667087 #
180. gs17 ◴[] No.43666527{3}[source]
> They don't hate good AI images, and when they see great AI images, they don't even realize they are made by AI.

There's a decent size group of people who have a knee-jerk negative response toward AI regardless of quality. They'd see that image, like it, and then when told it's AI, turn on it and decide it was obviously flawed from the beginning. Is there a version of "sour grapes" where the fox did eat the grapes, they were delicious, but he declared they were sour after the fact to claim moral superiority?

replies(2): >>43669256 #>>43669668 #
181. gs17 ◴[] No.43666534{4}[source]
> I'm no Hayao "I sense insult to life itself" Miyazaki

He was saying that in response to a computer-animated zombie that dragged itself along in a grotesque manner. It wasn't that it was animated by a computer, it was that he found it offensive in that it felt like it was making light of the struggles of people with disabilities. You definitely would also find it disgusting.

replies(1): >>43684587 #
182. jwitthuhn ◴[] No.43666681{4}[source]
It sort of can but all non-adobe software I know of, even commercial stuff like Affinity Photo, has spotty support for some PSD features.

Basically any given PSD will certainly load correctly in photoshop, but you're rolling the dice if you want to load it into anything else. More so if you are using more modern features.

183. _bin_ ◴[] No.43666817{4}[source]
I thought most people supported moving away from that and towards a more socially liberal model. If we're no longer doing that I have a whole stack of socially conservative policies I guess I'll go back to pushing.

I don't think y'all really want to go down this road; it leads straight back to the nineties republicans holding senate hearings on what's acceptable content for a music album.

replies(1): >>43667753 #
184. _bin_ ◴[] No.43666848{4}[source]
Disagree. Authority is given Congress to establish an IP regime for the purpose of "promot[ing] the progress of science and useful arts". You would have to justify how banning gen AI is a. feasible at all, particularly with open-weight models; and b. how it "promotes the progress of useful arts." You would lose in court because it's very difficult to argue that keeping art as a skilled craftsman's trade is worse for its progress than lowering the barriers to individuals expressing what they see.

I think bad AI makes bad output and so a few people are worried it will replace good human art with bad AI art. Realistically, the stuff it's replacing now is bad human art: stock photos and clipart stuff that weren't really creative expression to start with. As it improves, we'll be increasingly able to go do a targeted inpaint to create images that more closely match our creative vision. There's a path here that lowers the barriers for someone getting his ideas into a visual form and that's an unambiguous good, unless you're one of the "craftsmen" who invested time to learn the old way.

It's almost exactly the same as AI development. As an experienced dev who knows the ins and outs really well I look at AI code and say, "wow, that's garbage." But people are using it to make unimportant webshit frontends, not do "serious work". Once it can do "serious work" that will decrease the number of jobs in the field but be good for software development as a whole.

185. Juliate ◴[] No.43667087{9}[source]
You’re confusing the tools (which are their own marvels) and the practice (which is art, using the tools).

However good or not is the camera, it’s not the camera that dictates the inner qualities of a photograph, there is _something else_ that evades the technicalities of the tools and comes from the context and the choice of the photograph (and of accident, too, because it’s the nature of photography: capturing an accident of light).

The same camera in the hands of two persons will give two totally different sets of pictures, if only because, their sight, their looking at the world is different; and because one knows how to use the tools, and the other, not in the same way, or not at all.

It’s not a matter of « feeling artsy » or special, it’s a matter of « doing art ».

Everyone is an artist, if they want to: it’s a matter of practicing and intent, not a matter of outputting.

Art is in the process (of making, and of receiving), not in the output (which is the artefact of art and which has its own set of controversial and confusing economics and markets).

Generative AI on the contrary of tools that stay in their specific place, steals the insight from previous artists (from the training set) and strips the prompter from their own insights and personality and imprint (because it is not employed, but only through a limited text prompt at an interface).

Generative AI enthousiasts may be so. They have every right to be. But not by ignoring and denying the fundamental steal that injecting training sets without approval is, and the fundamental difference there is between _doing art_ and asking a computer to produce art.

Ignoring those two is a red flag of people having no idea what art, and practice is.

replies(2): >>43670342 #>>43674966 #
186. squigz ◴[] No.43667192{8}[source]
> Reducing this to "orientation of the camera" is such a dismissive take on the eye and focus of the person that decides to take a picture, where/when he/she is; this is really revealing you do not practice it.

Oh, the irony...

187. fc417fc802 ◴[] No.43667389{6}[source]
The latter example (Ghibli) is also somewhat misleading. Other studios sometimes use very similar styles. They might not have the same budget for fine detail throughout the entire length of the animation, and they probably don't do every production with that single art style, but when comparing still frames (which is what these tools generate after all) the style isn't really unique to a single studio.
188. fc417fc802 ◴[] No.43667434{6}[source]
So AI tools take you from "artist" to "art director". That's an interesting thought. I think I agree.
189. weregiraffe ◴[] No.43667492{3}[source]
>Right, but "distaste" isn't grounds for trying to ban something.

https://en.wikipedia.org/wiki/United_States_obscenity_law

190. fc417fc802 ◴[] No.43667546{6}[source]
Well one could argue that this ought to be a discussion of morality and social acceptability rather than legality. After all the former can eventually lead to the latter. However if you make that argument you immediately run into the issue that there clearly isn't broad consensus on this topic.

Personally I'm inclined to liken ML tools to backhoes. I don't want the law to force ditches to be dug by hand. I'm not a fan of busywork.

191. mort96 ◴[] No.43667578[source]
To people who care about ethics wrt. "AI", there is no such thing as ethical "AI".

To people who are on board with the "AI" hype train, there is no ethical problem to be solved wrt. "AI".

Neither side cares.

192. fc417fc802 ◴[] No.43667613{7}[source]
It's piracy, not theft. Those aren't the same thing but they are both against the law and the court will assess damages for both.

The person you replied to derailed the conversation by misconstruing an analogy.

> what's happening is objectively wrong.

Doesn't seem like a defensible claim to me. Clearly plenty of people don't feel that way, myself included.

Aside, you appear to be banned. Just in case you aren't aware.

replies(1): >>43668523 #
193. fc417fc802 ◴[] No.43667684{4}[source]
> I'm not the one being rugpulled out of a career path by GenAI,

That's quite a bold assumption. Betting that logic and reasoning ability plateaus prior to "full stack developer" seems like a very risky gamble.

replies(1): >>43671087 #
194. Workaccount2 ◴[] No.43667703{9}[source]
Are anime artists all in copyright violation of each other?
replies(1): >>43669222 #
195. card_zero ◴[] No.43667749{6}[source]
This hinges on denying that artists have distinctive personal styles. Instead your theory seems to be that styles are genres, and that the AI only needs to be trained on the genre, not the specific artist's output, in order to produce that artist's style. Which under this theory is equivalent to the generic style.

My counter-argument is "no". Ideally I'd elaborate on that. So ummm ... no, that's not the way things are. Is it?

replies(1): >>43669069 #
196. fc417fc802 ◴[] No.43667753{5}[source]
Many laws come down to distaste at the root. There's usually an alternative angle about market efficiency or social stability or whatever if you want to frame it that way. The same applies in this case as well.

For but a few examples consider laws regarding gambling, many aspects of zoning, or deceptive marketing.

What's the purpose of the law if not providing stability? Why should social issues be exempted from that?

197. fc417fc802 ◴[] No.43667788{4}[source]
Or alternatively, who else could afford the licensing costs?
198. UtopiaPunk ◴[] No.43667869{7}[source]
I don't want to come across as judgey, gate-keeping what is in good taste or what should make you "feel bad." The human element is just a very crucial part, at least for me. Art is a way of humans beings connectig with each other. That can be high-brow stuff, but that can also be, like, pulpy action movies or cheesy romance novels. Someone might be expressing deep beautiful ideas that change my life forever, or they might think that it was totally sick to have a car jump over a chasm and through a big loud explosion. In both cases, I'm engaging with another human being, at least at some level, at that means something.

But if I see something that I think is cool and interesting, and then I discover that it was mostly the result of a few AI prompts, then I just don't care about it anymore. I don't "feel bad" that I thought it interesting, rather, I just completely lose interest.

I do fear that it will be increasingly difficult to tell what is generated by AI and what is created by humans. Just examining myself, I think that would mean I would retreat from mainstream pop-culture stuff, and it would be with sadness. It's a bleak future to imagine. It seems reminiscent of the "versificator" in George Orwell's 1984.

199. UtopiaPunk ◴[] No.43667972{7}[source]
I agree with the sentiment that "most people want entertainment, not art," or at least they do a lot of the time. I have a pretty wide definition of what is art, in that almost anything created by a human could be appreciated as art (whether that's a novel, a building, the swinging of a baseball bat, or even a boring sidewalk). But a lot of people, a lot of the time engage with movies and books and the like as merely "entertainment." There's art there, but art is a two-way interaction between the creator(s) and the audience. Even in the pulpiest, most corporate creations. I'm not engaging with cat food commercials as art, but one genuinely could. I agree that AI can generate stuff that is entertaining.

"The idea that only humans can make that kind of work has already been disproven." That I disagree with, and it ultimately is a matter of "what is art." I won't pretend to offer a full, complete definition of what is art, but at least one aspect of defining what is and is not art is, in my opinion, whether is was created by a human or not. There is at least some legal precedent that in order for a copyright to be granted, the work has to be created by a human being: https://en.wikipedia.org/wiki/Monkey_selfie_copyright_disput...

"I suspect future generations of AI will be better at creating compelling original art because the AI will have a more complete model of our emotional triggers - including novelty and surprise triggers - than we do ourselves."

Again, by my definition at least, AI cannot create "original art." But I'll concede that it is conceivable that AI will generate entertainment that is more popular and arousing than the entertainment of today. That is a rather bleak future to imagine, though, isn't it? It seems reminiscent of the "versificator" of 1984.

200. wizzwizz4 ◴[] No.43668188{6}[source]
You know "puréed orphan extract" is just salt, right? You can extract it from seawater in an expensive process that, nonetheless, is way cheaper than crushing orphans (not to mention the ethical implications). Sure, you have to live near the ocean, but plenty of people do, and we already have distribution networks to transport the resulting salt to your local market. Just one fist-sized container is the equivalent of, like, three or four dozen orphans; and you can get that without needing a fancy press or an expensive meat-sink.
201. jhbadger ◴[] No.43668211{6}[source]
Historically, it took a long time for traditional artists (painters and sculptors) to see photographers as fellow artists rather than mere technicians using technology to replace art. The same thing was true of early digital artists who dared to make images without paint or pencils.
replies(1): >>43670307 #
202. timewizard ◴[] No.43668300[source]
> or to make a moral argument that some kind of misappropriation is occurring.

They can also make a legal argument that the training set will fully reproduce copyrighted work. Which is just an actual crime as well as being completely amoral.

> because then it exists and they have to compete with it

The entire point of copyright law is: "To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries."

Individual artists should not have to "compete" against a billion dollar corporation which freely engages in copyright violations that these same artists have to abide by.

203. lcnPylGDnU4H9OF ◴[] No.43668428{7}[source]
I dunno, man. Re-read your comment but change one assumption:

> They still won in court based on the profits they determined the "theft" cost them, and the settlements were absurdly high.

Such court determinations are wrong. At least hopefully you can see how perhaps there is not so much wrong with the reasoning, even if you ultimately disagree.

> They're training them on the actual images artists create with skills honed over decades of work.

This is very similar to a human studying different artists and practicing; it’s pretty inarguable that art generated by such humans is not the product of copyright infringement, unless the image copies an artist’s style. Studio Ghibli-style AI images come to mind, to be fair, which should be a liability to whoever is running the AI because they’re distributing the image after producing it.

If one doesn’t think that it’s wrong for, e.g., Meta to torrent everything they can, as I do not, then it is not inconsistent to think their ML training and LLM deployment is simply something that happened and changed market conditions.

replies(1): >>43673537 #
204. lcnPylGDnU4H9OF ◴[] No.43668523{8}[source]
> The person you replied to derailed the conversation by misconstruing an analogy.

Curious why you say this. They seem to have made the copyright infringement analogous to theft and I addressed that directly in the comment.

replies(1): >>43668808 #
205. ToucanLoucan ◴[] No.43668604{10}[source]
Firstly it’s not an appeal to nature fallacy to accurately describe how a product of nature works, secondly it’s the peak of lazy online discussion to name a fallacy and leave as though it means something. Fallacies can be applied to tons of good arguments and along with the fallacy, you need to explain why the point itself being made is fallacious.

It’s a philosophical concept not a trap card.

206. TeMPOraL ◴[] No.43668694{7}[source]
> Wouldn’t a better alternative be to work on improving social safety nets for everybody, as opposed to providing a bespoke one for a single industry?

Yes, but:

1) It's not really an exclusive choice; different people can pursue different angles, including all of them - one can both seek immediate support/compensation for the specific case they're the victim of and seek longer-term solution for everyone who'd face the same problem in the future.

2) A bespoke solution is much more likely to be achievable than a general one.

3) I don't believe it would be good for society for artists to succeed in curtailing generative AI! But, should they succeed, I imagine the consequences will encourage people to seek the more general solution that mitigates occupational damage of GenAI while preserving its availability, instead of having to deal with a series of bespoke stopgaps that also kills GenAI entirely.

4) Not that banning GenAI has any chance of succeeding - the most we'd get is it being unavailable in some countries, who'd then be at a disadvantage in competition with countries that embraced it.

Again, I'm not in favor of banning GenAI - on the contrary, I'm in favor of giving a blanket exception from copyright laws for purposes of training generative models. However, I recognize the plight of artists and other people who are feeling the negative economic impact on their jobs right now (and hell, my own line of work - software development - is still one of the most at risk in the near to mid-term, too); I wish for a solution that will help them (and others about to be in this situation), but in the meantime, I don't begrudge them for trying to fight it - I think they have full right to. I only have problems with people who oppose AI because they feel that Big AI is depriving them of opportunity to seek rent from society for the value AI models are creating.

207. fc417fc802 ◴[] No.43668808{9}[source]
It was an analogy, ie a comparison of the differences between pairs. The relevant bit then is the damages suffered by the party stolen from. If you fail to pursue when the damages are small or nonexistent (image classifiers, employee stealing a single apple, individual reproduction for personal use) why should that undermine a case you bring when the damages become noticeable (generative models, employee stealing 500 lbs of apples, bulk reproduction for commercial sale)?
replies(1): >>43669325 #
208. mesh ◴[] No.43669008[source]
This is something that the teams within Adobe take really seriously. More info here:

https://adobe.design/stories/leading-design/reducing-biased-...

(I work for Adobe)

209. int_19h ◴[] No.43669069{7}[source]
The argument here isn't so much that individual artists don't have their specific styles, but rather whether AI actually tracks that, or whether using "in the style of ..." is effectively a substitute for identifying the more general style category to which this artist belongs.
210. int_19h ◴[] No.43669091{5}[source]
It depends on who was harmed. When countries were banning slavery (or serfdom in places where it was functionally equivalent, like Russia), slave owners made this very argument that depriving them of legitimately acquired workpower was an undeserved and unfair calamity for them, and were generally compensated.
211. int_19h ◴[] No.43669105{3}[source]
What exactly is unethical about generative AI, per se?
212. Riverheart ◴[] No.43669222{10}[source]
It’s not the style itself but the use of the art to train the model that outputs the style. Anime as a style is not copyrightable. The work anime artists create is copyrightable. Specifically, if you take their copyrighted work and feed it into a machine to extract the artistic expressions that characterize anime to make new art, is your usage of their art in that process fair use?

Fair Use 4th Factor: This factor considers whether the use could harm the copyright holders market for the original work.

If the use is research it’s fine. If the use is providing a public non-commercial model then it is somewhat harmful as their work is devalued. If the goal is to compete with them it is very harmful. Therefore, since we’re talking about the last two use cases, I argue fair use does not apply. Others maintain it does as maybe you do.

If it’s not fair use then it would be infringing on that particular copyright holder.

As you know, anime art is a spectrum with “How to Draw Manga for Kids” at the bottom and studio quality at the top. People pick and choose the art to train on not just because of the style but also the quality and consistency of their work. That’s why you might choose a specific artist to base a model on even though their style is just “anime”.

213. jcotton42 ◴[] No.43669256{4}[source]
The issue with AI isn't quality, or at least isn't just quality. It's ethical (use of works for training without credit or compensation, potential to displace a large portion of the artistic market, etc.)
replies(1): >>43684645 #
214. furyofantares ◴[] No.43669325{10}[source]
This is precisely where the analogy breaks down. The victim suffers damages in any theft, independent of any value the perpetrator gains. Damages due to copyright infringement don't work this way. Copyright exists to motivate the creation of valuable works; damages for copyright are an invented thing meant to support this.
replies(1): >>43669730 #
215. tpmoney ◴[] No.43669615{7}[source]
> This is not merely a machine that can be misused if desired by a bad actor, this is a machine that specializes in infringement.

So is a xerox machine. It's whole purpose is to make copies of things whatever you put into it with no regard to whether you have a license to make that copy. Likewise with the record capability on your VCR. Sure you could hook it up to a cam corder and transfer your home movie from a Super-8 to a VHS with your VCR (or like one I used to own, it might even have a camera accessory and port that you could hook a camera up to directly) and yet, I would wager most recordings on most VCRs were to commit copyright infringement. Bit-torrent specializes in facilitating copyright infringement, no matter how many Linux ISOs you download with it. CD ripping software and DeCSS is explicitly about copyright infringement. And let's be real, while MAME is a phenomenal piece of software that has done an amazing job of documenting legacy hardware and its quirks, the entire emulation scene as a whole is built on copyright infringement, and I would wager to a rounding error none of the folks that write MAME emulators have a license to copy the ROMs that they use to do that.

But in all of these cases, the fact that it can (and even usually is) used for copyright infringement is not in and of itself a reason to restrict or ban the technology.

> And if you want to disagree with this point, it'd be most persuasive then to explain why, if this is not the case, AI images regularly end up infringing on various aspects of various popular artworks, like characters, styles, intellectual properties, when those things are not being requested by the prompt.

Well for starters, I'd like to clarify to axioms:

1) "characters" as a subset of "intellectual properties"

2) "style" is not something you can copyright or infringe under US law. It can be part of a trademark or a design patent, and certainly you can commit fraud if you represent something in someone else's style as being a genuine item from that person, but style itself is not protected and I don't think it should be.

So then to answer the question, I would argue that AI images don't "regularly end up infringing on ... intellectual properties, when those things are not being requested by the prompt". I've generated quite a few AI images myself in exploring the various products out there and not a one of them has generated an infringing work, because none of my prompts have asked it to generate an infringing work. It is certainly possible that a given model with a sufficiently limited training set for a given set of words might be likely to generate an infringing image on a prompt, and that's because with a limited set of options to draw from, the prompt is inherently asking for an infringing image no matter how much you try to scrape the serial numbers off. That is, if I ask for an image of "two Italian plumbers who are brothers and battle turtles", everyone knows what that prompt is asking for. There's not a lot of reference options for that particular set of requirements and so it is more likely to generate an infringing image. It's also partly a function of the current goals of the models. As it stands, for the most part we want a model that takes a vague description and gives us something that matches our imagined output. Give that description to most people and they're going to envision the Mario Brothers, so a "good" image generation model is one that will generate a "Mario Brothers" inspired (or infringing) image.

As the technology improves and we get better about producing models that can take new paths without also generating body horror results, and as the users start wanting models that are more creative, we'll begin to see models that can respond to even that limited training set and generate something more unique and less likely to be infringing.

> No, you aren't, because an artist is a person that doesn't want to suffer legal consequences for drawing something owned by someone else.

Sorry, I think you're wrong. If you commission it for money from someone with enough potential visibility, you might encounter people who go out of their way to avoid anything that could be construed as Indiana Jones, but I bet even then you'd get more "Indiana Jones with the serial numbers filed off" images than not.

But if you just asked random artists to draw that prompt, you're going to get an artists rendition of Indiana Jones. It's clear thats what you want from the prompt and that's the single and sole cultural creative reference for that prompt. Though I suppose you and I are going to have to agree to disagree on what people will do unless you're feeling like actually asking a bunch of artist on Fiver to draw the prompt for you.

And realistically what do you expect them to draw when you make that request? When that article showed up with the headline, EVERYONE reading the headline knew the article was talking about an AI generating Indiana Jones. Why did everyone know that? Because of the limited reference for that prompt that exists. "Archeologist that wears a hat and uses a whip" describes very uniquely a single character to almost every single person.

There's a reason no one is writing articles about AIs ripping off Studio Ghibli by showing the output from the prompt "raccoon with giant testicles." No one writes articles talking about how the AI spontaneously generated Garfield knockoffs when prompted to draw an "orange stripped cat". There's no articles about AIs churning out truckloads of Superman images when someone asks for "super hero". And those articles don't exist because there's enough variations on those themes out there, enough different combinations of those words to describe enough different combinations of images and things that those words don't instantly conjure the same image and character for everyone. And so it goes for the AI too. Those prompts don't ask specifically for infringing art so they don't generally generate infringing art.

216. numpad0 ◴[] No.43669668{4}[source]

  > and then when told it's AI, turn on it and decide it was obviously flawed from the beginning. 
Have you seen any experimental results from researches in which participants were _falsely_ told something was AI-made, to prove and gauge that "moral superiority" effect? I'm not aware of any. There has to be many, because it has to be easy. No?
replies(1): >>43673942 #
217. fc417fc802 ◴[] No.43669730{11}[source]
That would only be a relevant distinction if the discussion were specifically about realized damages. It is not.

The discussion is about whether or not ignoring something that is of little consequence to you diminishes a later case you might bring when something substantially similar causes you noticeable problems. The question at hand had nothing to do with damages due to piracy (direct, perceived, hypothetical, legal fiction, or otherwise).

It's confusing because the basis for the legal claim is damages due to piracy and the size of that claim probably hasn't shifted all that much. But the motivating interest is not the damages. It is the impact of the thing on their employment. That impact was not present before so no one was inclined to pursue a protracted uphill battle.

replies(1): >>43670680 #
218. Juliate ◴[] No.43670307{7}[source]
Not the same thing again.

That comparison would be fair if the generative AI you use is trained exclusively on your own (rightfully acquired) data and work.

Existing generative AIs are feeding on the work of millions of people who did not consent.

That’s a violation of their work and of their rights.

And that should also alert those that expect to use/benefit of their own production out of these generators: why would it be 1/ protectable, 2/ protected at all.

It is no coincidence that these generators makers’ philosophy aligns with an autocrat political project, and some inhuman « masculinity » promoters. It’s all about power and nothing about playing by the rules of a society.

replies(2): >>43672068 #>>43672461 #
219. Juliate ◴[] No.43670342{10}[source]
There is a third and fourth red flag, is it conscious or not I don’t know.

I am not even speaking of « do the users feel what it is ». Here it is:

If some people are so enthusiastic and ruthless defenders of AI generators that were trained/fed from the work of millions on unconsenting artists…

1/ what do they expect will happen to their own generated production?

2/ what do they expect will happen to their own consent, in that particular matter, or in others matters (as this will have been an additional precedent, a de facto)?

Again, said it elsewhere, there is a power play behind this, that is very related to the brolicharchy pushing for some kind of twisted, « red pilled » (lol) masculinity, and that is related to rape as a culture, not only in sexual matter but in all of them.

replies(1): >>43678979 #
220. dbdr ◴[] No.43670381[source]
> From the perspective of an artist, a corporation training an AI image generator in a way that isn't susceptible to moral or legal assault is worse

That's ignoring the fact that an AI image generator trained without infringing on existing works would have way worse quality, because of the reduced amount and quality of the training set.

221. furyofantares ◴[] No.43670680{12}[source]
Oh, I agree with all that, I had sort of ignored the middle post in this chain.
222. TeMPOraL ◴[] No.43671087{5}[source]
I meant right now. I acknowledge elsewhere that software development is still near the top of the list, but it isn't affecting us just yet in the way it affects artists today.
223. squigz ◴[] No.43672068{8}[source]
> That comparison would be fair if the generative AI you use is trained exclusively on your own (rightfully acquired) data and work.

> Existing generative AIs are feeding on the work of millions of people who did not consent.

There are LLMs that are trained on non-copyright work, but apparently that's irrelevant according to the comment I replied to.

224. jhbadger ◴[] No.43672461{8}[source]
As people have mentioned, people are still against legally-sourced generative AI systems like Adobe's, so concern over IP rights isn't the only, or I suspect, major, objection to generative AI that people have.
replies(1): >>43673067 #
225. Juliate ◴[] No.43673067{9}[source]
It's not the only objection, but it's one of the major and blocking ones, because how do you _prove_ that you do not have unconsented copyrighted contents in your training set?

The other objections, in the economic range (replacing/displacing artists work for financial gain, from the producers point of view) are totally valid too, but don't rely on the same argument.

And my point above is not really an objection, it's a reminder: of what are AI generators, and what they are not (and that AI generators promoters pretend they are, without any piece of evidence or real argument).

Of what their output is (a rough, industrial barely specified and mastered product), and what it is not (art).

replies(1): >>43674248 #
226. Juliate ◴[] No.43673530{6}[source]
> Nothing was stolen from the artists but instead used without their permission.

Which is equally illegal.

> disregarding copyright law because that is circular reasoning

This is not circular, copyright is non-negotiable.

227. Juliate ◴[] No.43673537{8}[source]
> This is very similar to a human...

A machine, software, hardware, whatever, as much as a corporation, _is not a human person_.

228. gs17 ◴[] No.43673942{5}[source]
https://www.nature.com/articles/s41598-023-45202-3 is pretty similar to that, they randomized AI/human-made labels and participants considered the exact same piece less valuable and less creative when labeled as AI-made. It's not measuring "moral superiority", but it shows a "negative response toward AI regardless of quality". It's definitely an irrational response.
replies(1): >>43708698 #
229. squigz ◴[] No.43674248{10}[source]
> how do you _prove_ that you do not have unconsented copyrighted contents in your training set?

And this is why I've stopped arguing with people from this crowd. Beyond the classic gatekeeping of what art is, I'm sick of the constant moving of the goalposts. Even if a company provides proof, I'm sure you'd find another issue with them

Underlying all of it is a fundamental misunderstanding of how AI tools are used for art, and a subtle implication that it's really the amount of effort that defines what "art" really is.

replies(1): >>43675356 #
230. nektro ◴[] No.43674357[source]
because customers don't want generative AI in their products, ethical or not
231. arthurtully ◴[] No.43674455[source]
Step 1. Make a stock photos library for everyone to upload. Step 2. Use that stock photo library to train your AI without letting users opt out. You couldn't remove photos without accepting the licence. Step 3. Allow users to use AI generated art on said stock library, even further ignoring artists by regurgitating art from other models. Step 4. Force new licences to users that use any file as potential training data. Step 5. Act shocked when everyone is mad.
232. luckylion ◴[] No.43674966{10}[source]
> Generative AI on the contrary of tools that stay in their specific place, steals the insight from previous artists (from the training set) and strips the prompter from their own insights and personality and imprint (because it is not employed, but only through a limited text prompt at an interface).

Because every piece of generative AI looks identical, right? I mean, if the prompt had an impact, and two people using some ML-model would create different results based on what they choose to input, it sounds suspiciously like your "the same camera in two different hands", doesn't it?

> the fundamental difference there is between _doing art_ and asking a computer to produce art.

You mean doing art by asking a computer do produce a dump of sensor-data by pressing a button?

You appear to be completely blind to the similarities and just retreat towards "I draw the lines around art, and this is inside, and that's outside of it" without being able to explain how the AI-tool is fundamentally different from the camera-tool, but obviously one negates all possibility to create art, while the other totally is art, because that's what people say!

Needless to say that the people making those distinctions can't even tell apart a photo from an AI-generated picture.

replies(1): >>43680620 #
233. Juliate ◴[] No.43675356{11}[source]
You’re sure? How?

And what crowd? I am stating my viewpoint, from an education in humanities AND tech, and from 25 years of career in software tech, and 30 years of musician and painter practice.

Sorry but who is moving the goalpost here? Who is coming with their tech saying « hi, but we don’t care about how your laws make sense and we don’t care that we don’t know what art is because we never studied about it, neither do we have any artistic practice, we just want to have what you guys do by pressing a button. Oh and all of your stuff is free for us to forage thru, don’t care about what you say about your own work. »

Typical entitled behavior. Don’t act surprised that this is met with counter arguments and reality.

replies(1): >>43675391 #
234. squigz ◴[] No.43675391{12}[source]
Typical gatekeeping behavior. Don't act surprised when the world and artistic expression moves on without you.
replies(2): >>43678700 #>>43678744 #
235. Juliate ◴[] No.43678700{13}[source]
Laughable.

What would be gatekeeping is if someone prevented you to pick a pencil, paper, a guitar, a brush, to make something out of your own.

You’re the only one gatekeeping yourself here.

Looks like it’s the same pattern as with blockchains, and NFTs and Web3 stuff and the move fast/break things mantra: you cannot argue for and demonstrate for what your « solutions » actually solve, so you need brute force to break things and impose them.

236. Juliate ◴[] No.43678744{13}[source]
Artistic expression does not « move on » without me, or people.

Artistic expression is people in motion, alone or in groups.

You’re talking about the economics of performances and artefacts, which are _something else_ out of artistic expression.

EDIT to clarify/reinforce:

Elvis without Elvis isn’t Elvis. Discs, movies, books are captures of Elvis. Not the same thing.

Miyazaki without Miyazaki isn’t Miyazaki. It may look like it, but it is not it.

Artistic expression is someone’s expression, practice (yours, mine, theirs). It’s the definition of the originality of it (who it comes from, who it is actually made by).

A machine, a software may produce (raw) materials for artistic expression, whatever it is, but it is not artistic expression by itself.

Bowie using the Verbasizer is using a tool for artistic expression. The Verbasizer output isn’t art by itself. Bowie made Bowie stuff.

237. squigz ◴[] No.43678979{11}[source]
Can you talk more about "rape as a culture"?
replies(1): >>43680394 #
238. Juliate ◴[] No.43680394{12}[source]
Rape is fundamentally about power, control and the violation of consent.

The casual dismissal of artists' fundamental rights to control their work and how they are used is a part of a larger cultural problem, where might would rule over law, power would rule over justice, lies over truth.

That may seem a charged argument, and it is, because it hits right and it is particularly uncomfortable to acknowledge.

The same tech leaders that push for this move over IP law are the tech leaders that fund(ed) the current dismantling of US democracy and that have chosen their political team because it aligns precisely (up to the man that got the presidential seat, the man that has (had?) quite problematic issues towards women) with their values.

This is too obvious to be an accident.

And this is also a stern warning. Because the ideology behind power does not stop at anything. It goes on until it eats itself.

replies(1): >>43682626 #
239. Juliate ◴[] No.43680620{11}[source]
> Because every piece of generative AI looks identical, right? I mean, if the prompt had an impact, and two people using some ML-model would create different results based on what they choose to input, it sounds suspiciously like your "the same camera in two different hands", doesn't it?

I feel there's something interesting to discuss here but I'm still not convinced: a camera captures light from the physical reality. AI generators "capture" something from a model trained on existing artworks from other people (most likely not consenting). There's a superficial similarity in the push of the button, but that's it. Each does not operate the same way, on the same domain.

> You appear to be completely blind to the similarities [...] without being able to explain how the AI-tool is fundamentally different from the camera-tool, but obviously one negates all possibility to create art, while the other totally is art, because that's what people say!

There's a vocabulary issue here. Art is a practice, not a thing, not a product. You can create a picture, however you like it.

What makes a picture cool to look at is how it looks. And that is very subjective and contextual. No issue with that. What makes it _interesting_ and catchy is not so much what it _is_ but what it says, what it means, what it triggers, from the intent of the artist (if one gets to have the info about it), to its techniques[1] all the way to the inspiration it creates in the onlookers (which is also a function of a lot of things).

Anything machine-produced can be cool/beautiful/whatever.

Machines also reproduce/reprint original works. And while there are common qualities, it is not the same to look at a copy, at a reproduction of a thing, and to look at the original thing, that was made by the original artist. If you haven't experienced that, please try to (going to a museum for instance, or a gallery, anywhere).

[1] and there, using AI stuff as anything else as a _tool_ to practice/make art? of course. But to say that what this tool makes _is_ art or a work of art? Basic no for me.

> Needless to say that the people making those distinctions can't even tell apart a photo from an AI-generated picture.

1/ It does get better and better, but it still looks like AI-generated (as of April 2025).

2/ Human-wise/feeling-wise/intellectual-wise, anything that I know has been generated by AI will be a. interesting perhaps, for ideas, for randomness, b. but soulless. And that is connection, relief, soul (mine, and those of others) I am looking for in art (as a practice, an artefact or a performance); I'm pretty sure that's what connects us humans.

3/ Market-wise, I predict that any renowned artwork will lose of its value as soon as its origin being AI-made will be known; for the very reason 2/ above.

240. squigz ◴[] No.43682626{13}[source]
Do you maybe think using 'rape' in such a casual way takes away anything from actual rape victims?
replies(1): >>43686955 #
241. Suppafly ◴[] No.43683088[source]
>Artists don't like AI image generators because they have to compete with them

I always wonder why people make statements like this. Anyone that knows more than one artist knows that artists uses these tools for a variety of reasons and aren't nearly as scared as random internet concern trolls make them out to be.

242. Suppafly ◴[] No.43683137{3}[source]
>I don't think all artists are treating this tool as such an existential threat.

Agreed, the ones I know in real life are excited by these tools and have been using them.

replies(1): >>43683198 #
243. squigz ◴[] No.43683198{4}[source]
Well no, those aren't actually artists, because they clearly don't understand what art is.

(/s)

244. ben_w ◴[] No.43684587{5}[source]
I do find the way things like this* get parroted to be mildly amusing, given the "stochastic parot" phrase existed.

* not only this contextually misleading quote, and I've also parotted things

245. ben_w ◴[] No.43684645{5}[source]
Both are issues, for different people.

Art as nice things, vs. art as a peacock's tail where the effort is the point.

Fast fashion vs. Ned Ludd.

Queen Elizabeth I saying to William Lee, "Thou aimest high, Master Lee. Consider thou what the invention could do to my poor subjects. It would assuredly bring to them ruin by depriving them of employment, thus making them beggars."

246. Juliate ◴[] No.43686955{14}[source]
1/ It does not take anything away. The use is not casual but deliberate and analytical. The concept of « rape culture » extends beyond sexual assault to other patterns of consent violation and power dynamics.

2/ it has been discussed for like, decades, in academic and social contexts, how attitudes in some domain reflects and reinforces them in others.

3/ Your « actual » makes an assumption about my experience that you have no basis for.

Point remains that non-consensual use of artists’ work reflects the same fundamental disregard for autonomy that characterizes other consent violations.

247. numpad0 ◴[] No.43703937{3}[source]
There are few who don't realize it, they go lunatic.
248. numpad0 ◴[] No.43708698{6}[source]
yeah that's measuring effects of labeling, not discrepancies between human made and AI generated.