Most active commenters
  • jjcon(8)
  • wiseowise(3)

←back to thread

125 points akeck | 33 comments | | HN request time: 1.853s | source | bottom
1. ta8645 ◴[] No.33580501[source]
Artists are no different than all the people who tried to destroy the cotton gin or the automated loom. We're all going to have to live in a world where these technologies exist, and find a way to live a fulfilling life regardless. Just as chess players today enjoy the game even though computers have surpassed our chess abilities.

It seems odd to complain that computers are using human's artwork to inspire their own creations. Every human artist has done the exact same thing in their lifetime; it's unavoidable.

replies(10): >>33580588 #>>33580624 #>>33580644 #>>33580673 #>>33580687 #>>33580701 #>>33580722 #>>33580832 #>>33580867 #>>33582176 #
2. echelon ◴[] No.33580588[source]
> It seems odd to complain that computers are using human's artwork to inspire their own creations. Every human artist has done the exact same thing in their lifetime; it's unavoidable.

Agree.

But you also have to treat code the same way. We shouldn't be suing Open AI and Microsoft over copilot being trained on open source code. It's no different than models trained on art.

Besides, if Microsoft loses, they actually win. I expect they're one of the few companies with enough code to train the model on completely proprietary data. If they lose the case, they'll still be able to build the tool. The rest of us will be locked out of easy training data and won't be able to compete.

replies(1): >>33580668 #
3. bugfix-66 ◴[] No.33580624[source]
These systems aggregate and interpolate human work. Interpolation: https://en.m.wikipedia.org/wiki/Interpolation

It's like a very complicated form of linear interpolation:

  a*x + (1-a)*y
These systems do not "think". Today I spent all day mulling an idea, experimenting with variations, feeling frustrated or excited, imagining it, simulating it, making mistakes, following paths of reasoning, deducing facts, revisiting dead-ends with new insight, daydreaming, talking to my wife about it, etc. That's human thought.

These models do not "think" like a human, they do not dream or imagine or feel. They run a feed-forward system of linear equations (matrix multiplications).

They INTERPOLATE HUMAN WORK.

They don't exist without training data (huge amounts of intellectual property) aggregated and interpolated in a monstrous perversion of "fair use":

https://bugfix-66.com/7a82559a13b39c7fa404320c14f47ce0c304fa...

Starve the machine. Without your work, it's got nothing.

replies(2): >>33580684 #>>33581480 #
4. ThePadawan ◴[] No.33580644[source]
> It seems odd to complain that computers are using human's artwork to inspire their own creations. Every human artist has done the exact same thing in their lifetime; it's unavoidable.

I don't find it odd to complain that publishing an artwork on DeviantArt has gone from "I intend humans to look at this" to "I (opt-out!) agree that a corporation may use this to generate new artwork for profit."

I would not complain if a painting of mine were exhibited in a museum and someone came in to look at it and draw something inspired by it.

I would complain if I handed over a painting of mine to that same museum to be exhibited, they scanned it in at high resolution, handed it over to a class of copy artists, who then produced artwork in order to compete with mine, before finally putting it up in a gallery.

Does that still seem odd?

5. bbarnett ◴[] No.33580668[source]
The rest of us will be locked out of easy training data and won't be able to compete.

There's loads of BSD code, and to share, all that is required is a link to attributation.

It seems to me, there is money to be made, in getting model data colinked with "saw first" references.

Then, for example, after a github style codebot writes the code for you, it can show a link to "where it learned to help you today!".

There is no technical reason the can't be done, only a business model reason.

That said, I find the comments in this thread strange. Discussion about how tech moves on, and looms and such.

That arg was lost, when people could cut up songs, and slap them together, or cut up 100 textbooks into one. This is settled by endless laws, and caselaw. It isn't a new argument. Microsoft will lose.

6. greenthrow ◴[] No.33580673[source]
It's not even remotely comparable to the cotton gin or the dishwasher or any kind of normal labor that has been automated.

We are talking about creative works being shuffled together and remixed as a legal protection for theft. That's all it is. There is ample evidence that these algorithms merely regurgitate what goes in and cannot create something entirely new. Which, is of coursw what you'd expect if you understand what is going on under the hood. But it is not what is being sold.

replies(1): >>33580729 #
7. jjcon ◴[] No.33580684[source]
> Starve the machine, it doesn't exist without having your work to interpolate.

But again… aren’t people the same way? Noone exists in isolation. The Sir Isaac Newton quote comes to mine:

“If I have seen further, it is by standing on the shoulders of giants”

Edit: to be clear - these algorithms are specifically non-linear and are a far cry from ‘linear interpolation’. Yes they do involve matrix multiplication that does not make them interpolaters unless you want to water down the meaning of interpolation to be so generic it loses its meaning. Having said all that - the sophistication of the algorithm is beyond the point here as long as what they are generating is substantially transformative (which >99% of the possible outputs are legally speaking).

replies(3): >>33580690 #>>33580837 #>>33581288 #
8. heavyset_go ◴[] No.33580687[source]
> Artists are no different than all the people who tried to destroy the cotton gin or the automated loom.

I feel like this post by an HN user is pertinent[1].

> Have you ever done any reading on the Luddites? They weren't the anti technology, anti progress social force people think they were.

> They were highly skilled laborers who knew how to operate complex looms. When auto looms came along, factory owners decided they didn't want highly trained, knowledgeable workers they wanted highly disposable workers. The Luddites were happy to operate the new looms, they just wanted to realize some of the profit from the savings in labor along with the factory owners. When the factory owners said no, the Luddites smashed the new looms.

The Luddites went from middle class business owners and craftsmen to utter destitution. Many of the Luddites were tried for machine breaking and were either executed by the state, or exiled to penal colonies. They risked literally everything, because everything was at stake.

I bring this up because people like to pretend the Luddites were some cult of ignorant technophobes, but the reality is that many of us are in the same situation the Luddites were in, as highly skilled workers that operate complex machinery with comfortable middle class lives, before owners cut them out and their families starved in the streets.

[1] https://news.ycombinator.com/item?id=33230262

9. bugfix-66 ◴[] No.33580690{3}[source]
Are you foolishly suggesting that Sir Isaac Newton was just aggregating and interpolating others' work?

Like a feed-forward chain of matrix multiplications, trained to predict its training data?

No, of course you weren't. That would be FUCKING RIDICULOUS.

replies(1): >>33580711 #
10. varnaud ◴[] No.33580701[source]
>Artists are no different than all the people who tried to destroy the cotton gin or the automated loom.

Yes. They are angry that their labor was used to create something new and arguably more efficient, but they don't get a appropriate compensation for it.

11. jjcon ◴[] No.33580711{4}[source]
Yes… we all do that every day. Humans don’t exist in isolation, we build and learn from other’s accomplishments from the wheel to the printing press to the computer. Modern impressionists don’t owe royalties to Monet but they certainly draw from and learn from his contributions to the art world. Brand new material from art algorithms (frankly regardless of their sophistication) certainly deserve and fall under this same legal treatment.
replies(1): >>33580743 #
12. Gigachad ◴[] No.33580722[source]
Chess is competitive so you can regain enjoyment by just banning AI from competitions. Drawing is more outcome based, I can see it becoming somewhat obsolete like how photography removed portrait and landscape painters from jobs.

On the plus side. I can imagine this tech empowering artists to create more stuff they previously couldn’t. I’m imagining a single person producing a whole animation which previously was only accessible to companies and teams.

13. Gigachad ◴[] No.33580729[source]
This is basically what the majority of artists do already. They pull in a bunch of reference images and blend them together in to a single piece.
replies(1): >>33581315 #
14. jjcon ◴[] No.33580773{6}[source]
> You just don't understand the math.

This is not in good faith, please read HN rules.

Rather than attack me (calling me foolish, swearing at me) why don’t you rebut my ideas and have a conversation if you actually have something to contribute.

I’ve read the papers, I’ve worked personally with these systems. I understand them just fine. Notice that I said earlier: “regardless of how simple they are”. I understand you are trying to water them down to be simple interpolation which they definitely are not but even if they were that simple it wouldn’t change the legal calculus here one bit. New art is being generated (far beyond any ‘transformative’ legal test precedent) and any new art that is substantively different from its inputs is legally protectable.

15. TOMDM ◴[] No.33580794{6}[source]
That or they do understand the math and they think what's going on in our own minds may not be that special.
16. 6gvONxR4sf7o ◴[] No.33580832[source]
The automated loom was possible without the manual loom operators. Generative models are not possible without artists. It’s not remotely the same.
17. 6gvONxR4sf7o ◴[] No.33580837{3}[source]
People are the same, yes, but corporations aren’t people.
replies(1): >>33580866 #
18. jjcon ◴[] No.33580866{4}[source]
Certainly, don’t mean to imply they are (legal distinctions aside). A person can create an algorithm (or use an algorithm) and create new things, even works of art.
19. schroeding ◴[] No.33580867[source]
> Just as chess players today enjoy the game even though computers have surpassed our chess abilities.

The "product" that chess players produce is not replaceable by ML systems. The game itself, the "fight" of two minds (or one mind against the machine, in the past) is the "product". Watching two chess AIs play against each other can't replace that.

For artists, the product is their output, the art itself. An approximation of that art can also be produced by a ML system now, making artists an unnecessary cost factor[1] for e.g. simple illustrations.

They are not comparable, IMO. Chess players are not replaced by ML systems, artists will be.

> it's unavoidable.

It really isn't. Of course it would be possible to just outlaw the use of things like "the pile", which includes gigabytes of random texts with unknown copyright status. The same goes for any training set that uses images scraped of the web, ignoring any copyright.

Yes, people would still do it, but it would have the same status that piracy has. You can't build a US multi-billion dollar company on piracy (for long), and you wouldn't be able to do so with ML systems that were trained on random stuff from the internet.

I don't think this, in such broad strokes, would be a good thing, to be clear. Such datasets are great for research! But I have a really hard time understanding this defeatism that there is "nothing we can do".

[1] from the perspective of some customers e.g. magazines or ad companies - I don't agree with this

replies(2): >>33580921 #>>33581903 #
20. jjcon ◴[] No.33580921[source]
There is so much art that is creative commons and public domain I’m sure a worse ‘pile’ could be conjured up to start things out. Then just as we have seen with other architectures, as they refine, their need for data can drop and eventually we are back in the same place, maybe a few more years removed but back in the same place nonetheless. That is my take at least.

Personally, I don’t think it is likely that copyright laws will change to protect against algorithmic usage (too much precedent in more general reuse cases and for what is considered transformative). Having said that I also don’t think this will be the death of artists by any stretch, some industries will need to change or evolve but it will be just another tool in an artists belt IMO.

replies(1): >>33582246 #
21. wiseowise ◴[] No.33581288{3}[source]
Ethical meat business (single farm, limited scope) = good, industrial meat grinder (huge factories) = bad.
replies(1): >>33581416 #
22. wiseowise ◴[] No.33581315{3}[source]
They don’t do it on industrial scale.
23. jjcon ◴[] No.33581416{4}[source]
So is any scaled up process unethical or is there another comparison you are trying to go for here?

Notably in your example both are certainly legal just varied by the level of controversy around them and on that level I would agree. Scaled up processes do tend to attract more controversy.

replies(1): >>33588481 #
24. mkaic ◴[] No.33581480[source]
Humans also interpolate human work. True originality is an illusion and all creative works are based on, inspired by, or contributed to by something else. Are you implying that human thought is required to create human-level art? Because if anything, I think AI-generated art is in the process of disproving this exact hypothesis. It is unnerving to realize that something we felt up til now was fundamentally exclusive to the human experience isn't actually exclusive, but it's becoming more and more apparent.
replies(2): >>33581994 #>>33582573 #
25. WA ◴[] No.33581903[source]
> For artists, the product is their output, the art itself.

For professional artists who do it for the money, yes, that's true.

For amateur artists, the product can be the process, the flow of creating art. Futhermore, I'd say a lot of art isn't about conveying an idea or whatever. You see something, you paint it, because you like it, give it your own spin. Maybe the end result is good, maybe not. Often enough, the art becomes "valuable", because others give it some new context.

> Chess players are not replaced by ML systems, artists will be.

Traditional artists working with real materials won't be. They might even get new interest, because digital art will be flodded with spam.

Or a magazine does, what the art scene has been about forever: Hire artists because of their name or their background.

The job "digital artist" was created roughly 20-30 years ago and now is transformed to something else or might become obsolete. Bummer for digital artists, but not sure if this will destroy "artists" in general.

26. FridgeSeal ◴[] No.33581994{3}[source]
Humans on the whole aren't capable of hoovering up basically every piece of artist content accessible on the web, storing all of it, and then creating near-faithful reproductions at a moments notice.

It's a problem of scale.

> Because if anything, I think AI-generated art is in the process of disproving this exact hypothesis

But it's not creating anything, it's regurgitating it's training material (through a suitably fine blender) in the way that scores best. These models are nothing without the actual art they've appropriated.

27. friend_and_foe ◴[] No.33582176[source]
While I generally agree with you, they have some good points you're missing.

It isn't "inspiration". These machine models aren't actually intelligent. There's no expressive element here, with regard to the machine producing the art.

What it really is is just a new tool for producing art. The sculptor had his chisel, the painter had a paintbrush, the photographer had a camera, the graphic designer had Photoshop or whatever, and now you can make art by being skillful in coming up with a prompt. It still requires skill, skill with the tool, just like anything else.

The difference is that this new tool (probably) doesn't enable the creation of anything truly novel.

28. schroeding ◴[] No.33582246{3}[source]
True, but make even one classification mistake (and people upload stuff they don't own with the wrong license all the time) and you have to retrain your whole system for each mistake you make, as people trickle in and want their (wrongly classified as CC or public domain) stuff removed from your dataset.

It would chill the whole ML space significantly for decades, IMO, as the only truly safe data would be synthetic or licensed. This can work for some applications (e.g. Microsoft used synthetic data for facial landmark recognition[1]), but it would kill DALL-E 2 et al.

[1] https://microsoft.github.io/DenseLandmarks/

replies(1): >>33586104 #
29. throw_m239339 ◴[] No.33582573{3}[source]
> Humans also interpolate human work.

Humans, artists aren't the machines other human created, they interpret or copy, not interpolate.

30. jjcon ◴[] No.33586104{4}[source]
If I use photoshop to recreate a copywritten work - they don’t have to redistribute photoshop or change it in any way. The originals are not being shipped in the models but the models are capable of recreating copywritten work. These are tools just like photoshop.
replies(1): >>33589050 #
31. wiseowise ◴[] No.33588481{5}[source]
> So is any scaled up process unethical

If it abuses someone - yes.

32. heavyset_go ◴[] No.33589050{5}[source]
Neural networks can and do encode data from their training sets in the models itself. That's the reason you can make some models reproduce things like the Getty watermark in the images they produce.
replies(1): >>33590023 #
33. jjcon ◴[] No.33590023{6}[source]
Again not directly though and that is all that matters - I can reproduce the getty watermark in photoshop but that doesn’t make adobe liable. The fact that a tool is capable of copyright infringment does not shift the legal burden anywhere - it is totally beside the point. Technically photoshop’s ‘content aware fill’ could fill in missing regions with copywritten content purely by chance but the burden is still on me if I publish that content, not on adobe. Legally speaking these are tools just like any other algorithm or machine out there, their sophistication and particular method is not particularly relevant (again legally speaking).