←back to thread

553 points bookofjoe | 1 comments | | HN request time: 0.247s | source
Show context
adzm ◴[] No.43654878[source]
Adobe is the one major company trying to be ethical with its AI training data and no one seems to even care. The AI features in Photoshop are the best around in my experience and come in handy constantly for all sorts of touchup work.

Anyway I don't really think they deserve a lot of the hate they get, but I do hope this encourages development of viable alternatives to their products. Photoshop is still pretty much peerless. Illustrator has a ton of competitors catching up. After Effects and Premiere for video editing are getting overtaken by Davinci Resolve -- though for motion graphics it is still hard to beat After Effects. Though I do love that Adobe simply uses JavaScript for its expression and scripting language.

replies(36): >>43654900 #>>43655311 #>>43655626 #>>43655700 #>>43655747 #>>43655859 #>>43655907 #>>43657271 #>>43657436 #>>43658069 #>>43658095 #>>43658187 #>>43658412 #>>43658496 #>>43658624 #>>43659012 #>>43659378 #>>43659401 #>>43659469 #>>43659478 #>>43659507 #>>43659546 #>>43659648 #>>43659715 #>>43659810 #>>43660283 #>>43661100 #>>43661103 #>>43661122 #>>43661755 #>>43664378 #>>43664554 #>>43665148 #>>43667578 #>>43674357 #>>43674455 #
f33d5173 ◴[] No.43655907[source]
Adobe isn't trying to be ethical, they are trying to be more legally compliant, because they see that as a market opportunity. Otoh, artists complain about legal compliance of AIs not because that is what they care about, but because they see that as their only possible redress against a phenomenon they find distasteful. A legal reality where you can only train AI on content you've licensed would be the worst for everybody bar massive companies, legacy artists included.
replies(7): >>43658034 #>>43658253 #>>43659203 #>>43659245 #>>43659443 #>>43659929 #>>43661258 #
Riverheart ◴[] No.43658253[source]
“A legal reality where you can only train AI on content you've licensed would be the worst for everybody bar massive companies, legacy artists included.”

Care to elaborate?

Also, saying artists only concern themselves with the legality of art used in AI because of distaste when there are legal cases where their art has been appropriated seems like a bold position to take.

It’s a practice founded on scooping everything up without care for origin or attribution and it’s not like it’s a transparent process. There are people that literally go out of their way to let artists know they’re training on their art and taunt them about it online. Is it unusual they would assume bad faith from those purporting to train their AI legally when participation up till now has either been involuntary or opt out? Rolling out AI features when your customers are artists is tone deaf at best and trolling at worst.

replies(1): >>43658703 #
Workaccount2 ◴[] No.43658703[source]
There is no "scooping up", the models aren't massive archives of copied art. People either don't understand how these models work or they purposely misrepresent it (or purposely refuse to understand it).

Showing the model an picture doesn't create a copy of that picture in it's "brain". It moves a bunch of vectors around that captures an "essence" of what the image is. The next image shown from a totally different artist with a totally different style may well move around many of those same vectors again. But suffice to say, there is no copy of the picture anywhere inside of it.

This also why these models hallucinate so much, they are not drawing from a bank of copies, they are working off of a fuzzy memory.

replies(3): >>43658755 #>>43658813 #>>43658942 #
Riverheart ◴[] No.43658813[source]
The collection of the training data is the “scooping up” I mentioned. I assume you acknowledge the training data doesn’t spontaneously burst out of the aether?

As for the model, it’s still creating deterministic, derivative works based off its inputs and the only thing that makes it random is the seed so it being a database of vectors is irrelevant.

replies(1): >>43660474 #
rcxdude ◴[] No.43660474[source]
deterministic is neither here nor there for copyright infringement. a hash of an image is not infringing, and a slightly noisy version of it is.
replies(1): >>43661683 #
Riverheart ◴[] No.43661683[source]
Nobody is trying to copyright an image hash and determinism matters because it’s why the outputs are derivative rather than inspired.
replies(1): >>43662706 #
bawolff ◴[] No.43662706[source]
That is not how copyright works. "Inspired" works can still be derrivative. In the US, entirely deterministic works are not considered derrivative works as they aren't considered new creative works (if anything they are considered the same as the original). See https://en.wikipedia.org/wiki/Bridgeman_Art_Library_v._Corel...
replies(1): >>43664167 #
Riverheart ◴[] No.43664167[source]
“In the US, entirely deterministic works are not considered derrivative works as they aren't considered new creative works (if anything they are considered the same as the original)”

Okay, so if the inputs to the model are my artwork to replicate my style, is the output copyrightable by you? You just said deterministic works aren’t derivative, they’re considered the same as the original. That’s not anything I’ve heard AI proponents claim and the outputs are more original than a 1 to 1 photocopy but I assume like the case you linked to that the answer will be, no, you can’t copyright.

replies(2): >>43666244 #>>43667703 #
1. bawolff ◴[] No.43666244[source]
That depends on how much "creativity" is in the prompt, but generally i would lean towards no, the AI created work is not copyrightable by the person who used the model to "create" it.

I believe that is the conclusion the US copyright office came to as well https://www.copyright.gov/ai/ (i didnt actually read their report, but i think that's what it says)