←back to thread

293 points rntn | 2 comments | | HN request time: 0.418s | source
Show context
ankit219 ◴[] No.44608660[source]
Not just Meta, 40 EU companies urged EU to postpone roll out of the ai act by two years due to it's unclear nature. This code of practice is voluntary and goes beyond what is in the act itself. EU published it in a way to say that there would be less scrutiny if you voluntarily sign up for this code of practice. Meta would anyway face scrutiny on all ends, so does not seem to a plausible case to sign something voluntary.

One of the key aspects of the act is how a model provider is responsible if the downstream partners misuse it in any way. For open source, it's a very hard requirement[1].

> GPAI model providers need to establish reasonable copyright measures to mitigate the risk that a downstream system or application into which a model is integrated generates copyright-infringing outputs, including through avoiding overfitting of their GPAI model. Where a GPAI model is provided to another entity, providers are encouraged to make the conclusion or validity of the contractual provision of the model dependent upon a promise of that entity to take appropriate measures to avoid the repeated generation of output that is identical or recognisably similar to protected works.

[1] https://www.lw.com/en/insights/2024/11/european-commission-r...

replies(7): >>44610592 #>>44610641 #>>44610669 #>>44611112 #>>44612330 #>>44613357 #>>44617228 #
zizee ◴[] No.44611112[source]
It doesn't seem unreasonable. If you train a model that can reliably reproduce thousands/millions of copyrighted works, you shouldn't be distributibg it. If it were just regular software that had that capability, would it be allowed? Just because it's a fancy Ai model it is ok?
replies(2): >>44611371 #>>44611463 #
CamperBob2 ◴[] No.44611371[source]
I have a Xerox machine that can reliably reproduce copyrighted works. Is that a problem, too?

Blaming tools for the actions of their users is stupid.

replies(4): >>44611396 #>>44611501 #>>44612409 #>>44614295 #
zeta0134 ◴[] No.44611501[source]
Helpfully the law already disagrees. That Xerox machine tampers with the printed result, leaving a faint signature that is meant to help detect forgeries. You know, for when users copy things that are actually illegal to copy. Xerox machine (and every other printer sold today) literally leaves a paper trail to trace it back to them.

https://en.wikipedia.org/wiki/Printer_tracking_dots

replies(1): >>44611509 #
ChadNauseam ◴[] No.44611509[source]
i believe only color printers are known to have this functionality, and it’s typically used for detecting counterfeit, not for enforcing copyright
replies(1): >>44611532 #
1. zeta0134 ◴[] No.44611532[source]
You're quite right. Still, it's a decent example of blaming the tool for the actions of its users. The law clearly exerted enough pressure to convince the tool maker to modify that tool against the user's wishes.
replies(1): >>44611561 #
2. justinclift ◴[] No.44611561[source]
> Still, it's a decent example of blaming the tool for the actions of its users.

They're not really "blaming" the tool though. They're using a supply chain attack against the subset of users they're interested in.