←back to thread

324 points rntn | 5 comments | | HN request time: 0.788s | source
Show context
ankit219 ◴[] No.44608660[source]
Not just Meta, 40 EU companies urged EU to postpone roll out of the ai act by two years due to it's unclear nature. This code of practice is voluntary and goes beyond what is in the act itself. EU published it in a way to say that there would be less scrutiny if you voluntarily sign up for this code of practice. Meta would anyway face scrutiny on all ends, so does not seem to a plausible case to sign something voluntary.

One of the key aspects of the act is how a model provider is responsible if the downstream partners misuse it in any way. For open source, it's a very hard requirement[1].

> GPAI model providers need to establish reasonable copyright measures to mitigate the risk that a downstream system or application into which a model is integrated generates copyright-infringing outputs, including through avoiding overfitting of their GPAI model. Where a GPAI model is provided to another entity, providers are encouraged to make the conclusion or validity of the contractual provision of the model dependent upon a promise of that entity to take appropriate measures to avoid the repeated generation of output that is identical or recognisably similar to protected works.

[1] https://www.lw.com/en/insights/2024/11/european-commission-r...

replies(8): >>44610592 #>>44610641 #>>44610669 #>>44611112 #>>44612330 #>>44613357 #>>44617228 #>>44620292 #
t0mas88 ◴[] No.44610641[source]
Sounds like a reasonable guideline to me. Even for open source models, you can add a license term that requires users of the open source model to take "appropriate measures to avoid the repeated generation of output that is identical or recognisably similar to protected works"

This is European law, not US. Reasonable means reasonable and judges here are expected to weigh each side's interests and come to a conclusion. Not just a literal interpretation of the law.

replies(4): >>44613578 #>>44614324 #>>44614949 #>>44615016 #
1. gkbrk ◴[] No.44614324[source]
> Even for open source models, you can add a license term that requires users of the open source model to take appropriate measures to avoid [...]

You just made the model not open source

replies(3): >>44614685 #>>44614721 #>>44615634 #
2. LadyCailin ◴[] No.44614685[source]
“Source available” then?
3. badsectoracula ◴[] No.44614721[source]
Instead of a license term you can put that in your documentation - in fact that is exactly what the code of practice mentions (see my other comment) for open source models.
4. h4ck_th3_pl4n3t ◴[] No.44615634[source]
An open source cocaine production machine is still an illegal cocaine production machine. The fact that it's open source doesn't matter.

You seem to not have understood that different forms of appliances need to comply with different forms of law. And you being able to call it open source or not doesn't change anything about its legal aspects.

And every law written is a compromise between two opposing parties.

replies(1): >>44629439 #
5. enedil ◴[] No.44629439[source]
I'm not sure what do you mean. Statement "open source models can just add a clause to the terms of use, restricting how it can be used" is false, because in that case, it won't be open source. Does it mean that the open source needs to comply with the laws? Absolutely, but that might mean that open source models are effectively illegal.