←back to thread

328 points rntn | 2 comments | | HN request time: 0.418s | source
Show context
ankit219 ◴[] No.44608660[source]
Not just Meta, 40 EU companies urged EU to postpone roll out of the ai act by two years due to it's unclear nature. This code of practice is voluntary and goes beyond what is in the act itself. EU published it in a way to say that there would be less scrutiny if you voluntarily sign up for this code of practice. Meta would anyway face scrutiny on all ends, so does not seem to a plausible case to sign something voluntary.

One of the key aspects of the act is how a model provider is responsible if the downstream partners misuse it in any way. For open source, it's a very hard requirement[1].

> GPAI model providers need to establish reasonable copyright measures to mitigate the risk that a downstream system or application into which a model is integrated generates copyright-infringing outputs, including through avoiding overfitting of their GPAI model. Where a GPAI model is provided to another entity, providers are encouraged to make the conclusion or validity of the contractual provision of the model dependent upon a promise of that entity to take appropriate measures to avoid the repeated generation of output that is identical or recognisably similar to protected works.

[1] https://www.lw.com/en/insights/2024/11/european-commission-r...

replies(8): >>44610592 #>>44610641 #>>44610669 #>>44611112 #>>44612330 #>>44613357 #>>44617228 #>>44620292 #
t0mas88 ◴[] No.44610641[source]
Sounds like a reasonable guideline to me. Even for open source models, you can add a license term that requires users of the open source model to take "appropriate measures to avoid the repeated generation of output that is identical or recognisably similar to protected works"

This is European law, not US. Reasonable means reasonable and judges here are expected to weigh each side's interests and come to a conclusion. Not just a literal interpretation of the law.

replies(4): >>44613578 #>>44614324 #>>44614949 #>>44615016 #
deanc ◴[] No.44613578[source]
Except that it’s seemingly impossible to prevent against prompt injection. The cat is out the bag. Much like a lot of other legislation (eg cookie law, being responsible for user generated content when you have millions of it posted per day) it’s entirely impractical albeit well-meaning.
replies(1): >>44613667 #
lcnielsen ◴[] No.44613667[source]
I don't think the cookie law is that impractical? It's easy to comply with by just not storing non-essential user information. It would have been completely nondisruptive if platforms agreed to respect users' defaults via browser settings, and then converged on a common config interface.

It was made impractical by ad platforms and others who decided to use dark patterns, FUD and malicious compliance to deceive users into agreeing to be tracked.

replies(3): >>44613785 #>>44613896 #>>44613989 #
deanc ◴[] No.44613785[source]
It is impractical for me as a user. I have to click on a notice on every website on the internet before interacting with it - often which are very obtuse and don’t have a “reject all” button but a “manage my choices” button which takes to an even more convoluted menu.

Instead of exactly as you say: a global browser option.

As someone who has had to implement this crap repeatedly - I can’t even begin to imagine the amount of global time that has been wasted implementing this by everyone, fixing mistakes related to it and more importantly by users having to interact with it.

replies(3): >>44613848 #>>44615071 #>>44615338 #
lcnielsen ◴[] No.44613848[source]
Yeah, but the only reason for this time wasteage is because website operators refuse to accept what would become the fallback default of "minimal", for which they would not need to seek explicit consent. It's a kind of arbitrage, like those scammy website that send you into redirect loops with enticing headlines.

The law is written to encourage such defaults if anything, it just wasn't profitable enough I guess.

replies(2): >>44613881 #>>44614219 #
fauigerzigerk ◴[] No.44614219[source]
Not even EU institutions themselves are falling back on deaults that don't require cookie consent.

I'm constantly clicking away cookie banners on UK government or NHS (our public healthcare system) websites. The ICO (UK privacy watchdog) requires cookie consent. The EU Data Protection Supervisor wants cookie consent. Almost everyone does.

And you know why that is? It's not because they are scammy ad funded sites or because of government surveillance. It's because the "cookie law" requires consent even for completely reasonable forms of traffic analysis with the sole purpose of improving the site for its visitors.

This is impractical, unreasonable, counterproductive and unintelligent.

replies(3): >>44614559 #>>44614784 #>>44614823 #
grues-dinner ◴[] No.44614823[source]
> completely reasonable

This is a personal decision to be made by the data "donor".

The NHS website cookie banner (which does have a correct implementation in that the "no consent" button is of equal prominence to the "mi data es su data" button) says:

> We'd also like to use analytics cookies. These collect feedback and send information about how our site is used to services called Adobe Analytics, Adobe Target, Qualtrics Feedback and Google Analytics. We use this information to improve our site.

In my opinion, it is not, as described, "completely reasonable" to consider such data hand-off to third parties as implicitly consented to. I may trust the NHS but I may not trust their partners.

If the data collected is strictly required for the delivery of the service and is used only for that purpose and destroyed when the purpose is fulfilled (say, login session management), you don't need a banner.

The NHS website is in a slightly tricky position, because I genuinely think they will be trying to use the data for site and service improvement, at least for now, and they hopefully have done their homework to make sure Adobe, say, are also not misusing the data. Do I think the same from, say, the Daily Mail website? Absolutely not, they'll be selling every scrap of data before the TCP connection even closes to anyone paying. Now, I may know the Daily Mail is a wretched hive of villainy and can just not go there, but I do not know about every website I visit. Sadly the scumbags are why no-one gets nice things.

replies(1): >>44615015 #
fauigerzigerk ◴[] No.44615015[source]
>This is a personal decision to be made by the data "donor".

My problem is that users cannot make this personal decision based on the cookie consent banners because all sites have to request this consent even if they do exactly what they should be doing in their users' interest. There's no useful signal in this noise.

The worst data harvesters look exactly the same as a site that does basic traffic analysis for basic usability purposes.

The law makes it easy for the worst offenders to hide behind everyone else. That's why I'm calling it counterproductive.

[Edit] Wrt NHS specifically - this is a case in point. They use some tools to analyse traffic in order to improve their website. If they honour their own privacy policy, they will have configured those tools accordingly.

I understand that this can still be criticised from various angles. But is this criticism worth destroying the effectiveness of the law and burying far more important distinctions?

The law makes the NHS and Daily Mail look exactly the same to users as far as privacy and data protection is concered. This is completely misleading, don't you think?

replies(2): >>44615348 #>>44615961 #
grues-dinner ◴[] No.44615961[source]
I don't think it's too misleading, because in the absence of any other information, they are the same.

What you could then add to this system is a certification scheme to permit implicit consent of all the data handling (including who you hand it off to and what they are allowed to do with it, as well as whether they have demonstrated themselves to be trustworthy) is audited to be compliant with some more stringent requirements. It could even be self-certification along the lines of CE marking. But that requires strict enforcement, and the national regulators so far have been a bunch of wet blankets.

That actually would encourage organisations to find ways to get the information they want without violating the privacy of their users and anyone else who strays into their digital properties.

replies(1): >>44618052 #
fauigerzigerk ◴[] No.44618052[source]
>I don't think it's too misleading, because in the absence of any other information, they are the same.

But other information not being absent we know that they are not the same. Just compare privacy policies for instance. The cookie law makes them appear similar in spite of the fact that they are very different (as of now - who knows what will happen to the NHS).

replies(1): >>44618647 #
grues-dinner ◴[] No.44618647[source]
I do understand the point, but other then allowing a process of auditing to allow a middle ground of consent implied for first-party use only and within some strictly defined boundaries, what else can you do? It's a market for lemons in terms of trustworthy data processors. 90% (bum-pull figures, but lines up with the number of websites that play silly buggers with hiding the no-consent button) of all people who want to use data will be up to no good and immediately try to bend and break every rule.

I would also be in favour of companies having to report all their negative data protection judgements against them and everyone they will share your data with in their cookie banner before giving you the choice as to whether you trust them.

replies(1): >>44619551 #
1. fauigerzigerk ◴[] No.44619551[source]
If any rule is going to be broken and impossible to enforce, how can that be a justification for keeping a bad rule rather than replacing it with more sensible one?
replies(1): >>44623568 #
2. grues-dinner ◴[] No.44623568[source]
I said they'd try to break them. Which requires vigilance and regulators stepping in with an enormous hammer. So far national regulators have been pretty weaksauce which is indeed very frustrating.

I'm not against improving the system, and I even proposed something, but I am against letting data abusers run riot because the current system isn't quite 100% perfect.

I'll still take what we have over what we had before (nothing, good luck everyone).