Most active commenters
  • 1718627440(5)
  • deanc(4)
  • lcnielsen(4)
  • artathred(4)
  • fauigerzigerk(3)
  • owebmaster(3)

←back to thread

297 points rntn | 49 comments | | HN request time: 0.001s | source | bottom
Show context
ankit219 ◴[] No.44608660[source]
Not just Meta, 40 EU companies urged EU to postpone roll out of the ai act by two years due to it's unclear nature. This code of practice is voluntary and goes beyond what is in the act itself. EU published it in a way to say that there would be less scrutiny if you voluntarily sign up for this code of practice. Meta would anyway face scrutiny on all ends, so does not seem to a plausible case to sign something voluntary.

One of the key aspects of the act is how a model provider is responsible if the downstream partners misuse it in any way. For open source, it's a very hard requirement[1].

> GPAI model providers need to establish reasonable copyright measures to mitigate the risk that a downstream system or application into which a model is integrated generates copyright-infringing outputs, including through avoiding overfitting of their GPAI model. Where a GPAI model is provided to another entity, providers are encouraged to make the conclusion or validity of the contractual provision of the model dependent upon a promise of that entity to take appropriate measures to avoid the repeated generation of output that is identical or recognisably similar to protected works.

[1] https://www.lw.com/en/insights/2024/11/european-commission-r...

replies(7): >>44610592 #>>44610641 #>>44610669 #>>44611112 #>>44612330 #>>44613357 #>>44617228 #
1. t0mas88 ◴[] No.44610641[source]
Sounds like a reasonable guideline to me. Even for open source models, you can add a license term that requires users of the open source model to take "appropriate measures to avoid the repeated generation of output that is identical or recognisably similar to protected works"

This is European law, not US. Reasonable means reasonable and judges here are expected to weigh each side's interests and come to a conclusion. Not just a literal interpretation of the law.

replies(4): >>44613578 #>>44614324 #>>44614949 #>>44615016 #
2. deanc ◴[] No.44613578[source]
Except that it’s seemingly impossible to prevent against prompt injection. The cat is out the bag. Much like a lot of other legislation (eg cookie law, being responsible for user generated content when you have millions of it posted per day) it’s entirely impractical albeit well-meaning.
replies(1): >>44613667 #
3. lcnielsen ◴[] No.44613667[source]
I don't think the cookie law is that impractical? It's easy to comply with by just not storing non-essential user information. It would have been completely nondisruptive if platforms agreed to respect users' defaults via browser settings, and then converged on a common config interface.

It was made impractical by ad platforms and others who decided to use dark patterns, FUD and malicious compliance to deceive users into agreeing to be tracked.

replies(3): >>44613785 #>>44613896 #>>44613989 #
4. deanc ◴[] No.44613785{3}[source]
It is impractical for me as a user. I have to click on a notice on every website on the internet before interacting with it - often which are very obtuse and don’t have a “reject all” button but a “manage my choices” button which takes to an even more convoluted menu.

Instead of exactly as you say: a global browser option.

As someone who has had to implement this crap repeatedly - I can’t even begin to imagine the amount of global time that has been wasted implementing this by everyone, fixing mistakes related to it and more importantly by users having to interact with it.

replies(3): >>44613848 #>>44615071 #>>44615338 #
5. lcnielsen ◴[] No.44613848{4}[source]
Yeah, but the only reason for this time wasteage is because website operators refuse to accept what would become the fallback default of "minimal", for which they would not need to seek explicit consent. It's a kind of arbitrage, like those scammy website that send you into redirect loops with enticing headlines.

The law is written to encourage such defaults if anything, it just wasn't profitable enough I guess.

replies(2): >>44613881 #>>44614219 #
6. deanc ◴[] No.44613881{5}[source]
The reality is the data that is gathered is so much more valuable and accurate if you gather consent when you are running a business. Defaulting to a minimal config is just not practical for most businesses either. The decisions that are made with proper tracking data have a real business impact (I can see it myself - working at a client with 7 figure monthly revenue).

Im fully supportive of consent, but the way it is implemented is impractical from everyone’s POV and I stand by that.

replies(4): >>44613917 #>>44613943 #>>44614111 #>>44614127 #
7. jonathanlydall ◴[] No.44613896{3}[source]
I recently received an email[0] from a UK entity with an enormous wall of text talking about processing of personal information, my rights and how there is a “Contact Card” of my details on their website.

But with a little bit of reading, one could ultimately summarise the enormous wall of text simply as: “We’ve added your email address to a marketing list, click here to opt out.”

The huge wall of text email was designed to confuse and obfuscate as much as possible with them still being able to claim they weren’t breaking protection of personal information laws.

[0]: https://imgur.com/a/aN4wiVp

replies(1): >>44614190 #
8. ta1243 ◴[] No.44613917{6}[source]
Why would I ever want to consent to you abusing my data?
9. user5534762135 ◴[] No.44613943{6}[source]
That is only true if you agree with ad platforms that tracking ads are fundamentally required for businesses, which is trivially untrue for most enterprises. Forcing businesses to get off privacy violating tracking practices is good, and it's not the EU that's at fault for forcing companies to be open about ad networks' intransigence on that part.
10. mgraczyk ◴[] No.44613989{3}[source]
Even EU government websites have horrible intrusive cookie banners. You can't blame ad companies, there are no ads on most sites
replies(1): >>44614216 #
11. bfg_9k ◴[] No.44614111{6}[source]
Are you genuinely trying to defend businesses unnecessarily tracking users online? Why can't businesses sell their core product(s) and you know... not track users? If they did that, then they wouldn't need to implement a cookie banner.
replies(3): >>44614226 #>>44614240 #>>44614993 #
12. discreteevent ◴[] No.44614127{6}[source]
> just not practical for most businesses

I don't think practical is the right word here. All the businesses in the world operated without tracking until the mid 90s.

13. tester756 ◴[] No.44614190{4}[source]
>The huge wall of text email was designed to confuse and obfuscate as much as possible with

It is pretty clear

replies(1): >>44614293 #
14. lcnielsen ◴[] No.44614216{4}[source]
Because they track usage stats for site development purposes, and there was no convergence on an agreed upon standard interface for browsers since nobody would respect it. Their banners are at least simple yes/no ones without dark patterns.

But yes, perhaps they should have worked with e.g. Mozilla to develop some kind of standard browser interface for this.

15. fauigerzigerk ◴[] No.44614219{5}[source]
Not even EU institutions themselves are falling back on deaults that don't require cookie consent.

I'm constantly clicking away cookie banners on UK government or NHS (our public healthcare system) websites. The ICO (UK privacy watchdog) requires cookie consent. The EU Data Protection Supervisor wants cookie consent. Almost everyone does.

And you know why that is? It's not because they are scammy ad funded sites or because of government surveillance. It's because the "cookie law" requires consent even for completely reasonable forms of traffic analysis with the sole purpose of improving the site for its visitors.

This is impractical, unreasonable, counterproductive and unintelligent.

replies(3): >>44614559 #>>44614784 #>>44614823 #
16. deanc ◴[] No.44614226{7}[source]
Retargetting etc is massive revenue for online retailers. I support their right to do it if users consent to it. I don’t support their right to do it if users have not consented.

The conversation is not about my opinion on tracking, anyway. It’s about the impracticality of implementing the legislation that is hostile and time consuming for both website owners and users alike

replies(1): >>44615060 #
17. lcnielsen ◴[] No.44614240{7}[source]
Plus with any kind of effort put into a standard browser setting you could easily have some granularity, like: accept anonymous ephemeral data collected to improve website, but not stuff shared with third parties, or anything collected for the purpose of tailoring content or recommendations for you.
18. johnisgood ◴[] No.44614293{5}[source]
Only if you read it. Most people do not read it, same with ToSes.
replies(1): >>44614671 #
19. gkbrk ◴[] No.44614324[source]
> Even for open source models, you can add a license term that requires users of the open source model to take appropriate measures to avoid [...]

You just made the model not open source

replies(3): >>44614685 #>>44614721 #>>44615634 #
20. FirmwareBurner ◴[] No.44614559{6}[source]
>This is impractical, unreasonable, counterproductive and unintelligent.

It keeps the political grifters who make these regulations employed, that's kind of the main point in EU/UKs endless stream of regulations upon regulations.

21. octopoc ◴[] No.44614671{6}[source]
If you ask someone if they killed your dog and they respond with a wall of text, then you’re immediately suspicious. You don’t even have to read it all.

The same is true of privacy policies. I’ve seen some companies have very short policies I could read in less than 30s, those companies are not suspicious.

replies(2): >>44615333 #>>44617435 #
22. LadyCailin ◴[] No.44614685[source]
“Source available” then?
23. badsectoracula ◴[] No.44614721[source]
Instead of a license term you can put that in your documentation - in fact that is exactly what the code of practice mentions (see my other comment) for open source models.
24. troupo ◴[] No.44614784{6}[source]
> It's because the "cookie law" requires consent even for completely reasonable forms of traffic analysis with the sole purpose of improving the site for its visitors

Yup. That's what those 2000+ "partners" are all about if you believe their "legitimate interest" claims: "improve traffic"

25. grues-dinner ◴[] No.44614823{6}[source]
> completely reasonable

This is a personal decision to be made by the data "donor".

The NHS website cookie banner (which does have a correct implementation in that the "no consent" button is of equal prominence to the "mi data es su data" button) says:

> We'd also like to use analytics cookies. These collect feedback and send information about how our site is used to services called Adobe Analytics, Adobe Target, Qualtrics Feedback and Google Analytics. We use this information to improve our site.

In my opinion, it is not, as described, "completely reasonable" to consider such data hand-off to third parties as implicitly consented to. I may trust the NHS but I may not trust their partners.

If the data collected is strictly required for the delivery of the service and is used only for that purpose and destroyed when the purpose is fulfilled (say, login session management), you don't need a banner.

The NHS website is in a slightly tricky position, because I genuinely think they will be trying to use the data for site and service improvement, at least for now, and they hopefully have done their homework to make sure Adobe, say, are also not misusing the data. Do I think the same from, say, the Daily Mail website? Absolutely not, they'll be selling every scrap of data before the TCP connection even closes to anyone paying. Now, I may know the Daily Mail is a wretched hive of villainy and can just not go there, but I do not know about every website I visit. Sadly the scumbags are why no-one gets nice things.

replies(1): >>44615015 #
26. whatevaa ◴[] No.44614949[source]
There is no way to enforce that license. Free software doesn't have funds for such lawsuits.
27. artathred ◴[] No.44614993{7}[source]
Are you genuinely acting this obtuse? what do you think walmart and every single retailer does when you walk into a physical store? it’s always constant monitoring to be able to provide a better customer experience. This doesn’t change with online, businesses want to improve their service and they need the data to do so.
replies(2): >>44615030 #>>44615374 #
28. fauigerzigerk ◴[] No.44615015{7}[source]
>This is a personal decision to be made by the data "donor".

My problem is that users cannot make this personal decision based on the cookie consent banners because all sites have to request this consent even if they do exactly what they should be doing in their users' interest. There's no useful signal in this noise.

The worst data harvesters look exactly the same as a site that does basic traffic analysis for basic usability purposes.

The law makes it easy for the worst offenders to hide behind everyone else. That's why I'm calling it counterproductive.

[Edit] Wrt NHS specifically - this is a case in point. They use some tools to analyse traffic in order to improve their website. If they honour their own privacy policy, they will have configured those tools accordingly.

I understand that this can still be criticised from various angles. But is this criticism worth destroying the effectiveness of the law and burying far more important distinctions?

The law makes the NHS and Daily Mail look exactly the same to users as far as privacy and data protection is concered. This is completely misleading, don't you think?

replies(2): >>44615348 #>>44615961 #
29. sealeck ◴[] No.44615016[source]
> This is European law, not US. Reasonable means reasonable and judges here are expected to weigh each side's interests and come to a conclusion. Not just a literal interpretation of the law.

I think you've got civil and common law the wrong way round :). US judges have _much_ more power to interpret law!

replies(2): >>44615325 #>>44616612 #
30. owebmaster ◴[] No.44615030{8}[source]
> it’s always constant monitoring to be able to provide a better customer experience

This part gave me a genuine laugh. Good joke.

replies(1): >>44615088 #
31. owebmaster ◴[] No.44615060{8}[source]
> Retargetting etc is massive revenue for online retailers

Drug trafficking, stealing, scams are massive revenue for gangs.

32. tcfhgj ◴[] No.44615071{4}[source]
Just don't process any personal data by default when not I inherently required -> no banner required.
33. artathred ◴[] No.44615088{9}[source]
ah yes because walmart wants to harvest your in-store video data so they can eventually clone you right?

adjusts tinfoil hat

replies(1): >>44616332 #
34. saubeidl ◴[] No.44615325[source]
It is European law, as in EU law, not law from a European state. In EU matters, the teleogocial interpretation, i.e. intent applies:

> When interpreting EU law, the CJEU pays particular attention to the aim and purpose of EU law (teleological interpretation), rather than focusing exclusively on the wording of the provisions (linguistic interpretation).

> This is explained by numerous factors, in particular the open-ended and policy-oriented rules of the EU Treaties, as well as by EU legal multilingualism.

> Under the latter principle, all EU law is equally authentic in all language versions. Hence, the Court cannot rely on the wording of a single version, as a national court can, in order to give an interpretation of the legal provision under consideration. Therefore, in order to decode the meaning of a legal rule, the Court analyses it especially in the light of its purpose (teleological interpretation) as well as its context (systemic interpretation).

https://www.europarl.europa.eu/RegData/etudes/BRIE/2017/5993...

replies(1): >>44615527 #
35. 1718627440 ◴[] No.44615333{7}[source]
That's true, because of the EU privacy regulation, because they make companies write a wall of text before doing smth. suspicious.
36. 1718627440 ◴[] No.44615338{4}[source]
I don't have to, because there are add-ons to reject everything.
37. 1718627440 ◴[] No.44615348{8}[source]
> even if they do exactly what they should be doing in their users' interest

If they only do this, they don't need to show anything.

replies(1): >>44615544 #
38. 1718627440 ◴[] No.44615374{8}[source]
If you're talking about the same jurisdiction of this privacy laws, then this is illegal. Your are only allowed to retain videos for 24h and only use it for basically calling the police.
replies(1): >>44617377 #
39. chimeracoder ◴[] No.44615527{3}[source]
> It is European law, as in EU law, not law from a European state. In EU matters, the teleogocial interpretation, i.e. intent applies

I'm not sure why you and GP are trying to use this point to draw a contrast to the US? That very much is a feature in US law as well.

replies(1): >>44616428 #
40. fauigerzigerk ◴[] No.44615544{9}[source]
Then we clearly disagree on what they should be doing.

And this is the crux of the problem. The law helps a tiny minority of people enforce an extremely (and in my view pointlessly) strict version of privacy at the cost of misleading everybody else into thinking that using analytics for the purpose of making usability improvements is basically the same thing as sending personal data to 500 data brokers to make money off of it.

replies(1): >>44615875 #
41. h4ck_th3_pl4n3t ◴[] No.44615634[source]
An open source cocaine production machine is still an illegal cocaine production machine. The fact that it's open source doesn't matter.

You seem to not have understood that different forms of appliances need to comply with different forms of law. And you being able to call it open source or not doesn't change anything about its legal aspects.

And every law written is a compromise between two opposing parties.

42. 1718627440 ◴[] No.44615875{10}[source]
If you are talking for example about invasive A/B tests, then the solution is to pay for testers, not to test on your users.

What exactly do think should be allowed which still respect privacy, which isn't now?

43. grues-dinner ◴[] No.44615961{8}[source]
I don't think it's too misleading, because in the absence of any other information, they are the same.

What you could then add to this system is a certification scheme to permit implicit consent of all the data handling (including who you hand it off to and what they are allowed to do with it, as well as whether they have demonstrated themselves to be trustworthy) is audited to be compliant with some more stringent requirements. It could even be self-certification along the lines of CE marking. But that requires strict enforcement, and the national regulators so far have been a bunch of wet blankets.

That actually would encourage organisations to find ways to get the information they want without violating the privacy of their users and anyone else who strays into their digital properties.

44. owebmaster ◴[] No.44616332{10}[source]
yeah this one wasn't as funny.
replies(1): >>44617380 #
45. saubeidl ◴[] No.44616428{4}[source]
I will admit my ignorance of the finer details of US law - could you share resources explaining the parallels?
46. lowkey_ ◴[] No.44616612[source]
In the US, for most laws, and most judges, there's actually much less power to interpret law. Part of the benefit of the common law system is to provide consistency and take that interpretation power away from judges of each case.
47. artathred ◴[] No.44617377{9}[source]
walmart has sales associates running around gathering all those data points, as well as people standing around monitoring. Their “eyes” aren’t regulated.
48. artathred ◴[] No.44617380{11}[source]
I can see how it hits too close to home for you
49. johnisgood ◴[] No.44617435{7}[source]
I do not disagree. It could indeed be made shorter than usual, especially if you are not malicious.