Most active commenters
  • artathred(7)
  • fauigerzigerk(6)
  • 1718627440(6)
  • deanc(4)
  • grues-dinner(4)
  • owebmaster(4)
  • lcnielsen(3)

←back to thread

324 points rntn | 38 comments | | HN request time: 2.285s | source | bottom
Show context
ankit219 ◴[] No.44608660[source]
Not just Meta, 40 EU companies urged EU to postpone roll out of the ai act by two years due to it's unclear nature. This code of practice is voluntary and goes beyond what is in the act itself. EU published it in a way to say that there would be less scrutiny if you voluntarily sign up for this code of practice. Meta would anyway face scrutiny on all ends, so does not seem to a plausible case to sign something voluntary.

One of the key aspects of the act is how a model provider is responsible if the downstream partners misuse it in any way. For open source, it's a very hard requirement[1].

> GPAI model providers need to establish reasonable copyright measures to mitigate the risk that a downstream system or application into which a model is integrated generates copyright-infringing outputs, including through avoiding overfitting of their GPAI model. Where a GPAI model is provided to another entity, providers are encouraged to make the conclusion or validity of the contractual provision of the model dependent upon a promise of that entity to take appropriate measures to avoid the repeated generation of output that is identical or recognisably similar to protected works.

[1] https://www.lw.com/en/insights/2024/11/european-commission-r...

replies(8): >>44610592 #>>44610641 #>>44610669 #>>44611112 #>>44612330 #>>44613357 #>>44617228 #>>44620292 #
t0mas88 ◴[] No.44610641[source]
Sounds like a reasonable guideline to me. Even for open source models, you can add a license term that requires users of the open source model to take "appropriate measures to avoid the repeated generation of output that is identical or recognisably similar to protected works"

This is European law, not US. Reasonable means reasonable and judges here are expected to weigh each side's interests and come to a conclusion. Not just a literal interpretation of the law.

replies(4): >>44613578 #>>44614324 #>>44614949 #>>44615016 #
deanc ◴[] No.44613578[source]
Except that it’s seemingly impossible to prevent against prompt injection. The cat is out the bag. Much like a lot of other legislation (eg cookie law, being responsible for user generated content when you have millions of it posted per day) it’s entirely impractical albeit well-meaning.
replies(1): >>44613667 #
lcnielsen ◴[] No.44613667[source]
I don't think the cookie law is that impractical? It's easy to comply with by just not storing non-essential user information. It would have been completely nondisruptive if platforms agreed to respect users' defaults via browser settings, and then converged on a common config interface.

It was made impractical by ad platforms and others who decided to use dark patterns, FUD and malicious compliance to deceive users into agreeing to be tracked.

replies(3): >>44613785 #>>44613896 #>>44613989 #
deanc ◴[] No.44613785[source]
It is impractical for me as a user. I have to click on a notice on every website on the internet before interacting with it - often which are very obtuse and don’t have a “reject all” button but a “manage my choices” button which takes to an even more convoluted menu.

Instead of exactly as you say: a global browser option.

As someone who has had to implement this crap repeatedly - I can’t even begin to imagine the amount of global time that has been wasted implementing this by everyone, fixing mistakes related to it and more importantly by users having to interact with it.

replies(3): >>44613848 #>>44615071 #>>44615338 #
1. lcnielsen ◴[] No.44613848[source]
Yeah, but the only reason for this time wasteage is because website operators refuse to accept what would become the fallback default of "minimal", for which they would not need to seek explicit consent. It's a kind of arbitrage, like those scammy website that send you into redirect loops with enticing headlines.

The law is written to encourage such defaults if anything, it just wasn't profitable enough I guess.

replies(2): >>44613881 #>>44614219 #
2. deanc ◴[] No.44613881[source]
The reality is the data that is gathered is so much more valuable and accurate if you gather consent when you are running a business. Defaulting to a minimal config is just not practical for most businesses either. The decisions that are made with proper tracking data have a real business impact (I can see it myself - working at a client with 7 figure monthly revenue).

Im fully supportive of consent, but the way it is implemented is impractical from everyone’s POV and I stand by that.

replies(4): >>44613917 #>>44613943 #>>44614111 #>>44614127 #
3. ta1243 ◴[] No.44613917[source]
Why would I ever want to consent to you abusing my data?
4. user5534762135 ◴[] No.44613943[source]
That is only true if you agree with ad platforms that tracking ads are fundamentally required for businesses, which is trivially untrue for most enterprises. Forcing businesses to get off privacy violating tracking practices is good, and it's not the EU that's at fault for forcing companies to be open about ad networks' intransigence on that part.
5. bfg_9k ◴[] No.44614111[source]
Are you genuinely trying to defend businesses unnecessarily tracking users online? Why can't businesses sell their core product(s) and you know... not track users? If they did that, then they wouldn't need to implement a cookie banner.
replies(3): >>44614226 #>>44614240 #>>44614993 #
6. discreteevent ◴[] No.44614127[source]
> just not practical for most businesses

I don't think practical is the right word here. All the businesses in the world operated without tracking until the mid 90s.

7. fauigerzigerk ◴[] No.44614219[source]
Not even EU institutions themselves are falling back on deaults that don't require cookie consent.

I'm constantly clicking away cookie banners on UK government or NHS (our public healthcare system) websites. The ICO (UK privacy watchdog) requires cookie consent. The EU Data Protection Supervisor wants cookie consent. Almost everyone does.

And you know why that is? It's not because they are scammy ad funded sites or because of government surveillance. It's because the "cookie law" requires consent even for completely reasonable forms of traffic analysis with the sole purpose of improving the site for its visitors.

This is impractical, unreasonable, counterproductive and unintelligent.

replies(3): >>44614559 #>>44614784 #>>44614823 #
8. deanc ◴[] No.44614226{3}[source]
Retargetting etc is massive revenue for online retailers. I support their right to do it if users consent to it. I don’t support their right to do it if users have not consented.

The conversation is not about my opinion on tracking, anyway. It’s about the impracticality of implementing the legislation that is hostile and time consuming for both website owners and users alike

replies(1): >>44615060 #
9. lcnielsen ◴[] No.44614240{3}[source]
Plus with any kind of effort put into a standard browser setting you could easily have some granularity, like: accept anonymous ephemeral data collected to improve website, but not stuff shared with third parties, or anything collected for the purpose of tailoring content or recommendations for you.
10. FirmwareBurner ◴[] No.44614559[source]
>This is impractical, unreasonable, counterproductive and unintelligent.

It keeps the political grifters who make these regulations employed, that's kind of the main point in EU/UKs endless stream of regulations upon regulations.

11. troupo ◴[] No.44614784[source]
> It's because the "cookie law" requires consent even for completely reasonable forms of traffic analysis with the sole purpose of improving the site for its visitors

Yup. That's what those 2000+ "partners" are all about if you believe their "legitimate interest" claims: "improve traffic"

12. grues-dinner ◴[] No.44614823[source]
> completely reasonable

This is a personal decision to be made by the data "donor".

The NHS website cookie banner (which does have a correct implementation in that the "no consent" button is of equal prominence to the "mi data es su data" button) says:

> We'd also like to use analytics cookies. These collect feedback and send information about how our site is used to services called Adobe Analytics, Adobe Target, Qualtrics Feedback and Google Analytics. We use this information to improve our site.

In my opinion, it is not, as described, "completely reasonable" to consider such data hand-off to third parties as implicitly consented to. I may trust the NHS but I may not trust their partners.

If the data collected is strictly required for the delivery of the service and is used only for that purpose and destroyed when the purpose is fulfilled (say, login session management), you don't need a banner.

The NHS website is in a slightly tricky position, because I genuinely think they will be trying to use the data for site and service improvement, at least for now, and they hopefully have done their homework to make sure Adobe, say, are also not misusing the data. Do I think the same from, say, the Daily Mail website? Absolutely not, they'll be selling every scrap of data before the TCP connection even closes to anyone paying. Now, I may know the Daily Mail is a wretched hive of villainy and can just not go there, but I do not know about every website I visit. Sadly the scumbags are why no-one gets nice things.

replies(1): >>44615015 #
13. artathred ◴[] No.44614993{3}[source]
Are you genuinely acting this obtuse? what do you think walmart and every single retailer does when you walk into a physical store? it’s always constant monitoring to be able to provide a better customer experience. This doesn’t change with online, businesses want to improve their service and they need the data to do so.
replies(2): >>44615030 #>>44615374 #
14. fauigerzigerk ◴[] No.44615015{3}[source]
>This is a personal decision to be made by the data "donor".

My problem is that users cannot make this personal decision based on the cookie consent banners because all sites have to request this consent even if they do exactly what they should be doing in their users' interest. There's no useful signal in this noise.

The worst data harvesters look exactly the same as a site that does basic traffic analysis for basic usability purposes.

The law makes it easy for the worst offenders to hide behind everyone else. That's why I'm calling it counterproductive.

[Edit] Wrt NHS specifically - this is a case in point. They use some tools to analyse traffic in order to improve their website. If they honour their own privacy policy, they will have configured those tools accordingly.

I understand that this can still be criticised from various angles. But is this criticism worth destroying the effectiveness of the law and burying far more important distinctions?

The law makes the NHS and Daily Mail look exactly the same to users as far as privacy and data protection is concered. This is completely misleading, don't you think?

replies(2): >>44615348 #>>44615961 #
15. owebmaster ◴[] No.44615030{4}[source]
> it’s always constant monitoring to be able to provide a better customer experience

This part gave me a genuine laugh. Good joke.

replies(1): >>44615088 #
16. owebmaster ◴[] No.44615060{4}[source]
> Retargetting etc is massive revenue for online retailers

Drug trafficking, stealing, scams are massive revenue for gangs.

replies(1): >>44620087 #
17. artathred ◴[] No.44615088{5}[source]
ah yes because walmart wants to harvest your in-store video data so they can eventually clone you right?

adjusts tinfoil hat

replies(1): >>44616332 #
18. 1718627440 ◴[] No.44615348{4}[source]
> even if they do exactly what they should be doing in their users' interest

If they only do this, they don't need to show anything.

replies(1): >>44615544 #
19. 1718627440 ◴[] No.44615374{4}[source]
If you're talking about the same jurisdiction of this privacy laws, then this is illegal. Your are only allowed to retain videos for 24h and only use it for basically calling the police.
replies(1): >>44617377 #
20. fauigerzigerk ◴[] No.44615544{5}[source]
Then we clearly disagree on what they should be doing.

And this is the crux of the problem. The law helps a tiny minority of people enforce an extremely (and in my view pointlessly) strict version of privacy at the cost of misleading everybody else into thinking that using analytics for the purpose of making usability improvements is basically the same thing as sending personal data to 500 data brokers to make money off of it.

replies(1): >>44615875 #
21. 1718627440 ◴[] No.44615875{6}[source]
If you are talking for example about invasive A/B tests, then the solution is to pay for testers, not to test on your users.

What exactly do think should be allowed which still respect privacy, which isn't now?

replies(1): >>44618118 #
22. grues-dinner ◴[] No.44615961{4}[source]
I don't think it's too misleading, because in the absence of any other information, they are the same.

What you could then add to this system is a certification scheme to permit implicit consent of all the data handling (including who you hand it off to and what they are allowed to do with it, as well as whether they have demonstrated themselves to be trustworthy) is audited to be compliant with some more stringent requirements. It could even be self-certification along the lines of CE marking. But that requires strict enforcement, and the national regulators so far have been a bunch of wet blankets.

That actually would encourage organisations to find ways to get the information they want without violating the privacy of their users and anyone else who strays into their digital properties.

replies(1): >>44618052 #
23. owebmaster ◴[] No.44616332{6}[source]
yeah this one wasn't as funny.
replies(1): >>44617380 #
24. artathred ◴[] No.44617377{5}[source]
walmart has sales associates running around gathering all those data points, as well as people standing around monitoring. Their “eyes” aren’t regulated.
replies(1): >>44617618 #
25. artathred ◴[] No.44617380{7}[source]
I can see how it hits too close to home for you
26. 1718627440 ◴[] No.44617618{6}[source]
Walmart in the EU?
replies(1): >>44618997 #
27. fauigerzigerk ◴[] No.44618052{5}[source]
>I don't think it's too misleading, because in the absence of any other information, they are the same.

But other information not being absent we know that they are not the same. Just compare privacy policies for instance. The cookie law makes them appear similar in spite of the fact that they are very different (as of now - who knows what will happen to the NHS).

replies(1): >>44618647 #
28. fauigerzigerk ◴[] No.44618118{7}[source]
I would draw the line where my personal data is exchanged with third parties for the purpose of monetisation. I want the websites I visit to be islands that do not contribute to anyone's attempt to create a complete profile of my online (and indeed offline) life.

I don't care about anything else. They can do whatever A/B testing they want as far as I'm concerned. They can analyse my user journey across multiple visits. They can do segmentation to see how they can best serve different groups of users. They can store my previous search terms, choices and preferences. If it's a shop, they can rank products according to what they think might interest me based on previous visits. These things will likely make the site better for me or at least not much worse.

Other people will surely disagree. That's fine. What's more important than where exactly to draw the line is to recognise that there are trade-offs.

The law seems to be making an assumption that the less sites can do without asking for consent the better most people's privacy will be protected.

But this is a flawed idea, because it creates an opportunity for sites to withhold useful features from people unless and until they consent to a complete loss of privacy.

Other sites that want to provide those features without complete loss of privacy cannot distinguish themselves by not asking for consent.

Part of the problem is the overly strict interpretation of "strictly necessary" by data protection agencies. There are some features that could be seen as strictly necessary for normal usability (such as remembering preferences) but this is not consistently accepted by data protection agencies so sites will still ask for consent to be on the safe side.

29. grues-dinner ◴[] No.44618647{6}[source]
I do understand the point, but other then allowing a process of auditing to allow a middle ground of consent implied for first-party use only and within some strictly defined boundaries, what else can you do? It's a market for lemons in terms of trustworthy data processors. 90% (bum-pull figures, but lines up with the number of websites that play silly buggers with hiding the no-consent button) of all people who want to use data will be up to no good and immediately try to bend and break every rule.

I would also be in favour of companies having to report all their negative data protection judgements against them and everyone they will share your data with in their cookie banner before giving you the choice as to whether you trust them.

replies(1): >>44619551 #
30. artathred ◴[] No.44618997{7}[source]
replace walmart with tesco or your eu retailer of choice, point still holds.

playing with semantics makes you sound smart though!

replies(1): >>44619233 #
31. 1718627440 ◴[] No.44619233{8}[source]
The question still stands then: Does it happen in Tesco in the EU? Because that is illegal.

The original idea was that it should be legal to track people, because it is ok in the analog world. But it really isn't and I'm glad it is illegal in the EU. I think it should be in the US also, but the EU can't change that and I have no right to have political influence about foreign countries so that doesn't matter.

replies(1): >>44619493 #
32. artathred ◴[] No.44619493{9}[source]
it’s illegal for Tesco to have any number of employees watching/monitoring/“tracking” in the store with their own eyes and using those in-store insights to drive better customer experiences?
replies(1): >>44622408 #
33. fauigerzigerk ◴[] No.44619551{7}[source]
If any rule is going to be broken and impossible to enforce, how can that be a justification for keeping a bad rule rather than replacing it with more sensible one?
replies(1): >>44623568 #
34. JumpinJack_Cash ◴[] No.44620087{5}[source]
Bro can you send me a link to the RJ community Whats app?

kwaigdc7 @ gmail.com

replies(1): >>44621227 #
35. owebmaster ◴[] No.44621227{6}[source]
hey! Which one? you can find them here: https://nomadbrazil.notion.site/Rio-WhatsApp-Groups-cc9ae8b8...
36. 1718627440 ◴[] No.44622408{10}[source]
Making statistics about sex, age, number of children, clothing choice, walking speed without consent, sounds illegal. I think it isn't forbidden for the company, but for the individual already, because that's voyeuristic behaviour.

Watching what is bought is fine, but walking around to do that is useless work, because you have that in the accounting/sales data already.

There is stuff like PayPal and now per company apps, that works the same as on the web: you need to first sign a contract. I would rather that to be cracked done on, but I see that it is difficult, because you can't forbid individual choice. But I think the incentive is that products become cheaper when you opt-in to data collection. This is already forbidden though, you can't combine consent with other benefits, then it isn't free consent anymore. I expect a lawsuit in the next decades.

replies(1): >>44627086 #
37. grues-dinner ◴[] No.44623568{8}[source]
I said they'd try to break them. Which requires vigilance and regulators stepping in with an enormous hammer. So far national regulators have been pretty weaksauce which is indeed very frustrating.

I'm not against improving the system, and I even proposed something, but I am against letting data abusers run riot because the current system isn't quite 100% perfect.

I'll still take what we have over what we had before (nothing, good luck everyone).

38. artathred ◴[] No.44627086{11}[source]
EXTREMELY curious to see where in EU law it states that a store creating internal reports based on purely VISUAL statistics that employees can observe like walking speed, sex, number of children, etc is illegal.