←back to thread

747 points porridgeraisin | 1 comments | | HN request time: 0.236s | source
Show context
ljosifov ◴[] No.45064773[source]
Excellent. What were they waiting for up to now?? I thought they already trained on my data. I assume they train, even hope that they train, even when they say they don't. People that want to be data privacy maximalists - fine, don't use their data. But there are people out there (myself) that are on the opposite end of the spectrum, and we are mostly ignored by the companies. Companies just assume people only ever want to deny them their data.

It annoys me greatly, that I have no tick box on Google to tell them "go and adapt models I use on my Gmail, Photos, Maps etc." I don't want Google to ever be mistaken where I live - I have told them 100 times already.

This idea that "no one wants to share their data" is just assumed, and permeates everything. Like soft-ball interviews that a popular science communicator did with DeepMind folks working in medicine: every question was prefixed by litany of caveats that were all about 1) assumed aversion of people to sharing their data 2) horrors and disasters that are to befall us should we share the data. I have not suffered any horrors. I'm not aware of any major disasters. I'm aware of major advances in medicine in my lifetime. Ultimately the process does involve controlled data collection and experimentation. Looks a good deal to me tbh. I go out of my way to tick all the NHS boxes too, to "use my data as you see fit". It's an uphill struggle. The defaults are always "deny everything". Tick boxes never go away, there is no master checkbox "use any and all of my data and never ask me again" to tick.

replies(16): >>45064814 #>>45064872 #>>45064877 #>>45064889 #>>45064911 #>>45064921 #>>45064967 #>>45064974 #>>45064988 #>>45065001 #>>45065005 #>>45065065 #>>45065128 #>>45065333 #>>45065457 #>>45065554 #
bgwalter ◴[] No.45065065[source]
I realize this might be satire. If not, you are using the same aggressive strategy of turning the tables as Palantir:

https://www.theguardian.com/technology/2025/jul/08/palantir-...

Most people do want to deny their data, as we have recently seen in various DOGE backlashes.

replies(1): >>45072978 #
1. ljosifov ◴[] No.45072978[source]
It's not a satire, you can check mu comments on this topic easily.

I dispute 'most people'. Revealed preferences of most people are that they value their data privacy very cheaply, almost zero. Even one click extra to share their data less, is one click too many, an effort too high - for most people. This is their real, observed behaviour. I think our current predicament is the case of "public lies, private truths." A small cadre of vocal proponents of a particular view, established "the ground truth to what is desirable". (in this case - maximum privacy, ideally zero information sharing) The public goes with it in words, pays lip service - but in reality behaves different, even opposite to what they say they desire.

And even if 'most people' wanted what you say they do, I still think the companies could and should accommodate a minority group like myself that want otherwise to what 'most people' want. I don't think the will of the majority is the highest ideal, so high as to trump what I personally want.