←back to thread

747 points porridgeraisin | 8 comments | | HN request time: 0s | source | bottom
Show context
ljosifov ◴[] No.45064773[source]
Excellent. What were they waiting for up to now?? I thought they already trained on my data. I assume they train, even hope that they train, even when they say they don't. People that want to be data privacy maximalists - fine, don't use their data. But there are people out there (myself) that are on the opposite end of the spectrum, and we are mostly ignored by the companies. Companies just assume people only ever want to deny them their data.

It annoys me greatly, that I have no tick box on Google to tell them "go and adapt models I use on my Gmail, Photos, Maps etc." I don't want Google to ever be mistaken where I live - I have told them 100 times already.

This idea that "no one wants to share their data" is just assumed, and permeates everything. Like soft-ball interviews that a popular science communicator did with DeepMind folks working in medicine: every question was prefixed by litany of caveats that were all about 1) assumed aversion of people to sharing their data 2) horrors and disasters that are to befall us should we share the data. I have not suffered any horrors. I'm not aware of any major disasters. I'm aware of major advances in medicine in my lifetime. Ultimately the process does involve controlled data collection and experimentation. Looks a good deal to me tbh. I go out of my way to tick all the NHS boxes too, to "use my data as you see fit". It's an uphill struggle. The defaults are always "deny everything". Tick boxes never go away, there is no master checkbox "use any and all of my data and never ask me again" to tick.

replies(16): >>45064814 #>>45064872 #>>45064877 #>>45064889 #>>45064911 #>>45064921 #>>45064967 #>>45064974 #>>45064988 #>>45065001 #>>45065005 #>>45065065 #>>45065128 #>>45065333 #>>45065457 #>>45065554 #
12ian34 ◴[] No.45064814[source]
not remotely worried about leaks, hacks, or sinister usage of your data?
replies(3): >>45064920 #>>45065057 #>>45072864 #
1. londons_explore ◴[] No.45064920[source]
I would far prefer the service use my data to work better and take a few privacy risks.

People die all the time from cancer or car accidents. People very rarely die from data leaks.

Some countries like Sweden make people's private financial data public information - and yet their people seem happier than ever. Perhaps privacy isn't as important as we think for a good society.

replies(6): >>45065000 #>>45065055 #>>45065141 #>>45065772 #>>45065823 #>>45066321 #
2. soiltype ◴[] No.45065000[source]
public/private isn't a binary, it's a spectrum. we Americans mostly sit in the shithole middle ground where our data is widely disseminated among private, for-profit actors, for the explicit purpose of being used to manipulate us, but it's mostly not available to us, creating an assymmetric power balance.
replies(1): >>45066547 #
3. 12ian34 ◴[] No.45065055[source]
Sweden is a very poor example, all that is public is personal taxable income. That's it. You're comparing apples to oranges. And how is your home address, and AI chatbot history going to cure cancer?
4. Gud ◴[] No.45065141[source]
That financial data is very limited. Would it be just as acceptable if these companies knew where and what you purchased?
5. ◴[] No.45065772[source]
6. nojs ◴[] No.45065823[source]
Would you be comfortable posting all of this information here, right now? Your name, address, email address, search history, ChatGPT history, emails, …

If not, why?

7. ljosifov ◴[] No.45066321[source]
In the past I have found obstacles to data sharing codified in the UK law frustrating. I'm reasonably sure some people will have died because of this, that would not have died otherwise. If they could communicate with the NHS, similarly (email, whatsapp) to how they communicate in their private and professional lives.

Within the UK NHS and UK private hospital care, these are my personal experiences.

1) Can't email my GP to pass information back-and-forth. GP withholds their email contact, I can't email them e.g. pictures of scans, or lab work reports. In theory they should have those already on their side. In practice they rarely do. The exchange of information goes sms->web link->web form->submit - for one single turn. There will be multiple turns. Most people just give up.

2) MRI scan private hospital made me jump 10 hops before sending me link, so I can download my MRI scans videos and pictures. Most people would have given up. There were several forks in the process where in retrospect could have delayed data DL even more.

3) Blood tests scheduling can't tell me back that scheduled blood test for a date failed. Apparently it's between too much to impossible for them to have my email address on record, and email me back that the test was scheduled, or the scheduling failed. And that I should re-run the process.

4) I would like to volunteer my data to benefit R&D in the NHS. I'm a user of medicinal services. I'm cognisant that all those are helping, but the process of establishing them relied on people unknown to me sharing very sensitive personal information. If it wasn't for those unknown to me people, I would be way worse off. I'd like to do the same, and be able to tell UK NHS "here are, my lab works reports, 100 GB of my DNA paid for by myself, my medical histories - take them all in, use them as you please."

In all cases vague mutterings of "data protection... GDPR..." have been relayed back as "reasons". I take it's mostly B/S. Yes there are obstacles, but the staff could work around if they wanted to. However there is a kernel of truth - it's easier for them to not try to share, it's less work and less risk, so the laws are used as a cover leaf. (in the worst case - an alibi for laziness.)

8. ljosifov ◴[] No.45066547[source]
I agree with your stance there. Further - the conventional opinion is that the power imbalance coming from the information imbalance (state/business know a lot about me; I know little about them) is that us citizens and consumers should reduce our "information surface" towards them. And address the imbalance that way. But.

There exists another, often unmentioned option. And that option is for state/business to open up, to increase their "information surface" towards us, their citizens/consumers. That will also achieve information (and one hopes power) rebalance. Every time it's actually measured, how much value we put on our privacy, when we have to weight privacy against convenience and other gains from more data sharing, the revealed preference is close to zero. The revealed preference is that we put the value of our privacy close to zero, despite us forever saying otherwise. (that we value privacy very very much; seems - "it ain't so")

So the option of state/business revealing more data to us citizens/consumers, is actually more realistic. Yes there is extra work on part of state/business to open their data to us. But it's worth it. The more advanced the society, the more coordination it needs to achieve the right cooperation-competition balance in the interactions between ever greater numbers of people.

There is an old book "Data For the People" by an early AI pioneer and Amazon CTO Andreas Weigend. Afaics it well describes the world we live in, and also are likely to live even more in the future.