Most active commenters
  • ljosifov(21)
  • 12ian34(5)
  • soraminazuki(5)

←back to thread

747 points porridgeraisin | 55 comments | | HN request time: 0.85s | source | bottom
1. ljosifov ◴[] No.45064773[source]
Excellent. What were they waiting for up to now?? I thought they already trained on my data. I assume they train, even hope that they train, even when they say they don't. People that want to be data privacy maximalists - fine, don't use their data. But there are people out there (myself) that are on the opposite end of the spectrum, and we are mostly ignored by the companies. Companies just assume people only ever want to deny them their data.

It annoys me greatly, that I have no tick box on Google to tell them "go and adapt models I use on my Gmail, Photos, Maps etc." I don't want Google to ever be mistaken where I live - I have told them 100 times already.

This idea that "no one wants to share their data" is just assumed, and permeates everything. Like soft-ball interviews that a popular science communicator did with DeepMind folks working in medicine: every question was prefixed by litany of caveats that were all about 1) assumed aversion of people to sharing their data 2) horrors and disasters that are to befall us should we share the data. I have not suffered any horrors. I'm not aware of any major disasters. I'm aware of major advances in medicine in my lifetime. Ultimately the process does involve controlled data collection and experimentation. Looks a good deal to me tbh. I go out of my way to tick all the NHS boxes too, to "use my data as you see fit". It's an uphill struggle. The defaults are always "deny everything". Tick boxes never go away, there is no master checkbox "use any and all of my data and never ask me again" to tick.

replies(16): >>45064814 #>>45064872 #>>45064877 #>>45064889 #>>45064911 #>>45064921 #>>45064967 #>>45064974 #>>45064988 #>>45065001 #>>45065005 #>>45065065 #>>45065128 #>>45065333 #>>45065457 #>>45065554 #
2. 12ian34 ◴[] No.45064814[source]
not remotely worried about leaks, hacks, or sinister usage of your data?
replies(3): >>45064920 #>>45065057 #>>45072864 #
3. j4hdufd8 ◴[] No.45064872[source]
> But there are people out there (myself) that are on the opposite end of the spectrum, and we are mostly ignored by the companies. Companies just assume people only ever want to deny them their data.

What? I think you're exactly the kind of person that companies pay attention to, and why they pull moves like this

replies(1): >>45072923 #
4. blipmusic ◴[] No.45064877[source]
My life does in fact have priorities above ”LLMs should work a bit better”.
5. j4hdufd8 ◴[] No.45064889[source]
Are you okay getting ads for shit holistic medication because you had a mental health conversation with AI?
6. imiric ◴[] No.45064911[source]
What a ridiculous stance.

Do you lock your front door, or use passwords on any of your accounts? Because what you're essentially saying is that you're OK with strangers having access to your personal information. That's beyond the already flawed "I have nothing to hide" argument.

replies(1): >>45064928 #
7. londons_explore ◴[] No.45064920[source]
I would far prefer the service use my data to work better and take a few privacy risks.

People die all the time from cancer or car accidents. People very rarely die from data leaks.

Some countries like Sweden make people's private financial data public information - and yet their people seem happier than ever. Perhaps privacy isn't as important as we think for a good society.

replies(6): >>45065000 #>>45065055 #>>45065141 #>>45065772 #>>45065823 #>>45066321 #
8. franga2000 ◴[] No.45064921[source]
Try living in a place with privatised health insurance and you'll quickly see why medical data is some of the most important to keep private.
9. JumpCrisscross ◴[] No.45064928[source]
> Do you lock your front door

In trusted neighborhoods? No. But that respect goes both ways.

10. behnamoh ◴[] No.45064967[source]
Are you trolling us or do you live in a hypothetical world where companies have our best interests in heart?
11. beepbooptheory ◴[] No.45064974[source]
This may very well be a rational stance, but either way, wish one could somehow teleport this sentiment to the Cypherpunk mailing list in the 80s/90s. Of all the things they projected, concocted, fought for.. Nothing could truly prepare them for this kind of thing: the final victory of the product over the people, the happy acceptance of surveillance. They were all imagining terrible dystopias garnered from state violence and repression, never could they begin to imagine it could all transpire anyway because people like not having to type their address in!

> The angel would like to stay, awaken the dead, and make whole what has been smashed. But a storm is blowing from Paradise; it has got caught in his wings with such violence that the angel can no longer close them. The storm irresistibly propels him into the future to which his back is turned, while the pile of debris before him grows skyward. This storm is what we call progress.

replies(1): >>45065638 #
12. 827a ◴[] No.45064988[source]
Its incredible to me how seriously people can hold an opinion they've so clearly critically interrogated so little.
replies(1): >>45065758 #
13. soiltype ◴[] No.45065000{3}[source]
public/private isn't a binary, it's a spectrum. we Americans mostly sit in the shithole middle ground where our data is widely disseminated among private, for-profit actors, for the explicit purpose of being used to manipulate us, but it's mostly not available to us, creating an assymmetric power balance.
replies(1): >>45066547 #
14. Razengan ◴[] No.45065001[source]
> I'm not aware of any major disasters.

Oh boy. Did you somehow miss all the news about data leaks and password dumps etc. being sold on the "dark" web and shit?

Would you mind if I followed you around and noted everything you do and constantly demanded your attention?

The shit done by corporations is akin to a clingy stalker and would be absolutely despised if it was an individual person doing something like that.

As for benefits, which?? In my entire life I have never seen an ad for anything (that I did not already know about via other means) that made me want to look up the product, let alone buy it. Nor do I know anyone who did. In fact, it turns me off from a product if its ad appears too frequently.

Google etc. and various storefronts also almost never recommend me anything that actually matches my interests, beyond just a shallow word similarity, in fact they forcibly shove completely unrelated shit into my searches cause they were paid to. Like searching for RPGs and seeing Candy f'ing Crush.

----

You know what though, I kinda agree with the potential intent behind your charade:

Yes, LET ME TELL YOU ABOUT ME.

I will gladly TELL companies EXACTLY what I like, and I WANT you to use that. Show me other shit that is actually relevant to MY interests instead of the interests of whomever paid you to shove their shit into my face.

ASK! DON'T SPY! Because you can't ever get it right anyway!

15. calmbonsai ◴[] No.45065005[source]
I don't think you understand how...humanity works?! Is this deliberate parody?

Abuse of medical data is just the tip of the iceberg here and, at least in the states, privatized healthcare presents all sorts of for-profit pricing abuse scenarios let alone nasty scenarios for social coercion.

replies(1): >>45072994 #
16. 12ian34 ◴[] No.45065055{3}[source]
Sweden is a very poor example, all that is public is personal taxable income. That's it. You're comparing apples to oranges. And how is your home address, and AI chatbot history going to cure cancer?
17. ljosifov ◴[] No.45065057[source]
If they leaked bank accounts numbers, or private keys - I would be worried. That has not happened in the past.

About myself personally - my Name Surname is googleable, I'm on the open electoral register, so my address is not a secret, my company information is also open in the companies register, I have a a personal website I have put up willingly and share information about myself there. Training models on my data doesn't seem riskier than that.

Yeah, I know I'd be safer if I was completely dark, opaque to the world. I like the openness though. I also think my life has been enriched in infinitely many ways by people sharing parts of their lives via their data with me. So it would be mildly sociopathic of me, if I didn't do similar back to the world, to some extent.

replies(2): >>45065103 #>>45068227 #
18. bgwalter ◴[] No.45065065[source]
I realize this might be satire. If not, you are using the same aggressive strategy of turning the tables as Palantir:

https://www.theguardian.com/technology/2025/jul/08/palantir-...

Most people do want to deny their data, as we have recently seen in various DOGE backlashes.

replies(1): >>45072978 #
19. 12ian34 ◴[] No.45065103{3}[source]
So you are projecting sociopathy on those that choose to keep their lives more private than you? Like you said, basic personal details are essentially public knowledge anyway. Where do you draw the line personally on what should be private?
replies(1): >>45065529 #
20. Gud ◴[] No.45065128[source]
Have you considered the drawbacks of sharing your data to the most unscrupulous people on this planet?
replies(1): >>45072724 #
21. Gud ◴[] No.45065141{3}[source]
That financial data is very limited. Would it be just as acceptable if these companies knew where and what you purchased?
22. AlexandrB ◴[] No.45065333[source]
I think I'd have more understanding for this position if I thought that these companies were still fundamentally interested in serving their users. They are not. Any information you provide is more likely to be used against your interests (even if that's "just" targeting you with some ads for a scammy product) than for your benefit.

Basically all AI companies are fruit from the same VC-poisoned tree and I expect these products will get worse and more user-hostile as they try to monetize. We're currently living in the "MoviePass"[1] era of AI where users are being heavily subsidized to try to gain market share. It will not last and the potential for abuse is enormous.

[1] https://en.wikipedia.org/wiki/MoviePass

replies(1): >>45068317 #
23. mrbombastic ◴[] No.45065457[source]
You should read up on the Dutch Civil Registry and the holocaust in the Netherlands and reevaluate if you are serious. I would love to live in a world where everyone had good intentions and the powers that be wouldn’t abuse data to their ends, we will never live in that world.
24. ljosifov ◴[] No.45065529{4}[source]
Not at all, on the contrary, I chose my words carefully ("mildly sociopathic OF ME") as to avoid casting shade on others. Saying "this is how I feel", so to preclude judging others. Everyone makes their own choices, and that's fine.

Boundaries - yes sure they exist. I don't have my photo albums open to the world. I don't share info about family and friends - I know people by default don't want to share information about them, and I try to respect that. Don't share anything on Facebook, where plenty share, for example.

At the same time, I find obstacles to data sharing codified in the UK law frustrating. With the UK NHS. 1) Can't email my GP to pass information back-and-forth - GP withholds their email contact; 2) MRI scan private hospital makes me jump 10 hops before sharing my data with me; 3) Blood tests scheduling can't tell me back that schedule for a date failed, apparently it's too much for them to have my email address on record; 4) Can't volunteer my data to benefit R&D in NHS. ("here are - my lab works reports, 100 GB of my DNA paid for by myself, my medical histories - take them all in, use them as you please...") In all cases vague mutterings of "data protection... GDPR..." have been relayed back as "reasons". I take it's mostly B/S. They could work around if they wanted to. But there is a kernel of truth - it's easier for them to not try share, so it's used as a cover leaf. (in the worst case - an alibi for laziness.)

I'm for having power to share, or not share, what I want. With Google - I do want them to know about myself and use that for my (and theirs) benefit. With the UK gov (trying to break encryption) - I don't want them to be able to read my WhatsApp-s. I detest UK gov for effectively forcing me (by forcing the online pharmacy) to take a photos of myself (face, figure) in order to buy online Wegovy earlier today.

replies(1): >>45066579 #
25. soraminazuki ◴[] No.45065554[source]
You can train commercial AI with your data right now without screwing over everyone else on the planet. It's easy, just publish your entire trove of personal data on a website and AI crawlers will happily gobble it all up. You can publish your name, home address, work, government-issued ID, financial transactions, chats, browser history, location history, surveillance footage of your home, all for free. So what are you waiting for? Just do it now if you want to share data that badly, there's no need to wait for the approval of "privacy maximalists."
replies(1): >>45072819 #
26. ljosifov ◴[] No.45065638[source]
Well Yes and No. Funny you mention the 80s/90s. I grew up in the pre-Internet world. I remember home computers, then PC-s, then modems to access BBS-es, then FIDO, uucp email, academic Internet and then the private commercial Internet after 1990. Some parts of the privacy agenda I'm strongly pro-privacy, the more the better. I don't want encryption broken. The UK gov (I live in the UK) are being morons for that, forever trying that play. Atm there is at least part of the US admin to push back on that. I don't like UK Parliament forcing the online ID on me. I'm pro- having private citizens having private keys on un-snoop-able dongle devices.
27. soraminazuki ◴[] No.45065758[source]
It makes sense when you see it as indoctrination than a mere opinion. Quashing critical thinking is the point. How else can you convince people to work against their own interests?
replies(1): >>45065963 #
28. ◴[] No.45065772{3}[source]
29. nojs ◴[] No.45065823{3}[source]
Would you be comfortable posting all of this information here, right now? Your name, address, email address, search history, ChatGPT history, emails, …

If not, why?

30. ljosifov ◴[] No.45065963{3}[source]
I put it to you - consider that you maybe wrong. That I indeed know what's best for me. The same way my default is that you know what's best for you. "Critical thinking" and "indoctrination" - you are on path to the dark side there. I grew up in a socialist/communist country. One of the ways in which vast majority of the population were oppressed, mis-treated etc or worse, was by them being denied agency and capability for critical thinking, for recognising their own interests, by a mechanism called "false consciousness". The ideas you expressed in your comment are of similar kind.
replies(1): >>45066163 #
31. soraminazuki ◴[] No.45066163{4}[source]
Says the person advocating for companies to get rid of consent, the bare minimum they can do when screwing over people for profit. That's not deciding what's best for you. That's you unilaterally deciding that no one deserves consumer protection. You are trying to force on everyone what 96% of people are opposed to [1]. So don't you dare pull off that DARVO nonsense and accuse me of being an oppressive dictator.

Also in what universe are utter fantasies like "'no one wants to share their data' is just assumed" or "the defaults are always 'deny everything'" true? Tech companies are bypassing user consent all [2] the [3] time [4].

[1]: https://arstechnica.com/gadgets/2021/05/96-of-us-users-opt-o...

[2]: https://hn.algolia.com/?q=opt%20out

[3]: https://hn.algolia.com/?q=opt%20in

[4]: https://hn.algolia.com/?q=consent

replies(1): >>45067835 #
32. ljosifov ◴[] No.45066321{3}[source]
In the past I have found obstacles to data sharing codified in the UK law frustrating. I'm reasonably sure some people will have died because of this, that would not have died otherwise. If they could communicate with the NHS, similarly (email, whatsapp) to how they communicate in their private and professional lives.

Within the UK NHS and UK private hospital care, these are my personal experiences.

1) Can't email my GP to pass information back-and-forth. GP withholds their email contact, I can't email them e.g. pictures of scans, or lab work reports. In theory they should have those already on their side. In practice they rarely do. The exchange of information goes sms->web link->web form->submit - for one single turn. There will be multiple turns. Most people just give up.

2) MRI scan private hospital made me jump 10 hops before sending me link, so I can download my MRI scans videos and pictures. Most people would have given up. There were several forks in the process where in retrospect could have delayed data DL even more.

3) Blood tests scheduling can't tell me back that scheduled blood test for a date failed. Apparently it's between too much to impossible for them to have my email address on record, and email me back that the test was scheduled, or the scheduling failed. And that I should re-run the process.

4) I would like to volunteer my data to benefit R&D in the NHS. I'm a user of medicinal services. I'm cognisant that all those are helping, but the process of establishing them relied on people unknown to me sharing very sensitive personal information. If it wasn't for those unknown to me people, I would be way worse off. I'd like to do the same, and be able to tell UK NHS "here are, my lab works reports, 100 GB of my DNA paid for by myself, my medical histories - take them all in, use them as you please."

In all cases vague mutterings of "data protection... GDPR..." have been relayed back as "reasons". I take it's mostly B/S. Yes there are obstacles, but the staff could work around if they wanted to. However there is a kernel of truth - it's easier for them to not try to share, it's less work and less risk, so the laws are used as a cover leaf. (in the worst case - an alibi for laziness.)

33. ljosifov ◴[] No.45066547{4}[source]
I agree with your stance there. Further - the conventional opinion is that the power imbalance coming from the information imbalance (state/business know a lot about me; I know little about them) is that us citizens and consumers should reduce our "information surface" towards them. And address the imbalance that way. But.

There exists another, often unmentioned option. And that option is for state/business to open up, to increase their "information surface" towards us, their citizens/consumers. That will also achieve information (and one hopes power) rebalance. Every time it's actually measured, how much value we put on our privacy, when we have to weight privacy against convenience and other gains from more data sharing, the revealed preference is close to zero. The revealed preference is that we put the value of our privacy close to zero, despite us forever saying otherwise. (that we value privacy very very much; seems - "it ain't so")

So the option of state/business revealing more data to us citizens/consumers, is actually more realistic. Yes there is extra work on part of state/business to open their data to us. But it's worth it. The more advanced the society, the more coordination it needs to achieve the right cooperation-competition balance in the interactions between ever greater numbers of people.

There is an old book "Data For the People" by an early AI pioneer and Amazon CTO Andreas Weigend. Afaics it well describes the world we live in, and also are likely to live even more in the future.

34. 12ian34 ◴[] No.45066579{5}[source]
Thanks for this considered response. I find it difficult to disagree with anything you said in this particular comment :) however I do think each instance you mention in this message is quite different to the topic at hand, regarding the big tech data machine. Additionally, I think I would rather our UK level of privacy regarding healthcare data than the commercialised free for all in the US. One counterpoint could be that Palantir got a significant amount of UK NHS data.
replies(1): >>45067008 #
35. ljosifov ◴[] No.45067008{6}[source]
Thanks for the consideration. Yeah US and UK are different in that respect. I got the impression that US ends with the worst deal on both ends: organisations that could help you are denied your data, while organisation most unscrupulous most bent on doing their worst with your data, get almost free access to it.

For UK - I'm reasonably sure some people will have died because of the difficulties sharing their data, that would not have died otherwise. "Otherwise" being - they could communicate with the NHS, share their data, similarly via email, WhatsApp etc, to how they communicate and share data in their private and professional lives.

People at personal level have a fairly reasonable stance, in how they behave, when it comes to sharing their data. They are surprisingly subtle in their cost-benefit analysis. It's only when they answer surveys, or talk in public, that they are less-than-entirely-truthful. We know this, b/c their revealed preferences are at odds with what they say they value, and how much they value.

36. ljosifov ◴[] No.45067835{5}[source]
I see reading comprehension is not something you enjoy to indulge with.

These -

> utter fantasies like "'no one wants to share their data' is just assumed" or "the defaults are always 'deny everything'" true?

...far from being fantasies, are my personal experiences in the UK medical systems. This -

https://news.ycombinator.com/item?id=45066321

replies(1): >>45068948 #
37. int_19h ◴[] No.45068227{3}[source]
LLMs can and do sometimes regurgitate parts of training data verbatim - this has been demonstrated many times on things ranging from Wikipedia articles to code snippets. Yes, it is not particularly likely for that damning private email of yours to be memorized, but if you throw a dataset with millions of private emails onto a model, it will almost certainly memorize some of them, and nobody knows what exact sequence of input tokens might trigger it to recite.
replies(1): >>45072902 #
38. ljosifov ◴[] No.45068317[source]
Whether Google is interested in serving me or not, is not only untestable (i.e. what counts as 'Google', 'interested', and 'serving' there - one could argue to end of time) - but besides the point. I want to be able to tell Google "My home is XYZ", and for Google to use that information about me in all of Google ecosystem. When I talk to Gemini it should know what/where "LJ home" is, when I write in Gdoc it should know my home address (so to insert it if I want it), ditto for Gmail, when I search in Google photos "photos taken at home" it should also know what "home" is for me.

Atm Google vaguely knows, and uses that for Ads targeting, sometimes. Most of the time - the targeting is bad, very low quality slop. To the level of "he bought a mattress yesterday, will keep buying mattresses in the next 30-60 days". I have the impression that we ended up in the worst case scenario. People I don't want to have my data, have access to it. People I do want to have my data, are afraid to touch it, and use it - yes! - for theirs, but also for my benefit too. The current predicament seems to me the case of "public lies, private truths."

A small cadre of vocal proponents of a particular view, established "the ground truth to what is desirable". (in this case - maximum privacy, ideally zero information sharing) The public goes with it in words, pays lip service, while in deeds, the revealed preferences show, they value their data privacy very cheaply, almost zero. Even one click extra, to share their data less, is one click too many, effort too high, for most people. Again - these are revealed preferences, for people keep lying when asked. It's not even the case of "you are lying to me" - no, it's more like "you are lying to yourself."

The conventional opinion is that the power imbalance coming from the information imbalance (state/business know a lot about me; I know little about them) is that us citizens and consumers should reduce our "information surface" towards them. And address the imbalance that way. But. There exists another, often unmentioned option. And that option is for state/business to open up, to increase their "information surface" towards us, their citizens/consumers. That will also achieve information (and one hopes power) rebalance. Yes there is extra work on part of state/business to open their data to us. But it's worth it. The more advanced the society, the more coordination it needs to achieve the right cooperation-competition balance in the interactions between ever greater numbers of people. There is an old book "Data For the People" by an early AI pioneer and Amazon CTO Andreas Weigend. Afaics it well describes the world we live in, and also are likely to live even more in the future.

replies(1): >>45070922 #
39. soraminazuki ◴[] No.45068948{6}[source]
See, this is what I meant by indoctrination. I showed you links containing dozens, maybe even hundreds of examples showing how companies don't obtain consent from users. But you ignore all that and cherry pick your highly exaggerated spin on the UK medical system. "I'm reasonably sure some people will have died because of this." Sigh, give me a break. Your take on privacy sounds just like the auto industry's take on right to repair. According to them, right to repair laws will get women raped in parking lots [1]. Corporate activists making absurd claims resorting to the same old fearmongering tactics.

This isn't me having problems with reading comprehension. It's you arguing in bad faith. Which is inevitable given your desire to demolish consumer protection for everyone. You're defending the indefensible.

[1]: https://www.vice.com/en/article/auto-industry-tv-ads-claim-r...

replies(1): >>45069127 #
40. ljosifov ◴[] No.45069127{7}[source]
I know indoctrination well. Reading what you write - I get the impression that you don't know much about indoctrination. But I don't know you, so I allow it that I maybe wrong. You asked "in what universe". I showed you concrete examples in one universe. For my claim to be true, one example suffices. None of your claims (latest "demolish customer protection") about my alleged intentions, character, thoughts, etc - have any basis in reality. You are wrong in almost everything that you wrote about me. It's all in your head, in your imagination. How do I know? B/c I know me, and you don't know me. That easy.
replies(1): >>45069257 #
41. soraminazuki ◴[] No.45069257{8}[source]
> Excellent. What were they waiting for up to now?? I thought they already trained on my data. I assume they train, even hope that they train, even when they say they don't.

These are your exact words, not my imagination. You very clearly want consumer protection to be gone, because you said so.

> For my claim to be true, one example suffices.

To be clear, your claim is that we live in a world where there's too much privacy protection. So much in fact that you're, gasp, "reasonably sure some people will have died because of this." Nope, a single spin on the UK medical system is nowhere near as sufficient for that absurd claim.

As for your attempted word lawyering about indoctrination? Classic.

replies(1): >>45069585 #
42. ljosifov ◴[] No.45069585{9}[source]
Yes - my data, not your data. You stay away from my data. I stay away from your data. I don't care about your data. But I do want them to train on my data. And to serve me better. Was disappointed that they didn't do that already.

But now you gave me ideas. ;-) Yeah - I think ideally we should go further, much further. Internet was not built by po-faced, lemon-sucking prudes, tut-tut-ing about everything and anything. It was built by happy-go-lucky, live-and-let live, altruistic mildly autistic nerds. It was permission-less, one didn't need to ask anyone in order to do anything, and that's why it lived. Whereas many other networks and protocols, technically more sophisticated, but with a fatal flaw that a gatekeeper with the power to say "NO" was built into them - just died off. Wish people went back to the original permission-less Net. That people tore down all manner of laws making moving bits around illegal, used to jail humans for crimes of reading, copying and writing data.

replies(1): >>45075871 #
43. danparsonson ◴[] No.45070922{3}[source]
You started by saying that it's difficult or impossible to define what 'serving the user' looks like, then immediately gave examples of what it would look like to you. It's not that Google can't do these things or is afraid to, but rather that operating in your best interests does not benefit their shareholders. Sure, it'd be great if we could all just get along, but we're living in the worst case scenario you describe because we can't all just get along. Not trusting companies like Google with your personal data is the pragmatic choice; regardless of what they could do with our data, what they actually do with it is what counts.

Side note: they know exactly where you live. My colleague's Android used to tell him, without any prompting or specific configuration, how long his drive home from work would take that day. That was over ten years ago.

replies(1): >>45073056 #
44. ljosifov ◴[] No.45072724[source]
I already share lots of my data with Google. I have Gmail where a lot of my online life is reflected. I have Photos, Gmaps, Gdrive. Also Google knows about my YouTube viewing, my Android phone use. So no matter what I say - with my actions, my revealed preference is - that I trust Google. So far - Google have not betrayed my trust, afaics. So I actually want for Google to adapt Gemini to me, either via the context, or even with a thin layer of LoRA. If Google treats me like a complete stranger it knows nothing about, then Google, and plenty of other people, make use of my data, but I, the creator (and nominal owner) of my data - don't benefit from their knowledge of me?? That sounds the worst of the possible options to me.
45. ljosifov ◴[] No.45072819[source]
I'm not 'screwing' anyone. I'm saying - the same way people don't want to have their data used, I DO want my data used. I'm not saying they use YOUR data. I'm saying they use MY data.

Likewise, I'm not telling you what you publish. In the same manner, I dislike it you telling me that I publish. So on

> name, home address, work, government-issued ID, financial transactions, chats, browser history, location history, surveillance footage of your home, all for free.

It's up to me, not you, what I decided to publish or not. Fwiw, I already publish

> name, home address, work,

willingly. My name is public (how can it be otherwise?) and home address is in the electoral register that is public. My work info is in the UK companies register, available for reading to all, on the web

I publish to selected parties

> government-issued ID

even if I don't want it. (we don't have specific 'government-issued ID' for ID purposes like in the Continent; my driving licence is used for that) I did it yesterday, because UK gov requires companies to collect that information. Yesterday I had to give two photos of myself to an online pharmacy shop because UK gov mandates that they collect that info - and I disliked that very much. The online pharmacy is not the one pushing for that data, its the UK gov forcing that one them via regulation of how that particular medication is to be sold online.

I don't want to publish and don't publish

> financial transactions, chats, browser history, location history, surveillance footage of your home

...and don't understand where this gale to tell perfect strangers what they should do with their lives comes from?? I don't tell you what you should or should not publish? Ditto for the pricing

> all for free.

Up to me to decide. I don't tell you what you do - so you don't tell me what I do, pretty please.

I am not waiting on "privacy maximalists." I try to share my data for some purpose I need. I loathe 'privacy maximalist' in the UK for having influenced the current laws of the land in a way to cater for their obsessions and ignore my desires. I think I'm in majority, not minority. Our current predicament seems to me the case of "public lies, private truths." A small cadre of vocal proponents of a particular view, established "the ground truth to what is desirable". (in this case - maximum privacy, ideally zero information sharing) The public goes with it in words, pays lip service, while in deeds, the revealed preferences show that we value our data privacy very little - almost zero. Even one click extra to share our data less, is one click too many, an effort too high for most people. Again - these are revealed preferences, for people keep lying when asked. It's not even the case of "you are lying to me" - no, it's more like "you are lying to yourself."

46. ljosifov ◴[] No.45072864[source]
I'm worried, it's not like I don't care. For example, I'm worried that Google is such a huge ginormous target, that at some point their Gmail will be broken. At the same time, there are benefits to sharing data. There are benefits to me, in Google using the information it has on my, to make my life easier. In this case, I judge that Gemini using my data to train, is a low extra risk for me. Compared to all other risks I take, for doing things in public. Including writing this on public forums, as you do too.

In general, I find the ongoing public scare about sharing data, to be anti-thesis to the original spirit of the Net, that was all about sharing data. Originally, we were delighted to connect to perfect strangers on the other side of the world. That we would never have gotten to communicate with otherwise. I accept there might have been an element of self-selection there, that aided that view: people one'd communicate with, although maybe from a different culture, would be from similar niche sub-culture of people messing with computers and looking forward to communication, having a favourable view of that.

replies(1): >>45073180 #
47. ljosifov ◴[] No.45072902{4}[source]
That's a consideration, for sure. But given the LLM-s have not got the ground truth, everything is controlled hallucination, then - if the LLM tells you an imperfect version of my email or chat, you can never be sure if what the LLM told you is true, or not. So maybe you don't gain that much extra knowledge about me. For example, you can reasonably guess I'm typing this on the computer, and having coffee too. So if you ask the LLM "tell me a trivial story", and LLM comes back with "one morning, LJ was typing HN replies on the computer while having his morning coffee" - did you learn that much new about me, that you didn't know or could guess before?
48. ljosifov ◴[] No.45072923[source]
My experience in the UK medical systems has been the opposite - wrote here

https://news.ycombinator.com/item?id=45066321

Google knows what "Home" is for me only in Gmaps, because I went out of my way (put a Label etc) to tell it. I want to be able to tell Google "My home is XYZ", and for Google to use that information about me in all of Google ecosystem. When I talk to Gemini it should know what/where "LJ home" is, when I write in Gdoc it should know my home address (so to insert it if I want it), ditto for Gmail, when I search in Google photos "photos taken at home" it should also know what "home" is for me.

I have the impression that we ended up in the worst case scenario. People I don't want to have my data, have access to it. People I do want to have my data, are afraid to touch it, and use it - yes! - for theirs, but also for my benefit too.

49. ljosifov ◴[] No.45072978[source]
It's not a satire, you can check mu comments on this topic easily.

I dispute 'most people'. Revealed preferences of most people are that they value their data privacy very cheaply, almost zero. Even one click extra to share their data less, is one click too many, an effort too high - for most people. This is their real, observed behaviour. I think our current predicament is the case of "public lies, private truths." A small cadre of vocal proponents of a particular view, established "the ground truth to what is desirable". (in this case - maximum privacy, ideally zero information sharing) The public goes with it in words, pays lip service - but in reality behaves different, even opposite to what they say they desire.

And even if 'most people' wanted what you say they do, I still think the companies could and should accommodate a minority group like myself that want otherwise to what 'most people' want. I don't think the will of the majority is the highest ideal, so high as to trump what I personally want.

50. ljosifov ◴[] No.45072994[source]
You know little about me, so it's better to assume less, no? My personal experience with medical data specifically is, that I would have been harmed by obstacles to data sharing that the UK medical system has in place, having not been familiar with computers and tech enough to anticipate the ways lack of data sharing will lead to outcomes undesirable to me. I wrote about that in a comment here https://news.ycombinator.com/item?id=45067219
replies(1): >>45088237 #
51. ljosifov ◴[] No.45073056{4}[source]
Yes - I meant 'impossible to difficult' to define to all people, at all times. Agree it's easy for me to define how that looks. It doesn't mean that the same is true to you. That's why I went from a very general, to very specific.

I'm saying we ended up in situation where people are lying when they say "I don't trust Google", b/c they have Gmail, use Google services - so their trust can't be zero. It's more than zero. Obviously it's a trade-off, people are pragmatic they do their cost-benefit analysis, and act accordingly. They just lie when they talk about the subject. I think it'd be better for all, if the public discussion moved from "I trust Google zero" (which is obviously untrue), to "There is cost-benefit to this, and I personally chose xyz".

52. 12ian34 ◴[] No.45073180{3}[source]
> the ongoing public scare about sharing data

I think this might be a bit of a social bubble thing - I think it isn't a forefront concern for the vast majority of people.

replies(1): >>45073297 #
53. ljosifov ◴[] No.45073297{4}[source]
I think you are correct there - the majority of the public don't care. They just try to get about doing their daily business and act the best they can under circumstances. So we just click "Accept" to any popup banner make it go away, accept "All cookies" 100 times every day, use Google mail/map/photos/drive and that all involves giving away data, even if in words we say we don't want to give data. So yes obviously the public by necessity act in a rational way, doing cost-benefit analysis. While a cadre of privacy obsessives have made my life worse by lobbying and having their bad ideas codified in the UL laws. Wrote about my experience in the UK medical systems here https://news.ycombinator.com/item?id=45066321
54. igor47 ◴[] No.45075871{10}[source]
You should read "the cuckoos egg", written by a happy go lucky nerd in the 80s dawn of network systems. Already there were bad actors in the system and he fought an uphill battle to implement network security. You're already standing on the shoulders of giants like him who saved the net -- i don't believe it could survive without a robust permission structure.
55. calmbonsai ◴[] No.45088237{3}[source]
I read you comment and I'm sorry you had to jump through all the hoops just to get copies of your medical records.

I dealt with some of that while being a foreign national working in Düsseldorf a few years back.

Medical care is definitely screwed up in the states, but getting my basic test results and emailing my doctor has always been a straightforward and rapid comms loop.