←back to thread

439 points diggan | 1 comments | | HN request time: 0.204s | source
Show context
ljosifov ◴[] No.45064731[source]
Excellent. What were they waiting for up to now?? I thought they already trained on my data. I assume they train, even hope that they train, even when they say they don't. People that want to be data privacy maximalists - fine, don't use their data. But there are people out there (myself) that are on the opposite end of the spectrum, and we are mostly ignored by the companies. Companies just assume people only ever want to deny them their data.

It annoys me greatly, that I have no tick box on Google to tell them "go and adapt models I use on my Gmail, Photos, Maps etc." I don't want Google to ever be mistaken where I live - I have told them 100 times already.

This idea that "no one wants to share their data" is just assumed, and permeates everything. Like soft-ball interviews that a popular science communicator did with DeepMind folks working in medicine: every question was prefixed by litany of caveats that were all about 1) assumed aversion of people to sharing their data 2) horrors and disasters that are to befall us should we share the data. I have not suffered any horrors. I'm not aware of any major disasters. I'm aware of major advances in medicine in my lifetime. Ultimately the process does involve controlled data collection and experimentation. Looks a good deal to me tbh. I go out of my way to tick all the NHS boxes too, to "use my data as you see fit". It's an uphill struggle. The defaults are always "deny everything". Tick boxes never go away, there is no master checkbox "use any and all of my data and never ask me again" to tick.

replies(7): >>45064789 #>>45064878 #>>45064882 #>>45064906 #>>45065043 #>>45065125 #>>45066501 #
koolba ◴[] No.45065043[source]
> It annoys me greatly, that I have no tick box on Google to tell them "go and adapt models I use on my Gmail, Photos, Maps etc." I don't want Google to ever be mistaken where I live - I have told them 100 times already.

As we’ve seen LLMs be able to fully regenerate text from their sources (or at least close enough), aren’t you the least bit worried about your personal correspondence magically appearing in the wild?

replies(1): >>45073184 #
1. ljosifov ◴[] No.45073184[source]
I am a little bit worried, for sure. But I think that's small extra risk on my side, for small extra gain for me personally, but large extra gain for the wider group I belong to (ultimately - all of humanity) in the sense of working towards ameliorating the "tragedy of the commons".

On the personal side. Given the LLM-s have not got the ground truth, everything is controlled hallucination, then - if the LLM tells you an imperfect version of my email or chat, you can never be sure if what the LLM told you is true, or not. So maybe you don't gain that much extra knowledge about me. For example, you can reasonably guess I'm typing this on the computer, and having coffee too. So if you ask the LLM "tell me a trivial story", and LLM comes back with "one morning, LJ was typing HN replies on the computer while having his morning coffee" - did you learn that much new about me, that you didn't know or could guess before?

On the "tragedy of the commons" side. We all benefit immensely from other people sharing their data, even very personal data. Any drug discovery, testing, approval - relies on many people allowing their data to be shared. Wider context - living in a group of people, involves radiating data outwards, and using data other people emit towards myself (and others), to have a functioning society. The more advanced the society, the more coordination it needs to achieve the right cooperation-competition balance in the interactions between ever greater numbers of people.

I think it's bad for me personally, and for everyone, that the "data privacy maximalists" had their desires codified in UK laws. My personal experience in the UK medical systems has been that the laws made my life worse, not better. Wrote here https://news.ycombinator.com/item?id=45066321