←back to thread

467 points bundie | 1 comments | | HN request time: 0.25s | source
Show context
ryanrasti ◴[] No.44501761[source]
> With Gemini Apps Activity turned off, their Gemini chats are not being reviewed or used to improve our AI models.

Indeed bizarre as the statement doesn't say much about data collection or retention.

More generally, I'm conflicted here -- I'm big on personal privacy but the power & convenience that AI will bring will probably be too great to overcome. I'm hoping that powerful, locally-run AI models will become a mainstream alternative.

replies(4): >>44501830 #>>44501876 #>>44501881 #>>44501970 #
1. _verandaguy ◴[] No.44501876[source]
My approach has been to lock AI assistants (for me, that's just Apple intelligence as far as I can help it) out of integrations with the vast majority of apps, and especially chat and email apps.

At some point, some reverse engineer will publish a writeup either confirming or denying how local these models are, how much data (and maybe even what data) is being sent up to the mothership, and how these integrations appear to be implemented.

It's not perfect, and it only offers a point-in-time view of the situation, but it's the best we can do in an intensely closed-source world. I'd be happier if these companies published the code (regardless of the license) and allowed users to test for build parity.