←back to thread

Gemini CLI

(blog.google)
1348 points sync | 2 comments | | HN request time: 0.497s | source
Show context
ipsum2 ◴[] No.44379036[source]
If you use this, all of your code data will be sent to Google. From their terms:

https://developers.google.com/gemini-code-assist/resources/p...

When you use Gemini Code Assist for individuals, Google collects your prompts, related code, generated output, code edits, related feature usage information, and your feedback to provide, improve, and develop Google products and services and machine learning technologies.

To help with quality and improve our products (such as generative machine-learning models), human reviewers may read, annotate, and process the data collected above. We take steps to protect your privacy as part of this process. This includes disconnecting the data from your Google Account before reviewers see or annotate it, and storing those disconnected copies for up to 18 months. Please don't submit confidential information or any data you wouldn't want a reviewer to see or Google to use to improve our products, services, and machine-learning technologies.

replies(20): >>44379046 #>>44379132 #>>44379301 #>>44379405 #>>44379410 #>>44379497 #>>44379544 #>>44379636 #>>44379643 #>>44380425 #>>44380586 #>>44380762 #>>44380864 #>>44381305 #>>44381716 #>>44382190 #>>44382418 #>>44382537 #>>44383744 #>>44384828 #
mattzito ◴[] No.44379301[source]
It's a lot more nuanced than that. If you use the free edition of Code Assist, your data can be used UNLESS you opt out, which is at the bottom of the support article you link to:

"If you don't want this data used to improve Google's machine learning models, you can opt out by following the steps in Set up Gemini Code Assist for individuals."

and then the link: https://developers.google.com/gemini-code-assist/docs/set-up...

If you pay for code assist, no data is used to improve. If you use a Gemini API key on a pay as you go account instead, it doesn't get used to improve. It's just if you're using a non-paid, consumer account and you didn't opt out.

That seems different than what you described.

replies(4): >>44379517 #>>44380349 #>>44381012 #>>44382425 #
foob ◴[] No.44380349[source]
your data can be used UNLESS you opt out

It's even more nuanced than that.

Google recently testified in court that they still train on user data after users opt out from training [1]. The loophole is that the opt-out only applies to one organization within Google, but other organizations are still free to train on the data. They may or may not have cleaned up their act given that they're under active investigation, but their recent actions haven't exactly earned them the benefit of the doubt on this topic.

[1] https://www.business-standard.com/technology/tech-news/googl...

replies(6): >>44380772 #>>44381196 #>>44381287 #>>44382297 #>>44382437 #>>44385194 #
TrainedMonkey ◴[] No.44380772[source]
Another dimension here is that any "we don't train on your data" is useless without a matching data retention policy which deletes your data. Case and point of 23andMe not selling your data until they decided to change that policy.
replies(3): >>44381263 #>>44381313 #>>44381964 #
1. decimalenough ◴[] No.44381263[source]
Google offers a user-configurable retention policy for all data.

https://support.google.com/accounts/answer/10549751

That said, once your data is inside an LLM, you can't really unscramble the omelette.

replies(1): >>44381541 #
2. elictronic ◴[] No.44381541[source]
Lawsuits and laws seem to work just fine at unscrambling. Once a company has a fiscal interest they seem to change very quickly.