←back to thread

Gemini CLI

(blog.google)
1348 points sync | 4 comments | | HN request time: 1.061s | source
Show context
ipsum2 ◴[] No.44379036[source]
If you use this, all of your code data will be sent to Google. From their terms:

https://developers.google.com/gemini-code-assist/resources/p...

When you use Gemini Code Assist for individuals, Google collects your prompts, related code, generated output, code edits, related feature usage information, and your feedback to provide, improve, and develop Google products and services and machine learning technologies.

To help with quality and improve our products (such as generative machine-learning models), human reviewers may read, annotate, and process the data collected above. We take steps to protect your privacy as part of this process. This includes disconnecting the data from your Google Account before reviewers see or annotate it, and storing those disconnected copies for up to 18 months. Please don't submit confidential information or any data you wouldn't want a reviewer to see or Google to use to improve our products, services, and machine-learning technologies.

replies(20): >>44379046 #>>44379132 #>>44379301 #>>44379405 #>>44379410 #>>44379497 #>>44379544 #>>44379636 #>>44379643 #>>44380425 #>>44380586 #>>44380762 #>>44380864 #>>44381305 #>>44381716 #>>44382190 #>>44382418 #>>44382537 #>>44383744 #>>44384828 #
jart ◴[] No.44379046[source]
Mozilla and Google provide an alternative called gemmafile which gives you an airgapped version of Gemini (which Google calls Gemma) that runs locally in a single file without any dependencies. https://huggingface.co/jartine/gemma-2-27b-it-llamafile It's been deployed into production by 32% of organizations: https://www.wiz.io/reports/the-state-of-ai-in-the-cloud-2025
replies(2): >>44379221 #>>44379814 #
1. ipsum2 ◴[] No.44379814[source]
There's nothing wrong with promoting your own projects, but its a little weird that you don't disclose that you're the creator.
replies(1): >>44380274 #
2. jart ◴[] No.44380274[source]
It would be more accurate to say I packaged it. llamafile is a project I did for Mozilla Builders where we compiled llama.cpp with cosmopolitan libc so that LLMs can be portable binaries. https://builders.mozilla.org/ Last year I concatenated the Gemma weights onto llamafile and called it gemmafile and it got hundreds of thousands of downloads. https://x.com/JustineTunney/status/1808165898743878108 I currently work at Google on Gemini improving TPU performance. The point is that if you want to run this stuff 100% locally, you can. Myself and others did a lot of work to make that possible.
replies(1): >>44380746 #
3. elbear ◴[] No.44380746[source]
I keep meaning to investigate how I can use your tools to create single-file executables for Python projects, so thanks for posting and reminding me.
replies(1): >>44381552 #
4. ahgamut ◴[] No.44381552{3}[source]
My early contributions to https://github.com/jart/cosmopolitan were focused towards getting a single-file Python executable. I wanted my Python scripts to run on both Windows and Linux, and now they do. To try out Python, you can:

    wget https://cosmo.zip/pub/cosmos/bin/python -qO python.com
    chmod +x python.com
    ./python.com
Adding pure-Python libraries just means downloading the wheel and adding files to the binary using the zip command:

    ./python.com -m pip download Click
    mkdir -p Lib && cd Lib
    unzip ../click*.whl
    cd ..
    zip -qr ./python.com Lib/
    ./python.com # can now import click
Cosmopolitan Libc provides some nice APIs to load arguments at startup, like cosmo_args() [1], if you'd like to run the Python binary as a specific program. For example, you could set the startup arguments to `-m datasette`.

[1]: https://github.com/jart/cosmopolitan/commit/4e9566cd3328626d...