←back to thread

Gemini CLI

(blog.google)
1342 points sync | 9 comments | | HN request time: 1.621s | source | bottom
1. danavar ◴[] No.44378477[source]
Is there a way to instantly, quickly prompt it in the terminal, without loading the full UI? Just to get a short response without filling the terminal page.

like to just get a short response - for simple things like "what's a nm and grep command to find this symbol in these 3 folders". I use gemini alot for this type of thing already

Or would that have to be a custom prompt I write?

replies(5): >>44378594 #>>44378617 #>>44378653 #>>44378886 #>>44380297 #
2. peterldowns ◴[] No.44378594[source]
I use `mods` for this https://github.com/charmbracelet/mods

other people use simon willison's `llm` tool https://github.com/simonw/llm

Both allow you to switch between models, send short prompts from a CLI, optionally attach some context. I prefer mods because it's an easier install and I never need to worry about Python envs and other insanity.

replies(1): >>44378637 #
3. ◴[] No.44378617[source]
4. indigodaddy ◴[] No.44378637[source]
Didn't know about mods, looks awesome.
5. cperry ◴[] No.44378653[source]
-p is your friend
replies(1): >>44384133 #
6. hiAndrewQuinn ◴[] No.44378886[source]
gemini --prompt "Hello"
7. irthomasthomas ◴[] No.44380297[source]
If you uv install llm Then grab my shelllm scripts github.com/irthomasthomas/shelllm and source them in your terminal then you can use premade prompt functions like shelp "what's a nm and grep command to find this symbol in these 3 folders" -m gemini-pro

There's also wrappers that place the command directly in your terminal prompt if you run shelp-c

replies(1): >>44386575 #
8. 8n4vidtmkvmk ◴[] No.44384133[source]
And if prompt is too long for -p due to shell arg limits, pipe into stdio instead
9. Game_Ender ◴[] No.44386575[source]
The link you have posted 404’s and I could seem to find a command like that in your repos. Can you be more specific?