←back to thread

Gemini CLI

(blog.google)
1339 points sync | 1 comments | | HN request time: 0.328s | source
Show context
asadm ◴[] No.44377180[source]
I have been using this for about a month and it’s a beast, mostly thanks to 2.5pro being SOTA and also how it leverages that huge 1M context window. Other tools either preemptively compress context or try to read files partially.

I have thrown very large codebases at this and it has been able to navigate and learn them effortlessly.

replies(2): >>44377204 #>>44378419 #
_zoltan_ ◴[] No.44378419[source]
what's your workflow?
replies(2): >>44382980 #>>44383035 #
leoh ◴[] No.44382980[source]
+1 I have not found Gemini 2.5 better than Claude's latest models -- different and better at some things, but not better in general; and in general I have found Gemini 2.5 Pro to be worse at dealing with large codebases despite its huge context window. So I too am quite curious about the workflow here.
replies(1): >>44383039 #
asadm ◴[] No.44383039[source]
the core idea is to not use up all of context for files but instead sessions go longer before becoming a useless pursuit ie. more turns possible with larger context window
replies(1): >>44385361 #
1. leoh ◴[] No.44385361[source]
Are you doing this with an LLM interface directly as opposed to using cursor or another tool like that?