←back to thread

Gemini CLI

(blog.google)
1428 points sync | 3 comments | | HN request time: 1.906s | source
Show context
joelm ◴[] No.44379446[source]
Been using Claude Code (4 Opus) fairly successfully in a large Rust codebase, but sometimes frustrated by it with complex tasks. Tried Gemini CLI today (easy to get working, which was nice) and it was pretty much a failure. It did a notably worse job than Claude at having the Rust code modifications compile successfully.

However, Gemini at one point output what will probably be the highlight of my day:

"I have made a complete mess of the code. I will now revert all changes I have made to the codebase and start over."

What great self-awareness and willingness to scrap the work! :)

replies(8): >>44379714 #>>44380383 #>>44380768 #>>44380866 #>>44381146 #>>44381754 #>>44383245 #>>44386866 #
ZeroCool2u ◴[] No.44379714[source]
Personally my theory is that Gemini benefits from being able to train on Googles massive internal code base and because Rust has been very low on uptake internally at Google, especially since they have some really nice C++ tooling, Gemini is comparatively bad at Rust.
replies(5): >>44380405 #>>44380865 #>>44381697 #>>44382948 #>>44383662 #
1. data-ottawa ◴[] No.44383662[source]
Tangental, but I worry that LLMs will cause a great stagnation in programming language evolution, and possibly a bunch of tech.

I've tried using a few new languages and the LLMs would all swap the code for syntactically similar languages, even after telling them to read the doc pages.

Whether that's for better or worse I don't know, but it does feel like new languages are genuinely solving hard problems as their raison d'etre.

replies(2): >>44385738 #>>44395610 #
2. breakingcups ◴[] No.44385738[source]
Not just that, I think this will happen on multiple levels too. Think de-facto ossified libraries, tools, etc.

LLMs thrive because they had a wealth of high-quality corpus in the form os Stack Overflow, Github, etc. and ironically their uptake is causing a strangulation of that source of training data.

3. sillystu04 ◴[] No.44395610[source]
Perhaps the next big programming language will be designed specifically for LLM friendliness. Some things which are human friendly like long keywords are just a waste of tokens for LLMs, and there could be other optimisations too.