←back to thread

135 points barddoo | 4 comments | | HN request time: 0s | source

Writing Redis from scratch in Zig.
Show context
johnisgood ◴[] No.45308123[source]
Seems like LLMs are getting good at Zig (with some help, I presume).
replies(2): >>45308193 #>>45308553 #
mtlynch ◴[] No.45308193[source]
Is there anything about this project that seems LLM-generated?

I've found that LLMs are particularly bad at writing Zig because the language evolves quickly, so LLMs that are trained on Zig code from two years ago will write code that no longer compiles on modern Zig.

replies(4): >>45308296 #>>45308429 #>>45308798 #>>45311161 #
1. jasonjmcghee ◴[] No.45308798[source]
I skimmed, for me it was this: https://github.com/barddoo/zedis/blob/87321b04224b2e2e857b67...

There seems to be a fair amount of stigma around using llms. And many people that use them are uncomfortable talking about it.

It's a weird world. Depending on who is at the wheel, whether an llm is used _can_ make no difference.

But the problem is, you can have no idea what you're doing and make something that feels like it was carefully hand-crafted by someone - a really great project - but there are hidden things or outright lies about functionality, often to the surprise of the author. Like, they weren't trying to mislead, just didn't take them time to see if it did all of what the LLM said it did.

replies(3): >>45308978 #>>45310151 #>>45311116 #
2. boredemployee ◴[] No.45308978[source]
3 months ago I was vibe coding an idea and for some reason (and luck) I went to check a less important part of the code and saw that the LLM changed the env variable of an API key and hard coded the key explictly in the code. That was scary. I'm glad I saw it before PR and shit like that.
3. barddoo ◴[] No.45310151[source]
Agree. I used it mostly for getting ideas, the memory management for example, Gemini listed so many different ways of managing memory I didn’t even know existed. I know I wanted to pre allocate memory like tigerbeetle does, so the hybrid approach was perfect. Essentially it has 3 different allocators, a huge one for the cache, a arena allocator for context, intermediate state like pub/sub and temp one, for requests. It was 100% Gemini’s idea.
4. johnisgood ◴[] No.45311116[source]
I generally do not think it is a bad thing. I use LLMs too and I know what I am doing, so I do not know if it could be qualified as vibe coding.

I think it is not inherently a bad thing to use LLMs, only if you have absolutely no clue about what you are doing, but even then, if the project is usable and as-is advertised, why not? shrugs

As for the link, that is exactly the same code that caught my eye, besides the README.md itself. The LRU eviction thing is what GPT (and possibly other LLMs) always comes up with according to my experiences, and he could have just had it properly implemented then. :D

Edit: I am glad author confirmed the use of an LLM. :P