←back to thread

135 points barddoo | 1 comments | | HN request time: 0.203s | source

Writing Redis from scratch in Zig.
Show context
johnisgood ◴[] No.45308123[source]
Seems like LLMs are getting good at Zig (with some help, I presume).
replies(2): >>45308193 #>>45308553 #
mtlynch ◴[] No.45308193[source]
Is there anything about this project that seems LLM-generated?

I've found that LLMs are particularly bad at writing Zig because the language evolves quickly, so LLMs that are trained on Zig code from two years ago will write code that no longer compiles on modern Zig.

replies(4): >>45308296 #>>45308429 #>>45308798 #>>45311161 #
jasonjmcghee ◴[] No.45308798[source]
I skimmed, for me it was this: https://github.com/barddoo/zedis/blob/87321b04224b2e2e857b67...

There seems to be a fair amount of stigma around using llms. And many people that use them are uncomfortable talking about it.

It's a weird world. Depending on who is at the wheel, whether an llm is used _can_ make no difference.

But the problem is, you can have no idea what you're doing and make something that feels like it was carefully hand-crafted by someone - a really great project - but there are hidden things or outright lies about functionality, often to the surprise of the author. Like, they weren't trying to mislead, just didn't take them time to see if it did all of what the LLM said it did.

replies(3): >>45308978 #>>45310151 #>>45311116 #
1. johnisgood ◴[] No.45311116[source]
I generally do not think it is a bad thing. I use LLMs too and I know what I am doing, so I do not know if it could be qualified as vibe coding.

I think it is not inherently a bad thing to use LLMs, only if you have absolutely no clue about what you are doing, but even then, if the project is usable and as-is advertised, why not? shrugs

As for the link, that is exactly the same code that caught my eye, besides the README.md itself. The LRU eviction thing is what GPT (and possibly other LLMs) always comes up with according to my experiences, and he could have just had it properly implemented then. :D

Edit: I am glad author confirmed the use of an LLM. :P