←back to thread

416 points floverfelt | 1 comments | | HN request time: 0s | source
Show context
jeppester ◴[] No.45057505[source]
In my company I feel that we getting totally overrun with code that's 90% good, 10% broken and almost exactly what was needed.

We are producing more code, but quality is definitely taking a hit now that no-one is able to keep up.

So instead of slowly inching towards the result we are getting 90% there in no time, and then spending lots and lots of time on getting to know the code and fixing and fine-tuning everything.

Maybe we ARE faster than before, but it wouldn't surprise me if the two approaches are closer than what one might think.

What bothers me the most is that I much prefer to build stuff rather than fixing code I'm not intimately familiar with.

replies(8): >>45057537 #>>45058508 #>>45061118 #>>45061272 #>>45061732 #>>45062347 #>>45065856 #>>45070745 #
1. Cthulhu_ ◴[] No.45061118[source]
I'd argue that this awareness is a good thing; it means you're measuring, analyzing, etc all the code.

Best practices in software development for forever have been to verify everything; CI, code reviews, unit tests, linters, etc. I'd argue that with LLM generated code, a software developer's job and/or that of an organization as a whole has shifted even more towards reviewing and verification.

If quality is taking a hit you need to stop; how important is quality to you? How do you define quality in your organization? And what steps do you take to ensure and improve quality before merging LLM generated code? Remember that you're still the boss and there is no excuse for merging substandard code.