←back to thread

421 points briankelly | 1 comments | | HN request time: 0s | source
Show context
necovek ◴[] No.43575664[source]
The premise might possibly be true, but as an actually seasoned Python developer, I've taken a look at one file: https://github.com/dx-tooling/platform-problem-monitoring-co...

All of it smells of a (lousy) junior software engineer: from configuring root logger at the top, module level (which relies on module import caching not to be reapplied), over not using a stdlib config file parser and building one themselves, to a raciness in load_json where it's checked for file existence with an if and then carrying on as if the file is certainly there...

In a nutshell, if the rest of it is like this, it simply sucks.

replies(23): >>43575714 #>>43575764 #>>43575953 #>>43576545 #>>43576732 #>>43576977 #>>43577008 #>>43577017 #>>43577193 #>>43577214 #>>43577226 #>>43577314 #>>43577850 #>>43578934 #>>43578952 #>>43578973 #>>43579760 #>>43581498 #>>43582065 #>>43583922 #>>43585046 #>>43585094 #>>43587376 #
gerdesj ◴[] No.43576977[source]
My current favourite LLM wankery example is this beauty: https://blog.fahadusman.com/proxmox-replacing-failed-drive-i...

Note how it has invented the faster parameter for the zpool command. It is possible that the blog writer hallucinated a faster parameter themselves without needing a LLM - who knows.

I think all developers should add a faster parameter to all commands to make them run faster. Perhaps a LLM could create the faster code.

I predict an increase of man page reading, and better quality documentation at authoritative sources. We will also improve our skills at finding auth sources of docs. My uBlacklist is getting quite long.

replies(2): >>43577304 #>>43579253 #
rotis ◴[] No.43579253[source]
How can this article be written by LLM? Its date is November 2021. Not judging the article as a whole but the command you pointed out seems to be correct. Faster is the name of the pool.
replies(4): >>43579351 #>>43580646 #>>43584721 #>>43585451 #
1. bdhcuidbebe ◴[] No.43584721[source]
There was alot going on in the years before ChatGPT. Text generation was going strong with interactive fiction before anyone were talking about OpenAI.