←back to thread

183 points WolfOliver | 1 comments | | HN request time: 0.2s | source
Show context
WheelsAtLarge ◴[] No.45067421[source]
True, but it's catching up fast. A year ago, I used AI for small OS scripts. It worked fine and saved me the time of looking up switches for commands. Now, I can ask it to create a simple game of about 200 lines, and it does a pretty good job of writing bug-free code within a few seconds. It's only going to get better. Even if the tech doesn't improve further, I can see a future where all apps are endlessly configurable.

A big part of my career has been the modification of enterprise software to fit a company's needs. Rarely was any one addition more than a few hundred lines of code. I can see a future where there will be simple options for a non-coder to add to an app.

True, it's not a coder, but that doesn't mean it won't fundamentally change how apps are made and it will reduce the number of master programmers needed. It won't replace all programmers, but it will greatly reduce the number that are needed, which country they work in and the language they use to program apps.

Programming has mainly been a career that requires the individual to understand English. That is changing. I can see a future where code can be created in multiple human languages. Programming was well-paid because relatively few people had the expertise to do it. That won't be the case, and the pay will adjust downward as needed. AI might not be a coder, but it will let many more people become coders. In the future, coding will be in the same pay range as clerical work. Companies will be hiring Programming Clerks rather than Programming Engineers.

replies(2): >>45067503 #>>45070358 #
1. ForHackernews ◴[] No.45070358[source]
I think you're right that LLMs are democratizing access to coding, but unless and until AI models reach a point where they can say 'no' to their users, the scenario you're imagining ('endlessly configurable apps') will probably lead to software that collapses under its own complexity.

Years ago, I supported a team of finance professionals who were largely quite competent at coding but knew nothing about software engineering. They had thousands of scripts and spreadsheets: they used version control, but kept separate long-lived branches for client-specific variations of different models. There were no tests for anything; half the tools would break when the clocks changed.

They weren't dumb, but their incentives weren't about building anything we might recognize as an engineered application. I suspect something similar will happen turning end users loose with AI.