From where I sit, right now, this does not seem to be the case.
This is as if writing down the code is not the biggest problem, or the biggest time sink, of building software.
From where I sit, right now, this does not seem to be the case.
This is as if writing down the code is not the biggest problem, or the biggest time sink, of building software.
You had all these small-by-modern-standards teams (though sometimes in large companies) putting out desktop applications, sometimes on multiple platforms, with shitloads of features. On fairly tight schedules. To address markets that are itty-bitty by modern standards.
Now people are like “We’ll need (3x the personnel) and (2x the time) and you can forget about native, it’s webshit or else you can double those figures… for one platform. What’s that? Your TAM is only (the size of the entire home PC market circa 1995)? Oh forget about it then, you’ll never get funded”
It seems like we’ve gotten far less efficient.
I’m skeptical this problem has to do with code-writing, and so am skeptical that LLMs are going to even get us back to our former baseline.
While it wasn't perfect, I'd argue software got much worse, and I blame SaaSification and the push for web-based centralization.
Take for example, Docker. Linux suffered from an issue of hosting servers in a standardized, and isolated manner.
The kernel already had all those features to make it work, all we needed was a nice userland to take advantage of it.
What was needed was a small loader program that set up the sandbox and then started executing the target software.
What we got was Docker, which somehow came with its own daemon (what about cron, systemd etc), way of doing IPC (we had that), package distribution format (why not leave stuff on the disk), weird shit (layers wtf), repository (we had that as well), and CLI (why).
All this stuff was wrapped into a nice package you have to pay monthly subscription fees for.
Duplicating and enshittifying standard system functions, what a way to go.