←back to thread

752 points dceddia | 1 comments | | HN request time: 0.215s | source
Show context
waboremo ◴[] No.36447387[source]
We just rely on layers and layers of cruft. We then demand improvements when things get too bad, but we're only operating on the very top layer where even dramatic improvements and magic are irrelevant.

Windows is especially bad at this due to so much legacy reliance, which is also kind of why people still bother with Windows. Not to claim that Linux or MacOS don't have similar problems (ahem, Catalyst) but it's not as overt.

A lot of the blame gets placed on easy to see things like an Electron app, but really the problem is so substantial that even native apps perform slower, use more resources, and aren't doing a whole lot more than they used to. Windows Terminal is a great example of this.

Combine this with the fact that most teams aren't given the space to actually maintain (because maintaining doesn't result in direct profits), and you've got a winning combination!

replies(7): >>36447692 #>>36447714 #>>36447761 #>>36448103 #>>36448804 #>>36449621 #>>36458475 #
leidenfrost ◴[] No.36447692[source]
Not only that. It's not Wirth's Law.

It's the fact that manpower can't keep up with the exploding amount of complexity and use cases that happened to computing in the last decades.

We went from CLI commands and a few graphical tools for the few that actually wanted to engage with computers, to an entire ecosystem of entertainment where everyone in the world wants 'puters to predict what could they want to see or buy next.

To maintain the same efficiency in code we had in the 90-2000s, we would need to instantly jump the seniority of every developer in the world, right from Junior to Senior+. Yes, you can recruit and train developers, but how many Tanenbaums and Torvalds can you train per year?

The biggest amount of cruft not only went to dark patterns and features in programs like animations and rendering that some people regard it as "useless" (which is debatable at minimum). But the layers went also to improve "developer experience".

And I'm not talking about NodeJS only. I'm talking about languages like Python, Lua, or even the JVM.

There's a whole universe of hoops and loops and safeguards made so that the not-so-genius developer doesn't shoot themselves in the foot so easily.

I'm sure that you can delete all of that, only leave languages like Rust, C and C++ and get a 100x jump in performance. But you'd also be annihilating 90% of the software development workforce. Good luck trying to watch a movie in Netflix or counting calories on a smartwatch.

replies(4): >>36448237 #>>36448470 #>>36448830 #>>36453605 #
hcarvalhoalves ◴[] No.36448237[source]
I'm afraid you're putting too much weight on language. Windows is largely built on C++, no? The impact of adding features, multiple layers of architecture, maintained by many people over a long time is worse. Don't underestimate the capacity of creating slow software in ANY language. Software architecture and project management are unsolved problems in the industry.
replies(3): >>36448324 #>>36448395 #>>36453287 #
1. leidenfrost ◴[] No.36448395[source]
Windows is a behemoth of legacy protocols, abandoned projects, and software support that ranges from bleeding edge gaming hardware to obscure ancient machines required by long term support partners and investors.

Devs at MS have to make everything right in an universe where everything else is dead or crap. And the fact that Windows 11 can even run without crashing daily is an engineering marvel.

PD: Not to defend MS, but I'm sure their current devs are very capable and doing their best.