←back to thread

752 points dceddia | 1 comments | | HN request time: 0s | source
Show context
waboremo ◴[] No.36447387[source]
We just rely on layers and layers of cruft. We then demand improvements when things get too bad, but we're only operating on the very top layer where even dramatic improvements and magic are irrelevant.

Windows is especially bad at this due to so much legacy reliance, which is also kind of why people still bother with Windows. Not to claim that Linux or MacOS don't have similar problems (ahem, Catalyst) but it's not as overt.

A lot of the blame gets placed on easy to see things like an Electron app, but really the problem is so substantial that even native apps perform slower, use more resources, and aren't doing a whole lot more than they used to. Windows Terminal is a great example of this.

Combine this with the fact that most teams aren't given the space to actually maintain (because maintaining doesn't result in direct profits), and you've got a winning combination!

replies(7): >>36447692 #>>36447714 #>>36447761 #>>36448103 #>>36448804 #>>36449621 #>>36458475 #
leidenfrost ◴[] No.36447692[source]
Not only that. It's not Wirth's Law.

It's the fact that manpower can't keep up with the exploding amount of complexity and use cases that happened to computing in the last decades.

We went from CLI commands and a few graphical tools for the few that actually wanted to engage with computers, to an entire ecosystem of entertainment where everyone in the world wants 'puters to predict what could they want to see or buy next.

To maintain the same efficiency in code we had in the 90-2000s, we would need to instantly jump the seniority of every developer in the world, right from Junior to Senior+. Yes, you can recruit and train developers, but how many Tanenbaums and Torvalds can you train per year?

The biggest amount of cruft not only went to dark patterns and features in programs like animations and rendering that some people regard it as "useless" (which is debatable at minimum). But the layers went also to improve "developer experience".

And I'm not talking about NodeJS only. I'm talking about languages like Python, Lua, or even the JVM.

There's a whole universe of hoops and loops and safeguards made so that the not-so-genius developer doesn't shoot themselves in the foot so easily.

I'm sure that you can delete all of that, only leave languages like Rust, C and C++ and get a 100x jump in performance. But you'd also be annihilating 90% of the software development workforce. Good luck trying to watch a movie in Netflix or counting calories on a smartwatch.

replies(4): >>36448237 #>>36448470 #>>36448830 #>>36453605 #
1. TeMPOraL ◴[] No.36448830[source]
> everyone in the world wants 'puters to predict what could they want to see or buy next.

This doesn't strike me like something "everyone in the world wants", but rather something a small group of leaches is pushing on the rest of the population, to enrich themselves at the expense of everyone else. I'm yet to meet a person that would tell me they actually want computers to tell them what to see or buy. And if I met such person, I bet they'd backtrack if they learned how those systems work.

Exercise for the reader: name one recommendation system that doesn't suck. They all do, and it's not because recommendations are hard. Rather, it's because those systems aren't tuned to recommend what the users would like - they're optimized to recommend what maximizes vendor's revenue. This leads to well-known absurdities like Netflix recommendations being effectively random, and the whole UX being optimized to mask how small their catalogue is; or Spotify recommendations pushing podcasts whether you want them or not; or how you buy a thing and then get spammed for weeks by ads for the same thing, because as stupid as it is, it seems to maximize effectiveness at scale. Etc.

> I'm sure that you can delete all of that, only leave languages like Rust, C and C++ and get a 100x jump in performance. But you'd also be annihilating 90% of the software development workforce. Good luck trying to watch a movie in Netflix or counting calories on a smartwatch.

I'll say the same thing I say to people when they claim banning ads would annihilate 90% of the content on the Internet: good. riddance.

Netflix would still be there. So would smartwatches and calorie counting apps. We're now drowning in deluge of shitty software, a lot of which is actually malware in disguise; "annihilating 90% of the software development workforce" would vastly improve SNR.