Most active commenters
  • TeMPOraL(3)

←back to thread

752 points dceddia | 11 comments | | HN request time: 1.029s | source | bottom
Show context
waboremo ◴[] No.36447387[source]
We just rely on layers and layers of cruft. We then demand improvements when things get too bad, but we're only operating on the very top layer where even dramatic improvements and magic are irrelevant.

Windows is especially bad at this due to so much legacy reliance, which is also kind of why people still bother with Windows. Not to claim that Linux or MacOS don't have similar problems (ahem, Catalyst) but it's not as overt.

A lot of the blame gets placed on easy to see things like an Electron app, but really the problem is so substantial that even native apps perform slower, use more resources, and aren't doing a whole lot more than they used to. Windows Terminal is a great example of this.

Combine this with the fact that most teams aren't given the space to actually maintain (because maintaining doesn't result in direct profits), and you've got a winning combination!

replies(7): >>36447692 #>>36447714 #>>36447761 #>>36448103 #>>36448804 #>>36449621 #>>36458475 #
1. leidenfrost ◴[] No.36447692[source]
Not only that. It's not Wirth's Law.

It's the fact that manpower can't keep up with the exploding amount of complexity and use cases that happened to computing in the last decades.

We went from CLI commands and a few graphical tools for the few that actually wanted to engage with computers, to an entire ecosystem of entertainment where everyone in the world wants 'puters to predict what could they want to see or buy next.

To maintain the same efficiency in code we had in the 90-2000s, we would need to instantly jump the seniority of every developer in the world, right from Junior to Senior+. Yes, you can recruit and train developers, but how many Tanenbaums and Torvalds can you train per year?

The biggest amount of cruft not only went to dark patterns and features in programs like animations and rendering that some people regard it as "useless" (which is debatable at minimum). But the layers went also to improve "developer experience".

And I'm not talking about NodeJS only. I'm talking about languages like Python, Lua, or even the JVM.

There's a whole universe of hoops and loops and safeguards made so that the not-so-genius developer doesn't shoot themselves in the foot so easily.

I'm sure that you can delete all of that, only leave languages like Rust, C and C++ and get a 100x jump in performance. But you'd also be annihilating 90% of the software development workforce. Good luck trying to watch a movie in Netflix or counting calories on a smartwatch.

replies(4): >>36448237 #>>36448470 #>>36448830 #>>36453605 #
2. hcarvalhoalves ◴[] No.36448237[source]
I'm afraid you're putting too much weight on language. Windows is largely built on C++, no? The impact of adding features, multiple layers of architecture, maintained by many people over a long time is worse. Don't underestimate the capacity of creating slow software in ANY language. Software architecture and project management are unsolved problems in the industry.
replies(3): >>36448324 #>>36448395 #>>36453287 #
3. anthk ◴[] No.36448324[source]
Windows uses C# a lot.
4. leidenfrost ◴[] No.36448395[source]
Windows is a behemoth of legacy protocols, abandoned projects, and software support that ranges from bleeding edge gaming hardware to obscure ancient machines required by long term support partners and investors.

Devs at MS have to make everything right in an universe where everything else is dead or crap. And the fact that Windows 11 can even run without crashing daily is an engineering marvel.

PD: Not to defend MS, but I'm sure their current devs are very capable and doing their best.

5. enterprise_cog ◴[] No.36448470[source]
Oh please, get over yourself. How many of those oh so smart 90s devs used those elite skills to write code littered with exploit vectors? Or full of bugs? Or, hell, even really performant?

You are looking back with rose tinted glasses if you think all software was blazing fast back then. There was a reason putting your cursor on a progress bar to track whether it was moving was a thing.

replies(1): >>36448869 #
6. TeMPOraL ◴[] No.36448830[source]
> everyone in the world wants 'puters to predict what could they want to see or buy next.

This doesn't strike me like something "everyone in the world wants", but rather something a small group of leaches is pushing on the rest of the population, to enrich themselves at the expense of everyone else. I'm yet to meet a person that would tell me they actually want computers to tell them what to see or buy. And if I met such person, I bet they'd backtrack if they learned how those systems work.

Exercise for the reader: name one recommendation system that doesn't suck. They all do, and it's not because recommendations are hard. Rather, it's because those systems aren't tuned to recommend what the users would like - they're optimized to recommend what maximizes vendor's revenue. This leads to well-known absurdities like Netflix recommendations being effectively random, and the whole UX being optimized to mask how small their catalogue is; or Spotify recommendations pushing podcasts whether you want them or not; or how you buy a thing and then get spammed for weeks by ads for the same thing, because as stupid as it is, it seems to maximize effectiveness at scale. Etc.

> I'm sure that you can delete all of that, only leave languages like Rust, C and C++ and get a 100x jump in performance. But you'd also be annihilating 90% of the software development workforce. Good luck trying to watch a movie in Netflix or counting calories on a smartwatch.

I'll say the same thing I say to people when they claim banning ads would annihilate 90% of the content on the Internet: good. riddance.

Netflix would still be there. So would smartwatches and calorie counting apps. We're now drowning in deluge of shitty software, a lot of which is actually malware in disguise; "annihilating 90% of the software development workforce" would vastly improve SNR.

7. TeMPOraL ◴[] No.36448869[source]
> There was a reason putting your cursor on a progress bar to track whether it was moving was a thing.

The reason it's not a thing today is because those progress bars got replaced by spinners and "infinite progress bars". At least back then you had a chance to learn or guess how long slow operations would take. These days, users are considered too dumb to be exposed to such "details".

replies(1): >>36451532 #
8. akarlsten ◴[] No.36451532{3}[source]
The real reason people moved to the infinite ones is that the determinate progress bar is almost never accurate or representative, hence useless.

Like beyond truly "dumb" tasks like downloading a file it's basically a guessing game how long it will take anyway, right? Say you split the whole loading bar into percentages based on the number of subtasks, suddenly you end up with a progress bar stuck on 89% for 90% of the total loading time.

Obviously you could post-hoc measure things and adjust it so each task was roughly "worth" as much as the time it took, but people rarely did that back in the day and my boss would get mad at me for wasting time with it today. Hence, spinners.

replies(1): >>36452554 #
9. TeMPOraL ◴[] No.36452554{4}[source]
> Say you split the whole loading bar into percentages based on the number of subtasks, suddenly you end up with a progress bar stuck on 89% for 90% of the total loading time.

Sure. But now as a user, I get to see the glimpse of what's going out under the hood. Combined with other information, such as a log of installation steps (if you provide it), or the sounds made by the spinning rust drive, those old-school determinate progress bars were "leaking" huge amount of information, giving users both greater confidence and ability to solve their own problems. In many cases, you could guess the reason why that scrollbar is stuck on 89% indefinitely, just by ear, and then fix it.

Conversely, spinners and indeterminate progress bars deny users agency, and disenfranchise them. And it's just one case of many, which adds up to the sad irony of UI/UX field - it works hard to dumb down or hide everything about how computers work, and justifies it by claiming it's too difficult for people to understand. But how can they understand, how can they build a good mental model of computing, when the software does its best to hide or scramble anything that would reveal how the machine works?

10. qball ◴[] No.36453287[source]
>Windows is largely built on C++, no?

Microsoft has been trying to migrate Windows development to a managed language for over 20 years; their first attempt at this was a complete disaster and NT 6.0 (Vista) would ultimately be developed the old way.

It's only really been in the last 5-7 years, with Windows 10 and 11, that MS has managed to get their wish as far as UI elements go, which is why the taskbar doesn't react immediately when you click on it any more and has weird bugs that it didn't have before.

11. sigotirandolas ◴[] No.36453605[source]
For the most part, that 90% of the development workforce is not working on solving a new and unique use case, but rather solving an already solved use case but for yet another walled garden (think Disney+, Amazon Prime Video, HBO Max, etc. or the analogous diaspora of fitness tracking apps).

The early 00’s “open standard” of web forum + eMule + VLC would still be light years ahead of Netflix&co. if it weren’t for how hard it’s been gutted by governments, copyright lobbies, ISPs and device/platform vendors through the years. Heck, the modern equivalent often still is (despite all the extra hoops), unless you are trying to watch the latest popular show in English.