←back to thread

752 points dceddia | 3 comments | | HN request time: 0.631s | source
Show context
waboremo ◴[] No.36447387[source]
We just rely on layers and layers of cruft. We then demand improvements when things get too bad, but we're only operating on the very top layer where even dramatic improvements and magic are irrelevant.

Windows is especially bad at this due to so much legacy reliance, which is also kind of why people still bother with Windows. Not to claim that Linux or MacOS don't have similar problems (ahem, Catalyst) but it's not as overt.

A lot of the blame gets placed on easy to see things like an Electron app, but really the problem is so substantial that even native apps perform slower, use more resources, and aren't doing a whole lot more than they used to. Windows Terminal is a great example of this.

Combine this with the fact that most teams aren't given the space to actually maintain (because maintaining doesn't result in direct profits), and you've got a winning combination!

replies(7): >>36447692 #>>36447714 #>>36447761 #>>36448103 #>>36448804 #>>36449621 #>>36458475 #
leidenfrost ◴[] No.36447692[source]
Not only that. It's not Wirth's Law.

It's the fact that manpower can't keep up with the exploding amount of complexity and use cases that happened to computing in the last decades.

We went from CLI commands and a few graphical tools for the few that actually wanted to engage with computers, to an entire ecosystem of entertainment where everyone in the world wants 'puters to predict what could they want to see or buy next.

To maintain the same efficiency in code we had in the 90-2000s, we would need to instantly jump the seniority of every developer in the world, right from Junior to Senior+. Yes, you can recruit and train developers, but how many Tanenbaums and Torvalds can you train per year?

The biggest amount of cruft not only went to dark patterns and features in programs like animations and rendering that some people regard it as "useless" (which is debatable at minimum). But the layers went also to improve "developer experience".

And I'm not talking about NodeJS only. I'm talking about languages like Python, Lua, or even the JVM.

There's a whole universe of hoops and loops and safeguards made so that the not-so-genius developer doesn't shoot themselves in the foot so easily.

I'm sure that you can delete all of that, only leave languages like Rust, C and C++ and get a 100x jump in performance. But you'd also be annihilating 90% of the software development workforce. Good luck trying to watch a movie in Netflix or counting calories on a smartwatch.

replies(4): >>36448237 #>>36448470 #>>36448830 #>>36453605 #
enterprise_cog ◴[] No.36448470[source]
Oh please, get over yourself. How many of those oh so smart 90s devs used those elite skills to write code littered with exploit vectors? Or full of bugs? Or, hell, even really performant?

You are looking back with rose tinted glasses if you think all software was blazing fast back then. There was a reason putting your cursor on a progress bar to track whether it was moving was a thing.

replies(1): >>36448869 #
1. TeMPOraL ◴[] No.36448869[source]
> There was a reason putting your cursor on a progress bar to track whether it was moving was a thing.

The reason it's not a thing today is because those progress bars got replaced by spinners and "infinite progress bars". At least back then you had a chance to learn or guess how long slow operations would take. These days, users are considered too dumb to be exposed to such "details".

replies(1): >>36451532 #
2. akarlsten ◴[] No.36451532[source]
The real reason people moved to the infinite ones is that the determinate progress bar is almost never accurate or representative, hence useless.

Like beyond truly "dumb" tasks like downloading a file it's basically a guessing game how long it will take anyway, right? Say you split the whole loading bar into percentages based on the number of subtasks, suddenly you end up with a progress bar stuck on 89% for 90% of the total loading time.

Obviously you could post-hoc measure things and adjust it so each task was roughly "worth" as much as the time it took, but people rarely did that back in the day and my boss would get mad at me for wasting time with it today. Hence, spinners.

replies(1): >>36452554 #
3. TeMPOraL ◴[] No.36452554[source]
> Say you split the whole loading bar into percentages based on the number of subtasks, suddenly you end up with a progress bar stuck on 89% for 90% of the total loading time.

Sure. But now as a user, I get to see the glimpse of what's going out under the hood. Combined with other information, such as a log of installation steps (if you provide it), or the sounds made by the spinning rust drive, those old-school determinate progress bars were "leaking" huge amount of information, giving users both greater confidence and ability to solve their own problems. In many cases, you could guess the reason why that scrollbar is stuck on 89% indefinitely, just by ear, and then fix it.

Conversely, spinners and indeterminate progress bars deny users agency, and disenfranchise them. And it's just one case of many, which adds up to the sad irony of UI/UX field - it works hard to dumb down or hide everything about how computers work, and justifies it by claiming it's too difficult for people to understand. But how can they understand, how can they build a good mental model of computing, when the software does its best to hide or scramble anything that would reveal how the machine works?