←back to thread

752 points dceddia | 6 comments | | HN request time: 0.856s | source | bottom
Show context
jmmv ◴[] No.36450457[source]
Hey folks, author of the Twitter thread here.

It's pretty funny how a pair of crappy videos I recorded in 5 minutes have gone viral and landed here. I obviously did not expect that this would happen and is why I didn't give a second thought to the comparison. There are many wrong things in there (including inaccuracies, as some have reported), and Twitter really doesn't give room to nuance. (Plus my notifications are now unusable so I can't even reply where necessary.)

I don't want to defend the "computers of 20 years ago" because they sucked in many aspects. Things have indeed gotten better in many ways: faster I/O, better graphics and insanely fast networks are a few of them, which have allowed many new types of apps to surface. Better languages and the pervasiveness of virtual machines have also allowed for new types of development and deployment environments, which can make things safer. Faster CPUs do enable things we couldn't do, like on-the-fly video transcoding and the like. The existence of GPUs gives us graphics animations for free. And the list goes on.

BUT. That still doesn't mean everything is better. UIs have generally gotten slower as you can see. There is visible lag even in fast computers: I noticed it on a ~2021 Z4 workstation I had at work, I noticed it on an i7 Surface Laptop 3 I had, and I still notice it on the Mac Pro I'm running Windows 11 on (my primary machine). It's mind-blowing to me that we need super-fast multi-core systems and GBs of RAM to approach, but not reach, the responsiveness we used to have in native desktop apps before. And this is really my pet peeve and what prompted the tweets.

Other random thoughts:

* Some massive wins we got in the past, like the switch from HDDs to SSDs, have been eaten away and now SSDs are a requirement.

* Lag is less visible on macOS and Linux desktops as they still feature mostly-native apps (unscientific claim as well).

* The Surface Go 2 isn't a very performant machine indeed, but note it ships with Windows 11 and the lag exists out of the box, so that makes it enough to qualify as a fair comparison. The specs I quoted were wrong though because I misread them from whichever website returned them to me. I don't care though because this is the experience I get on all reasonably-modern machines.

* Yes, I had opened the apps in both computers before running the video, so they were all cached in memory (which puts the newer system in a worse light?).

* One specific thing that illustrates the problem is Notepad: the app was recently "rewritten" (can't recall exactly what the changes were). It used to open instantaneously on the Go 2, but not any more.

* NT 3.51 wasn't truly fair game because it was years-older than the machine. But if you scroll down the thread you'll see the same "test" rerun on Windows 2000 (released same year as the hardware).

I might come back to extend the list of random thoughts. A proper follow-up blog post would be nice, but I'm not going to have time to write one right away.

replies(6): >>36452118 #>>36453162 #>>36453609 #>>36455839 #>>36457291 #>>36497870 #
1. Wowfunhappy ◴[] No.36453609[source]
But, like, I literally want to know why this is happening on a technical level. Notepad is Notepad. What the heck is it doing that makes it take so much longer? What would it take to get back to this level of responsiveness on a machine which has, let's say, 2gb of memory and an Intel atom chip, which is still a heck of a lot more than that NT machine was running on.
replies(3): >>36454435 #>>36457748 #>>36459500 #
2. LordShredda ◴[] No.36454435[source]
layers upon layers of abstraction. Windows testing every single application in case it needs to drop down to 32 bit or windows 95 compat. Security checking, telemetry, theming, spinning up GPU to draw what you see.
3. Swisstone ◴[] No.36457748[source]
Because nowadays notepad is unfortunately a "Packaged Application" (UWP), and starting a Packaged Application involves a ton of system components (IIRC relies on an Activation Manager to make RPC and DCOM calls, create an AppContainer and various tokens, to, maybe, if you are lucky, spawn a process). There are more details about this in Windows Internals 7th, Part 2.
replies(1): >>36458397 #
4. Wowfunhappy ◴[] No.36458397[source]
But it's not like this is limited to Windows. I'd guess that if you tried to open TextEdit on an extremely old and low-end Intel Mac (which again is still an order of magnitude faster than computers 20 years ago which could edit text just fine), the app would take a while to launch. And it probably wouldn't be a great experience.

I haven't used desktop Linux recently, but my expectation would be that e.g. gnome is a little better than Windows and macOS, but still not great compared to what we had 20 years ago.

So there's just something broadly bloated about modern software. And, sure, people largely don't notice because modern hardware is so damn fast, but we're also forcing users to buy hardware that is much faster than what they should need.

GP points out that faster CPUs allow us to do things that weren't possible in the old days, like real-time video transcoding, and I agree that's great! But if you just need to check your damn email, you should be able to save a great deal of money with a machine that's laughably underpowered by "modern" standards, while still checking your email at the speed of light.

What is all this software doing? I guess we're e.g. rendering at higher resolutions than we used to, but isn't the GPU supposed to take care of that?

5. immibis ◴[] No.36459500[source]
Abstraction abstraction abstraction abstraction abstraction abstraction abstraction abstraction manager factory impl mushroom.
replies(1): >>36459543 #
6. immibis ◴[] No.36459543[source]
Developers are falling victim to the same illness that pervades our whole economy: ignoring the ground truth. When Tesla announces a cybertruck it's not because Elon is thinking of selling cybertrucks. It's because he increases the stock price. When developers develop they're developing to fulfil buzzword checklists and not to tell the CPU what to do to accomplish a goal - that's seen as secondary.