←back to thread

837 points turrini | 1 comments | | HN request time: 0s | source
Show context
titzer ◴[] No.43971962[source]
I like to point out that since ~1980, computing power has increased about 1000X.

If dynamic array bounds checking cost 5% (narrator: it is far less than that), and we turned it on everywhere, we could have computers that are just a mere 950X faster.

If you went back in time to 1980 and offered the following choice:

I'll give you a computer that runs 950X faster and doesn't have a huge class of memory safety vulnerabilities, and you can debug your programs orders of magnitude more easily, or you can have a computer that runs 1000X faster and software will be just as buggy, or worse, and debugging will be even more of a nightmare.

People would have their minds blown at 950X. You wouldn't even have to offer 1000X. But guess what we chose...

Personally I think the 1000Xers kinda ruined things for the rest of us.

replies(20): >>43971976 #>>43971990 #>>43972050 #>>43972107 #>>43972135 #>>43972158 #>>43972246 #>>43972469 #>>43972619 #>>43972675 #>>43972888 #>>43972915 #>>43973104 #>>43973584 #>>43973716 #>>43974422 #>>43976383 #>>43977351 #>>43978286 #>>43978303 #
_aavaa_ ◴[] No.43972050[source]
Except we've squandered that 1000x not on bounds checking but on countless layers of abstractions and inefficiency.
replies(6): >>43972103 #>>43972130 #>>43972215 #>>43974876 #>>43976159 #>>43983438 #
Gigachad ◴[] No.43972215[source]
Am I taking crazy pills or are programs not nearly as slow as HN comments make them out to be? Almost everything loads instantly on my 2021 MacBook and 2020 iPhone. Every program is incredibly responsive. 5 year old mobile CPUs load modern SPA web apps with no problems.

The only thing I can think of that’s slow is Autodesk Fusion starting up. Not really sure how they made that so bad but everything else seems super snappy.

replies(40): >>43972245 #>>43972248 #>>43972259 #>>43972269 #>>43972273 #>>43972292 #>>43972294 #>>43972349 #>>43972354 #>>43972450 #>>43972466 #>>43972520 #>>43972548 #>>43972605 #>>43972640 #>>43972676 #>>43972867 #>>43972937 #>>43973040 #>>43973065 #>>43973220 #>>43973431 #>>43973492 #>>43973705 #>>43973897 #>>43974192 #>>43974413 #>>43975741 #>>43975999 #>>43976270 #>>43976554 #>>43978315 #>>43978579 #>>43981119 #>>43981143 #>>43981157 #>>43981178 #>>43981196 #>>43983337 #>>43984465 #
KapKap66 ◴[] No.43975741[source]
There's a problem when people who aren't very sensitive to latency and try and track it, and that is that their perception of what "instant" actually means is wrong. For them, instant is like, one second. For someone who cares about latency, instant is less than 10 milliseconds, or whatever threshold makes the difference between input and result imperceptible. People have the same problem judging video game framerates because they don't compare them back to back very often (there are perceptual differences between framerates of 30, 60, 120, 300, and 500, at the minimum, even on displays incapable of refreshing at these higher speeds), but you'll often hear people say that 60 fps is "silky smooth," which is not true whatsoever lol.

If you haven't compared high and low latency directly next to each other then there are good odds that you don't know what it looks like. There was a twitter video from awhile ago that did a good job showing it off that's one of the replies to the OP. It's here: https://x.com/jmmv/status/1671670996921896960

Sorry if I'm too presumptuous, however; you might be completely correct and instant is instant in your case.

replies(3): >>43975762 #>>43975835 #>>43976654 #
JoeAltmaier ◴[] No.43975762[source]
I fear that such comments are similar to the old 'a monster cable makes my digital audio sound more mellow!'

The eye percieves at about 10 hz. That's 100ms per capture. All the rest, I'd have to see a study that shows how any higher framerate can possibly be perceived or useful.

replies(4): >>43976063 #>>43977562 #>>43981646 #>>43985975 #
1. zahlman ◴[] No.43976063[source]
>The eye percieves at about 10 hz. That's 100ms per capture. All the rest, I'd have to see a study that shows how any higher framerate can possibly be perceived or useful.

It takes effectively no effort to conduct such a study yourself. Just try re-encoding a video at different frame rates up to your monitor refresh rate. Or try looking at a monitor that has a higher refresh rate than the one you normally use.