←back to thread

838 points turrini | 2 comments | | HN request time: 0.549s | source
Show context
titzer ◴[] No.43971962[source]
I like to point out that since ~1980, computing power has increased about 1000X.

If dynamic array bounds checking cost 5% (narrator: it is far less than that), and we turned it on everywhere, we could have computers that are just a mere 950X faster.

If you went back in time to 1980 and offered the following choice:

I'll give you a computer that runs 950X faster and doesn't have a huge class of memory safety vulnerabilities, and you can debug your programs orders of magnitude more easily, or you can have a computer that runs 1000X faster and software will be just as buggy, or worse, and debugging will be even more of a nightmare.

People would have their minds blown at 950X. You wouldn't even have to offer 1000X. But guess what we chose...

Personally I think the 1000Xers kinda ruined things for the rest of us.

replies(20): >>43971976 #>>43971990 #>>43972050 #>>43972107 #>>43972135 #>>43972158 #>>43972246 #>>43972469 #>>43972619 #>>43972675 #>>43972888 #>>43972915 #>>43973104 #>>43973584 #>>43973716 #>>43974422 #>>43976383 #>>43977351 #>>43978286 #>>43978303 #
_aavaa_ ◴[] No.43972050[source]
Except we've squandered that 1000x not on bounds checking but on countless layers of abstractions and inefficiency.
replies(6): >>43972103 #>>43972130 #>>43972215 #>>43974876 #>>43976159 #>>43983438 #
Gigachad ◴[] No.43972215[source]
Am I taking crazy pills or are programs not nearly as slow as HN comments make them out to be? Almost everything loads instantly on my 2021 MacBook and 2020 iPhone. Every program is incredibly responsive. 5 year old mobile CPUs load modern SPA web apps with no problems.

The only thing I can think of that’s slow is Autodesk Fusion starting up. Not really sure how they made that so bad but everything else seems super snappy.

replies(40): >>43972245 #>>43972248 #>>43972259 #>>43972269 #>>43972273 #>>43972292 #>>43972294 #>>43972349 #>>43972354 #>>43972450 #>>43972466 #>>43972520 #>>43972548 #>>43972605 #>>43972640 #>>43972676 #>>43972867 #>>43972937 #>>43973040 #>>43973065 #>>43973220 #>>43973431 #>>43973492 #>>43973705 #>>43973897 #>>43974192 #>>43974413 #>>43975741 #>>43975999 #>>43976270 #>>43976554 #>>43978315 #>>43978579 #>>43981119 #>>43981143 #>>43981157 #>>43981178 #>>43981196 #>>43983337 #>>43984465 #
KapKap66 ◴[] No.43975741[source]
There's a problem when people who aren't very sensitive to latency and try and track it, and that is that their perception of what "instant" actually means is wrong. For them, instant is like, one second. For someone who cares about latency, instant is less than 10 milliseconds, or whatever threshold makes the difference between input and result imperceptible. People have the same problem judging video game framerates because they don't compare them back to back very often (there are perceptual differences between framerates of 30, 60, 120, 300, and 500, at the minimum, even on displays incapable of refreshing at these higher speeds), but you'll often hear people say that 60 fps is "silky smooth," which is not true whatsoever lol.

If you haven't compared high and low latency directly next to each other then there are good odds that you don't know what it looks like. There was a twitter video from awhile ago that did a good job showing it off that's one of the replies to the OP. It's here: https://x.com/jmmv/status/1671670996921896960

Sorry if I'm too presumptuous, however; you might be completely correct and instant is instant in your case.

replies(3): >>43975762 #>>43975835 #>>43976654 #
JoeAltmaier ◴[] No.43975762[source]
I fear that such comments are similar to the old 'a monster cable makes my digital audio sound more mellow!'

The eye percieves at about 10 hz. That's 100ms per capture. All the rest, I'd have to see a study that shows how any higher framerate can possibly be perceived or useful.

replies(4): >>43976063 #>>43977562 #>>43981646 #>>43985975 #
dahart ◴[] No.43985975[source]
> The eye perceives at about 10 hz.

Not sure what this means; the eye doesn’t perceive anything. Maybe you’re thinking of saccades or round-trip response times or something else? Those are in the ~100ms range, but that’s different from whether the eye can see something.

This paper shows pictures can be recognized at 13ms, which is faster than 60hz, and that’s for full scenes, not even motion tracking or small localized changes. https://link.springer.com/article/10.3758/s13414-013-0605-z

replies(1): >>44009925 #
JoeAltmaier ◴[] No.44009925[source]
From that, then, we conclude that somehow 500Hz is important or meaningful?
replies(1): >>44015152 #
dahart ◴[] No.44015152[source]
Is it only 500 or 10, and nothing in between? You could have argued against 500 with the GP comment instead of countering with something that’s demonstrably untrue. I handed you the study you asked for.

Movies’ 24Hz is too slow, just watch a horizontal pan. 24Hz is good enough for slow things, but it was chosen that low for cost reasons, not because it’s the limit of perception. US TV’s 60hz interlace isn’t the limit either, which is also shown with horizontal pans. 60hz progressive looks different than 30hz, just watch YouTube or turn on frame interpolation on a modern TV.

The limit of meaningful motion tracking perception might be in the 100-200Hz range. The reason 500Hz is meaningful to gamers is, I think, because of latency rather than frequency. Video systems often have multiple frames of latency, so there actually is a perceptible difference to them between 60Hz and 500Hz.

replies(1): >>44017480 #
1. ajolly ◴[] No.44017480[source]
I find a big difference between running my desktop at 60 HZ versus 144 HZ in how smooth the mouse moves and how easy it is to click on a small area of the screen with fast mouse movement.
replies(1): >>44018115 #
2. dahart ◴[] No.44018115[source]
Yep, and I think a lot of people who’ve tried it would agree. I’ve heard the same from others too, and I believe it.

I don’t know what typical display latency of just browsing files with a monitor these days. I’d guess it’s probably a few frames, and I’d bet we would be able to feel the difference between 3 frames of latency at 144hz and 1 frame of latency at 144hz. I’m also curious if mouse cursor motion blur would make any difference.