←back to thread

838 points turrini | 2 comments | | HN request time: 0.002s | source
Show context
titzer ◴[] No.43971962[source]
I like to point out that since ~1980, computing power has increased about 1000X.

If dynamic array bounds checking cost 5% (narrator: it is far less than that), and we turned it on everywhere, we could have computers that are just a mere 950X faster.

If you went back in time to 1980 and offered the following choice:

I'll give you a computer that runs 950X faster and doesn't have a huge class of memory safety vulnerabilities, and you can debug your programs orders of magnitude more easily, or you can have a computer that runs 1000X faster and software will be just as buggy, or worse, and debugging will be even more of a nightmare.

People would have their minds blown at 950X. You wouldn't even have to offer 1000X. But guess what we chose...

Personally I think the 1000Xers kinda ruined things for the rest of us.

replies(20): >>43971976 #>>43971990 #>>43972050 #>>43972107 #>>43972135 #>>43972158 #>>43972246 #>>43972469 #>>43972619 #>>43972675 #>>43972888 #>>43972915 #>>43973104 #>>43973584 #>>43973716 #>>43974422 #>>43976383 #>>43977351 #>>43978286 #>>43978303 #
_aavaa_ ◴[] No.43972050[source]
Except we've squandered that 1000x not on bounds checking but on countless layers of abstractions and inefficiency.
replies(6): >>43972103 #>>43972130 #>>43972215 #>>43974876 #>>43976159 #>>43983438 #
Gigachad ◴[] No.43972215[source]
Am I taking crazy pills or are programs not nearly as slow as HN comments make them out to be? Almost everything loads instantly on my 2021 MacBook and 2020 iPhone. Every program is incredibly responsive. 5 year old mobile CPUs load modern SPA web apps with no problems.

The only thing I can think of that’s slow is Autodesk Fusion starting up. Not really sure how they made that so bad but everything else seems super snappy.

replies(40): >>43972245 #>>43972248 #>>43972259 #>>43972269 #>>43972273 #>>43972292 #>>43972294 #>>43972349 #>>43972354 #>>43972450 #>>43972466 #>>43972520 #>>43972548 #>>43972605 #>>43972640 #>>43972676 #>>43972867 #>>43972937 #>>43973040 #>>43973065 #>>43973220 #>>43973431 #>>43973492 #>>43973705 #>>43973897 #>>43974192 #>>43974413 #>>43975741 #>>43975999 #>>43976270 #>>43976554 #>>43978315 #>>43978579 #>>43981119 #>>43981143 #>>43981157 #>>43981178 #>>43981196 #>>43983337 #>>43984465 #
flohofwoe ◴[] No.43972269[source]
I guess you don't need to wrestle with Xcode?

Somehow the Xcode team managed to make startup and some features in newer Xcode versions slower than older Xcode versions running on old Intel Macs.

E.g. the ARM Macs are a perfect illustration that software gets slower faster than hardware gets faster.

After a very short 'free lunch' right after the Intel => ARM transition we're now back to the same old software performance regression spiral (e.g. new software will only be optimized until it feels 'fast enough', and that 'fast enough' duration is the same no matter how fast the hardware is).

Another excellent example is the recent release of the Oblivion Remaster on Steam (which uses the brand new UE5 engine):

On my somewhat medium-level PC I have to reduce the graphics quality in the Oblivion Remaster so much that the result looks worse than 14-year old Skyrim (especially outdoor environments), and that doesn't even result in a stable 60Hz frame rate, while Skyrim runs at a rock-solid 60Hz and looks objectively better in the outdoors.

E.g. even though the old Skyrim engine isn't by far as technologically advanced as UE5 and had plenty of performance issues at launch on a ca. 2010 PC, the Oblivion Remaster (which uses a "state of the art" engine) looks and performs worse than its own 14 years old predecessor.

I'm sure the UE5-based Oblivion remaster can be properly optimized to beat Skyrim both in looks and performance, but apparently nobody cared about that during development.

replies(1): >>43973055 #
jayd16 ◴[] No.43973055[source]
You're comparing the art(!) of two different games, that targeted two different sets of hardware while using the ideal hardware for one and not the other. Kind of a terrible example.
replies(1): >>43973172 #
flohofwoe ◴[] No.43973172[source]
> You're comparing the art(!)

The art direction, modelling and animation work is mostly fine, the worse look results from the lack of dynamic lighting and ambient occlusion in the Oblivion Remaster when switching Lumen (UE5's realtime global illumination feature) to the lowest setting, this results in completely flat lighting for the vegetation but is needed to get an acceptable base frame rate (it doesn't solve the random stuttering though).

Basically, the best art will always look bad without good lighting (and even baked or faked ambient lighting like in Skyrim looks better than no ambient lighting at all.

Digital Foundry has an excellent video about the issues:

https://www.youtube.com/watch?v=p0rCA1vpgSw

TL;DR: the 'ideal hardware' for the Oblivion Remaster doesn't exist, even if you get the best gaming rig money can buy.

replies(2): >>43973450 #>>43973742 #
1. KronisLV ◴[] No.43973450{3}[source]
> …when switching Lumen (UE5's realtime global illumination feature) to the lowest setting, this results in completely flat lighting for the vegetation but is needed to get an acceptable base frame rate (it doesn't solve the random stuttering though).

This also happens to many other UE5 games like S.T.A.L.K.E.R. 2 where they try to push the graphics envelope with expensive techniques and most people without expensive hardware have to turn the settings way down (even use things like upscaling and framegen which further makes the experience a bit worse, at least when the starting point is very bad and you have to use them as a crutch), often making these modern games look worse than something a decade old.

Whatever UE5 is doing (or rather, how so many developers choose to use it) is a mistake now and might be less of a mistake in 5-10 years when the hardware advances further and becomes more accessible. Right now it feels like a ploy by the Big GPU to force people to upgrade to overpriced hardware if they want to enjoy any of these games; or rather, sillyness aside, is an attempt by studios to save resources by making the artists spend less time on faking and optimizing effects and detail that can just be brute forced by the engine.

In contrast, most big CryEngine and idTech games run great even on mid range hardware and still look great.

replies(1): >>43975682 #
2. flohofwoe ◴[] No.43975682[source]
It's like (usable) realtime global illumination is the fusion power of rendering, always just 10 years away ;)

I remember that UE4 also hyped a realtime GI solution which then was hardly used in realworld games because it had a too big performance hit.