←back to thread

205 points samspenc | 10 comments | | HN request time: 0.417s | source | bottom
Show context
WhereIsTheTruth ◴[] No.45147352[source]
I once made the mistake to buy some sound effects from Fab, I had to download the entire Unreal Engine and start it to create a project to then import the assets..

It took the whole afternoon

It's no wonder UE5 games have the reputation of being poorly optimized, you need an insane machine only just to run the editor..

State of the art graphics pipeline, but webdev level of bloat when it comes to software.. I'd even argue electron is a smoother experience tan Unreal Engine Editor

Insanity

replies(4): >>45147501 #>>45147527 #>>45147764 #>>45148495 #
1. daemin ◴[] No.45147501[source]
Yet it is the engine dominating the industry and beloved by artists of all kinds.

To get UE games that run well you either need your own engine team to optimise it or you drop all fancy new features.

replies(1): >>45147642 #
2. Ekaros ◴[] No.45147642[source]
Being around back in days when LCDs replaced the CRTs and learning importance of native resolutions. I feel like recent games have been saved too much by frame-generation and all sort of weird resolution hacks... Mostly by Nvidia and AMD.

I am kinda sad we have reached point where native resolution is not the standard for high mid tier/low high tier GPUs. Surely games should run natively at non-4k resolution on my 700€+ GPU...

replies(3): >>45147710 #>>45148182 #>>45148322 #
3. daemin ◴[] No.45147710[source]
Games haven't been running full native resolution for quite some time, maybe even the last decade, as they tend to render to a smaller buffer and then upscale to the desired resolution in order to achieve better frame rates. This doesn't even include frame generation which is trading off supposed higher frame rates for worse response times so the games can feel worse to play.

By Games I mean modern AAA first or third person games. 2D and others will often run at full resolution all the time.

4. cheschire ◴[] No.45148182[source]
You mean back in the day when 30 fps at 1024x768 was the norm?

New monitors default to 60hz but folks looking to game are convinced by ads that the only reason they lost that last round was not because of the SBMM algorithm, but because the other player undoubtedly had a 240hz 4K monitor rendering the player coming around the corner a tick faster.

Competitive gaming and Twitch are what pushed the current priorities, and the hardware makers were only too happy to oblige.

replies(4): >>45148224 #>>45148448 #>>45149234 #>>45149510 #
5. rkomorn ◴[] No.45148224{3}[source]
I don't play any online competitive games or FPSes, but I can definitely tell that 144 FPS on a synced monitor is nicer than 60 FPS, especially when I play anything that uses mouse look.

For me, it's not quite as big of a jump as, say, when we went from SD to HD TV, but it's still a big enough leap that I don't consider it gimmicky.

Gaming in 4K, on the other hand, I don't really care for. QHD is plenty, but I do find 4K makes for slightly nicer desktop use.

Edit: I'll add that I almost always limit FPS anyway because my GPU turns into a jet engine under high load and I hate fan noise, but that's a different problem.

6. ThatPlayer ◴[] No.45148322[source]
Native resolution was never good enough though. That's why antialiasing is a thing, to fake a higher than native resolution

And now antialiasing is so good you can start from lower resolutions and still fake even higher quality

replies(1): >>45149480 #
7. daemin ◴[] No.45148448{3}[source]
There is something to having a monitor display at a higher than 60fps frame rate especially if the game runs and can process inputs at that higher rate as well. This just decreases the response time that a player can attain as there is literally less time between seeing something on screen and then the game reacting to the player input.

For a bit of background modern games tend to do game procesing and rendering at the same time in parallel, but that means that the frame being processed by the rendering system is the previous frame, and then once rendering has been submitted to the "graphics card" it can take one or more more frames before it's actually visible on the monitor. So you end up with a lag of 3+ frames rather than only a single one like you had on old DOS games and such. So having a faster monitor and being able to render frames at that faster rate will give you some benefit.

In addition this is why using frame generation can actually hurt the gaming experience as instead of waiting 3+ frames to see your input reflected in what is on the screen you end up with something like 7+ frames because the fake in-between frames don't actually deal with any input.

8. Strom ◴[] No.45149234{3}[source]
30 fps was not the norm, at least not with competitive games. Like Counter-Strike in 2000 on a CRT. Yes 1024x768 was common, but at 100 fps. Alternatively you would go to 800x600 to reach 120 fps.

It’s only when LCDs appeared that 60 Hz started being a thing on PCs and 60 fps followed as a consequence, because the display can’t show more anyway.

It’s true that competitive gaming has pushed the priority of performance, but this happened in the 90s already with Quake II. There’s nothing fake about it either. At the time a lot of playing happened at LANs not online. The person with the better PC got better results. Repeatedly reproduced by rotating people around on the available PCs.

9. boomlinde ◴[] No.45149480{3}[source]
I don't agree with the framing of it as "faking" a higher than native resolution. The native resolution is what it is. The problem lies in how the view is sampled as it is rendered to the screen. What you ideally do when you have higher frequency content than the screen can represent is to oversample, filter and downsample the view, as in SSAA, or you approximate the effect or use it selectively when there is high frequency content, using some more clever methods.

It's really the same problem as in synthesizing audio. 44.1 kHz is adequate for most audio purposes, but if you are generating sounds with content past the nyquist frequency it's going to alias and fold back in undesirable ways, causing distortion in the audible content. So you multisample, filter to remove the high frequency content and downsample in order to antialias (which would be roughly equivalent to SSAA) or you build the audio from band limited impulses or steps.

10. boomlinde ◴[] No.45149510{3}[source]
> You mean back in the day when 30 fps at 1024x768 was the norm?

I recall playing games at 100 FPS on my 100 Hz CRT. People seriously interested in multiplayer shooters at the time turned vsync off and aimed for even higher frame rates. It was with this in mind I was quick to upgrade to a 144 Hz display when they got cheap enough: I was taking back territory from when the relatively awful (but much more convenient) flat screens took over.

> because the other player undoubtedly had a 240hz 4K monitor rendering the player coming around the corner a tick faster.

I play 99% single player games and in most of those, response time differences at that scale seem inconsequential. The important difference to me is in motion clarity. It's much easier to track moving objects and anticipate where they will be when you get more frames of animation along their path. This makes fast-paced games much more fun to play, especially first person games where you're always rapidly shifting your view around.