←back to thread

205 points samspenc | 5 comments | | HN request time: 0.395s | source
Show context
WhereIsTheTruth ◴[] No.45147352[source]
I once made the mistake to buy some sound effects from Fab, I had to download the entire Unreal Engine and start it to create a project to then import the assets..

It took the whole afternoon

It's no wonder UE5 games have the reputation of being poorly optimized, you need an insane machine only just to run the editor..

State of the art graphics pipeline, but webdev level of bloat when it comes to software.. I'd even argue electron is a smoother experience tan Unreal Engine Editor

Insanity

replies(4): >>45147501 #>>45147527 #>>45147764 #>>45148495 #
daemin ◴[] No.45147501[source]
Yet it is the engine dominating the industry and beloved by artists of all kinds.

To get UE games that run well you either need your own engine team to optimise it or you drop all fancy new features.

replies(1): >>45147642 #
Ekaros ◴[] No.45147642[source]
Being around back in days when LCDs replaced the CRTs and learning importance of native resolutions. I feel like recent games have been saved too much by frame-generation and all sort of weird resolution hacks... Mostly by Nvidia and AMD.

I am kinda sad we have reached point where native resolution is not the standard for high mid tier/low high tier GPUs. Surely games should run natively at non-4k resolution on my 700€+ GPU...

replies(3): >>45147710 #>>45148182 #>>45148322 #
1. cheschire ◴[] No.45148182[source]
You mean back in the day when 30 fps at 1024x768 was the norm?

New monitors default to 60hz but folks looking to game are convinced by ads that the only reason they lost that last round was not because of the SBMM algorithm, but because the other player undoubtedly had a 240hz 4K monitor rendering the player coming around the corner a tick faster.

Competitive gaming and Twitch are what pushed the current priorities, and the hardware makers were only too happy to oblige.

replies(4): >>45148224 #>>45148448 #>>45149234 #>>45149510 #
2. rkomorn ◴[] No.45148224[source]
I don't play any online competitive games or FPSes, but I can definitely tell that 144 FPS on a synced monitor is nicer than 60 FPS, especially when I play anything that uses mouse look.

For me, it's not quite as big of a jump as, say, when we went from SD to HD TV, but it's still a big enough leap that I don't consider it gimmicky.

Gaming in 4K, on the other hand, I don't really care for. QHD is plenty, but I do find 4K makes for slightly nicer desktop use.

Edit: I'll add that I almost always limit FPS anyway because my GPU turns into a jet engine under high load and I hate fan noise, but that's a different problem.

3. daemin ◴[] No.45148448[source]
There is something to having a monitor display at a higher than 60fps frame rate especially if the game runs and can process inputs at that higher rate as well. This just decreases the response time that a player can attain as there is literally less time between seeing something on screen and then the game reacting to the player input.

For a bit of background modern games tend to do game procesing and rendering at the same time in parallel, but that means that the frame being processed by the rendering system is the previous frame, and then once rendering has been submitted to the "graphics card" it can take one or more more frames before it's actually visible on the monitor. So you end up with a lag of 3+ frames rather than only a single one like you had on old DOS games and such. So having a faster monitor and being able to render frames at that faster rate will give you some benefit.

In addition this is why using frame generation can actually hurt the gaming experience as instead of waiting 3+ frames to see your input reflected in what is on the screen you end up with something like 7+ frames because the fake in-between frames don't actually deal with any input.

4. Strom ◴[] No.45149234[source]
30 fps was not the norm, at least not with competitive games. Like Counter-Strike in 2000 on a CRT. Yes 1024x768 was common, but at 100 fps. Alternatively you would go to 800x600 to reach 120 fps.

It’s only when LCDs appeared that 60 Hz started being a thing on PCs and 60 fps followed as a consequence, because the display can’t show more anyway.

It’s true that competitive gaming has pushed the priority of performance, but this happened in the 90s already with Quake II. There’s nothing fake about it either. At the time a lot of playing happened at LANs not online. The person with the better PC got better results. Repeatedly reproduced by rotating people around on the available PCs.

5. boomlinde ◴[] No.45149510[source]
> You mean back in the day when 30 fps at 1024x768 was the norm?

I recall playing games at 100 FPS on my 100 Hz CRT. People seriously interested in multiplayer shooters at the time turned vsync off and aimed for even higher frame rates. It was with this in mind I was quick to upgrade to a 144 Hz display when they got cheap enough: I was taking back territory from when the relatively awful (but much more convenient) flat screens took over.

> because the other player undoubtedly had a 240hz 4K monitor rendering the player coming around the corner a tick faster.

I play 99% single player games and in most of those, response time differences at that scale seem inconsequential. The important difference to me is in motion clarity. It's much easier to track moving objects and anticipate where they will be when you get more frames of animation along their path. This makes fast-paced games much more fun to play, especially first person games where you're always rapidly shifting your view around.