←back to thread

1345 points philosopher1234 | 1 comments | | HN request time: 0s | source
Show context
MuffinFlavored ◴[] No.34628720[source]
Met what ended up being a great friend in real life somewhere in some random IRC room looking for a 5th member to join my friend's group

He had a special CRT monitor to get the best refresh rate to be as competitive as possible for the game

Feels like a lifetime ago

replies(5): >>34628819 #>>34629194 #>>34630350 #>>34630754 #>>34636855 #
Zurrrrr ◴[] No.34629194[source]
"He had a special CRT monitor to get the best refresh rate to be as competitive as possible for the game"

People like that always trying to compensate for a lack of skill

replies(3): >>34629275 #>>34629278 #>>34629770 #
dbttdft ◴[] No.34629770[source]
Bad monitors are just a gimped setup. Ungimping your setup doesn't mean compensating for a lack of skill. 60Hz LCDs are extremely hard on the eyes because of the large amount of motion blur inherent to displaying something at 60Hz without strobing. They also had very bad pixel response in 1999. They also had medium-high input lag depending on the model and what colors are being displayed on the screen. You also wanted a high end CRT for both better still image sharpness and better refresh rate (lots of them only did 60Hz or 75Hz, and anything that maxed out at 75Hz probably had bad focus, because focus decreases as you raise the refresh rate). Once you start fixing your system (changing mouse polling rate from 125Hz, disabling mouse acceleration), the monitor is just one more thing to fix. All of this is needed just to be able to game competently with the top say, 50% of players (unless your play style just avoids aiming).

I remember in UT99 for years always running into situations where my aim was slightly off in situations where I was dead sure it should have hit. Turns out it just used the mouse acceleration feature in windows: the speed at which you move the mouse influences how far the crosshair (or cursor) moves. Once I disabled that I became about 5x better. The next big jumps were turning off vsync (and making sure it doesn't turn itself back on) and going back to CRT from LCD.

replies(2): >>34629907 #>>34630162 #
brezelgoring ◴[] No.34630162[source]
In cycling there's a term called MAMIL, which means 'Middle Aged Man in Lycra'. It represents a grown man, clearly out of shape, with the latest and greatest equipment that probably cost him north of tens of thousands of dollars, singing praises about about the edge his equipment gives him over everyone else - whilst sporting legs worth about 5 dollars.

I play RTS games, which don't need any fancy equipment to play and win, but still don't believe you need a 5K USD monitor to play CS.

My point is those things that cost money don't help win, reader, unless you're talking about those Nike sprinting shoes that were all the rage in the Olympics last year, those things rule.

replies(3): >>34630416 #>>34630488 #>>34630802 #
dbttdft ◴[] No.34630416[source]
Yes I'm well aware of the consumer whore, but you don't understand, none of this is about money, it's about proper configuration.

Vsync: Unbearable amount of input lag, feels like you're moving your mouse in molasses. It can be disabled for free by setting an option. (the amount of lag decreases as refresh rate increases. 60Hz vsync was the worst thing ever. 120Hz is somewhat acceptable).

Me and my friends actually enable vsync from time to time to train ourselves to rely on aim less.

125Hz mouse: Visibly jittery. Just set it to 1000Hz, works on most mice even from 1999.

60HZ CRT: Kills your eyes due to flashing. Get 75Hz

75Hz CRT: Might have bad focus. Get one with good focus

60HZ LCD: Kills your eyes due to motion blur. Suggest 120Hz at bare minimum (yup, motion blur decreases as refresh rate increases). Some models lag as bad as vsync, just get a model that doesn't lag. This has nothing to do with cost, it's a common firmware bug that some models have some don't.

120HZ/240Hz LCD: Might be some garbage with so much overshoot that it's just as bad as the slow pixel response it tried to prevent. Get one without that issue

replies(2): >>34631852 #>>34632630 #
fergal_reid ◴[] No.34631852[source]
Is this actually true, or BS? If true, can someone explain?

My mental model is that vsync will lock the framerate to the same as the monitor refresh.

So if monitor is 60hz, my games graphical update rate runs at 60fps.

Let's say without vsync, my graphical update rate would generously go up to 120hz.

Worst case we're talking about an additional input latency due to vsync of maybe 1/60s? 16ms?

I don't believe 16ms is significant for non pro level eSports players and even skeptical it's a big factor at that level.

What am I missing here?

I could understand if the get clock speed was tied to the graphical update rate but I presume that's not the case for CS, server side game etc; or even if it was it's still not going to be that material.

I'm just skeptical - what am I missing here?

Is the mouse input code tied to graphical clock rate or something in some surprising bad way?

replies(5): >>34632427 #>>34632902 #>>34633180 #>>34633634 #>>34636256 #
r9550684 ◴[] No.34633180[source]
if you have 0 input delay, worst case latency is still 32ms, because your input might come in when back frame is ready, but blit hasn't happened yet: render frame A, input comes in, blit frame A, render frame B, blit frame B.

but your input delay is not 0, so your input might come in before frame A above, but frame A doesn't reflect input yet, which makes your worst case input latency 48ms: input comes in, blit ..., render frame A, blit frame A, render frame B, blit frame B.

there are also bad vsync implementations, that by virtue of being enabled, introduce further delay between state and graphics. or if fps drops under refresh rate, things go out of sync, and your vsync becomes a compounding effect.

finally vsync delay existing in addition to whatever other delays. a 30ms delay for whatever reasons, becomes an 80ms delay because vsync on top.

replies(1): >>34637528 #
dbttdft ◴[] No.34637528{3}[source]
Here's how I see it:

99% of the time a game after 1998 or so says "vsync" it means double buffered vsync, so I'll explain that version.

Let's say the game renders frames instantly.

Without vsync but locked at 60FPS, an input can take up to 16ms to cause an effect on the monitor because the game loop only applies and renders pending inputs once every 16ms at this framerate. Each input will have between 0ms and 16ms of lag.

In double buffered vsync at 60Hz, its the same thing: the game loop applies and renders pending inputs once every 16ms. But now the frame is not shown on the monitor right away. Instead, the loop waits for the monitor to be ready. And because the loop will restart right after that, this wait will always be another 16ms. Each input will have between 16ms and 32ms lag.

Of course if your render takes more than 16ms you will have more issues, but that's not the problem here. Even with a computer that renders instantly, the lag will be too much.

And yes this will be on top of the already existing lag of the game and peripherals.

I don't understand how you get 48ms. If I have a mouse with 4ms of lag, it will just add a constant 4ms to the total making worst case 32ms + 4ms. I did think it was 48ms at some point but now I think I just imagined it.

replies(1): >>34650159 #
1. r9550684 ◴[] No.34650159{4}[source]
you're right, I fucked it up with 48ms logic. an input lag will go to a second 16ms cycle instead of the first one, rather than somehow magically creating a third one