He had a special CRT monitor to get the best refresh rate to be as competitive as possible for the game
Feels like a lifetime ago
He had a special CRT monitor to get the best refresh rate to be as competitive as possible for the game
Feels like a lifetime ago
People like that always trying to compensate for a lack of skill
I remember in UT99 for years always running into situations where my aim was slightly off in situations where I was dead sure it should have hit. Turns out it just used the mouse acceleration feature in windows: the speed at which you move the mouse influences how far the crosshair (or cursor) moves. Once I disabled that I became about 5x better. The next big jumps were turning off vsync (and making sure it doesn't turn itself back on) and going back to CRT from LCD.
I play RTS games, which don't need any fancy equipment to play and win, but still don't believe you need a 5K USD monitor to play CS.
My point is those things that cost money don't help win, reader, unless you're talking about those Nike sprinting shoes that were all the rage in the Olympics last year, those things rule.
Vsync: Unbearable amount of input lag, feels like you're moving your mouse in molasses. It can be disabled for free by setting an option. (the amount of lag decreases as refresh rate increases. 60Hz vsync was the worst thing ever. 120Hz is somewhat acceptable).
Me and my friends actually enable vsync from time to time to train ourselves to rely on aim less.
125Hz mouse: Visibly jittery. Just set it to 1000Hz, works on most mice even from 1999.
60HZ CRT: Kills your eyes due to flashing. Get 75Hz
75Hz CRT: Might have bad focus. Get one with good focus
60HZ LCD: Kills your eyes due to motion blur. Suggest 120Hz at bare minimum (yup, motion blur decreases as refresh rate increases). Some models lag as bad as vsync, just get a model that doesn't lag. This has nothing to do with cost, it's a common firmware bug that some models have some don't.
120HZ/240Hz LCD: Might be some garbage with so much overshoot that it's just as bad as the slow pixel response it tried to prevent. Get one without that issue
My mental model is that vsync will lock the framerate to the same as the monitor refresh.
So if monitor is 60hz, my games graphical update rate runs at 60fps.
Let's say without vsync, my graphical update rate would generously go up to 120hz.
Worst case we're talking about an additional input latency due to vsync of maybe 1/60s? 16ms?
I don't believe 16ms is significant for non pro level eSports players and even skeptical it's a big factor at that level.
What am I missing here?
I could understand if the get clock speed was tied to the graphical update rate but I presume that's not the case for CS, server side game etc; or even if it was it's still not going to be that material.
I'm just skeptical - what am I missing here?
Is the mouse input code tied to graphical clock rate or something in some surprising bad way?
> I could understand if the get clock speed was tied to the graphical update rate but I presume that's not the case for CS, server side game etc; or even if it was it's still not going to be that material.
in cs source, the client could not send updates to the server at a faster rate than it was drawing frames. in other words, if you were playing on a 66 tickrate server but only rendering 50 FPS, you were actually losing some simulation fidelity. of course, if you're not the type of person to notice your frame rate dropping to 50 in the first place, you would probably also not notice this consequence. just an interesting technical fact.