He had a special CRT monitor to get the best refresh rate to be as competitive as possible for the game
Feels like a lifetime ago
He had a special CRT monitor to get the best refresh rate to be as competitive as possible for the game
Feels like a lifetime ago
I remember in UT99 for years always running into situations where my aim was slightly off in situations where I was dead sure it should have hit. Turns out it just used the mouse acceleration feature in windows: the speed at which you move the mouse influences how far the crosshair (or cursor) moves. Once I disabled that I became about 5x better. The next big jumps were turning off vsync (and making sure it doesn't turn itself back on) and going back to CRT from LCD.
I play RTS games, which don't need any fancy equipment to play and win, but still don't believe you need a 5K USD monitor to play CS.
My point is those things that cost money don't help win, reader, unless you're talking about those Nike sprinting shoes that were all the rage in the Olympics last year, those things rule.
Vsync: Unbearable amount of input lag, feels like you're moving your mouse in molasses. It can be disabled for free by setting an option. (the amount of lag decreases as refresh rate increases. 60Hz vsync was the worst thing ever. 120Hz is somewhat acceptable).
Me and my friends actually enable vsync from time to time to train ourselves to rely on aim less.
125Hz mouse: Visibly jittery. Just set it to 1000Hz, works on most mice even from 1999.
60HZ CRT: Kills your eyes due to flashing. Get 75Hz
75Hz CRT: Might have bad focus. Get one with good focus
60HZ LCD: Kills your eyes due to motion blur. Suggest 120Hz at bare minimum (yup, motion blur decreases as refresh rate increases). Some models lag as bad as vsync, just get a model that doesn't lag. This has nothing to do with cost, it's a common firmware bug that some models have some don't.
120HZ/240Hz LCD: Might be some garbage with so much overshoot that it's just as bad as the slow pixel response it tried to prevent. Get one without that issue
https://www.youtube.com/watch?v=F2dPIHPNOeY
You don't need to spend lots of money to get 120+hz screens. They can be had for $150 if you're willing to make compromises.
in Eastern Europe we had computer clubs, which were like internet cafes, but without internet. you went there to play lan games, paid by the hour. they were usually packed, and people were usually decent. you had computer club rats, and I was one of them. I played q1, q2 and then cs for money. you show up to a club, strike a pose, "your club's cs fu are that of a dog, I challenge any one of you lamers to de_dust 1x1 deagle only", and then sometimes if you met your match you'd put cash up. even working, well maintained hardware ranged in quality to the point of making significant difference to a game. you always carried your own mouse at least (and sometimes a keyboard), and a config on a floppy disk. but the one thing you learned to spot were monitor makes and brands, for the reasons guy you responding to stated. in fact, occasionally computer club admins in collusion with their house teams will put you on a machine with poor monitor. if you had big team vs team match planned, you'd schedule an outing to a downtown club with known high refresh rate monitors and quality hardware (they were never "the club next door" and their hourly rate was usually higher) to both level the playing field and provide peak possible playing experience.
I would guesstimate about 100% more effectiveness compared to a subpar setup. Meaning all else equal the guy with the better setup would win an encounter twice as often.
You also need to know about all of the configuration steps to maximize advantage. What might look superhuman on youtube for instance may very well just be a good setup.
it is also from the world that's entirely unconnected to the American obsession with "buying the best skiis, recommended by the skiing daily magazine, before ever getting to a slope", which is a real thing, and I agree with you there.
there's definitely a Boris somewhere in the middle of nowhere who can cyka blyat on a 60mhz crt and a pentium 4 in cs:go to this day smh, and there's also a lamer with souped up ryzen who can't aim for shit. but all else being equal competitive advantage from hardware is not always and not exclusively "golden vacuum tube amp connectors".
My mental model is that vsync will lock the framerate to the same as the monitor refresh.
So if monitor is 60hz, my games graphical update rate runs at 60fps.
Let's say without vsync, my graphical update rate would generously go up to 120hz.
Worst case we're talking about an additional input latency due to vsync of maybe 1/60s? 16ms?
I don't believe 16ms is significant for non pro level eSports players and even skeptical it's a big factor at that level.
What am I missing here?
I could understand if the get clock speed was tied to the graphical update rate but I presume that's not the case for CS, server side game etc; or even if it was it's still not going to be that material.
I'm just skeptical - what am I missing here?
Is the mouse input code tied to graphical clock rate or something in some surprising bad way?
> I could understand if the get clock speed was tied to the graphical update rate but I presume that's not the case for CS, server side game etc; or even if it was it's still not going to be that material.
in cs source, the client could not send updates to the server at a faster rate than it was drawing frames. in other words, if you were playing on a 66 tickrate server but only rendering 50 FPS, you were actually losing some simulation fidelity. of course, if you're not the type of person to notice your frame rate dropping to 50 in the first place, you would probably also not notice this consequence. just an interesting technical fact.
I could play tennis with a 30$ racket from ALDI but it would be a lot less fun.
I bet you use a high resolution monitor for work? You could argue that is blowing money on something for even a slight advantage, since you could do the same work on a 15" 1024x768 monitor too. Oh, but the experience sucks? Yeah exactly - that's why people want to improve their gaming experience even if those people are casual/only playing as a recreational hobby.
but your input delay is not 0, so your input might come in before frame A above, but frame A doesn't reflect input yet, which makes your worst case input latency 48ms: input comes in, blit ..., render frame A, blit frame A, render frame B, blit frame B.
there are also bad vsync implementations, that by virtue of being enabled, introduce further delay between state and graphics. or if fps drops under refresh rate, things go out of sync, and your vsync becomes a compounding effect.
finally vsync delay existing in addition to whatever other delays. a 30ms delay for whatever reasons, becomes an 80ms delay because vsync on top.
We were one of the first to get cable in the area, so our ping was generally about 5ms..
This was when many others were still on bonded ISDN for a (nice stable) ping of about 120ms or dial up (150+ ping).
That made up for a lot of skill :D
its generally closer to 2 frames with V-Sync [1][2]
> I don't believe 16ms is significant for non pro level eSports players and even skeptical it's a big factor at that level.
It actually is fairly significant. LTT did a series of tests with pro players in CS:GO focused on monitor refresh rate, but one test they did was 60hz/60fps vs 60hz/300fps and found that reducing the render latency drastically improved performance despite the display still being locked to 60hz.
https://youtu.be/OX31kZbAXsA?t=1911
[1] https://displaylag.com/reduce-input-lag-in-pc-games-the-defi...
[2] https://www.cse.wustl.edu/~jain/cse567-15/ftp/vsync/index.ht...
Run a monitor at 60Hz. Run whatever FPS locker you want (whatever the game has built in, RTSS, etc) at 60FPS then run with the frame locker removed but vsync on. The average person will notice the huge difference. It will be impossible to aim on the latter without getting used to it, and even then you'll still miss lots of shots that you know you should have hit.
> [1]
They got Freesync having less lag than no synchronization which means their measurements are likely wrong.
> [2]
All the lag can be calculated on paper, why do they need an empirical study? Their definition of triple buffering is one of following: One is FIFO, used by Microsoft, which causes even more lag than just double buffered. The other is some obscure mode I barely even remember that is incorrect because it drops or doubles various frames.
> LED
It's actually still an LCD. The manufacturers calling them "LED monitors" is a scam, they just changed out the backlight from CCFL to LED, and it has little to no different characteristics, visually. They actually made billions of dollars from that scam (as in, people straight up buy it thinking it solves viewing angle shift) and nobody noticed, it's pretty funny.
99% of the time a game after 1998 or so says "vsync" it means double buffered vsync, so I'll explain that version.
Let's say the game renders frames instantly.
Without vsync but locked at 60FPS, an input can take up to 16ms to cause an effect on the monitor because the game loop only applies and renders pending inputs once every 16ms at this framerate. Each input will have between 0ms and 16ms of lag.
In double buffered vsync at 60Hz, its the same thing: the game loop applies and renders pending inputs once every 16ms. But now the frame is not shown on the monitor right away. Instead, the loop waits for the monitor to be ready. And because the loop will restart right after that, this wait will always be another 16ms. Each input will have between 16ms and 32ms lag.
Of course if your render takes more than 16ms you will have more issues, but that's not the problem here. Even with a computer that renders instantly, the lag will be too much.
And yes this will be on top of the already existing lag of the game and peripherals.
I don't understand how you get 48ms. If I have a mouse with 4ms of lag, it will just add a constant 4ms to the total making worst case 32ms + 4ms. I did think it was 48ms at some point but now I think I just imagined it.
What does that mean? You mean if player A can beat player B 70% of the time, then once player A upgrades his equipment he'll beat player B 140% of the time?
Do you mean that if player A can beat player B 20% of the time, then after upgrading his equipment his win rate will jump to 60%?
Neither of those seems at all plausible, and one is gibberish, but those are the only two ways I can think of to interpret "win an encounter twice as often".