It's been already discussed in this thread. The performance impact should be negligible. I am having ~110-150 FPS with my nVidia 560 Ti in CS:GO , 1920x1080.
The precise testing haven't been done explicitly to measure increase/decrease. But feel free to test it. ;)
To be fair though, if you would take a game that seriously stresses the GPU you wouldn't see a 20% difference. You would probably see about the same constant difference of about 0.5 ms.
Thanks for the support, appreciated! We are trying to develop a lightweight CV algorithm so that will affect the performance of the game client. So far we didn't get problem regarding the FPS drop.
Yeah I'm very curious as to how this effected 99% framerates and frame pacing.
I suspect only a modest hit to average framerate, but I can only imagine it hurt the actual max frametimes which make it "feel choppy" even if the framerate is still higher then your monitor's refresh rate.
Probably in games that use Ray tracing or similar esoteric features, in which case your FPS will go from 10 to 20 and Nvidia's claim will still be technically true
I’ve recently upgraded from a gtx550ti (~10 y.o.) card to an RX 570 (3 y.o.), and even though I get way more performance, it introduced a noticeable inputlag in csgo.
It just doesnt feel right, and slowmo video capture confirms it.
When profiling games, using FPS to measure performance increases/decreases is kinda silly. For example, going from 25 to 30 fps (5 fps increase) is a bigger performance improvement than going from 90 to 120 fps (30 fps increase). Much better to use frame times.
There were some cases where there was a noticeable impact, there's quite some benchmarks out there that show a clear difference in FPS and frametimes. But it always seemed like a borked implementation to me, rather than a problem with the product itself.
It doesn't look significant because the game is old and runs already pretty fast on modern hardware. If it was more gpu intensive game, a 20% difference could make a difference between a tolerably fluid framerate and a jerky framerate.
Surely if they are both doing 60fps then by definition the performance will be the same? I guess the time spent doing the actual rendering vs waiting to do the next frame might be different, not sure if there is a reason that the characteristics would be different at 300FPS and 60FPS though since surely the only difference is how often the render code is called?
Bare in mind that Valve is running this test on a much higher end PC than most consumers are likely to have. People will want to be able to run these games on their $500 laptops , so the difference between 270 and 315 FPS might translate into a difference between 50FPS and 60FPS which would be noticeable.
keep in mind I'm talking about 99th percentile frames. I'm running a 4670k right now which easily gets over 250 fps most of the time in csgo. with multiple players, flashes, smokes, and weapon animations on screen, it can drop down to around 70. it's hard for me to notice this visually, but it does make the input feel weird.
Pretty miniscule. Putting things into perspective, it's still fully possible to hit 120 FPS on my GTX 1050ti in Overwatch. A more direct comparison can better elucidate the tradeoffs though: https://youtu.be/voXc1nCD4IA
The precise testing haven't been done explicitly to measure increase/decrease. But feel free to test it. ;)
reply