Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

It's been already discussed in this thread. The performance impact should be negligible. I am having ~110-150 FPS with my nVidia 560 Ti in CS:GO , 1920x1080.

The precise testing haven't been done explicitly to measure increase/decrease. But feel free to test it. ;)



sort by: page size:

To be fair though, if you would take a game that seriously stresses the GPU you wouldn't see a 20% difference. You would probably see about the same constant difference of about 0.5 ms.

For 60 FPS that's an increase to 61 FPS.


I knew this would come up :-)

Doesn't improve overall game performance though.


Thanks for the support, appreciated! We are trying to develop a lightweight CV algorithm so that will affect the performance of the game client. So far we didn't get problem regarding the FPS drop.

Yeah I'm very curious as to how this effected 99% framerates and frame pacing.

I suspect only a modest hit to average framerate, but I can only imagine it hurt the actual max frametimes which make it "feel choppy" even if the framerate is still higher then your monitor's refresh rate.


Probably in games that use Ray tracing or similar esoteric features, in which case your FPS will go from 10 to 20 and Nvidia's claim will still be technically true

The PCs are provided by the tourney organisers, and it's a marginal benefit really.

Most professional CS is played at 4:3 stretched so the PC is pulling in hundreds of FPS


I’ve recently upgraded from a gtx550ti (~10 y.o.) card to an RX 570 (3 y.o.), and even though I get way more performance, it introduced a noticeable inputlag in csgo.

It just doesnt feel right, and slowmo video capture confirms it.


Fps in CSGO isn’t really so dependent on the GPU as more modern games, so comparing with a different game might be more accurate.

When profiling games, using FPS to measure performance increases/decreases is kinda silly. For example, going from 25 to 30 fps (5 fps increase) is a bigger performance improvement than going from 90 to 120 fps (30 fps increase). Much better to use frame times.

It's quite easy to get 200+ fps on CS:GO, and that's with max graphics settings or very close.

Minor performance loss. 5% fps on average. MS recommends turning it off if gaming is your primary use.

Not if GPU accelerated. If software rendered(or bad GPU), then yes, you should see a big difference.

There were some cases where there was a noticeable impact, there's quite some benchmarks out there that show a clear difference in FPS and frametimes. But it always seemed like a borked implementation to me, rather than a problem with the product itself.

It doesn't look significant because the game is old and runs already pretty fast on modern hardware. If it was more gpu intensive game, a 20% difference could make a difference between a tolerably fluid framerate and a jerky framerate.

Surely if they are both doing 60fps then by definition the performance will be the same? I guess the time spent doing the actual rendering vs waiting to do the next frame might be different, not sure if there is a reason that the characteristics would be different at 300FPS and 60FPS though since surely the only difference is how often the render code is called?

Bare in mind that Valve is running this test on a much higher end PC than most consumers are likely to have. People will want to be able to run these games on their $500 laptops , so the difference between 270 and 315 FPS might translate into a difference between 50FPS and 60FPS which would be noticeable.


keep in mind I'm talking about 99th percentile frames. I'm running a 4670k right now which easily gets over 250 fps most of the time in csgo. with multiple players, flashes, smokes, and weapon animations on screen, it can drop down to around 70. it's hard for me to notice this visually, but it does make the input feel weird.

We are talking a few percentage fps, at 1080p, where you are going from 200 fps to 220 fps.

To call that a "significant" performance lead is silly.


Well, that was quite useless and I'm surprised no one has mentioned it.

1. What you really care about is consistently having a framerate of more than 60 fps. A higher fps count can actually be worse.

2. Relatively few linux users have a gtx 680. It's a 500$ graphics card and there just aren't that many linux applications who needs it.

3. If this down-scales linearly, which it might not, we are looking at around 3 fps difference at 60 fps.


Pretty miniscule. Putting things into perspective, it's still fully possible to hit 120 FPS on my GTX 1050ti in Overwatch. A more direct comparison can better elucidate the tradeoffs though: https://youtu.be/voXc1nCD4IA
next

Legal | privacy