"But how much do they really help? We found that the latency of a system with one of the NVIDIA GeForce RTX graphics cards is halved compared to a GeForce GTX 750 Ti, and nearly 6x less than a system without one."
In other words, systems with graphics cards from the newest generation perform much better than those with graphics cards from three generations ago, and better still than systems without graphics cards.
And this is surprising... to whom, exactly? This is just a marketing article that tries, poorly, to connect the product it's hawking with a currently-popular trend in gaming.
Except that’s not the case here. The RTX units are attached to the CUDA cores and are physically larger than the 10-series CUDA cores. Graphics is trivially parallelizable for the most part and the number of transistors available mostly linearly impacts rendering performance.
This is a physical reality, the new chip is bigger and it ray traces an ancient game at less than 60fps 1080p with significant noise and aliasing artifacts despite having specialized hardware.
I’m not sure why I’m expected to engage in a Gish Gallop of NVidia marketing buzzwords when the facts are clear. It’s bullshit, plain and simple. Neither you nor the grandparent are giving actual facts.
The biggest benefit of the RTX cards isn't that they'll make existing video games faster or that the new effects are strikingly more beautiful than current technology. The biggest benefit is how easy it is to add these new effects, achieving a look as beautiful as the previous state-of-the-art trickery with a simple toggle.
Yes, but we've all been burned by GPU marketing in the past. Notice that they're only remarking on RTX performance. It's entirely possible that this card has a lower geekbench score or FLOPS output.
Likewise, the last time I can recall a notable increase in my own enjoyment from a new GPU was my GTX 660. It ran all the games I was playing back then (TF2, Skyrim, Civ V, Bioshock Infinite...) on mid to high settings. The newer cards I've bought since then have only felt like marginal improvements. Yeah, the graphics have more detail, more effects, better lighting. So what? Do the games actually look better, aesthetically? Are the worlds more immersive? Is the gameplay any better? Nope.
This x100. It's weird how so many sites, tomshardware being the most infamous of this recently, are trying to push the hype on these cards with so little information.
Yeah, seems like marketing bunk. Especially considering how their new shiny Geforce 5xx line is only a 10-15% improvement over the last gen. Show me the numbers!
Super cards are allegedly going to be only 10% faster than non-super cards, at similar pricing #. Not significant enough to change the argument of the article I'd suggest.
While they are more performant still, nvidia tries to lock customers into artificially expensive products.
RTX consumer cards suck at AI because they simply lack the memory. That is where even an Macbook Air with passive cooling and max memory config is nearly as capable as an RTX4090, the larger Macs plainly beat everything nvidia puts into consumer tech because they can load larger models.
But sure, smaller models run faster on nvidia cards and the difference is still very noticeable. But complex models fail because nvidia doesn't have enough memory on their consumer cards. If they don't change here, other hardware will prevail. If there are similar optimizations that currently exist for cuda, they might already be on par.
It depends on the game. Some games AMD cards blow Nvidia cards (of the same "tier") away, some it's the opposite. The GP commenter who said they "suck" is incredibly wrong. I can't understand people who fan-boy over giant corporations.
High end graphics cards are usually much better than a gaming console, in that they offer much better frame-rates, and a lot more resolution, and at the same time even a single card can be more expensive than a complete gaming console.
So, yes, they are for gaming 99.9% of the time. But please don't compare them with consoles.
In other words, systems with graphics cards from the newest generation perform much better than those with graphics cards from three generations ago, and better still than systems without graphics cards.
And this is surprising... to whom, exactly? This is just a marketing article that tries, poorly, to connect the product it's hawking with a currently-popular trend in gaming.
reply