Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

"But how much do they really help? We found that the latency of a system with one of the NVIDIA GeForce RTX graphics cards is halved compared to a GeForce GTX 750 Ti, and nearly 6x less than a system without one."

In other words, systems with graphics cards from the newest generation perform much better than those with graphics cards from three generations ago, and better still than systems without graphics cards.

And this is surprising... to whom, exactly? This is just a marketing article that tries, poorly, to connect the product it's hawking with a currently-popular trend in gaming.



sort by: page size:

Except that’s not the case here. The RTX units are attached to the CUDA cores and are physically larger than the 10-series CUDA cores. Graphics is trivially parallelizable for the most part and the number of transistors available mostly linearly impacts rendering performance.

This is a physical reality, the new chip is bigger and it ray traces an ancient game at less than 60fps 1080p with significant noise and aliasing artifacts despite having specialized hardware.

I’m not sure why I’m expected to engage in a Gish Gallop of NVidia marketing buzzwords when the facts are clear. It’s bullshit, plain and simple. Neither you nor the grandparent are giving actual facts.


The biggest benefit of the RTX cards isn't that they'll make existing video games faster or that the new effects are strikingly more beautiful than current technology. The biggest benefit is how easy it is to add these new effects, achieving a look as beautiful as the previous state-of-the-art trickery with a simple toggle.

This is the newest Nvidia card:

The brand new Nvidia RTX 4060 “can be slower than it’s predecessor in some cases”.

“ Nvidia GeForce RTX 4060 Ti (8GB) review: Disappointing for $400”

https://www.pcworld.com/article/1925928/nvidia-geforce-rtx-4...

“Do not buy”

“The Nvidia RTX 4060 ti is a waste of sand.”

https://youtu.be/Y2b0MWGwK_U


Yes, but we've all been burned by GPU marketing in the past. Notice that they're only remarking on RTX performance. It's entirely possible that this card has a lower geekbench score or FLOPS output.

Likewise, the last time I can recall a notable increase in my own enjoyment from a new GPU was my GTX 660. It ran all the games I was playing back then (TF2, Skyrim, Civ V, Bioshock Infinite...) on mid to high settings. The newer cards I've bought since then have only felt like marginal improvements. Yeah, the graphics have more detail, more effects, better lighting. So what? Do the games actually look better, aesthetically? Are the worlds more immersive? Is the gameplay any better? Nope.

For gaming (which these cards are sold and marketed to), it doesn't make a difference today.

This x100. It's weird how so many sites, tomshardware being the most infamous of this recently, are trying to push the hype on these cards with so little information.

Take a peep at the comments on this article below --- https://www.tomshardware.com/news/nvidia-rtx-gpus-worth-the-...


In both graphics and compute ?

Its ok for nvidia to release an architecture that does compute very well, but barely improves graphics, and vice-versa.

But I don't recall any compute generation where there was a better product from the competition.


BTW the NVIDIA results they are comparing to are from the previous generation. The current generation has 50% more FLOPS and 33% more bandwidth.

Yeah, seems like marketing bunk. Especially considering how their new shiny Geforce 5xx line is only a 10-15% improvement over the last gen. Show me the numbers!

Super cards are allegedly going to be only 10% faster than non-super cards, at similar pricing #. Not significant enough to change the argument of the article I'd suggest.

# https://www.tomshardware.com/pc-components/gpus/alleged-nvid...


While they are more performant still, nvidia tries to lock customers into artificially expensive products.

RTX consumer cards suck at AI because they simply lack the memory. That is where even an Macbook Air with passive cooling and max memory config is nearly as capable as an RTX4090, the larger Macs plainly beat everything nvidia puts into consumer tech because they can load larger models.

But sure, smaller models run faster on nvidia cards and the difference is still very noticeable. But complex models fail because nvidia doesn't have enough memory on their consumer cards. If they don't change here, other hardware will prevail. If there are similar optimizations that currently exist for cuda, they might already be on par.


It depends on the game. Some games AMD cards blow Nvidia cards (of the same "tier") away, some it's the opposite. The GP commenter who said they "suck" is incredibly wrong. I can't understand people who fan-boy over giant corporations.

Except it’s three or four generations behind in performance. The article compares it to a 1060 from Nvidia.

Modern AMD GPUs are fantastic, and I would recommend them over NVIDIA cards for most people. I'm not really sure where you're getting this from.

Most cards that come out seems to sell relatively well, so seems they are doing what consumers want: better performance in favor of everything else.

> gaming console graphics hardware

High end graphics cards are usually much better than a gaming console, in that they offer much better frame-rates, and a lot more resolution, and at the same time even a single card can be more expensive than a complete gaming console.

So, yes, they are for gaming 99.9% of the time. But please don't compare them with consoles.


The main point was that its a 3 generation old desktop card which is obviously not as efficient as the modern mobile devices.

Lets see what a 3000 series nvidia mobile design does on a more recent process before declaring victory.


"Graphics cards doing statistics"
next

Legal | privacy