Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The sum of all hardware produced, sure. But the amount of hardware a single player currently posseses? Doubtful.


sort by: page size:

That's debatable, and really depends on what hardware you count.

I assumed they where talking about the 'world' economy and if that's the case it may be both true and irrelevant.

Ex: ~20,000 player city's are uploaded into a model. They do a simple calculation based on excess energy, pollution, ect. The result's of that are fed back down and then they run the model again adjusting for new city's and client city updates. Now even if 10mhz per city is used your talking about a 200GHz worth of processing which is far more than an i7 but shared and mostly irrelevant in single player as you could just as easily fake the global numbers.


Game developers also belong to that group of people who squeeze work out of every cycle. Less than 18 months ago, two of the most commonly used devices contained: 512MB dedicated RAM (with 10MB of VRAM) (xbox 360) and 256MB RAM and 256MB VRAM (PS3)

alongside very aged processors. The hardware was almost 10 years old in both cases. Even the current gen aren't particularly powerful, coming in at 8GB ram with a 1.75GHz processor, and a GPU comparable to a 3-4 year old PC for the xbox one, and 1.6GHz processors, 8GB ram and a slightly beefier GPU in the case of the PS4.


That seems like quite a bold assumption about what I'll be running. While I do expect to a game to be able to take what it needs and certainly that could be a lot, I've definitely ran multiple games or ran intensive software at the same time. It's not always so simple

> Do we need all the power a PC provides natively to make great games?

I think the market has declared this to be a definitive "yes". Users don't want to waste their hardware dollars so that you can spend them on inefficient solutions.

When your competition takes advantage of the hardware, and you don't, then your application (or game) falls behind in the marketplace.

There's the argument that users are willing to have lesser performance ... for lesser cost. This is true, but quite different from your code performing more poorly than your competition's on the same hardware.


However, since you seem to count the GPU, fully utilising a modern system is definitely not easy.

This is not true. If nothing else a lot of the work done by a game engine pipeline is actually handled by the OS or device driver (GPU, audio, and network) and scheduled in a different thread. So even a “single threaded” game would benefit from 2-4 cores.

Again this isn't about steam hardware survey it's about reality.

Out of the lastest 10 AAA titles only one I would call something that might be worth more than 4 cores and it's WD2.

2 cores on 100%, 2 more on 60-70% and 2 more on 20%. W/ HT it will be 2, 100% and the 10 left "cores" at about 15%.

And this is by far the best "multithreaded" game that came out in the past 8-12 months.

What devs do for consoles doesn't translates to PC, PCs come with a huge variety in hardware and unlike consoles where devs get 6-7 cores out of the 8 exclusively for their game on a PC they have to live with everything else from AV scans to Streaming.

No one is taking advantages of multicore CPUs because no one can do it right on a fragmented platform where you don't control over the runstate of the app, co-hosting and have zero knowledge about it's hardware and configuration.


Gonna disagree there. The game lets you run on 5 different hardware configurations at once. Is this really a problem?

I don't think that extreme is true either. I work in games and we have access to workstation hardware even though most of our players will inevitably end up playing on substantially lower end hardware. We set performance targets for subsystems, have thorough profiling available, and regularly _test_ on consumer hardware to gain the above metrics and work with them. That can be (and often is) done.

Genuine question - other than increasing the number of players, what is the horizon of value for new processors to begin with ? Is there anything more impactful than squeezing a bit more performance/watt ?

Games have necessarily become more and more parallelised (whether through multithreading or multiprocessing) over the last decade, as even consoles are multi-core these days.

Both XB1 and PS4 have 8 cores (2xJaguar modules). Both also reserve one of the core exclusively for the system, so a single-threaded game would only leverage 14% of available CPU compute power.

The Switch is somewhat similar through to a lower extent: it effectively uses a quad-core ARM, with one of the core reserved to the system leaving 3 to game developers.


Well, actually making games requires that much power. I work at a games studio, run an 8-core(16-thread) Xeon + 64GB of ram + 1TB SSD and everything I do daily is just sloooow. Running the game in debug mode uses up all my ram, compilation times are between 20-40 minutes for the whole project even when using distributed build systems, and that 1TB SSD is not helping that much if it's nearly full all the time. If we could have our workstations upgraded to something like 16-core Xeons + 256GB of ram it would be a godsend.

Indeed, and also what about the effect of multiple cores? You could have 7 cores working on game logic and one doing GC.

Really? You're suggesting they use a software renderer instead of the GPU? Even if 70% of the CPU was wasted on a modern machine, that's like a quarter core per user that they'd need to rent from Amazon. I don't think so.

As far as the multiplayer, I don't think anyone has a problem needing servers for multiplayer.

All I'm saying is that the suggestion that computations need to be done online to save local CPU, even on a singleplayer game, is obviously ridiculous.


Compared to playing a game which would definitely use both GPU and CPU, instead of mostly just CPU?

They used ~1000 CPUs and ~200 GPUs 5 months ago.

I think you underestimate just how much stuff is moving towards being GPU-run instead of CPU-run.

But, does that matter?

Do we need all the power a PC provides natively to make great games?

next

Legal | privacy