Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The most hilarious enhancement chip is the SA-1, used in the Super Mario RPG and other games.

https://en.wikipedia.org/wiki/List_of_Super_NES_enhancement_...

It's essentially the same exact chip as the SNES's main CPU. Except three times faster.

Imagine buying a game for your 3ghz PC that ships on a flash drive. And that flash drive also contains a 9hz (edit: whoops, 9ghz, haha) Intel Core i9 CPU.



view as:

This was almost a thing with Xeon Phi... 57 CPU cores for 2 slots...

A 9hz microprocessor would indeed be something. Start the game up, come back 10 days later and one frame would almost be rendered...

Never had that centered Nintendo logo been a bigger disappointment.

It might even be longer than that... I wonder how long it would take to render one frame of a modern game by hand if everyone on the planet joined in (let's make it easy and render at 160x120).

I don't even have a napkin to write on so this will be the most back of the hand of calculations.

One floating point calculation for a human would probably take on the order of 2 minutes. That's 1/120 FLOP. Let's round it to 1/100. It's not like we're accurate here.

Let's say a GPU is 1 TFLOP. That means the GPU is 100 trillion times faster than a human. With 10 billion humans on earth (again, let's just round these numbers), that makes the GPU 10000 times faster than the combined power of all humans.

I don't think computational requirements scale linearly with resolution, but let's just assume it does. Your regular 1080p screen has roughly 100 times the pixels compared to your 160×120 screen. 10000/100 = 100, which means that the framerate of your manually computed game will be 1/100 of the performance of a regular GPU playing in 1080p.

I welcome someone else to make the same calculation. It'll be interesting to learn if I'm close to a reasonable answer.


They could do that because the SNES's main CPU was ridiculously pokey even for the time it was released. Even calling it 16-bit was a bit of a white lie.

Legal | privacy