Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
3dfx Voodoo 5 6000 recreated via reverse engineering (2021) (hexus.net) similar stories update story
181 points by userbinator | karma 78987 | avg karma 4.37 2022-09-24 01:25:37 | hide | past | favorite | 97 comments



view as:

Original discussion (Russian): https://www.modlabs.net/forum/topic/60674/

Wow, glad to see Modlabs still going! Modding my and friend's PCs back in 2004 after reading through it and Overclockers.ru is among my fondest childhood memories.

What would you use this for?

Playing Deus Ex, of course.

Retro gaming is the common use case. GLIDE is something people like to run natively and 3Dfx ruled the roost long enough for a ton of games to be specially tuned to the API.

It's viable enough that clone cards have been built based on the reverse engineering efforts:

https://www.youtube.com/watch?v=UgbVmYn1xZ8


Not sure if you're aware, but the card in that video is the same one in the article.

I think they sold them for about $1500, the original post was made more than a year ago.

> GLIDE is something people like to run natively

I wonder why? Because of delays during emulation?


For some retro gamers any emulation no matter how perfect is a no go because they don't consider it an authentic recreation of the experience. For some even this reverse engineered clone would be unacceptable.

For some even this reverse engineered clone would be unacceptable.

I could see that if it was a software emulation or e.g. an FPGA reimplementation, but this one is still using the original GPUs so would be comparable to having a modern GPU from Asus or Gigabyte instead of the reference designs from AMD or Nvidia.


Doesn't matter. For some retro enthusiasts only original hardware produced during the era is acceptable.

By what measure? Are they actually performing electrical tests to determine minute clocking anomalies?

It mostly has nothing to do with any technical issues. It is about recapturing a feeling for days gone by and part of being able to recapture the feeling for them is only using period manufactured hardware.

I understand.

I have a lot of nostalgia for the Playstation 2. That system occupied a lot of late nights during my college and young professional years. My memories of those games are wrapped up in those tumultuous times.

As fond as those memories may be, I just can't play emulations/remasters of those games nowadays. Why not? I'm convinced it's because the loud fan noise of the PS2 is missing without the original system. That fan blast was on before, during, and after every gaming session. It filled the silence when the game paused. It picked up when the game peaked. It was there when I had loud joyful matches of Burnout 3 with friends. And it was there during long lonely nights following a breakup. The PS2 fan's noise pierced my soul.

Nostalgia is a very fickle thing. It's never just the favorite game, or food, or jacket. It's where you were at the time, the smells in the air, the twitches around you, the people you were with, the way you felt.

What is an authentic recreation of an old game?

For one, it might just be running it on the original hardware. It may be hearing some specific electric hum or click before the system starts. It may be the way the buttons on those old controllers felt. And that may be enough.

For others, it may be playing the game with a friend lost to time. Or playing the game in one's childhood bedroom, long since demolished. Or playing the game while eating a special pizza, from a place that's long shuttered.

So if recreating the experience is as simple as getting the old hardware assembled, that sounds relatively reasonable and achievable.

Still, ultimately, I prefer not chasing ghosts. Why cling to the experiences of the past when we can make new experiences today? It's a hard lesson learned. Nostalgia bites me hard.


The GLIDE api was quite different from OpenGL or Direct3D. There’s a lot of examples of games from that era that played well on 3DFx hardware but absolutely sucked on OGL or D3D. I think for example the first Unreal engine and games built off it like Deus Ex.

Obviously hardware has improved so much that you can play a game like Deus Ex on any hardware today and it performs perfectly well. But there are also games that run into problems with high clock rates for unrelated reasons. So for an accurate emulation, you would want a clocked down CPU and 3DFx chip running on a FPGA or something.


Most old games are patched and you'll get good GLIDE wrappers everywhere.

Also very famously, Diablo 2 ran amazingly using Glide. I had a Voodoo 5 5500 and didn't lag when the whole screen was filled with mobs, summons, fire, etc. while other people died to lag.

Unreal and Unreal Tournament ran flawless and fast on GeForce 2 MX DDR using DirectX

Bitchin fast 3d accelerated gaming, of course.

Interpolated pixels kind of look fine sometimes. NFS2:SE looked most fine. They don't make them like that anymore.


AIUI, Glide cards didn't really do 3d acceleration in anything like a modern sense. They accelerated the rasterization part of the rendering pipeline; they were basically very fast (for the time) 2D triangle renderers.

(Thus, reimplementing this under either OpenGL or Vulkan ought to be reasonably trivial using modern hardware.)


If we're arguing semantics, I feel like these early cards are more fitting to the "3D accelerator" name than the modern ones. They were specialized for accelerating 3D graphics (which, in the end, means pushing triangles into a 2D framebuffer). Modern GPUs are more like general-purpose vector processors that happen to also be good at calculations related to 3D graphics and can (sometimes) render triangles to a framebuffer.

i'd argue that 3d-accelerated graphics from that era is still true 3d acceleration. of course, back then the pipelines were mostly fixed - nothing like programable shaders - but you're passing in 3d-coordinates, generating some matrices to determine the MVP, passing in some textures, and getting most of that "heavy lifting" of getting the pixels on the screen from the API & card.

> they were basically very fast (for the time) 2D triangle renderers.

that's still true of GPUs today :) in fact, with the exception of h264/265 encode/decode, and "AI" - that's all that graphics cards have ever done!


The point is that the 3D-geometry portion of the pipeline is still done on CPU with these early cards. The API's may have been slightly broader, but the hardware accelerated part was limited to rasterization, i.e. painting 2D triangles.

indeed, and i agree - what i'm saying is that even today, the rasterization of triangles is still 2D :) (and necessarily will always be! well, as long as we use 2D displays..)

The triangles are in 3D space. Part of what the cards do is project them into 2D space. See pp. 18-20 of the Glide reference: https://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=18...

It explicitly states that vertices are given as (x, y). Voodoo cards did not do projection in hardware.

The GrVertex structure has z and w components too. But it’s a fair point that the x and y is specified in screen space, not world space. I had forgotten that. But the hardware does depth buffering based on the z component, so it’s still 3D.

Yeah, that's a nice "hack" the hardware offered. You can still shove in the depth after having done the screen projection so you don't need to worry about the order in which you draw the triangles. So maybe 2.5D? :-)

> rasterization, i.e. painting 2D triangles

As I explained in a sister comment, there's a lot more going on with rasterization, even back then, than "painting 2D triangles" might imply. Yes, only the XY coordinates determine the screen location of a rendered pixel, but the Z coordinate even at that stage has a lot to do with its color (e.g. for perspective-correct texture lookup) and whether it is painted at all (Z buffering).


The rendition GPU was a risc processors but alas way to early and underperforming

GLIDE was low level, it would map to Vulkan perfectly.

I’m sure it can be emulated efficiently enough nowadays, but I don’t think the mapping is very direct: Glide was mostly imperative and synchronous, Vulkan is declarative and explicitly asynchronous. Both are low-level, but for hardware that works very differently.

even 15 years ago, Glide wrappers (Glide to DirectX or OpenGl) did a great job

In terms of API design and usage Glide is far closer to classic 1.x GL.

I guess Glide could be defined as low-level API since it was defined by single vendor's architecture instead of some more abstract machine like GL and other APIs.


Being 2D rasterizers was arguably true for some cheap early graphics chips like that of the PS1. For Voodoos, there was no point in the pipeline after which rendering is “only 2D” in any relevant sense. At the time, the Z coordinate was passed to the hardware and used e.g. for important things like perspective correct texture lookup, environment maps (fake reflections), and Z buffering — that is right until the very last moment of committing the pixel to memory.

They also do texture mapping, shading, and z buffering. Those things are still the core of 3D rendering.

I prefer the Voodoo 5 9000: https://youtu.be/_3iHV0NvLPI

> The graphics chip was codenamed Napalm

It’s too bad this name has already been used. It would be great for current GPUs.


It’s just an internal code name, they didn’t trademark it, so someone else could use it again. See the new case where “Mendocino” could refer to the one good Intel Celeron, or now some low end AMD Zen 2 chips for cheap laptops.

It's a cool name if it is a good product. If it is less-than-stellar, though, it's not that far from "face-palm". LOL

Back then if something was hot, it was new, cool, and coveted. Today thermal management is a huge problem so they tend not to equate products with heat.

Afaik, thermite is still available for marketing departments to adopt.


Not sure the Vietnamese customers would agree.

Is there a good book on the history of development of graphics hardware, including the PC-era cards also?


There's no good book, necessarily. However, it's easy enough to piece together the information from Wikipedia[1] and a few articles[2]. In addition, there are a few great youtube channels that offer some great commentary on specific cards/generations and comparisons: PixelPipes[3], PCRetroTech[4], LGR[5] and the various other obscure channels recommended once you go down that rabbit hole.

1 - https://en.wikipedia.org/wiki/Graphics_processing_unit#Histo...

2 - https://www.techspot.com/article/650-history-of-the-gpu/#par...

3 - https://www.youtube.com/c/PixelPipes

4 - https://www.youtube.com/c/PCRetroTech

5 - https://www.youtube.com/c/Lazygamereviews


Not quite what you want, but Michael Abrash’s Graphics Programming Black Book is available for free online, and covers the software rendering side of computer graphics in the early-mid 90s, ending at Quake, just before 3d accelerators hit the scene.

There are a few anecdotes from hardware designers of 2d graphics cards mixed in, but the bulk of the book is about writing optimised x86 assembly (don't follow the any of it's advice these days) and writing fast polygon rasterisers.


I'm not aware of any book, either.

If you want an architectural overview, an alternative approach is to read chronologically the websites coverage starting from the late 90s; it may be relatively superficial, but there's a decent amount of information:

- https://vintage3d.org/index.php: dedicated to vintage GPUs; very interesting, but I would have liked it to be more in-depth

- https://www.anandtech.com/tag/gpus/106: start of GPU coverage for Anandtech; probably middle ground between Vintage3d and Tom's Hardware

- https://www.tomshardware.com/archive/1998/2: start of (more dedicated) GPU coverage for Tom's Hardware, but it's more awkward to navigate, and I think more superficial, than Anandtech


Psst, thg.ru has copies of old articles with non-dead images and non-broken design.

http://www.thg.ru/graphic/19980227/onepage.html

http://www.thg.ru/graphic/1997.html

http://www.thg.ru/graphic/1998.html

etc.

It's pretty sad that modern generation has no idea that back in the days tech was cool, and “enthusiasts” were those who discussed chip architecture and texture sampling, not some video personalities discussing the color of old plastic. Well, consumerist approach have always beaten the specialist dedication.

https://www.anandtech.com/print/854/

http://web.archive.org/web/20071213131847/http://www.xbitlab...

http://web.archive.org/web/20040804115729/http://www.xbitlab...

http://web.archive.org/web/20080104000403/http://www.ixbt.co...

http://web.archive.org/web/20071229163953/http://www.ixbt.co...


(campfire crackling sounds): It all started back with E&S...

I had some fun digging through the PowerVR Series 1 source code [0] that was released earlier in the year. It includes a simulator that you can use to piece together how the hardware itself worked, some of it certainly looks like a rough translation from the RTL...

[0] - https://github.com/powervr-graphics/PowerVR-Series1


You can always use dgVooDoo [0] to translate Glide to DirectX. With great performance, too.

[0] https://github.com/dege-diosg/dgVoodoo2


This brings back some memories of making 3d games in the 90s. 3DFX cards were the best. I think our graphics programmer bought some 3dfx stock. It was great at the time but who wanted to support GLIDE? Our engineers. But DirectX was the obvious thing to do.

At the time we had deals with most of the 3d card manufacturers. I remember we had a press event with NVIDIA like in 1999, with the GeForce2, and I think we had one card beforehand, I am not sure. And every computer at this even had one with lots of journalists there to play. At the event it was the first time our frame rate wasn't smooth. Up to that point all we did was to make the rendering time even in ms. So we didn't think that much about unevenness in physics or networking. Rendering was the bottleneck. But once the rendering started to get really fast we realize we had a new problem.

Happy ending: I bought stock in that NVIDIA company.


I knew a guy who bought Voodoo stock. He had a shirt and everything. Didn't end well for him.

Damn, he lost his shirt.

That vintage original 3dfx shirt is worth a pretty penny today. Could actually make up the losses from their stock.

So how much NVIDIA stock did you buy? How much profit were you able to make from your foresight?

Not much. It's just a nice memory of an exciting time.

>It was great at the time but who wanted to support GLIDE? Our engineers. But DirectX was the obvious thing to do.

And OpenGL, John Carmack was extremely vocal about it. That was before the internet and twitter so it was interviews in computer magazine and PC Gamers.


"The graphics chip was codenamed Napalm, packed in about 14 million transistors, and was built on the 250nm process." - we have progressed by 35x in 20 years.

I think more if you count teraflops.

Should be more like 35^2 considering the same area but scaled down features?

Moore's Law would correctly give a factor of 35x linear density so 35^2 transistors on a square or rectangular die. But speed doesn't scale linearly with transistor count, both because there are diminishing returns per transistor, and because clock speed plateaus.

(Remember Moore's Law is about density, not speed. Density sometimes begets speed but not always; one big counterexample is memory access latency time.)


Good old times when I owned a voodoo 3 and used to play Team Fortress.

Ah Team Fortress... the age old enemy of Starsiege: Tribes. Tribes looked better under Glide too.

Aw yeah, I saved up for a VooDoo 3 just for playing Tribes on my 56k modem, fall 1999 - running windows 98 on a Packard-Bell that we got from Circuit City…

Truly the golden age of gaming. It leapfrogged everything at the arcade and consoles of the time.

I played way more TFC than Tribes, but man, Tribes had a certain quality to it that I have never found in again in a game.

Speaking of nostalgia… the _click_ of the relay when your 3 D card turned on.

(Although I think that was only with the original Voodoo and Voodoo2)


> He created his own custom PCB and even improved the design by fitting a 4-pin Molex power connector, so an external power brick wouldn't be needed.

A graphics card with its own power supply! With this weeks' news of the 4090 using 450 watts at peak, this Voodoo card puts things in perspective.


I get the snark but that's peak power. Chances are that VooDoo uses more power at idle than the 4090. Modern silicone is very efficient are throttling down at idle and old GPUs would use max power all the time.

My voodoo 6000 had a Molex connector, I guess to facilitate testing in ordinary pcs.

(I didn’t realise what I had until years after I threw it out! I only picked it up because it was so impressive-looking: comically enormous board, four fans.)


Uh, as far as I recall, the 6000 never reached retail shelves.

There were at least 1000 test units, so it wasn’t impossible to get one’s hands on a board.

The voodoo5 5500 had molex power connector on the card. The voodoo5 6000 had an external psu.

Sorry - was just speaking about the possibility of buying a 6k. It’s been so long I wouldn’t trust my own memory of which board had which configuration so didn’t intend to comment on that discussion.

You had a voodoo5 5500.

That's what I assumed, and it was only when I was idly googling for 3dfx stuff a few years ago that I realised it probably wasn't. The 5500s all look to have 2 chips, and are normal sized. Mine had four chips, and it was... large. Like, very large. Also had a plastic handle.

Looked a bit like this one: https://robertkrau.se/blog/3dfx-voodoo-5-6000-stripe-artifac...

Here's one with a Molex connector: http://www.x86-secret.com/articles/divers/v5-6000/v56kgb-6.h... (I can't remember whether it was wires or soldered in on mine, this was 20 years ago and I didn't spend much time playing with it)

I got it from the storeroom at the video games company I was working for at the time. They were having a clear out, and we could take whatever we wanted. This was one of the things I wanted.

(Sadly they didn't let me have the Jaguar devkit, doubly annoying when I later discovered that Atari had put all their Jaguar stuff into the public domain. Mind you, I'd only have thrown that out too as well, probably, so same result in the end.)


Interesting stuff. I worked at 3dfx in the last few months of their existence. All the 6000 I saw had the external psu. I was just an IT nerd so I wasn't involved with the cards.

> With this weeks' news of the 4090 using 450 watts at peak,

Flagship GPUs like the 3090 Ti have had 450W TDPs for a while now and they work fine. It's not a new thing with the 40xx series cards.


Voodoo linux drivers were open source IIRC. No firmware blob was required. The day nvidia bought 3dfx was very sad one for me.

There was actually not much need of drivers or firmware. 3Dfx products had a very simple and straight forward way to write commands directly to registers [1][2][3].

1.: https://web.archive.org/web/20180220234020/https://www.openg...

2.: https://web.archive.org/web/20180629191415/http://www.opengl...

3.: https://web.archive.org/web/20180628125330/http://www.opengl...


It's a fixed-function accelerator. They were all like that AFAIK.

It was sad for everyone in the gaming industry.

As much as I loved 3Dfx (with a capital "D"!) by the time it was bought by nvidia, it was already on its downfall. It did some interesting things, but clearly, they couldn't keep up with nvidia.

I actually had some hope when nvidia bought it, but unfortunately, they killed the brand, only keeping "SLI", for something that isn't even SLI (scanline interleaving).

And yeah, open source drivers, but don't you find it annoying when you only get open source drivers for subpar hardware? It is unfortunately a recurring problem.


I love seeing folks pour monumental effort into their passion projects.

This card looks pretty good, but it’s no match compared to the Bitchin’fast!3D 2000. 425 BungholioMarks don’t lie!

https://imgur.io/d3SSTNT


It's such a shame that poor management decisions killed the potential for LMNOPRAM.

GDDR6 is only now, decades later, catching up.


Ha I must be the latest person to fall for this gag. 20 years on and I’m googling LMNOPRAM !

I love that having 256MB of GPU memory is just as absurd today as it was back then but for the exact opposite reasons.


That was my dream card for a long time. A Voodoo 2 (3D only, no 2D output, had to run through a second card, wow) overclocked to hell lasted me a good number of years until I upgraded to a GeForce MX 440 lol

Voodoo 2 SLI chefskiss

Ahh, brings back memories of the golden days of playing Deus Ex and Diablo 2 with a voodoo4.

Can't forget Quake 2 and Star Wars, mind blowing experiences in the late 90s https://youtu.be/cnktx06feII

I had the 5500 back in the day and it was so long I had to bend it get it to fit in my ATX case.

Legal | privacy