Introduction
3dfx Voodoo 5 5500 Review
This content was originally featured on Amdmb.com and has been converted to PC Perspective’s website. Some color changes and flaws may appear.
It sure has been an interesting few days in the testing labs around here. With the recent arrival of our Thunderbird hardware and Hercules and 3dfx both sending in their video cards, we have had plenty to do. Being as that I had already reviewed the Hercules GeForce 2 GTS 32mb version, I had an idea as what to expect on the 64mb version, but since I had to use the new Voodoo, I was excited to see what I could pull off using this accelerator.Let’s start off by giving you some statistical information on the card itself:
Feature | Voodoo4 5000 | Voodoo5 5500 | Voodoo5 6000 |
Processors | Single | Dual | Quad |
Est. Fillrate | 333 MPixels/s | 667 MPixels/s | 1.3 GPixels/s |
Memory | 32MB | 64MB | 128MB |
Piss poor review. V5 5500
Piss poor review. V5 5500 wins hands down in Glide Games which were a majority at the time. Never thought of testing Unreal Tournament did you? Ever hear of Wicked GL? Easy 5-20% performance boost in opengl games. V5 5500 wins hands down in 4x FSAA mode…similar performance with 2x FSAA. Your Cpu was too blasted slow to unleash the V5 5500…and for the record the V5 6000 had enough muscle to compete with the Gefore 3 series. Oh yeah.. and the 32mb geforce 2 most certainly was NOT faster than the V5 5500.
Geforce 2 GTS cons:
2-3 games supported T@L.. which was basically a marketing hype at the time.
No Glide support and wrappers sucked.
Expensive.
Didn’t have the molex connector which for the record was on the V5 5500 because of AGP slot issues with some motherboards..aka the AGP slot didn’t provide enough power to meet the 3.3V spec.
The fact of the matter is this is a biased review and was quite possibly influenced by some $$ from Nvidia.
Nvidia lied about their tnt being faster than the Voodoo 2 sli when in most cases it was beat by a single v2 8m card. Then they screamed about the 32-bit rendering on the TNT2 which it could only run at a very low res.. forget 800×600, 1024×768…not to mention measly 3 games that actually supported 32-bit rendering. So most people were running 1024×768 in 16-color mode. Meanwhile the Voodoo 3 could run 22-bit color at the same res. Then f’n nvidia jumped the gun and released the Geforce 1 to further steal market share from 3dfx. Then there was the Geforce 2 shenanigans.
Oh wow.. echoing your rage 12
Oh wow.. echoing your rage 12 years later is the most saddest thing ever. Go out of your mom’s house and get some sun.
One thing that really stands
One thing that really stands out years later; is i have this card and for its time and currently even though it can’t compete with my 8 GB 12 gig max card it it still one of the best videos cards that were put on the market. Yes, HANDS DOWN!!
Oh wow.. echoing your rage 12
Oh wow.. echoing your rage 12 years later is the most saddest thing ever. Go out of your mom’s house and get some sun.
Typical comment from a troll
Typical comment from a troll who knows nothing. Can’t even make a intelligent comeback.
Trolling is trolling, some
Trolling is trolling, some people really can’t handle the thruth. Regardless of the time it happenned. 🙂
For the “anonymus” with the long answer, your text is 24k solid gold ! Superbly said.
Sadly, I don’t have the patience anymore to start and tweak my V5 5500, so I’ll be thinking about selling it for about it’s original listing price.
Indeed, nVidia was an ass at the time (not that nowadays something has changed 🙂 ) but games supporting glide after 2000 were a burdain in high detail for this card.
On a P3 600 rig I can’t even breach 20fps in 1024×768 res with ProRally Championship 2001.
Nice looking card, useless fans (a larger passive heatsink would’ve been better), power hungry and weak a f in 3D.
Gonna revert to a V3 3000 eventually for glide games.
Nice article.
And yes, I too believe nVidia has butterred some pockets in order to obtain this kind of reviews.
Yes the rant, ages after the
Yes the rant, ages after the cards were released, is funny, but I basically agree with him. It was known that Q3 performed better on Nvidia cards, in the same way that later, Doom3 performed better on Nvidia cards and Half Life 2 on Radeon’s. So basing a review on one game is biased and unprofessional. Nvidia definitely used marketing hype in a very dubious manner in order to get to market first. Pushing people to buy cards for features which barely existed in games at the time. The 32bpp feature at the time of the V3 was pointless. the cards of the time simply didn’t have the bandwidth to play at decent resolutions and 3dfx’s 22bpp equivalent rendering whilst using 16bpp bandwidth really looked good and was a stroke of engineering genius. Remember Unreal in Glide in 16bpp?
The sdr GeF1 should never have been released, or it should have been released as a cheaper variant. The memory bogged the card down terribly. Imagine paying top $ for this, only to see the performance increase of the ddr version a few months later. No architectural changes, just the chip operating with its intended memory. And by the time t&l was more widespread, these cards didn’t have the bandwidth for many of the games, especially not the sdr version. Then they did the same with the GeF2. I don’t recall the exact issue, but it was also related to drastic unbalance between gpu and memory. GeF2 proper potential was only realised in GeF2 ultra.
I’m not saying the Nvidia cards of that era were all bad, but they must have known that their first releases were way below potential. I owned a DDR GeF1 and a GeF2 GTS. But I also owned a V3 and V5 (all sold now except for the V5). The DDR GeF1 was awesome, mainly due to its superb bandwidth, which came close to its theoretical maximum. And the early geF2 cards mostly beat the V5 on pure speed at low res, without a doubt. But if i recall correctly the V5 was at least equivalent at higher res due to its bandwidth. This is impressive, as it was intened to compete with the GeF1, but was released late. And fsaa was a useful feature; it made a huge improvement in some games, even given its speed hit. I never had a problem with any game with the voodoo cards. They all just played. Smooth as butter.
I think that 3dfx were more realistic in their designs and honest in their feature and performance marketing claims. They focused and what could be implemented in games at the time, and adopted the cutting edge ones when it was more viable, due to bandwidth increases, to implement them.
Yes the rant, ages after the
Yes the rant, ages after the cards were released, is funny, but I basically agree with him. It was known that Q3 performed better on Nvidia cards, in the same way that later, Doom3 performed better on Nvidia cards and Half Life 2 on Radeon’s. So basing a review on one game is biased and unprofessional. Nvidia definitely used marketing hype in a very dubious manner in order to get to market first. Pushing people to buy cards for features which barely existed in games at the time. The 32bpp feature at the time of the V3 was pointless. the cards of the time simply didn’t have the bandwidth to play at decent resolutions and 3dfx’s 22bpp equivalent rendering whilst using 16bpp bandwidth really looked good and was a stroke of engineering genius. Remember Unreal in Glide in 16bpp?
The sdr GeF1 should never have been released, or it should have been released as a cheaper variant. The memory bogged the card down terribly. Imagine paying top $ for this, only to see the performance increase of the ddr version a few months later. No architectural changes, just the chip operating with its intended memory. And by the time t&l was more widespread, these cards didn’t have the bandwidth for many of the games, especially not the sdr version. Then they did the same with the GeF2. I don’t recall the exact issue, but it was also related to drastic unbalance between gpu and memory. GeF2 proper potential was only realised in GeF2 ultra.
I’m not saying the Nvidia cards of that era were all bad, but they must have known that their first releases were way below potential. I owned a DDR GeF1 and a GeF2 GTS. But I also owned a V3 and V5 (all sold now except for the V5). The DDR GeF1 was awesome, mainly due to its superb bandwidth, which came close to its theoretical maximum. And the early geF2 cards mostly beat the V5 on pure speed at low res, without a doubt. But if i recall correctly the V5 was at least equivalent at higher res due to its bandwidth. This is impressive, as it was intened to compete with the GeF1, but was released late. And fsaa was a useful feature; it made a huge improvement in some games, even given its speed hit. I never had a problem with any game with the voodoo cards. They all just played. Smooth as butter.
I think that 3dfx were more realistic in their designs and honest in their feature and performance marketing claims. They focused and what could be implemented in games at the time, and adopted the cutting edge ones when it was more viable, due to bandwidth increases, to implement them.