Difficult Transistion and Final Thoughts
If you entered the article on this page from a link on another website, you may want to start on Page 1 to get all the information.
With all the new technologies and ideas, even the great ones, there is a period of transition that can not be avoided and be very painful. In many cases, the transition is so hard that the technologies or company runs out of steam and/or money and we are left with nothing to show for it. The most recent transition to take place in the enthusiast PC market has been PCI Express, and to say that it has been picking up slower than expected would be an understatement. But with Intel and the entire industry behind it, there is little doubt it will make it eventually. Another transition, years ago, that is very similar to what AGEIA is trying to do, is that of the CPU to GPU migration.
It wasn’t that long ago that video games didn’t require a graphics accelerator card to run. Every game engines math and graphics calculations were done on the processor and output through the standard 2D frame buffer video card that was required for a computer to output to a monitor at all. As games and graphics became more complicated though, the idea of having a dedicated 3D graphics processor emerged and took foot hold. There was a transition time for this technology as well, where some people had GPUs and others did not, and because of that fact, games continued to be developed for CPU-only processing in order for developers to hit a larger market. Soon though, anyone gaming on a PC would have a GPU and the developers could move past the CPU as a graphics processor and dedicate their time to the faster, much better suited GPU for their game engines.
But the GPU had a couple of advantages to help in its transition that the PPU doesn’t have. First, the 3D graphics processing unit had a very good stepping stone to work with in the form of an already existing graphics card. 2D video cards already existed, and the jump to 3D video cards was a small one that simply meant users wanting to get a 3D GPU would simply buy a new video card to replace their current one, or more likely, wait until their older 2D card was outdated and then when buying a new one, make sure to get one with 3D acceleration. This meant that the GPU transition was going to happen eventually to everyone, and when that did occur, developers knew that PC gamers all had to have the technology in their machines, and could stop holding back their games by keeping support for CPU-based graphics.
This is NOT the case with AGEIA and their PhysX processor. There are no current PPU in anyone’s system that I know of, so upgrading to an AGEIA PPU isn’t really on the table. AGEIA is instead attempting to add an entirely new genre of card into your PC.
The other main disadvantage that the AGEIA PhysX PPU has is that in order for games to use physics in a way to drastically affect gameplay, they have to be able to write ONLY for the physics model of a PPU card. On the other hand, in order for finicky gamers to buy a PPU card, they are going to want to see games that take full advantage of the physics processor. It brings about a “chicken or egg” debate on who will be the ones to bite the bullet and spend the money first: a gamer on a PPU card or a developer on PPU card development. Why is this a requirement? Simply put, developers can’t dramatically change gameplay in a game engine utilizing a PPU in a way that is not possible to also do on a CPU-based physics engine without alienating a HUGE portion of their market.
That means that what current game engines, and those for the immediate future, that have implemented support for the PhysX processor with the inclusion and use of the NovodeX API are merely going to see game “fluff” added to systems with a PPU in them. In this case, I mean “fluff” in the sense of new effects and interactions that may be very, very cool, but won’t be required to finish or play the game. There won’t be any game coming out that is going to require you to blow apart a building full of completely interactive crates (that wouldn’t be possible for a CPU to handle) to find a key to move on to the next level. Instead you might see some added mist into a jungle portion of a game that instead of being stagnate and permanent, might move when you or another character walks through it.
What is it going to take for the industry to make the move to requiring a PPU to play a PC game? It’s hard to tell, but there are several possibilities. First, a BIG name developer like id or Epic could require a PPU in a new engine they are creating. This would force gamers wanting to take part in those games to take the plunge in and buy a physics processor card. But any developer is unlikely to do this as they would be taking a move that would cut out a very big portion of the market, and thereby their game sales. Another option might be to bundle a PPU-enabled game and card together in a way that makes the option seem more appealing to gamers. Again, this would have to be a big name title from a very big name game company to work at all. Finally, and our most likely option, we’ll have to wait several years for PPUs to come down in price and for developers to find enough neat “tricks” that gamers want to have a PPU in their system for the added “fluff”. Then developers can fully utilize a PhysX processor for in game physics and real world interaction.
What, that’s not enough of a hassle for you? Well then think about this: say by some miracle it only takes a year to get the current gen PhysX processor in every gaming PC. Game developers can then create games that allow for a great level of interaction: now you can destroy every item in the game, down to a fist sized object and still maintain good performance. But then AGEIA or some other technology company releases a newer PPU that has the power to calculate that same destruction level but down to a coin sized piece. Great right? Well, then we have to go through this same transition time again, as a developer can not fully utilize the power of the new PPU until the majority of users has upgraded.
From my perspective, it just doesn’t seem that the upgrade/life cycle of a physics processor is going to be ANYWHERE NEAR as easy to handle as the current upgrade cycle of video cards in relation to PC games.
A PPU Belongs in a Console
Another thought that occurred to me during my meeting with AGEIA at E3 was the fact that the perfect place to have a dedicated physics processor was in a console system. In that case, developers have a static platform to write games for and they no longer have to worry about what percentage of the market actually has a PPU in their system. This would allow console gamers to see the benefits of physics processing almost immediately. This may also have the effect of tweaking PC user’s interests in the technology and accelerate PPU card purchases in that market.
Chances are the AGEIA PhysX chip just came along too late and was too expensive for either Microsoft or Sony or Nintendo to implement in this generation of consoles. Though in 3-4 years who knows what might happen?
A Competitor on the Horizon?
Something that has been discussed on this topic of a physics processor is whether or not a company like NVIDIA or ATI would be considering entering the market. With a large install base from both companies, either might be able to use that advantage to easily dominate the market over AGEIA. I have asked both companies about their intentions in this area and both have replied with a “not at this time” answer. What is interesting to note though is that an NVIDIA rep pointed me towards a website called GPGPU.ORG that collects and categorizes research that looks into using a GPU as a general purpose processor. There are several tests on the site that point to topics used heavily in physics simulations, though not many are using the lastest GPU hardware (6800, X800) so its hard to get a direct comparison to GPU vs CPU physics work. It does point to the possibility though of either of the graphics giants making a move into the world of PPUs.
This article really is only the tip of the iceberg on the subject of the AGEIA PhysX processor. There is a lot more of the technology that we know nothing about, including how fast the chip runs, what kind of pipeline it has and in reality how it functions at all. We could have spent a lot of time looking at how the current move to dual core processors can handle physics processing and if the AGEIA chip is going to be able to out pace the idea of developers using a second core for all physics calculations.
AGEIA has a great new technology idea on its hands and if they can get the right support from the game developers AND from gamers, they may just change the way games are played on the PC forever. They have a long, uphill battle ahead of them to get there though, and PC Perspective will be following it all.
I am curious to know what you all think about this development, and I have started a thread in the PCP forums to discuss it.
Update (May 2, 2006): You can find two newer articles on AGEIA’s PhysX technology at PC Perspective. At GDC 2006, AGEIA revealed a lot of details on the PhysX processor hardware and the features it sports. As of today we also have several videos playing Ghost Recon: Advanced Warfighter as well as the Cell Factor demo. In GRAW we show pairs of videos that demonstrate how game play changes when adding an AGEIA PhysX processor.
Be sure to use our price checking engine to find the best prices on the BFG AGEIA PhysX PPU Card, and anything else you may want to buy!