AGEIA’s Implementation and Demo
If you entered the article on this page from a link on another website, you may want to start on Page 1 to get all the information.
Now that you have a general view of what a physics processor is used for and what the physical card is going to look like and cost you, let’s look at how this affects gamers and game developers. After all, integrating a completely new piece of hardware like this isn’t going to be an easy task for AGEIA.
AGEIA’s Software Integration Approach
Ageia’s approach to getting the Physics Processing Unit into every gamers machine is a multi-pronged method with several steps. The first step is what we are seeing recently with the host of press releases and game announcements from Ageia. What you are seeing is the integration of Ageia’s physics software API, NovodeX, into developers game engines. In much the same way that gamers have seen and heard about the Havok physics engine being incorporated into games, these NovodeX announcements show that developers are integrating Ageia owned software into their games.
This NovodeX software API allows game developers to use currently available harware (your CPU) to do similar levels of physics calculations and functions as the competing physics engines. This particular API implementation process is more important to developers and gamers though, because having support for NovodeX in a game means that support for the PhysX processor, when it is released, is either automatic or easier to enable.
This diagram illustrates what I am referring to:
High level overview of AGEIA’s PhysX Architecture
Here the game content and engine communicates with the AGEIA NovodeX software; if there is no PhysX processor in the system, the software handles all the calculations with the CPU and gives the results back to the game engine as it is required. If a gamer’s system does have an AGEIA PhysX processor in it, the driver then communicates with the NovodeX software and the PPU then in turn handles and off loads most of the physics calculations.
This may indicate then that any games that we see released this year that have integrated NovodeX at all, will see some performance increases at the very least with a PPU installed in the system. By off loading the physics work from the processor and moving it to the PPU, the CPU is then less likely to be a bottleneck on overall game performance, as we have seen many times in current games like Half-Life 2.
This is also having another important effect of getting game developers familiar with and efficient in using the NovodeX engine and will allow the NEXT generation of games a better chance of having features and support for the PhysX processor.
So what can a PPU do for you?
As you can probably guess by what I have shown you above, simply having the NovodeX engine in the game isn’t going to bring about any of the gameplay revolutions the AGEIA staff would want you to see. After all, a game developer that implements the NovodeX engine today, or even this year, knows that very few if any gamers will have a PPU in their system and has to then develop the game for the lowest common denominator (LCD). That means that any physics that is required for their game has to be simple enough to be run on a mid-level processor efficiently. PPU owners will only see an increase in performance in this case, but how much of an increase is a complete unknown for now.
What AGEIA and even game developers envision a PPU will enable for a gamer is a world with physics unlike anything we have seen in a real time game before. We are talking about thousands of rigid bodies, real flowing water, hair simulation, avalanches of rock, clothing simulations and more. Even more impressive is the idea of a universal collision detection system that allows you to interact with absolutely ANYTHING in a game world. All of it calculated in real time with nothing scripted in the game engine.
Sure you might have seen some explosions in a game you have played before, ones that might destroy an entire building. In nearly all cases, those have been scripted, meaning the debris and fire and dust were all created specifically for that explosion scene. Their motions and reactions were probably all scripted so that they went in a particular direction at a particular time and a particular speed. But what if you could have the option of changing that? What if you could have the explostion of a dam on a river be changed in real time depending on YOUR placement of the explosives? You might place them on the very center of the dam, creating a big hole that water rushes through, or instead you might only use a small amount of explosives to destory a small side portion and let water move out more slowly and let the water pressure be the force that eventually destroys the entire dam.
Damn. That would be a cool scene, and I didn’t even see a demo of that — just made it up!
What I did see demos of were all pretty impressive. Building destructions, avalanches and car wash scenes were shown to me during E3 this past week. I have some pictures of the demos I saw and have but keep in mind that a large portion of the “wow” factor on these demos is to see them in motion. If and when AGEIA puts them online for public consumption, I’ll be sure to let you all know. For now you’ll have to make do with some pictures of monitors and my breif descriptions.
First up is a demo of fluid dynamics running on PPU powered system.
This image shows some fluid being poured from over top of a car and having it run off the car according to the angle, amount and direction of how it was poured following laws of physics. The controller was able to move and change the amount of fluids being dispersed on the demo and it was fairly impressive to see it running.
The avalanche demo showed several hundred real time rigid bodies interacting with each other and their surroundings in real time.
The demo allowed the user to control how many boulders came running down the side of the mountain, and thus affecting how each bounced off each other and off the ground. Every object you can (almost) see in the image above is being calculated in real time on the PhysX chip.
The final demo was actually a movie through an unannounced engine that shows a completely immersive world.
Though still very early in development graphically, the level of interaction I saw was very intriguing. Here we see a couple of crates being shot and and destroyed in smaller pieces than you would usually see and all very dependent on where the box was shot. It didn’t stop there though as the demo movie showed the player shooting a parked airplane and having pieces of it falling off and denting.
This screenshot does not give much into what is happening or how complex the scene is, but a plane has just recently flown into this hanger and run into a couple of stacks of items, knocking them all over and causing a large amount of interactions between the various environmental pieces. Everything in the image was affected by the planes “entry” into the scene including the bomb that has just fallen on the walkway you are standing on!
Throughout the demos, I was very impressed with what the PhysX PPU allowed the demos to run and was easily surpassing the very best that current CPU-based physics can allow for. That being said, AGEIA has a lot of work ahead of them for this type of realism to hit home in a gamer’s machine.