Realism has been a driving force behind gaming, but games themselves are nothing without the hardware they run on. Without graphics cards we would not be able to experience the immersive virtual worlds we have been spoiled with. There is however a problem with realism; its very resource intensive.
The more game developers try to create more realistic worlds the more current technology is pushed to its limit. If you ask average Joe what realism in gaming is all about they would probably tell you about the graphics, but any gamer knows graphics are nothing if the world they render is skin deep. This is where physics comes into play. Physics have been present in games in some form since the start. Take Pong for example; two paddles on each side of the screen bounce a ball between them, if the ball lands at a certain angle with the paddle going at a certain speed then the resulting bounce will be relative to those variables.
Physics is actually a lot simpler than any explanation can make it, the simplest way to describe how game physics should work is by looking at the real world. This is what game developers have been doing for years; trying to bring real world to the virtual.
Cry Havok
There are two elements to game physics; namely the hardware and the engine. There are two main physics engines at present, these are Havok and PhysX. Havok's physics engine is unsurprisingly designed by Havok, it was first released back in 2000 and is currently on its 6th version which was released in August 2008. Havok has been used in over 200 game titles; these include Company of Heroes, Soul Calibur IV, and the upcoming StarCraft II and Diablo III. Havok is released to developers (after they pay a license fee of course) as a Software Development Kit also known as an SDK. This SDK allows developers to use the engine in all aspects of their game that require physics. Havok physics is in fact such a successful engine that it has been integrated into the PlayStation, PlayStation 2, PlayStation 3, PSP Nintendo Wii, and the Xbox 360.
Havok's success has been due to its easy implementation not only by game designers but also by movie studios. Havok proudly lists on its site a few on the movies that the Havok physic SDK has been used on, this list includes 10.000BC, X-Men: The Last Stand, and Troy. Havok also quotes on their site the numerous compliments that its engines have received from some of the major development studios. These studios include LucasArts, Bungie Studios, Bethesda Studios, Midway, and Irrational Games.
The competition to Havok is PhysX. PhysX was the start of all the physics commotion. Ageia burst onto the scene touting their new hardware and engine as the greatest thing since sliced bread, unfortunately some people believed them. Not long after the launch of the Ageia PhysX card and the accompanying SDK in 2006 game developers quickly started incorporating the engine into their games.
One of the most notable games at the time was Tom Clancy's Advanced Warfighter. Ironically Havok claimed that Warfighter used the Havok engine. While Havok was not hardware dependent and therefore could run on any system configuration, PhysX was proprietary and therefore coded to run on specific hardware, namely the Ageia PhysX card. In the beginning the hardware was the Ageia PhysX card but since Nvidia's buy out of Ageia all CUDA enabled Nvidia graphics cards are capable of utilizing the PhysX SDK. A few gentlemen on the internet have however hacked drivers and have gotten PhysX to run on a couple of HD4870's.
PhysX has an impressive list of titles under its belt but before the take-over by Nvidia there was very little reason to warrant the purchase of an expensive Physics Processing Unit (PPU). The frame rate hit in the beginning was a shock to some, but after a few tweaks there was new hope for the foundling tech. Unfortunately in the long run, a frame rate boost was too little too late and hardly the point of the PPU, namely to provide more interaction in the virtual world without any performance degradation.
Since the acquisition of Ageia by Nvidia there has been hope for those who wanted a PPU and were unable to afford one. Nvidia released CUDA, which takes advantage of the unified architecture of the 8-series and all subsequent series GPUs and allows the system to use the processing capabilities of the GPU in a more general application such as physics. This approach turns the GPU into a GPGPU (GenerahPurpose processing on Graphics Processing Unit). What this basically means is that if you have a few clock cycles to spare then why not throw some more calculations at the GPU and score a more realistic environment because of it. That would be the Utopian idea, however the facts are that if you want PhysX to work on your Nvidia GPU then you best have at least two of them.
AMD, the swing vote...
With the acquisition of Ageia by Nvidia and Havok by Intel, AMD was left to decide which side it wanted to fight on and in this case they chose the devil they knew. It seems that AMD prefer Havok even though Nvidia made PhysX an open standard in March 2008. While AMD has its own SDK available this is more for its own FireStream cards which are not targeted at the consumer market but rather high performance computing sectors such as medical and finance.
AMD and Intel have been working on improved implementations of Havok on the hardware level; this alone should be a statement of epic proportions. Does Nvidia's PhysX really stand a chance if both of the largest microprocessor manufacturers are supporting the competition? AMD's decision to stick with Havok is probably due to Havok's maturity as an engine, and that it was more ‘open’ than PhysX, which kept in line with AMD's market strategy. AMD has said that it sees no reason to switch to PhysX at this point but as true to form they didn't deny the future potential.
Newton's four horsemen
The simulation of real world Newtonian physics is a partnership between hardware and software. While the SDKs and APIs can pass the information onto the hardware, the hardware itself must be capable of handling the immense calculations required to provide this realism factor. In the world of physics hardware there are four options; namely: Ageia, Nvidia, AMD, and Intel. Ageia is now redundant thanks to Nvidia, who AMD are copying while Intel is reinventing the wheel.
An interesting state of affairs when considering that Havok is for now limited to CPUs. Fear not though for as you read this, very smart people are punching away at their keyboards, hammering out code that will move current Havok calculations from the CPU to the GPU changing the landscape once again and justifying AMD working so closely with the Havok team.
So where does Intel fall into the picture? Well the answer is two-fold. Firstly as owners of Havok they benefit from AMD's interest and support, having a major GPU player supporting your software is just the kind of support that a company needs when their competition is in a similar situation. AMD GPUs are present in two of the three major gaming consoles. It is this pressure that might see developers adopt Havok instead of PhysX.
Larrabee all that you can be
Intel's upcoming GPU, Larrabee is rumored to be the next big thing in GPU design. Intel has designed Larrabee from the ground up to be a GPU based on the x86 instruction set. What this means is instead of simplistic stream processors found in current GPUs, Larrabee will be hypothetically 8, 16 or 32 CPU cores speculatively based on the new Core 17 fab process. These CPU cores are all based on the old Pentium design and even though they have been updated to support x86-64 are unable to achieve out-of-order execution. Since these are CPU cores they would require an entirely new approach to graphics meaning that while GPUs are trying to be CPUs, the opposite is also taking place. Larrabee is based on x86 CPUs. This will make the transition for Havok a lot smoother as it currently works on the same platform. Intel has hinted that Larrabee will be capable of the absurdly illusive art of real-time Ray Tracing amongst others, something GPUs have never been powerful enough to achieve. Details are still sketchy but the graphics world is holding its breath to see if a multi-core CPU can be a GPU and if so, what API it would use.
Long story short, if they get it right, Intel will have the hardware capable of doing physics calculations and thanks to their acquisition of Havok they will be laughing all the way to the bank if Larrabee is launched capable of GPU and PPU tasks.
How will the Apple Fall?
Things are not as cut and dry as they appear. Information is sketchy regarding some deals but it seems that AMD could have also adopted PhysX. The challenge that developers face is whether they should code for Havok or PhysX; this little choice could spell disaster for both. Some speculate that developers could start calling for Microsoft to write a physics API into DirectX because supporting a single universal API is easier than supporting two. An API war could see one side, or even both, making their API fully open in an attempt to lure developers and make it a standard.
This sort of competition is good for consumers in the long run but unfortunately will leave early adopters cursing if it doesn't go in their favor. Whichever way you slice it, physics is still young, and like graphics of old, requires time to mature. Hopefully in a year or two the future of physics will be a clear path and consumers can spend their hard earned money on technology that they can actually use.
CrossfireX and SLI: Platform Choices
One thing that hybrid Crossfire or SLI configurations may be useful for is taking the strain of physics off the CPU.How to Interpret Female Body Language
Understand female body language and you can have almost any woman you desire. Many men fail because they fail to read what a woman wants and say the wrong things that will impress her.Crossfire X vs SLI
New cards means another new round in the ongoing multi-GPU battle.