As if my blog inspired the Inquirer in writing this (yeah right) post I just discovered regarding Aegia and Nvidia's integration of their technology within their GeForce lineup of video cards. This article comments on a slightly older article discussing the technology through Unreal Tournament III:
"NVIDIA HAS BEEN embroiled in controversy over the last few weeks, thanks to the massive score boost it's enjoyed in 3DMark after enabling processing of Ageia Physx routines on the GPU.
Some folks have complained that running physics on the GPU in that benchmark constitutes cheating - Nvidia says it's simply using the technology it paid for - but regardless, what seems to have been slightly overshadowed is the fact that physics on the GPU has real-world gameplay implications right now.
The folks over at TechGage busted out their copy of Unreal Tournament 3 and given it the GPU-physics once over. The results are pretty interesting. When running at a decent-but-not-killer resolution (1680x1050 in this case), the GPU is easily able to handle the load of running both graphics and physics calculations, returning a 50FPS average framerate versus a 31FPS rate for CPU physics calculations (interestingly enough, a dedicated Physx PPU hits 60)."
The full news clip by the Inquirer: http://www.theinquirer.net/gb/inquirer/news/2008/07/07/gpu-physics-works-kinda
The older reference clip is here: http://techgage.com/article/nvidias_physx_performance_and_status_report/3