Ageia was always a dog without a home. They were simply looking for an exit strategy and they must have made nVidia a tempting offer.
IMHO nVidia didn't need Ageia, but thought it might come in handy for reasons mentioned hereabouts. Not nearly as expensive as some speculative acquisitions which spring to mind (AMD+ATI) but not likely to be particularly revolutionary either, IMHO.
Physics processing for the mainstream has to be API lead. So we can expect that to come from MS so Intel, AMD and nVidia will all need to get in touch with them and develope their own hardware to suit.
I get the sense that virtualisation may assist here, since physics on a GPU is not like physics on a CPU unless you have that layer in between software and hardware that can make the distinction insignificant to the software. In which case AMD have a decision to make, do they add their own version to the GPU (& compete with nVidia) or the CPU (and compete with Intel) or both ?
In any case virtualisation is just jargon for the time being and we are heading down the multicore CPU route so the CPU seems the obvious place for physics. But nVidia dont make CPUs and I wonder if not a little of their motivation in getting Ageia was to prevent anyone else getting it. A dog in the manger, as it were!
Thanks for destroying the best thing that could've happened to gaming, now I'll be waiting 10 years for a feature to be added to a game while I could've had that feature in 2 years had there been dedicated physics cards.
I don't want to have to buy an nvidia GPU just to get add on physics. At least with the PPU card, it didn't matter what video card I had, and if I don't want or need to improve the visual effects of the game I'm playing but would like more interactivity, all I have to do is buy a new PPU and not a whole new video card.
Everyone knows that the CPU is dead, it's not developing beacuse of the frequency-/heat-wall.
So the foreseeable future is the discret processors, gpu, fpu and maybe spu. Whatever x-pu you can imagine or come up with.
The gpumakers whant the fpumarket also for sure but the want to do it on their product in the channel they know and trust.
This is like when GM and Ford bought the bus-companies in the USA and closed them down.
Kill the competition, and a cheap low-selling...no brainer.
I'm just waiting for Intel to release their high-performance GPU's and later FPU's.
They are in desperate need to branch out.
Everyone knows that the CPU is dead? Really? Have you sent a note off to Intel and AMD? I dont think they got the memo.
A GPU, FPU or PPU are all processing units. Any efficiency implemented in those parts can be implemented in a CPU. Any physical challenges in terms of die size, heat, and signaling noise faced by the CPU are also faced by those other, transistor based, parts. What are you getting on about?
All of these parts are a collection of a ton of transistors arranged on a die and coupled with some defined microcode. How they are arranged is a shell game where various sides basically bet on the most commercially viable packaging for any given market segment.
There is no such thing as "dead" or "alive". With transistor based electronic ICs, there are simply various ways of solving various problems and an ever moving landscape target based on what end users want to do. Semiconductor firms can adapt pretty easily and the semantics dont matter at all.
I remember similarly ridiculous comments with the advent of digital media when people were saying the "CPU is dead" and the future would be "all DSPs". Back then I was equally amazed at just how far some can be from "getting it"
So basically from the article it appears that Intel wants PPU hardware to fail so that the CPU is needed for processing Physics and they can continue to push (sell) for newer CPUs for higher performance.
Therefor it appears that Nvidia may want to compete in the CPU market or even in the PPU market so that extra performance can be gained out of the existing CPU power. Nvidia is already in competition with intel in the general chipset market. Then they could claim better performance than intel by adding the PPU technology; unless intel starts to do the same.
I'm thinking there might be some near term benefit from adding a few secondary chips to the gpu card. There is bandwidth enough in pci-e 2.0 to handle some additional calculations.
I'm thinking:
phyics ppu
sound apu
Nvidia has experience (some a bit dated) in both, but they now own the harder of the two to provide. Now they may run into power and heat budgets that constrain them from pursuing this route, but there is plenty of pci-e bandwidth to handle all these things on a single or even dual (sli) style card(s).
With even Asus going into the sound route that sounds like an easy one to cover (and one that has benefits in the home theater arena as well). Now that they have the physics I'd think the trio would work well. And when the gpu advances enough to cover one or both of the secondary chips you just remove them, let the gpu take over the calculations, lower the transistor count and move along. You continue to keep the same set of calculations all based on same card and all under the same roof.
And with their experience in OpenGL acceleration, I'd take a reasonable bet they could do OpenAL acceleration as well. Who knows, will we see eventually OpenPL? :)
I think that the whole PPU market missed it's target. Instead of targeting the single-player FPS/RPG etc market, they should have targeted the MMO developers market.
Any advanced physics needed for a single-player FPS can be handled by a multicore CPU or alternatively on the GPU shaders with a bit of work.
However imagine a large non-instanced MMORPG with tens of thousands of players and NPCs interating in combat and other tasks. All the collision handling, hit calculations etc HAVE to be done on the server side to prevent cheating. This puts a large strain on the server hardware.
Now imagine a server farm basicaly a multinode cluster handling a large world with each server hosting an area of the game. It has to handle all the interaction and in addition network traffic, backend database handling. A single or dual PPU setup with proper software could make worlds (if not universes) of difference to the player experience, offloading the server CPUs from a bulk of operations.
Huge improvements to player experience here. Maybe I just miss information on these, but I have yet to hear about an MMO that actualy uses this kind of technology.
Nvidia should start calling their next-gen cards with PhysX support "VRPU"s instead of GPUs. Call it a Virtual Reality Processing Unit.
The CPU feeds it objects. The VRPU can use the vertex and texture data for both graphics and physics. Add some new data to the textures for physics and off it goes. The CPU can sit back and feed in user and network inputs to update the virtual world state.
Isn't DirectX 11 supposed to ship with physics support? AMD/ATI has said accelerated physics is dead till DirectX 11 comes out, and I tend to agree with them.
Why? I mean really. Why would some theoretical "OpenPL" be "better"? Unless you have some transformational OSS agenda, it wouldnt be.
Come back to practical reality. Im sure its important to niche Linux fans or the small Mac installed base that everything be NOT based on DirectX, so go ahead and spark that up.
For (literally) 90+% of gamers, DirectX works out just fine and DirectPhysics *will* be the best solution.
A theoretical "OpenPL" would be the same as OpenGL. Marginally supported on the PC, loudly and often rudely evangelized by the OSS holy warriors and, ultimately, not all that much different from a proprietary API in practical application when put in context *on Windows*.
"Who is NVIDIA's competition when it comes to their burgeoning physics business? It's certainly not AMD now that Havok is owned by Intel, and with the removal of AGEIA, we've got one option left: Intel itself. "
I think you are wrong. Microsoft is the principal actor of this “useful technology”.
If Microsoft adds physics to the DX11 API (or even some add-on to the DX10) with the processing done at CPU and/or GPU level AMD will not lose nothing. In fact any Intel/Havok or Nvidia/Ageia implementation might not be even supported. So there goes the Intel and Nvidia investment down the drain.
Of course you are right by excluding Microsoft now because they didn’t show anything yet while Ageia and Havok have real products. But I think it is all up to DirectX.
Resuming I think it’s all up to Microsoft for physics to succeed because I don’t see different implementations to succeed or to get support from developers.
In the end AMD might even have the final word if get to use the Intel version or the Nvidia version ;) or wait for Microsoft…
Two things that weren't discussed that should have been:
1.
We are already GPU limited in most games, so where exactly are we supposed to get this processing power for physics from? Will I in the future have to buy a second 8800GTX for $500, instead of a PhysX card for $99, like I can do now?
2.
Hasn't there been some rumours about Direct Physics, where Microsoft does vendor neutral, GPU accelerated physics?
I could be mistaken on this, but hasn't Nvidia decided to start putting onboard graphics on ALL motherboards with their chipsets? Wouldn't that be a great way to have a second GPU to handle physics. Albeit, it may be an underpowered GPU, but it may work well enough to offload physics from the CPU. Like it was pointed out, beats having that horsepower sitting there going to waste.
I had the same question until I re-read the "official" position of Nvidia. They don't plan (for mainstream market) to integrate the chip into upcoming gfx cards, rather its a technology they can keep in-house until the time arrives where it could be beneficial.
At some point if/when physics becomes the norm instead of the feature, I could see the company offering a hybrid gfx chip as the higher-end part(s). Just imagine in 5 years the current AMD 3870X2 was a single high-end gpu, with an attached physics chip (using this as an example since IMO it had been the most effective dual chip solution in real-world performance to date, Nvidia's previous try not so much). That would be practical, again as long as the software is coded for it.
Here's hoping THAT is the future.....I don't want ANOTHER piece of equipment that requires upgrading at every new build.
Intel is going down in 2010! Thats all I'll say about that lol.
Seriously though, if AMD manages to get on board with this PhysX thing, their Fusion CPU's and CrossfireX will make a helluva lot of sense. Think about GPU physics on the processor itself, motherboard north bridge, and GPU all adding their powers together to run games with tons of physics. An AMD/ATI or AMD/Nvidia box would make more gaming sense than using Intel CPU/GPU's. This could be a big help to the ailing AMD right now...
I don't believe it has to be AMD to make this acquisition successful. The true golden apple is the software that integrates hardware with games. That must be Microsoft and DirectX. If they decide to implement physx technology with DirectX don't you believe that it will make the product and acquisition successful and force AMD to compete or integrate physx technology?
BTW I am not an expert just making an observation.
Well the key point with this take over is for nV a replacement for HavokFX. A populair middleware tool. Wich PhysX SDK takes over from.
For the HArdware Dedicated Physics part.
There are a lot G-cards out there.
nV has a larger momentum to be able to push Dedicated Physics much harder.
AMDIT also on it would give a tad more push to it.
As if the whole PC market that 100% would be a PIE.
It might be that PPU piece would not be a very visible target market. 1% ?
While nV got already a large piece. So nV GPU alone has a much bigger target platform. Even the g80 and G92 parts alone.
Wich have some reserves to do render and PhysX on one GPU.
The unified shader aproach to balance VS PS GS and PhysX tasks over the +/- 100 shaders units.
Or a spare GPU to do it dedicated.
So in the long run for nV it's more sales for GPU for this new HArdware accelerated market. Started by ageia.
The sales trends are clear, mass market belongs to consoles.
Why would any developer support a solution if it cannot work across the board? For hardware physics to be successful it has to be support within the directX framework and by AMD. Ideally it should work on consoles as well and AMD hardware.
Going forward, developers are going to put more resources on consoles and less on pc.
I do not know what "trends" you are quoting. If you mean the botched up hardwareshack redirected article that was pointed to by dailytech you should go back and read my comments. In a nutshell the PC market for retail sales on game software was at least 46% NOT 18%. Probably it was over 50% since, in general software prices are higher for the consoles and the numbers were only given in dollars sold not units shipped. Just think about it: the console race is very much a 3 horse race, with the "lowest tech" player leading. The second place player’s development platform is trivial from a port aspect. As far as physics goes; it will not be a major player for the Wii, that leaves the 360 and PS3. The PS3 is a difficult platform to program for and while it probably will move up from last place, it simply will not be “THE” dominate gaming platform the PS2 was in its generation. The 360 is basically the PC platform which means development for it is developing for at least 75% of the gamers. Physics development will only help; NOT detract from PC gaming. Now I agree the developers want the PC platform to die because consoles protect their IP much better, but they have to eat, so to try to force the issue would be suicide for any gaming company. There will always be FAR more PCs than ANY gaming platform and that simply is not going to change. Add to this the severe changes that occur when the next generation of the platforms come, which almost always radically change the way developers have to program to account for the hardware changes. PC development is a good fall back until they can ramp up for the next platforms. It takes at least a year after the platforms are released to see volume and quality on platforms. While PCs do have rapid advances in graphics capabilities they have always had backwards compatibility so games are rarely developed for the newest PCs. I simply do not see it going away soon.
PhysX supports Consoles.
But the PS3 is somthing special.
It's Cell has some what more equalness with the PPU architekture.
But misses the internal bandwith and some restrictions.
So PhysX would run well on a PS3 as if it had a second GPU.
I espected Exclusive PS3 game supporting PhysX to be more Physics rich games. The Dev's have this opertunity to differ PS3 version from the other consoles. With more Physics.
Besides, the PC gaming market has been a proving ground for new technologies before they end up in consoles. Except for when somebody convinced a certain console maker to stick a supercomputer processor in a console, but I shan't name any names. We might end up seeing some form of hardware physics acceleration on the next generation of consoles.
"How is NVIDIA going to be successful where AGEIA failed? After all, not everyone has or will have NVIDIA graphics hardware. That's the interesting bit."
My first thought was licensing the technology for GPU or even chipset could work. Then Tony made it clear they would share it. That's really what has to be done. But if AMD put's their nose up they can virtually stick Nvidia with an acquisition that will do them no bit of good. If AMD doesn't get onboard it will be chicken and the egg all over again and devs won't integrate physics into games, then Ageia will make NVIDIA a nice paper weight.
I disagree here. I _think_ one of the main reasons that games stink for Crysis is because Crysis is physics-intense...? If AMD doesn't come onboard, then their benchmarks will stink for all the pretty shooters out there that use lots of physics. Imagine if, after implementing physics the 8800 GT, it doubles Crysis framerates...? (Has anyone tested Crysis with an AGEIA card? I guess they have). That would be a huge boost over AMD if it made a big difference in frame rates for the "resource hog" games.
"11) Does Crysis support any specialty hardware such as Ageia PhysX?
Yes, however Crysis will not support the Ageia PhysX card due to the fact that Crytek have built their own proprietary physics engine which is not only more advanced than Ageia's, but performs very well on ordinary CPU's (especially multi-core platofmrs). Crysis will support various technologies out of the box including 64-bit operating systems, multi-core processors (as mentioned above), DX9 through to DX10 and many gaming devices such as gamepads. On top of that, the Sandbox editor that comes with Crysis supports the use of a webcam (or any other video capturing device or camera) to animate character expressions."
This is probably the reason why Crysis framerates suxxors.
Ageia will certainly not make a paper-weight for nVidia. This was a very smart acquisition. There is a side of nVidia's business that you may currently know very little about, but that particular side of their business is probably the fastest growing and potentially the most profitable of all their ventures... considering the type of customer involved. nVidia is providing massively-parallel-processing-on-the-desktop capability for both academic research and the engineering industry, thanks to their CUDA toolset. Time costs money in research and can represent lost-opportunity-cost in engineering-related businesses. Waiting in line for centralized servers to crunch numbers is a time-losing game for those involved. Ageia brings a powerful physics-package to the party for those in research or industry who need it. No doubt nVidia is furiously working on integrating the Ageia physics library into their CUDA toolset.
Fair enough, if the technology is used to better map proteins it may help nvidia become more prominent in those venues. I agree I know nothing of it, but if it's geared towards those uses primarily and gains a following, it's doubtful it's uses will be as successful in gaming for reasons mentioned.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
32 Comments
Back to Article
Soubriquet - Monday, February 18, 2008 - link
Ageia was always a dog without a home. They were simply looking for an exit strategy and they must have made nVidia a tempting offer.IMHO nVidia didn't need Ageia, but thought it might come in handy for reasons mentioned hereabouts. Not nearly as expensive as some speculative acquisitions which spring to mind (AMD+ATI) but not likely to be particularly revolutionary either, IMHO.
Physics processing for the mainstream has to be API lead. So we can expect that to come from MS so Intel, AMD and nVidia will all need to get in touch with them and develope their own hardware to suit.
I get the sense that virtualisation may assist here, since physics on a GPU is not like physics on a CPU unless you have that layer in between software and hardware that can make the distinction insignificant to the software. In which case AMD have a decision to make, do they add their own version to the GPU (& compete with nVidia) or the CPU (and compete with Intel) or both ?
In any case virtualisation is just jargon for the time being and we are heading down the multicore CPU route so the CPU seems the obvious place for physics. But nVidia dont make CPUs and I wonder if not a little of their motivation in getting Ageia was to prevent anyone else getting it. A dog in the manger, as it were!
goku - Saturday, February 16, 2008 - link
Thanks for destroying the best thing that could've happened to gaming, now I'll be waiting 10 years for a feature to be added to a game while I could've had that feature in 2 years had there been dedicated physics cards.I don't want to have to buy an nvidia GPU just to get add on physics. At least with the PPU card, it didn't matter what video card I had, and if I don't want or need to improve the visual effects of the game I'm playing but would like more interactivity, all I have to do is buy a new PPU and not a whole new video card.
perzy - Thursday, February 14, 2008 - link
Everyone knows that the CPU is dead, it's not developing beacuse of the frequency-/heat-wall.So the foreseeable future is the discret processors, gpu, fpu and maybe spu. Whatever x-pu you can imagine or come up with.
The gpumakers whant the fpumarket also for sure but the want to do it on their product in the channel they know and trust.
This is like when GM and Ford bought the bus-companies in the USA and closed them down.
Kill the competition, and a cheap low-selling...no brainer.
I'm just waiting for Intel to release their high-performance GPU's and later FPU's.
They are in desperate need to branch out.
mlambert890 - Thursday, February 14, 2008 - link
Everyone knows that the CPU is dead? Really? Have you sent a note off to Intel and AMD? I dont think they got the memo.A GPU, FPU or PPU are all processing units. Any efficiency implemented in those parts can be implemented in a CPU. Any physical challenges in terms of die size, heat, and signaling noise faced by the CPU are also faced by those other, transistor based, parts. What are you getting on about?
All of these parts are a collection of a ton of transistors arranged on a die and coupled with some defined microcode. How they are arranged is a shell game where various sides basically bet on the most commercially viable packaging for any given market segment.
There is no such thing as "dead" or "alive". With transistor based electronic ICs, there are simply various ways of solving various problems and an ever moving landscape target based on what end users want to do. Semiconductor firms can adapt pretty easily and the semantics dont matter at all.
I remember similarly ridiculous comments with the advent of digital media when people were saying the "CPU is dead" and the future would be "all DSPs". Back then I was equally amazed at just how far some can be from "getting it"
forsunny - Wednesday, February 13, 2008 - link
So basically from the article it appears that Intel wants PPU hardware to fail so that the CPU is needed for processing Physics and they can continue to push (sell) for newer CPUs for higher performance.Therefor it appears that Nvidia may want to compete in the CPU market or even in the PPU market so that extra performance can be gained out of the existing CPU power. Nvidia is already in competition with intel in the general chipset market. Then they could claim better performance than intel by adding the PPU technology; unless intel starts to do the same.
I don't see where AMD fits into the picture?
FXi - Wednesday, February 13, 2008 - link
I'm thinking there might be some near term benefit from adding a few secondary chips to the gpu card. There is bandwidth enough in pci-e 2.0 to handle some additional calculations.I'm thinking:
phyics ppu
sound apu
Nvidia has experience (some a bit dated) in both, but they now own the harder of the two to provide. Now they may run into power and heat budgets that constrain them from pursuing this route, but there is plenty of pci-e bandwidth to handle all these things on a single or even dual (sli) style card(s).
With even Asus going into the sound route that sounds like an easy one to cover (and one that has benefits in the home theater arena as well). Now that they have the physics I'd think the trio would work well. And when the gpu advances enough to cover one or both of the secondary chips you just remove them, let the gpu take over the calculations, lower the transistor count and move along. You continue to keep the same set of calculations all based on same card and all under the same roof.
And with their experience in OpenGL acceleration, I'd take a reasonable bet they could do OpenAL acceleration as well. Who knows, will we see eventually OpenPL? :)
haplo602 - Wednesday, February 13, 2008 - link
I think that the whole PPU market missed it's target. Instead of targeting the single-player FPS/RPG etc market, they should have targeted the MMO developers market.Any advanced physics needed for a single-player FPS can be handled by a multicore CPU or alternatively on the GPU shaders with a bit of work.
However imagine a large non-instanced MMORPG with tens of thousands of players and NPCs interating in combat and other tasks. All the collision handling, hit calculations etc HAVE to be done on the server side to prevent cheating. This puts a large strain on the server hardware.
Now imagine a server farm basicaly a multinode cluster handling a large world with each server hosting an area of the game. It has to handle all the interaction and in addition network traffic, backend database handling. A single or dual PPU setup with proper software could make worlds (if not universes) of difference to the player experience, offloading the server CPUs from a bulk of operations.
Huge improvements to player experience here. Maybe I just miss information on these, but I have yet to hear about an MMO that actualy uses this kind of technology.
DigitalFreak - Tuesday, February 12, 2008 - link
HA HA HA HA HA HA HA! Chumps!Zan Lynx - Tuesday, February 12, 2008 - link
Nvidia should start calling their next-gen cards with PhysX support "VRPU"s instead of GPUs. Call it a Virtual Reality Processing Unit.The CPU feeds it objects. The VRPU can use the vertex and texture data for both graphics and physics. Add some new data to the textures for physics and off it goes. The CPU can sit back and feed in user and network inputs to update the virtual world state.
cheburashka - Tuesday, February 12, 2008 - link
Larrabeekilkennycat - Tuesday, February 12, 2008 - link
Larrabee...Marketing speak for Intel's GPU-killer wannabee. Now about 2 years out? ROTFL.....recoiledsnake - Tuesday, February 12, 2008 - link
Isn't DirectX 11 supposed to ship with physics support? AMD/ATI has said accelerated physics is dead till DirectX 11 comes out, and I tend to agree with them.http://news.softpedia.com/news/GPU-Physics-Dead-an...">http://news.softpedia.com/news/GPU-Phys...nd-Burie...
http://www.xbitlabs.com/news/multimedia/display/20...">http://www.xbitlabs.com/news/multimedia/display/20...
I am shocked that the author didn't even mention DirectX 11 in this two page article.
PrinceGaz - Tuesday, February 12, 2008 - link
OpenPL would be better than a MS dependent solution like DirectPhysics. AFAIK though, no work is being done towards developing an OpenPL standard.It's a shame that CUDA is nVidia specific as otherwise it might be viable as a starting-point for OpenPL development.
mlambert890 - Thursday, February 14, 2008 - link
Why? I mean really. Why would some theoretical "OpenPL" be "better"? Unless you have some transformational OSS agenda, it wouldnt be.Come back to practical reality. Im sure its important to niche Linux fans or the small Mac installed base that everything be NOT based on DirectX, so go ahead and spark that up.
For (literally) 90+% of gamers, DirectX works out just fine and DirectPhysics *will* be the best solution.
A theoretical "OpenPL" would be the same as OpenGL. Marginally supported on the PC, loudly and often rudely evangelized by the OSS holy warriors and, ultimately, not all that much different from a proprietary API in practical application when put in context *on Windows*.
Griswold - Thursday, February 14, 2008 - link
If I build the church, will you come and preach it?MrKaz - Tuesday, February 12, 2008 - link
"Who is NVIDIA's competition when it comes to their burgeoning physics business? It's certainly not AMD now that Havok is owned by Intel, and with the removal of AGEIA, we've got one option left: Intel itself. "I think you are wrong. Microsoft is the principal actor of this “useful technology”.
If Microsoft adds physics to the DX11 API (or even some add-on to the DX10) with the processing done at CPU and/or GPU level AMD will not lose nothing. In fact any Intel/Havok or Nvidia/Ageia implementation might not be even supported. So there goes the Intel and Nvidia investment down the drain.
Of course you are right by excluding Microsoft now because they didn’t show anything yet while Ageia and Havok have real products. But I think it is all up to DirectX.
Resuming I think it’s all up to Microsoft for physics to succeed because I don’t see different implementations to succeed or to get support from developers.
In the end AMD might even have the final word if get to use the Intel version or the Nvidia version ;) or wait for Microsoft…
Mr Alpha - Tuesday, February 12, 2008 - link
Two things that weren't discussed that should have been:1.
We are already GPU limited in most games, so where exactly are we supposed to get this processing power for physics from? Will I in the future have to buy a second 8800GTX for $500, instead of a PhysX card for $99, like I can do now?
2.
Hasn't there been some rumours about Direct Physics, where Microsoft does vendor neutral, GPU accelerated physics?
spidey81 - Tuesday, February 12, 2008 - link
I could be mistaken on this, but hasn't Nvidia decided to start putting onboard graphics on ALL motherboards with their chipsets? Wouldn't that be a great way to have a second GPU to handle physics. Albeit, it may be an underpowered GPU, but it may work well enough to offload physics from the CPU. Like it was pointed out, beats having that horsepower sitting there going to waste.7Enigma - Tuesday, February 12, 2008 - link
I had the same question until I re-read the "official" position of Nvidia. They don't plan (for mainstream market) to integrate the chip into upcoming gfx cards, rather its a technology they can keep in-house until the time arrives where it could be beneficial.At some point if/when physics becomes the norm instead of the feature, I could see the company offering a hybrid gfx chip as the higher-end part(s). Just imagine in 5 years the current AMD 3870X2 was a single high-end gpu, with an attached physics chip (using this as an example since IMO it had been the most effective dual chip solution in real-world performance to date, Nvidia's previous try not so much). That would be practical, again as long as the software is coded for it.
Here's hoping THAT is the future.....I don't want ANOTHER piece of equipment that requires upgrading at every new build.
wingless - Tuesday, February 12, 2008 - link
Intel is going down in 2010! Thats all I'll say about that lol.Seriously though, if AMD manages to get on board with this PhysX thing, their Fusion CPU's and CrossfireX will make a helluva lot of sense. Think about GPU physics on the processor itself, motherboard north bridge, and GPU all adding their powers together to run games with tons of physics. An AMD/ATI or AMD/Nvidia box would make more gaming sense than using Intel CPU/GPU's. This could be a big help to the ailing AMD right now...
Rezurecta - Tuesday, February 12, 2008 - link
I don't believe it has to be AMD to make this acquisition successful. The true golden apple is the software that integrates hardware with games. That must be Microsoft and DirectX. If they decide to implement physx technology with DirectX don't you believe that it will make the product and acquisition successful and force AMD to compete or integrate physx technology?BTW I am not an expert just making an observation.
SuperGee - Wednesday, February 20, 2008 - link
Well the key point with this take over is for nV a replacement for HavokFX. A populair middleware tool. Wich PhysX SDK takes over from.For the HArdware Dedicated Physics part.
There are a lot G-cards out there.
nV has a larger momentum to be able to push Dedicated Physics much harder.
AMDIT also on it would give a tad more push to it.
As if the whole PC market that 100% would be a PIE.
It might be that PPU piece would not be a very visible target market. 1% ?
While nV got already a large piece. So nV GPU alone has a much bigger target platform. Even the g80 and G92 parts alone.
Wich have some reserves to do render and PhysX on one GPU.
The unified shader aproach to balance VS PS GS and PhysX tasks over the +/- 100 shaders units.
Or a spare GPU to do it dedicated.
So in the long run for nV it's more sales for GPU for this new HArdware accelerated market. Started by ageia.
knitecrow - Tuesday, February 12, 2008 - link
The sales trends are clear, mass market belongs to consoles.Why would any developer support a solution if it cannot work across the board? For hardware physics to be successful it has to be support within the directX framework and by AMD. Ideally it should work on consoles as well and AMD hardware.
Going forward, developers are going to put more resources on consoles and less on pc.
tmouse - Tuesday, February 12, 2008 - link
I do not know what "trends" you are quoting. If you mean the botched up hardwareshack redirected article that was pointed to by dailytech you should go back and read my comments. In a nutshell the PC market for retail sales on game software was at least 46% NOT 18%. Probably it was over 50% since, in general software prices are higher for the consoles and the numbers were only given in dollars sold not units shipped. Just think about it: the console race is very much a 3 horse race, with the "lowest tech" player leading. The second place player’s development platform is trivial from a port aspect. As far as physics goes; it will not be a major player for the Wii, that leaves the 360 and PS3. The PS3 is a difficult platform to program for and while it probably will move up from last place, it simply will not be “THE” dominate gaming platform the PS2 was in its generation. The 360 is basically the PC platform which means development for it is developing for at least 75% of the gamers. Physics development will only help; NOT detract from PC gaming. Now I agree the developers want the PC platform to die because consoles protect their IP much better, but they have to eat, so to try to force the issue would be suicide for any gaming company. There will always be FAR more PCs than ANY gaming platform and that simply is not going to change. Add to this the severe changes that occur when the next generation of the platforms come, which almost always radically change the way developers have to program to account for the hardware changes. PC development is a good fall back until they can ramp up for the next platforms. It takes at least a year after the platforms are released to see volume and quality on platforms. While PCs do have rapid advances in graphics capabilities they have always had backwards compatibility so games are rarely developed for the newest PCs. I simply do not see it going away soon.SuperGee - Wednesday, February 20, 2008 - link
PhysX supports Consoles.But the PS3 is somthing special.
It's Cell has some what more equalness with the PPU architekture.
But misses the internal bandwith and some restrictions.
So PhysX would run well on a PS3 as if it had a second GPU.
I espected Exclusive PS3 game supporting PhysX to be more Physics rich games. The Dev's have this opertunity to differ PS3 version from the other consoles. With more Physics.
Mr Alpha - Tuesday, February 12, 2008 - link
Isn't PhysX already available on PS3?Besides, the PC gaming market has been a proving ground for new technologies before they end up in consoles. Except for when somebody convinced a certain console maker to stick a supercomputer processor in a console, but I shan't name any names. We might end up seeing some form of hardware physics acceleration on the next generation of consoles.
Cygni - Tuesday, February 12, 2008 - link
They said this when the PS1 came out. They were wrong then, and it is still wrong now.MadBoris - Tuesday, February 12, 2008 - link
"How is NVIDIA going to be successful where AGEIA failed? After all, not everyone has or will have NVIDIA graphics hardware. That's the interesting bit."My first thought was licensing the technology for GPU or even chipset could work. Then Tony made it clear they would share it. That's really what has to be done. But if AMD put's their nose up they can virtually stick Nvidia with an acquisition that will do them no bit of good. If AMD doesn't get onboard it will be chicken and the egg all over again and devs won't integrate physics into games, then Ageia will make NVIDIA a nice paper weight.
paydirt - Friday, February 15, 2008 - link
I disagree here. I _think_ one of the main reasons that games stink for Crysis is because Crysis is physics-intense...? If AMD doesn't come onboard, then their benchmarks will stink for all the pretty shooters out there that use lots of physics. Imagine if, after implementing physics the 8800 GT, it doubles Crysis framerates...? (Has anyone tested Crysis with an AGEIA card? I guess they have). That would be a huge boost over AMD if it made a big difference in frame rates for the "resource hog" games.paydirt - Friday, February 15, 2008 - link
From the Crysis FAQ:"11) Does Crysis support any specialty hardware such as Ageia PhysX?
Yes, however Crysis will not support the Ageia PhysX card due to the fact that Crytek have built their own proprietary physics engine which is not only more advanced than Ageia's, but performs very well on ordinary CPU's (especially multi-core platofmrs). Crysis will support various technologies out of the box including 64-bit operating systems, multi-core processors (as mentioned above), DX9 through to DX10 and many gaming devices such as gamepads. On top of that, the Sandbox editor that comes with Crysis supports the use of a webcam (or any other video capturing device or camera) to animate character expressions."
This is probably the reason why Crysis framerates suxxors.
kilkennycat - Tuesday, February 12, 2008 - link
Ageia will certainly not make a paper-weight for nVidia. This was a very smart acquisition. There is a side of nVidia's business that you may currently know very little about, but that particular side of their business is probably the fastest growing and potentially the most profitable of all their ventures... considering the type of customer involved. nVidia is providing massively-parallel-processing-on-the-desktop capability for both academic research and the engineering industry, thanks to their CUDA toolset. Time costs money in research and can represent lost-opportunity-cost in engineering-related businesses. Waiting in line for centralized servers to crunch numbers is a time-losing game for those involved. Ageia brings a powerful physics-package to the party for those in research or industry who need it. No doubt nVidia is furiously working on integrating the Ageia physics library into their CUDA toolset.MadBoris - Wednesday, February 13, 2008 - link
Fair enough, if the technology is used to better map proteins it may help nvidia become more prominent in those venues. I agree I know nothing of it, but if it's geared towards those uses primarily and gains a following, it's doubtful it's uses will be as successful in gaming for reasons mentioned.