NVIDIA to Acquire AGEIA Technologies

This topic is locked from further discussion.

Avatar image for Paganstar
Paganstar

3180

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#1 Paganstar
Member since 2002 • 3180 Posts

http://www.nvidia.com/object/io_1202161567170.html

If they can come up with something good this could really hurt AMD/ATI, especially when you add in the Intel/Havok buyout.

Avatar image for musclesforcier
musclesforcier

2894

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 musclesforcier
Member since 2004 • 2894 Posts

http://www.gamespot.com/pages/forums/show_msgs.php?topic_id=26197985

Avatar image for yoyo462001
yoyo462001

7535

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#3 yoyo462001
Member since 2005 • 7535 Posts
thread just a few days ago...
Avatar image for Jd1680a
Jd1680a

5960

Forum Posts

0

Wiki Points

0

Followers

Reviews: 38

User Lists: 0

#4 Jd1680a
Member since 2005 • 5960 Posts
With multicore CPUs and soon there will be dual GPU cores, it doesnt many any sense to get a physics chip. My only guess with Nvidia buying Ageia was to decrease a potential competitive revival in the graphic card market. Ageia would have the technical ability to switch over to graphic card market.
Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 9mmSpliff
Member since 2005 • 21751 Posts
With multicore CPUs and soon there will be dual GPU cores, it doesnt many any sense to get a physics chip. My only guess with Nvidia buying Ageia was to decrease a potential competitive revival in the graphic card market. Ageia would have the technical ability to switch over to graphic card market.Jd1680a


Multicor eCPUs can not produce the physics calculation a dedicated PPU can do. The reason for the PPU is to take the stresss off the CPU and let the PPU do the work. Crysis and HL2 are the best games to show off Havok CPU Physics and they dont even compare to UT3, GRAW 2, Warmonger, CellFactor and some of the other games.

Im not saying their bad, its just they cant do what they can. You acutally have to own a PPU and try these games with it to see what how much more immersed you get. Its not just destroying full buildings, or blasting through walls to cut through a linear level. Its about heat waves, weather, trajectory, blood, collision and doing it all at once. A PPU can handle around 5000 physics calculations a second. Where multicore CPUs cant do it to that extent.

I get around 10fps more in UT3 with a PPU then without, plus you get the heavy AGeia modded servers with a tornado ripping through and tearing metal. I honestly would never give mine up.
Avatar image for karasill
karasill

3155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 karasill
Member since 2007 • 3155 Posts

[QUOTE="Jd1680a"]With multicore CPUs and soon there will be dual GPU cores, it doesnt many any sense to get a physics chip. My only guess with Nvidia buying Ageia was to decrease a potential competitive revival in the graphic card market. Ageia would have the technical ability to switch over to graphic card market.9mmSpliff


Multicor eCPUs can not produce the physics calculation a dedicated PPU can do. The reason for the PPU is to take the stresss off the CPU and let the PPU do the work. Crysis and HL2 are the best games to show off Havok CPU Physics and they dont even compare to UT3, GRAW 2, Warmonger, CellFactor and some of the other games.

Im not saying their bad, its just they cant do what they can. You acutally have to own a PPU and try these games with it to see what how much more immersed you get. Its not just destroying full buildings, or blasting through walls to cut through a linear level. Its about heat waves, weather, trajectory, blood, collision and doing it all at once. A PPU can handle around 5000 physics calculations a second. Where multicore CPUs cant do it to that extent.

I get around 10fps more in UT3 with a PPU then without, plus you get the heavy AGeia modded servers with a tornado ripping through and tearing metal. I honestly would never give mine up.

A CPU is very much capable of delivering the same physics experience as a PPU. The only reason why we haven't seen more advanced physics in non Ageia supported games is that Devs can't assume everyone has a dual core or quad core CPU, so they have to compromise obviously.

Give it a few more years when quad cores become the standard and octo-cores are also in the hands of the puclic, you'll be seeing some amazing physics without the need of a PPU. Not to mention Graphics cards can also aid in physics calculations.

Also, have you seen the physics in Alan Wake? Look up the demonstration of Intel showing off their quad core cpu's, it's very impressive and you'll soon realize that if games take advantage of quad core CPU's for Physics you don't need a PPU.

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#7 codezer0
Member since 2004 • 15898 Posts
This could give Ageia the exposure and assistance to get into the market (on good footing) that it sorely needed. Again, it could also mean a future NVIDIA graphics card with an integrated PPU for accelerated physics processing. Let's not kid ourselves. HavoK has done a fine job for physics for many years in computer gaming, but it simply cannot handle the full scope of physics calculations nearly as efficiently in only software that a dedicated PPU with supporting software is capable of doing. And with UT3 (and any future games based off UE3), it goes to prove that you can get a performance increase with PPU silicon in your system.
Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 9mmSpliff
Member since 2005 • 21751 Posts

[QUOTE="9mmSpliff"][QUOTE="Jd1680a"]With multicore CPUs and soon there will be dual GPU cores, it doesnt many any sense to get a physics chip. My only guess with Nvidia buying Ageia was to decrease a potential competitive revival in the graphic card market. Ageia would have the technical ability to switch over to graphic card market.karasill



Multicor eCPUs can not produce the physics calculation a dedicated PPU can do. The reason for the PPU is to take the stresss off the CPU and let the PPU do the work. Crysis and HL2 are the best games to show off Havok CPU Physics and they dont even compare to UT3, GRAW 2, Warmonger, CellFactor and some of the other games.

Im not saying their bad, its just they cant do what they can. You acutally have to own a PPU and try these games with it to see what how much more immersed you get. Its not just destroying full buildings, or blasting through walls to cut through a linear level. Its about heat waves, weather, trajectory, blood, collision and doing it all at once. A PPU can handle around 5000 physics calculations a second. Where multicore CPUs cant do it to that extent.

I get around 10fps more in UT3 with a PPU then without, plus you get the heavy AGeia modded servers with a tornado ripping through and tearing metal. I honestly would never give mine up.

A CPU is very much capable of delivering the same physics experience as a PPU. The only reason why we haven't seen more advanced physics in non Ageia supported games is that Devs can't assume everyone has a dual core or quad core CPU, so they have to compromise obviously.

Give it a few more years when quad cores become the standard and octo-cores are also in the hands of the puclic, you'll be seeing some amazing physics without the need of a PPU. Not to mention Graphics cards can also aid in physics calculations.

Also, have you seen the physics in Alan Wake? Look up the demonstration of Intel showing off their quad core cpu's, it's very impressive and you'll soon realize that if games take advantage of quad core CPU's for Physics you don't need a PPU.



If I had the article from Ageia. They have proven that a Dual Core CPU can not handle the physics calculations a PPU can. Just dual cores, cause that is all that was out at the time. It was from when they launched the PPU. The Dual Core CPU got crushed in physics calculations. Yes AlanWake has great physics, but they still cant compete with Ageia. You are not an Ageia owner, so you do not know how intense the physics are in games like. Crumblimg 7 storey buildings, etc. The GPU also was proven through Agiea to not be able to handle the same phsyics a PPU can. The GPU portion was shown to only be able to provide cosmetic physics. More blood, heat waves, blowing bushes, fires, small collisions. This was all in the same article. Havok tried to do what Ageia has done and failed, which is why Quantam physics will not go through with nVidia from Havok.

Im not knocking Havok, theyre the grandfather of Physics and rock at it. But they couldnt do the PPU. I personally love the physics in Source, they classic, same with Crysis. But you can not compare them to GRAW 2 or UT3. The Unreal 3 engine is fully Ageia Licensed, so that was the best hting they could have done.
Avatar image for karasill
karasill

3155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 karasill
Member since 2007 • 3155 Posts
[QUOTE="karasill"]

[QUOTE="9mmSpliff"][QUOTE="Jd1680a"]With multicore CPUs and soon there will be dual GPU cores, it doesnt many any sense to get a physics chip. My only guess with Nvidia buying Ageia was to decrease a potential competitive revival in the graphic card market. Ageia would have the technical ability to switch over to graphic card market.9mmSpliff



Multicor eCPUs can not produce the physics calculation a dedicated PPU can do. The reason for the PPU is to take the stresss off the CPU and let the PPU do the work. Crysis and HL2 are the best games to show off Havok CPU Physics and they dont even compare to UT3, GRAW 2, Warmonger, CellFactor and some of the other games.

Im not saying their bad, its just they cant do what they can. You acutally have to own a PPU and try these games with it to see what how much more immersed you get. Its not just destroying full buildings, or blasting through walls to cut through a linear level. Its about heat waves, weather, trajectory, blood, collision and doing it all at once. A PPU can handle around 5000 physics calculations a second. Where multicore CPUs cant do it to that extent.

I get around 10fps more in UT3 with a PPU then without, plus you get the heavy AGeia modded servers with a tornado ripping through and tearing metal. I honestly would never give mine up.

A CPU is very much capable of delivering the same physics experience as a PPU. The only reason why we haven't seen more advanced physics in non Ageia supported games is that Devs can't assume everyone has a dual core or quad core CPU, so they have to compromise obviously.

Give it a few more years when quad cores become the standard and octo-cores are also in the hands of the puclic, you'll be seeing some amazing physics without the need of a PPU. Not to mention Graphics cards can also aid in physics calculations.

Also, have you seen the physics in Alan Wake? Look up the demonstration of Intel showing off their quad core cpu's, it's very impressive and you'll soon realize that if games take advantage of quad core CPU's for Physics you don't need a PPU.



If I had the article from Ageia. They have proven that a Dual Core CPU can not handle the physics calculations a PPU can. Just dual cores, cause that is all that was out at the time. It was from when they launched the PPU. The Dual Core CPU got crushed in physics calculations. Yes AlanWake has great physics, but they still cant compete with Ageia. You are not an Ageia owner, so you do not know how intense the physics are in games like. Crumblimg 7 storey buildings, etc. The GPU also was proven through Agiea to not be able to handle the same phsyics a PPU can. The GPU portion was shown to only be able to provide cosmetic physics. More blood, heat waves, blowing bushes, fires, small collisions. This was all in the same article. Havok tried to do what Ageia has done and failed, which is why Quantam physics will not go through with nVidia from Havok.

Im not knocking Havok, theyre the grandfather of Physics and rock at it. But they couldnt do the PPU. I personally love the physics in Source, they ****c, same with Crysis. But you can not compare them to GRAW 2 or UT3. The Unreal 3 engine is fully Ageia Licensed, so that was the best hting they could have done.

I hate the idea of having to buy a seperate card just for enhanced physics. This is something that CPU's should be doing. As CPU's get faster and programing gets a little more advanced there is no reason not to see the same level of physics as what a PPU can provide. All a PPU is, is a processor that specializes in physics calculations and takes some of the load away from the CPU.

However I just now saw this and it turns out a CPU is very capable of doing the same level of physics calculations as a PPU. The only thing a PPU does is increase performance by a bit, it's not like a CPU is unable to render the same level of physics. However this example for instance shows a dual CPU vs a PPU and the CPU actually ran the physics demo faster then the PPU. However the cost was that the CPU had to have one of it's cores work at full load to beat it. http://www.youtube.com/watch?v=a7hIuytdRKw&feature=related

Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#10 codezer0
Member since 2004 • 15898 Posts

Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.

karasill
And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck.
Avatar image for karasill
karasill

3155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 karasill
Member since 2007 • 3155 Posts
[QUOTE="karasill"]

Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.

codezer0

And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck.

I own a business, I make more money in one month then you in a year. Please quit trying to start a dumb argument with me.

And WTF does your post have to do with what I said? You really love to start s*** don't you? :roll:

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#12 codezer0
Member since 2004 • 15898 Posts
[QUOTE="codezer0"][QUOTE="karasill"]

Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.

karasill

And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck.

I own a business, I make more money in one month then you in a year. Please quit trying to start a dumb argument with me.

And WTF does your post have to do with what I said? You really love to start s*** don't you? :roll:

You? own a company? Oh god, I hope you don't do your own accounting. :lol: The reality is that even the most efficient coding of real-time physics API's and software is simply going to be limited in scope to what a piece of silicon whose sole purpose is physics is capable of doing. You know the GPU? Used to be that stuff was rendered purely by the CPU as well, and people also reasoned back then that a separate piece of silicon for video was unnecessary. Now you can't even get a desktop computer to boot up without one, and the few that do anymore are server blades meant to be rack-mounted, receive their .ini and config files and quietly crunch away in some basement corner somewhere. Even the basic PhysX PPU that is out now is able to accurately process data for thousands more objects (and the interactions of said objects with the environment, the player(s) and each other) far faster and more accurately than even the most ridiculously overclocked quad-core would be able to if you could dedicate all four cores to the job (highly unlikely). What this move does is it grants NVIDIA a way to work in Ageia's tech to creating or integrating a GPU with a PPU, so that gamers who would want the advanced physics could then buy a high-end, PPU-integrated model, and not only enjoy good eye candy, but also offload the physics work onto it, leaving the CPU to handle the A.I. elements (to which it is unpeered at handling over any other component in a computer), and the background/menial stuff the OS tasks it to do. In order to actually be in the business of making games and/or pushing the technology envelope, you do have to be a bit more forward thinking than your typical Timmy gamer who's a slave to their Wintendo box to actually make the games happen.
Avatar image for karasill
karasill

3155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 karasill
Member since 2007 • 3155 Posts
[QUOTE="karasill"][QUOTE="codezer0"][QUOTE="karasill"]

Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.

codezer0

And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck.

I own a business, I make more money in one month then you in a year. Please quit trying to start a dumb argument with me.

And WTF does your post have to do with what I said? You really love to start s*** don't you? :roll:

You? own a company? Oh god, I hope you don't do your own accounting. :lol: The reality is that even the most efficient coding of real-time physics API's and software is simply going to be limited in scope to what a piece of silicon whose sole purpose isphysics is capable of doing. You know the GPU? Used to be that stuff was rendered purely by the CPU as well, and people also reasoned back then that a separate piece of silicon for video was unnecessary. Now you can't even get a desktop computer to boot up without one, and the few that do anymore are server blades meant to be rack-mounted, receive their .ini and config files and quietly crunch away in some basement corner somewhere. Even the basic PhysX PPU that is out now is able to accurately process data for thousands more objects (and the interactions of said objects with the environment, the player(s) and each other) far faster and more accurately than even the most ridiculously overclocked quad-core would be able to if you could dedicate all four cores to the job (highly unlikely). What this move does is it grants NVIDIA a way to work in Ageia's tech to creating or integrating a GPU with a PPU, so that gamers who would want the advanced physics could then buy a high-end, PPU-integrated model, and not only enjoy good eye candy, but also offload the physics work onto it, leaving the CPU to handle the A.I. elements (to which it is unpeered at handling over any other component in a computer), and the background/menial stuff the OS tasks it to do. In order to actually be in the business of making games and/or pushing the technology envelope, you do have to be a bit more forward thinking than your typical Timmy gamer who's a slave to their Wintendo box to actually make the games happen.

Yes, please try to insult me, it's only expected from someone like you who can't even show some common courtesy, maybe the act of it is just too hard for you to do. For all I know I'm arguing with someone with a mental disibility.

The day when I'd have to buy a PPU just to play games is the day I quit PC gaming. I suppose we should just ignore what we've seen with the likes of Alan Wake and what Intel was showing last year huh? I was not talking about current quad core cpu's, I was talking about future cpus which would be a minimum of 4 times faster then the best that's currently out there.. By your method of thinking I suppose we should also have a dedicated AI processor, dedicated HDR processor, etc just to run a game.

Maybe we should ignore all CPU based physics because they obviously can't handle the complex offerings of the all mighty PPU despite the link I provided. But hey, let's ignore the truth, and is that a CPU is fully capable of calculating physics. A PPU is a nice idea and it does take some workload off the cpu, but it doesn't necessarily provide physics that are more advanced or realistic looking as a CPU can do all of that as well. (maybe you didn't look at the link I provided)

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#14 codezer0
Member since 2004 • 15898 Posts
I never said HavoK was worthless. Quit **** putting words in my mouth. :evil: I'm saying that the reality is that software-based API's still have to contend with the fact that they'll be dealing with an otherwise very occupied processor. Because even as you're running the game, the CPU has to keep track of everything the OS wants it to do, interrupts from hardware devices and software wanting attention, the antivirus and firewall programs you should be running if you're using Windows, communicating with drivers for all devices connected to the machine, performing the I/O ops for USB devices like your keyboard, mouse, headset, etc. The list goes on. There is no need for a dedicated A.I. processor; by your logic, the CPU already is one - the ALU that has been part of CPU architectures since even before there was the intel 8086 are proof of this. At the same note, no need for an HDR accelerator, because that's a byfunction of graphical elements; and the GPU has already proven its role in doing these brilliantly. I also am looking forward to Alan Wake, and it's more than just a purely technical reason at that. I admire what Remedy has shown off thus far was possible with their usage of HavoK FX and the like. At the same time though, I do believe this game could run just as well if it had a way to communicate with a PPU, if not a whole lot better. Given what they're talking about is involved in the physics element of the game, you could easily make do with a dual-core or possibly get away with a single-core CPU (that was fast enough) if HavoK would try or was allowed to code their API to be able to run on Ageia's PPU. At the moment though, they choose not to. And now intel snatched them up to help them out with creating their Larrabee project.
Avatar image for karasill
karasill

3155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 karasill
Member since 2007 • 3155 Posts

I never said HavoK was worthless. Quit **** putting words in my mouth. :evil: I'm saying that the reality is that software-based API's still have to contend with the fact that they'll be dealing with an otherwise very occupied processor. Because even as you're running the game, the CPU has to keep track of everything the OS wants it to do, interrupts from hardware devices and software wanting attention, the antivirus and firewall programs you should be running if you're using Windows, communicating with drivers for all devices connected to the machine, performing the I/O ops for USB devices like your keyboard, mouse, headset, etc. The list goes on. There is no need for a dedicated A.I. processor; by your logic, the CPU already is one - the ALU that has been part of CPU architectures since even before there was the intel 8086 are proof of this. At the same note, no need for an HDR accelerator, because that's a byfunction of graphical elements; and the GPU has already proven its role in doing these brilliantly. I also am looking forward to Alan Wake, and it's more than just a purely technical reason at that. I admire what Remedy has shown off thus far was possible with their usage of HavoK FX and the like. At the same time though, I do believe this game could run just as well if it had a way to communicate with a PPU, if not a whole lot better. Given what they're talking about is involved in the physics element of the game, you could easily make do with a dual-core or possibly get away with a single-core CPU (that was fast enough) if HavoK would try or was allowed to code their API to be able to run on Ageia's PPU. At the moment though, they choose not to. And now intel snatched them up to help them out with creating their Larrabee project.codezer0
My computer's CPU usage is about 1-5 percent with windows background apps and anit-virus running and so forth. Yeah, the CPU is very occupied :roll: Even when running Crysis my CPU usage never goes above 60% per core, so there is a lot of untapped resources right there. Again, you don't need a PPU to have some impressive physics when the CPU is very capable of supplying the same experience.

Have fun with the few games that actually support the PPU. BTW I've seen what it can do and yes the Cell Factor demo was impressive but at the same time I saw nothing that couldn't have been done on a CPU, especially after the tornado portion of the Alan Wake Demo.

Avatar image for Nkemjo
Nkemjo

585

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 Nkemjo
Member since 2005 • 585 Posts

I never said HavoK was worthless. Quit **** putting words in my mouth. :evil: I'm saying that the reality is that software-based API's still have to contend with the fact that they'll be dealing with an otherwise very occupied processor. Because even as you're running the game, the CPU has to keep track of everything the OS wants it to do, interrupts from hardware devices and software wanting attention, the antivirus and firewall programs you should be running if you're using Windows, communicating with drivers for all devices connected to the machine, performing the I/O ops for USB devices like your keyboard, mouse, headset, etc. The list goes on. There is no need for a dedicated A.I. processor; by your logic, the CPU already is one - the ALU that has been part of CPU architectures since even before there was the intel 8086 are proof of this. At the same note, no need for an HDR accelerator, because that's a byfunction of graphical elements; and the GPU has already proven its role in doing these brilliantly. I also am looking forward to Alan Wake, and it's more than just a purely technical reason at that. I admire what Remedy has shown off thus far was possible with their usage of HavoK FX and the like. At the same time though, I do believe this game could run just as well if it had a way to communicate with a PPU, if not a whole lot better. Given what they're talking about is involved in the physics element of the game, you could easily make do with a dual-core or possibly get away with a single-core CPU (that was fast enough) if HavoK would try or was allowed to code their API to be able to run on Ageia's PPU. At the moment though, they choose not to. And now intel snatched them up to help them out with creating their Larrabee project.codezer0

No offense but you're coming off as a bit too aggro. Why did you start insulting him personally?

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#17 codezer0
Member since 2004 • 15898 Posts

No offense but you're coming off as a bit too aggro. Why did you start insulting him personally?

Nkemjo
I know this **** from before. This isn't the first time he's come off as either completely ignorant or flat out wrong. and repeatedly has proven himself that he's the kind of person that likes to get kicked in the head a few times before any lesson will sink in.
Avatar image for Thinker_145
Thinker_145

2546

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 Thinker_145
Member since 2007 • 2546 Posts
[QUOTE="codezer0"][QUOTE="karasill"][QUOTE="codezer0"][QUOTE="karasill"]

Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.

karasill

And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck.

I own a business, I make more money in one month then you in a year. Please quit trying to start a dumb argument with me.

And WTF does your post have to do with what I said? You really love to start s*** don't you? :roll:

You? own a company? Oh god, I hope you don't do your own accounting. :lol: The reality is that even the most efficient coding of real-time physics API's and software is simply going to be limited in scope to what a piece of silicon whose sole purpose isphysics is capable of doing. You know the GPU? Used to be that stuff was rendered purely by the CPU as well, and people also reasoned back then that a separate piece of silicon for video was unnecessary. Now you can't even get a desktop computer to boot up without one, and the few that do anymore are server blades meant to be rack-mounted, receive their .ini and config files and quietly crunch away in some basement corner somewhere. Even the basic PhysX PPU that is out now is able to accurately process data for thousands more objects (and the interactions of said objects with the environment, the player(s) and each other) far faster and more accurately than even the most ridiculously overclocked quad-core would be able to if you could dedicate all four cores to the job (highly unlikely). What this move does is it grants NVIDIA a way to work in Ageia's tech to creating or integrating a GPU with a PPU, so that gamers who would want the advanced physics could then buy a high-end, PPU-integrated model, and not only enjoy good eye candy, but also offload the physics work onto it, leaving the CPU to handle the A.I. elements (to which it is unpeered at handling over any other component in a computer), and the background/menial stuff the OS tasks it to do. In order to actually be in the business of making games and/or pushing the technology envelope, you do have to be a bit more forward thinking than your typical Timmy gamer who's a slave to their Wintendo box to actually make the games happen.

Yes, please try to insult me, it's only expected from someone like you who can't even show some common courtesy, maybe the act of it is just too hard for you to do. For all I know I'm arguing with someone with a mental disibility.

The day when I'd have to buy a PPU just to play games is the day I quit PC gaming. I suppose we should just ignore what we've seen with the likes of Alan Wake and what Intel was showing last year huh? I was not talking about current quad core cpu's, I was talking about future cpus which would be a minimum of 4 times faster then the best that's currently out there.. By your method of thinking I suppose we should also have a dedicated AI processor, dedicated HDR processor, etc just to run a game.

Maybe we should ignore all CPU based physics because they obviously can't handle the complex offerings of the all mighty PPU despite the link I provided. But hey, let's ignore the truth, and is that a CPU is fully capable of calculating physics. A PPU is a nice idea and it does take some workload off the cpu, but it doesn't necessarily provide physics that are more advanced or realistic looking as a CPU can do all of that as well. (maybe you didn't look at the link I provided)

Ya i somewhat agree with that.
Avatar image for karasill
karasill

3155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 karasill
Member since 2007 • 3155 Posts
[QUOTE="Nkemjo"]

No offense but you're coming off as a bit too aggro. Why did you start insulting him personally?

codezer0

I know this **** from before. This isn't the first time he's come off as either completely ignorant or flat out wrong. and repeatedly has proven himself that he's the kind of person that likes to get kicked in the head a few times before any lesson will sink in.

No, you've always been a jerk from the get go. And not just to me, but most people. You don't know the meaning of being nice and no, I've proven you wrong many times. Don't get confused ;)

BTW you didn't reply to my post, maybe it's because you're the one who was in the wrong saying my cpu would be very occupied and that a cPU is incapable of doing some advanced Physics. Almost as if you're ignoring the CPU based physics in Alan Wake because they are as impressive as anything in CellFactor.

Maybe you feel the need to defend the PPU because you own one and don't want to feel like you made a bad call when purchasing it. Is the PPU really worth the few games and maps that actually support it? I think not.

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#22 codezer0
Member since 2004 • 15898 Posts
I didn't bother reading anything past the first few sentences because you're wasting my time....karasill
Your words, not mine. Why should I show you any courtesy or respect when you continue to delude yourself in the same manner you accuse me of?  Pot. Kettle. Black.
Avatar image for karasill
karasill

3155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 karasill
Member since 2007 • 3155 Posts

You know codezer0, I'm tired of arguing with you. You are the thorn on my side. How about you don't quote anything I say and I don't quote anything you say and we just leave each other alone? Because after today I'm just going to ignore everything you write. I don't log onto this forum just to argue, especially with thick headed people who have to defend their purchase as if it were their own flesh and blood.

If you love the PPU then that's cool, but even you should see that CPU's of today (not to mention CPU's from 2-4 years from now) are powerful enough to run advanced physics without the aid of a PPU if the game is coded correctly.

Avatar image for G013M
G013M

6424

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 G013M
Member since 2006 • 6424 Posts
  • DirectX 10 cannot natively handle DX9 (or earlier) software calls, and needs an emulation wrapper to be able to process them at all. And if your game is OpenGL, and the display drivers did it how Microsoft wanted them to do it, OpenGL code would have to be put in a wrapper to be passed to the DX9 wrapper, which then must be retranslated again for DirectX 10. And this is before the game is even running!

codezer0

DirectX 9 doesn't get emulated into DX 10 calls. That's why there's a WDDM version of DirectX 9 on all Vista installs, and also why you can download and install the traditional DX 9 from the Microsoft Download site.

There's no performance decrease through emulation for running Dx9 apps.

And there's no emulation for OpenGL either, it's handled exactly the same way as it is in XP. I got this from the OpenGL website. Just like in XP if there isn't an ICD provided in the drivers, then it's all done in software, but it's up to the Video Card manufacturers to provide the ICD to do OpenGL in hardware.

"OpenGL hardware acceleration is handled in exactly the same way in Windows XP and Windows Vista - through an Installable Client Driver (ICD) provided by graphics card manufacturers. "

http://www.opengl.org/pipeline/article/vol003_9/

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#25 codezer0
Member since 2004 • 15898 Posts
I don't own a PPU yet. You again ASSume that I did purchase one. No; I'm simply waiting until I can snap one up for $100 or less (w/o an MIR to deal with). Or if possible, for NVIDIA to provide a graphics card that includes a PhysX PPU on the PCB. Why should we have to wait 2-4 years (as you say), for these kinds of advanced physics? If an extra piece of silicon is capable of granting that ability now or shorten that wait significantly, why not take advantage?
Avatar image for karasill
karasill

3155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 karasill
Member since 2007 • 3155 Posts
I don't own a PPU yet. You again ASSume that I did purchase one. No; I'm simply waiting until I can snap one up for $100 or less (w/o an MIR to deal with). Or if possible, for NVIDIA to provide a graphics card that includes a PhysX PPU on the PCB. Why should we have to wait 2-4 years (as you say), for these kinds of advanced physics? If an extra piece of silicon is capable of granting that ability now or shorten that wait significantly, why not take advantage?codezer0
CPU's are very capable of doing this now. I'm just futher clarifying that as time goes on the need for a PPU diminishes greatly. Do I have to keep bringing up the Alan Wake Demo? Did you even watch the Tornado scene? It was about as impressive as the CellFactor Demo and I noticed zero impact on the framerate.
Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#27 codezer0
Member since 2004 • 15898 Posts

And there's no emulation for OpenGL either, it's handled exactly the same way as it is in XP. I got this from the OpenGL website. Just like in XP if there isn't an ICD provided in the drivers, then it's all done in software, but it's up to the Video Card manufacturers to provide the ICD to do OpenGL in hardware.

"OpenGL hardware acceleration is handled in exactly the same way in Windows XP and Windows Vista - through an Installable Client Driver (ICD) provided by graphics card manufacturers. "

http://www.opengl.org/pipeline/article/vol003_9/

G013M
Then why is it that every OpenGL game I've tried on Vista-based machines thus far have either run like ass or not at all, where they ran like butter on Windows XP and even 2000? If it's implemented the same way, that doesn't make any sense. :?
Avatar image for G013M
G013M

6424

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 G013M
Member since 2006 • 6424 Posts
[QUOTE="G013M"]

And there's no emulation for OpenGL either, it's handled exactly the same way as it is in XP. I got this from the OpenGL website. Just like in XP if there isn't an ICD provided in the drivers, then it's all done in software, but it's up to the Video Card manufacturers to provide the ICD to do OpenGL in hardware.

"OpenGL hardware acceleration is handled in exactly the same way in Windows XP and Windows Vista - through an Installable Client Driver (ICD) provided by graphics card manufacturers. "

http://www.opengl.org/pipeline/article/vol003_9/

codezer0

Then why is it that every OpenGL game I've tried on Vista-based machines thus far have either run like ass or not at all, where they ran like butter on Windows XP and even 2000? If it's implemented the same way, that doesn't make any sense. :?

It's probably an Nvidia problem. Just like SLI support under Vista isn't the greatest, OpenGL probably has it's problems at the moment as well.

I'm just speculating here, but the implementation for XP and Vista is pretty much the same.

Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#29 codezer0
Member since 2004 • 15898 Posts

It's probably an Nvidia problem. Just like SLI support under Vista isn't the greatest, OpenGL probably has it's problems at the moment as well.

I'm just speculating here, but the implementation for XP and Vista is pretty much the same.

G013M
Wasnt' my own machine, but it was a similar system to mine belonging to one of my classmates. Then again, it could be a matter of how my playability in the Crysis SP demo improved significantly just from going from the 162 to 169 device drivers. I stand corrected, at least as far as how OpenGL is handled in Vista. It still doesn't explain the performance discrepancy, especially for a v.N+1 change.