http://www.nvidia.com/object/io_1202161567170.html
If they can come up with something good this could really hurt AMD/ATI, especially when you add in the Intel/Havok buyout.
This topic is locked from further discussion.
With multicore CPUs and soon there will be dual GPU cores, it doesnt many any sense to get a physics chip. My only guess with Nvidia buying Ageia was to decrease a potential competitive revival in the graphic card market. Ageia would have the technical ability to switch over to graphic card market.Jd1680a
[QUOTE="Jd1680a"]With multicore CPUs and soon there will be dual GPU cores, it doesnt many any sense to get a physics chip. My only guess with Nvidia buying Ageia was to decrease a potential competitive revival in the graphic card market. Ageia would have the technical ability to switch over to graphic card market.9mmSpliff
Give it a few more years when quad cores become the standard and octo-cores are also in the hands of the puclic, you'll be seeing some amazing physics without the need of a PPU. Not to mention Graphics cards can also aid in physics calculations.
Also, have you seen the physics in Alan Wake? Look up the demonstration of Intel showing off their quad core cpu's, it's very impressive and you'll soon realize that if games take advantage of quad core CPU's for Physics you don't need a PPU.
[QUOTE="9mmSpliff"][QUOTE="Jd1680a"]With multicore CPUs and soon there will be dual GPU cores, it doesnt many any sense to get a physics chip. My only guess with Nvidia buying Ageia was to decrease a potential competitive revival in the graphic card market. Ageia would have the technical ability to switch over to graphic card market.karasill
Give it a few more years when quad cores become the standard and octo-cores are also in the hands of the puclic, you'll be seeing some amazing physics without the need of a PPU. Not to mention Graphics cards can also aid in physics calculations.
Also, have you seen the physics in Alan Wake? Look up the demonstration of Intel showing off their quad core cpu's, it's very impressive and you'll soon realize that if games take advantage of quad core CPU's for Physics you don't need a PPU.
[QUOTE="karasill"][QUOTE="9mmSpliff"][QUOTE="Jd1680a"]With multicore CPUs and soon there will be dual GPU cores, it doesnt many any sense to get a physics chip. My only guess with Nvidia buying Ageia was to decrease a potential competitive revival in the graphic card market. Ageia would have the technical ability to switch over to graphic card market.9mmSpliff
Give it a few more years when quad cores become the standard and octo-cores are also in the hands of the puclic, you'll be seeing some amazing physics without the need of a PPU. Not to mention Graphics cards can also aid in physics calculations.
Also, have you seen the physics in Alan Wake? Look up the demonstration of Intel showing off their quad core cpu's, it's very impressive and you'll soon realize that if games take advantage of quad core CPU's for Physics you don't need a PPU.
I hate the idea of having to buy a seperate card just for enhanced physics. This is something that CPU's should be doing. As CPU's get faster and programing gets a little more advanced there is no reason not to see the same level of physics as what a PPU can provide. All a PPU is, is a processor that specializes in physics calculations and takes some of the load away from the CPU.
However I just now saw this and it turns out a CPU is very capable of doing the same level of physics calculations as a PPU. The only thing a PPU does is increase performance by a bit, it's not like a CPU is unable to render the same level of physics. However this example for instance shows a dual CPU vs a PPU and the CPU actually ran the physics demo faster then the PPU. However the cost was that the CPU had to have one of it's cores work at full load to beat it. http://www.youtube.com/watch?v=a7hIuytdRKw&feature=related
Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.
And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck.Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.
karasill
[QUOTE="karasill"]And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck. I own a business, I make more money in one month then you in a year. Please quit trying to start a dumb argument with me.Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.
codezer0
And WTF does your post have to do with what I said? You really love to start s*** don't you? :roll:
[QUOTE="codezer0"][QUOTE="karasill"]And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck. I own a business, I make more money in one month then you in a year. Please quit trying to start a dumb argument with me.Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.
karasill
And WTF does your post have to do with what I said? You really love to start s*** don't you? :roll:
You? own a company? Oh god, I hope you don't do your own accounting. :lol: The reality is that even the most efficient coding of real-time physics API's and software is simply going to be limited in scope to what a piece of silicon whose sole purpose is physics is capable of doing. You know the GPU? Used to be that stuff was rendered purely by the CPU as well, and people also reasoned back then that a separate piece of silicon for video was unnecessary. Now you can't even get a desktop computer to boot up without one, and the few that do anymore are server blades meant to be rack-mounted, receive their .ini and config files and quietly crunch away in some basement corner somewhere. Even the basic PhysX PPU that is out now is able to accurately process data for thousands more objects (and the interactions of said objects with the environment, the player(s) and each other) far faster and more accurately than even the most ridiculously overclocked quad-core would be able to if you could dedicate all four cores to the job (highly unlikely). What this move does is it grants NVIDIA a way to work in Ageia's tech to creating or integrating a GPU with a PPU, so that gamers who would want the advanced physics could then buy a high-end, PPU-integrated model, and not only enjoy good eye candy, but also offload the physics work onto it, leaving the CPU to handle the A.I. elements (to which it is unpeered at handling over any other component in a computer), and the background/menial stuff the OS tasks it to do. In order to actually be in the business of making games and/or pushing the technology envelope, you do have to be a bit more forward thinking than your typical Timmy gamer who's a slave to their Wintendo box to actually make the games happen.[QUOTE="karasill"][QUOTE="codezer0"][QUOTE="karasill"]And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck. I own a business, I make more money in one month then you in a year. Please quit trying to start a dumb argument with me.Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.
codezer0
And WTF does your post have to do with what I said? You really love to start s*** don't you? :roll:
You? own a company? Oh god, I hope you don't do your own accounting. :lol: The reality is that even the most efficient coding of real-time physics API's and software is simply going to be limited in scope to what a piece of silicon whose sole purpose isphysics is capable of doing. You know the GPU? Used to be that stuff was rendered purely by the CPU as well, and people also reasoned back then that a separate piece of silicon for video was unnecessary. Now you can't even get a desktop computer to boot up without one, and the few that do anymore are server blades meant to be rack-mounted, receive their .ini and config files and quietly crunch away in some basement corner somewhere. Even the basic PhysX PPU that is out now is able to accurately process data for thousands more objects (and the interactions of said objects with the environment, the player(s) and each other) far faster and more accurately than even the most ridiculously overclocked quad-core would be able to if you could dedicate all four cores to the job (highly unlikely). What this move does is it grants NVIDIA a way to work in Ageia's tech to creating or integrating a GPU with a PPU, so that gamers who would want the advanced physics could then buy a high-end, PPU-integrated model, and not only enjoy good eye candy, but also offload the physics work onto it, leaving the CPU to handle the A.I. elements (to which it is unpeered at handling over any other component in a computer), and the background/menial stuff the OS tasks it to do. In order to actually be in the business of making games and/or pushing the technology envelope, you do have to be a bit more forward thinking than your typical Timmy gamer who's a slave to their Wintendo box to actually make the games happen. Yes, please try to insult me, it's only expected from someone like you who can't even show some common courtesy, maybe the act of it is just too hard for you to do. For all I know I'm arguing with someone with a mental disibility.The day when I'd have to buy a PPU just to play games is the day I quit PC gaming. I suppose we should just ignore what we've seen with the likes of Alan Wake and what Intel was showing last year huh? I was not talking about current quad core cpu's, I was talking about future cpus which would be a minimum of 4 times faster then the best that's currently out there.. By your method of thinking I suppose we should also have a dedicated AI processor, dedicated HDR processor, etc just to run a game.
Maybe we should ignore all CPU based physics because they obviously can't handle the complex offerings of the all mighty PPU despite the link I provided. But hey, let's ignore the truth, and is that a CPU is fully capable of calculating physics. A PPU is a nice idea and it does take some workload off the cpu, but it doesn't necessarily provide physics that are more advanced or realistic looking as a CPU can do all of that as well. (maybe you didn't look at the link I provided)
I never said HavoK was worthless. Quit **** putting words in my mouth. :evil: I'm saying that the reality is that software-based API's still have to contend with the fact that they'll be dealing with an otherwise very occupied processor. Because even as you're running the game, the CPU has to keep track of everything the OS wants it to do, interrupts from hardware devices and software wanting attention, the antivirus and firewall programs you should be running if you're using Windows, communicating with drivers for all devices connected to the machine, performing the I/O ops for USB devices like your keyboard, mouse, headset, etc. The list goes on. There is no need for a dedicated A.I. processor; by your logic, the CPU already is one - the ALU that has been part of CPU architectures since even before there was the intel 8086 are proof of this. At the same note, no need for an HDR accelerator, because that's a byfunction of graphical elements; and the GPU has already proven its role in doing these brilliantly. I also am looking forward to Alan Wake, and it's more than just a purely technical reason at that. I admire what Remedy has shown off thus far was possible with their usage of HavoK FX and the like. At the same time though, I do believe this game could run just as well if it had a way to communicate with a PPU, if not a whole lot better. Given what they're talking about is involved in the physics element of the game, you could easily make do with a dual-core or possibly get away with a single-core CPU (that was fast enough) if HavoK would try or was allowed to code their API to be able to run on Ageia's PPU. At the moment though, they choose not to. And now intel snatched them up to help them out with creating their Larrabee project.codezer0My computer's CPU usage is about 1-5 percent with windows background apps and anit-virus running and so forth. Yeah, the CPU is very occupied :roll: Even when running Crysis my CPU usage never goes above 60% per core, so there is a lot of untapped resources right there. Again, you don't need a PPU to have some impressive physics when the CPU is very capable of supplying the same experience.
Have fun with the few games that actually support the PPU. BTW I've seen what it can do and yes the Cell Factor demo was impressive but at the same time I saw nothing that couldn't have been done on a CPU, especially after the tornado portion of the Alan Wake Demo.
I never said HavoK was worthless. Quit **** putting words in my mouth. :evil: I'm saying that the reality is that software-based API's still have to contend with the fact that they'll be dealing with an otherwise very occupied processor. Because even as you're running the game, the CPU has to keep track of everything the OS wants it to do, interrupts from hardware devices and software wanting attention, the antivirus and firewall programs you should be running if you're using Windows, communicating with drivers for all devices connected to the machine, performing the I/O ops for USB devices like your keyboard, mouse, headset, etc. The list goes on. There is no need for a dedicated A.I. processor; by your logic, the CPU already is one - the ALU that has been part of CPU architectures since even before there was the intel 8086 are proof of this. At the same note, no need for an HDR accelerator, because that's a byfunction of graphical elements; and the GPU has already proven its role in doing these brilliantly. I also am looking forward to Alan Wake, and it's more than just a purely technical reason at that. I admire what Remedy has shown off thus far was possible with their usage of HavoK FX and the like. At the same time though, I do believe this game could run just as well if it had a way to communicate with a PPU, if not a whole lot better. Given what they're talking about is involved in the physics element of the game, you could easily make do with a dual-core or possibly get away with a single-core CPU (that was fast enough) if HavoK would try or was allowed to code their API to be able to run on Ageia's PPU. At the moment though, they choose not to. And now intel snatched them up to help them out with creating their Larrabee project.codezer0
No offense but you're coming off as a bit too aggro. Why did you start insulting him personally?
I know this **** from before. This isn't the first time he's come off as either completely ignorant or flat out wrong. and repeatedly has proven himself that he's the kind of person that likes to get kicked in the head a few times before any lesson will sink in.No offense but you're coming off as a bit too aggro. Why did you start insulting him personally?
Nkemjo
[QUOTE="codezer0"][QUOTE="karasill"][QUOTE="codezer0"][QUOTE="karasill"]And that's why you are sitting where you are making the amount of money you make instead of in a job making what you currently make in a year in a single paycheck. I own a business, I make more money in one month then you in a year. Please quit trying to start a dumb argument with me.Like I said, the PPU won't be needed as quad core cpu's become faster and in more people's hands. Give it 2-3 years... The PPU is a great idea but I don't see going anywhere.
karasill
And WTF does your post have to do with what I said? You really love to start s*** don't you? :roll:
You? own a company? Oh god, I hope you don't do your own accounting. :lol: The reality is that even the most efficient coding of real-time physics API's and software is simply going to be limited in scope to what a piece of silicon whose sole purpose isphysics is capable of doing. You know the GPU? Used to be that stuff was rendered purely by the CPU as well, and people also reasoned back then that a separate piece of silicon for video was unnecessary. Now you can't even get a desktop computer to boot up without one, and the few that do anymore are server blades meant to be rack-mounted, receive their .ini and config files and quietly crunch away in some basement corner somewhere. Even the basic PhysX PPU that is out now is able to accurately process data for thousands more objects (and the interactions of said objects with the environment, the player(s) and each other) far faster and more accurately than even the most ridiculously overclocked quad-core would be able to if you could dedicate all four cores to the job (highly unlikely). What this move does is it grants NVIDIA a way to work in Ageia's tech to creating or integrating a GPU with a PPU, so that gamers who would want the advanced physics could then buy a high-end, PPU-integrated model, and not only enjoy good eye candy, but also offload the physics work onto it, leaving the CPU to handle the A.I. elements (to which it is unpeered at handling over any other component in a computer), and the background/menial stuff the OS tasks it to do. In order to actually be in the business of making games and/or pushing the technology envelope, you do have to be a bit more forward thinking than your typical Timmy gamer who's a slave to their Wintendo box to actually make the games happen. Yes, please try to insult me, it's only expected from someone like you who can't even show some common courtesy, maybe the act of it is just too hard for you to do. For all I know I'm arguing with someone with a mental disibility.The day when I'd have to buy a PPU just to play games is the day I quit PC gaming. I suppose we should just ignore what we've seen with the likes of Alan Wake and what Intel was showing last year huh? I was not talking about current quad core cpu's, I was talking about future cpus which would be a minimum of 4 times faster then the best that's currently out there.. By your method of thinking I suppose we should also have a dedicated AI processor, dedicated HDR processor, etc just to run a game.
Maybe we should ignore all CPU based physics because they obviously can't handle the complex offerings of the all mighty PPU despite the link I provided. But hey, let's ignore the truth, and is that a CPU is fully capable of calculating physics. A PPU is a nice idea and it does take some workload off the cpu, but it doesn't necessarily provide physics that are more advanced or realistic looking as a CPU can do all of that as well. (maybe you didn't look at the link I provided)
Ya i somewhat agree with that.[QUOTE="Nkemjo"]I know this **** from before. This isn't the first time he's come off as either completely ignorant or flat out wrong. and repeatedly has proven himself that he's the kind of person that likes to get kicked in the head a few times before any lesson will sink in. No, you've always been a jerk from the get go. And not just to me, but most people. You don't know the meaning of being nice and no, I've proven you wrong many times. Don't get confused ;)No offense but you're coming off as a bit too aggro. Why did you start insulting him personally?
codezer0
BTW you didn't reply to my post, maybe it's because you're the one who was in the wrong saying my cpu would be very occupied and that a cPU is incapable of doing some advanced Physics. Almost as if you're ignoring the CPU based physics in Alan Wake because they are as impressive as anything in CellFactor.
Maybe you feel the need to defend the PPU because you own one and don't want to feel like you made a bad call when purchasing it. Is the PPU really worth the few games and maps that actually support it? I think not.
I didn't bother reading anything past the first few sentences because you're wasting my time....karasillYour words, not mine. Why should I show you any courtesy or respect when you continue to delude yourself in the same manner you accuse me of? Pot. Kettle. Black.
You know codezer0, I'm tired of arguing with you. You are the thorn on my side. How about you don't quote anything I say and I don't quote anything you say and we just leave each other alone? Because after today I'm just going to ignore everything you write. I don't log onto this forum just to argue, especially with thick headed people who have to defend their purchase as if it were their own flesh and blood.
If you love the PPU then that's cool, but even you should see that CPU's of today (not to mention CPU's from 2-4 years from now) are powerful enough to run advanced physics without the aid of a PPU if the game is coded correctly.
- DirectX 10 cannot natively handle DX9 (or earlier) software calls, and needs an emulation wrapper to be able to process them at all. And if your game is OpenGL, and the display drivers did it how Microsoft wanted them to do it, OpenGL code would have to be put in a wrapper to be passed to the DX9 wrapper, which then must be retranslated again for DirectX 10. And this is before the game is even running!
codezer0
DirectX 9 doesn't get emulated into DX 10 calls. That's why there's a WDDM version of DirectX 9 on all Vista installs, and also why you can download and install the traditional DX 9 from the Microsoft Download site.
There's no performance decrease through emulation for running Dx9 apps.
And there's no emulation for OpenGL either, it's handled exactly the same way as it is in XP. I got this from the OpenGL website. Just like in XP if there isn't an ICD provided in the drivers, then it's all done in software, but it's up to the Video Card manufacturers to provide the ICD to do OpenGL in hardware.
"OpenGL hardware acceleration is handled in exactly the same way in Windows XP and Windows Vista - through an Installable Client Driver (ICD) provided by graphics card manufacturers. "
http://www.opengl.org/pipeline/article/vol003_9/
I don't own a PPU yet. You again ASSume that I did purchase one. No; I'm simply waiting until I can snap one up for $100 or less (w/o an MIR to deal with). Or if possible, for NVIDIA to provide a graphics card that includes a PhysX PPU on the PCB. Why should we have to wait 2-4 years (as you say), for these kinds of advanced physics? If an extra piece of silicon is capable of granting that ability now or shorten that wait significantly, why not take advantage?codezer0CPU's are very capable of doing this now. I'm just futher clarifying that as time goes on the need for a PPU diminishes greatly. Do I have to keep bringing up the Alan Wake Demo? Did you even watch the Tornado scene? It was about as impressive as the CellFactor Demo and I noticed zero impact on the framerate.
Then why is it that every OpenGL game I've tried on Vista-based machines thus far have either run like ass or not at all, where they ran like butter on Windows XP and even 2000? If it's implemented the same way, that doesn't make any sense. :?And there's no emulation for OpenGL either, it's handled exactly the same way as it is in XP. I got this from the OpenGL website. Just like in XP if there isn't an ICD provided in the drivers, then it's all done in software, but it's up to the Video Card manufacturers to provide the ICD to do OpenGL in hardware.
"OpenGL hardware acceleration is handled in exactly the same way in Windows XP and Windows Vista - through an Installable Client Driver (ICD) provided by graphics card manufacturers. "
http://www.opengl.org/pipeline/article/vol003_9/
G013M
[QUOTE="G013M"]Then why is it that every OpenGL game I've tried on Vista-based machines thus far have either run like ass or not at all, where they ran like butter on Windows XP and even 2000? If it's implemented the same way, that doesn't make any sense. :?And there's no emulation for OpenGL either, it's handled exactly the same way as it is in XP. I got this from the OpenGL website. Just like in XP if there isn't an ICD provided in the drivers, then it's all done in software, but it's up to the Video Card manufacturers to provide the ICD to do OpenGL in hardware.
"OpenGL hardware acceleration is handled in exactly the same way in Windows XP and Windows Vista - through an Installable Client Driver (ICD) provided by graphics card manufacturers. "
http://www.opengl.org/pipeline/article/vol003_9/
codezer0
It's probably an Nvidia problem. Just like SLI support under Vista isn't the greatest, OpenGL probably has it's problems at the moment as well.
I'm just speculating here, but the implementation for XP and Vista is pretty much the same.
Wasnt' my own machine, but it was a similar system to mine belonging to one of my classmates. Then again, it could be a matter of how my playability in the Crysis SP demo improved significantly just from going from the 162 to 169 device drivers. I stand corrected, at least as far as how OpenGL is handled in Vista. It still doesn't explain the performance discrepancy, especially for a v.N+1 change.It's probably an Nvidia problem. Just like SLI support under Vista isn't the greatest, OpenGL probably has it's problems at the moment as well.
I'm just speculating here, but the implementation for XP and Vista is pretty much the same.
G013M
Please Log In to post.
Log in to comment