GeForce 8 graphics processors to gain PhysX support

This topic is locked from further discussion.

Avatar image for --Ryu--
--Ryu--

232

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 --Ryu--
Member since 2008 • 232 Posts

Article

During Nvidia's fourth-quarter financial results conference call, Nvidia shed a little more light on its acquisition of Ageia and what it plans to do with the firm's PhysX technology. Nvidia CEO Jen-Hsun Huang made no announcements regarding the deal until asked in the question-and-answer session, but he was happy to divulge a decent number of details.

digg_url = 'http://techreport.com/discussions.x/14147';

Huang revealed that Nvidia's strategy is to take the PhysX engine and port it onto CUDA. For those not in the know, CUDA stands for Compute Unified Device Architecture, and it's a C-like application programming interface Nvidia developed to let programmers write general-purpose applications that can run on GPUs. All of Nvidia's existing GeForce 8 graphics processors already support CUDA, and Huang confirmed that the cards will be able to run PhysX.

We're working toward the physics-engine-to-CUDA port as we speak. And we intend to throw a lot of resources at it. You know, I wouldn't be surprised if it helps our GPU sales even in advance of [the port's completion]. The reason is, [it's] just gonna be a software download. Every single GPU that is CUDA-enabled will be able to run the physics engine when it comes. . . . Every one of our GeForce 8-series GPUs runs CUDA.

Huang thinks the integration will encourage people to spend more on graphics processing hardware, as well:

Our expectation is that this is gonna encourage people to buy even better GPUs. It might—and probably will—encourage people to buy a second GPU for their SLI slot. And for the highest-end gamer, it will encourage them to buy three GPUs. Potentially two for graphics and one for physics, or one for graphics and two for physics.

Last, but not least, Huang said developers are "really excited" about the PhysX-to-CUDA port. "Finally they're able to get a physics engine accelerated into a very large population of gamers," he explained. Huang was unwilling to get into a time frame for the release of the first PhysX port. However, considering this will be purely a software implementation and Nvidia now has Ageia engineers on its payroll, the port may not take too long to complete.

Avatar image for dayaccus007
dayaccus007

4349

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 dayaccus007
Member since 2007 • 4349 Posts
That colaboration between Nvidia and PhysX will very important for the future of video card technology. Soon Nvidia will crush ATI
Avatar image for Indestructible2
Indestructible2

5935

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 Indestructible2
Member since 2007 • 5935 Posts
Guess i should get a XFX 8800GS XXX After all for my Lanbox PC :P
Avatar image for artur79
artur79

4679

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 artur79
Member since 2005 • 4679 Posts

I don't get it, is PhysX going to be an integrated part of future cards? Can you SF-upgrade a 8800 GTX to be PhysX compatible? There is very little info in that article.

It seems that you have to buy an extra GPU to run physics. I don't understand why this would be better than just buying an Ageia PhysX card. Graphics cards are the most expensive part of a PC nowadays, I don't see people spending 400$ on physics when octacores are around the corner.

Edit: the only positive thing about this is that if you upgrade your GPU, you can download that CUDA software and use your old 8800 card as a PhysX card. I think...

Avatar image for solid_mario
solid_mario

3144

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 solid_mario
Member since 2005 • 3144 Posts
It seems that you have to buy an extra GPU to run physics.artur79
That's the same drift that I get from the article aswell. I hate to crush Nvidia's seet dreams, but I'm not going to buy a second video card just for physics...
Avatar image for mfsa
mfsa

3328

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6 mfsa
Member since 2007 • 3328 Posts

That colaboration between Nvidia and PhysX will very important for the future of video card technology. Soon Nvidia will crush ATIdayaccus007

That could well be a bad thing, though. Advances thrive in competition, if one company moves so far ahead that they can't be touched, their incentive to continue advancing at the same speed diminishes - which leads to a general decline. Though I suppose many people may consider that a good thing, since they won't need to be upgrading their GPUs quite as often.

Avatar image for solid_mario
solid_mario

3144

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 solid_mario
Member since 2005 • 3144 Posts

[QUOTE="dayaccus007"]That colaboration between Nvidia and PhysX will very important for the future of video card technology. Soon Nvidia will crush ATImfsa

That could well be a bad thing, though. Advances thrive in competition, if one company moves so far ahead that they can't be touched, their incentive to continue advancing at the same speed diminishes - which leads to a general decline. Though I suppose many people may consider that a good thing, since they won't need to be upgrading their GPUs quite as often.

Would a decline be so bad? From reading people's post, graphics were fine about 2 years ago and people don't them upgrading. I'd like them to keep going though. I can't wait to play a movie :P
Avatar image for gs_gear
gs_gear

3237

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 gs_gear
Member since 2006 • 3237 Posts

I don't get it, is PhysX going to be an integrated part of future cards? Can you SF-upgrade a 8800 GTX to be PhysX compatible? There is very little info in that article.

It seems that you have to buy an extra GPU to run physics. I don't understand why this would be better than just buying an Ageia PhysX card. Graphics cards are the most expensive part of a PC nowadays, I don't see people spending 400$ on physics when octacores are around the corner.

Edit: the only positive thing about this is that if you upgrade your GPU, you can download that CUDA software and use your old 8800 card as a PhysX card. I think...

artur79

They want to say that all you have to do is just download a software, something as easy like updating your drivers. They talked about this even before buying Ageia(didn't know they did until now), about using a second card for physics but I think they said that they'll use the Havok engine as software or something like that.

Well if you could just do a software update and then run the physics in games and getting a second card would be only optional then it would be great. I mean I hope it won't be something like: "yes you can run physics on your 8800 but games will run like crap if you don't have a second card ", because then it would be like you've said that it would be just like buying a physics card.

So yeah, what use will there be left for 8 cores CPUs?:| You'll use them only to run the so called " A.I. " from games?

Avatar image for dnuggs40
dnuggs40

10484

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 dnuggs40
Member since 2003 • 10484 Posts

That colaboration between Nvidia and PhysX will very important for the future of video card technology. Soon Nvidia will crush ATIdayaccus007

I hope not.

While I have had more Nvidia cards then Ati, the competition between the two is good for consumers.

Avatar image for Colonel_Cool
Colonel_Cool

1335

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 Colonel_Cool
Member since 2006 • 1335 Posts
While this is definitely pretty cool, I don't ever plan on buying another 8800 just for physics processing. Isn't additional physics processing unnecessary with mulitcore processors able to donate a whole CPU core to the job?
Avatar image for pminooei
pminooei

1076

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 pminooei
Member since 2003 • 1076 Posts

k this isn't a good news or abd news as no game today even uses this much physics :P i mean who in their mind buy 3 8800Ultra and make 2 for physics and 1 for graphic lol :P i mean games today hardly even use physics the most demanding physic game today can run smoothly for 1 graphic card and a regular dual core lol :P Crysis with all it's glory doesn't require this much physics power lol ;P

Avatar image for osan0
osan0

18263

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 osan0
Member since 2004 • 18263 Posts

sweet. im interested to see AMDs response.

im even more interested to see what intel will be doing. there are whispers that a great power is rising in the south and its coming to sweep away the peoples of nvidia and AMD (ie it looks like intel will be taking the GPU market more seriously and compete with high end nvidia and ati chips...just in case there was any confusion about the above :P)

Avatar image for teebeenz
teebeenz

4362

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 teebeenz
Member since 2006 • 4362 Posts
So nvidia is putting gpus on mobos, and then expects us to own 2 additional cards? Yeah I dont think so,
Avatar image for Swiftstrike5
Swiftstrike5

6950

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#14 Swiftstrike5
Member since 2005 • 6950 Posts
PhysX is used in what? 2 games? both made by Ageia to showcase their card. Maybe we will see more uses for it in the future for more realistic debris or something. Although, CPUs are perfectly capable of doing physics processing and moving that processing onto the GPU would probably make the GPU run slower (unless I'm mistaken and they have a processor on GPU that isn't being used).
Avatar image for DJGOON
DJGOON

603

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 DJGOON
Member since 2005 • 603 Posts

No where does it say you have to buy another graphics card. What they are saying is that GeForce 8 series graphics cards will be able to run PhysX (well a port of it). If your card is already being used at maximum capacity then yeah its not really going to work well unless you get another card. However if your cards utilization is less than 100% then the PhysX could run to some degree on the one card.

I think its a great move. At the moment there are three roads that we could go down. First being a solely dedicated Physics card , second being to use the extra cores on a CPU, the third to use the GPU.

The problem with the first solution is that its yet another completly different chipset that devleopers have to look after and makes the industry even more complicated. The second option is valid but a CPU is not as good at the GPU for physics processing. At least with the third if a game does not use physics much then the cards will be dedicated to graphics so its not like its sitting there doing nothing (and the GPU is really great for doing physics calculations fast).

Avatar image for G013M
G013M

6424

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 G013M
Member since 2006 • 6424 Posts

PhysX is used in what? 2 games? both made by Ageia to showcase their card. Maybe we will see more uses for it in the future for more realistic debris or something. Although, CPUs are perfectly capable of doing physics processing and moving that processing onto the GPU would probably make the GPU run slower (unless I'm mistaken and they have a processor on GPU that isn't being used).Swiftstrike5

Don't forget that UT3 also features PhysX support.

And they'd probably dedicate an entire GPU to just do Physics calculations.

"Our expectation is that this is gonna encourage people to buy even better GPUs. It might—and probably will—encourage people to buy a second GPU for their SLI slot. And for the highest-end gamer, it will encourage them to buy three GPUs. Potentially two for graphics and one for physics, or one for graphics and two for physics."

Also, just to note for next time, this would be better suited for the Hardware forum.

Avatar image for Swiftstrike5
Swiftstrike5

6950

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#17 Swiftstrike5
Member since 2005 • 6950 Posts

[QUOTE="Swiftstrike5"]PhysX is used in what? 2 games? both made by Ageia to showcase their card. Maybe we will see more uses for it in the future for more realistic debris or something. Although, CPUs are perfectly capable of doing physics processing and moving that processing onto the GPU would probably make the GPU run slower (unless I'm mistaken and they have a processor on GPU that isn't being used).G013M

Don't forget that UT3 also features PhysX support.

And they'd probably dedicate an entire GPU to just do Physics calculations.

"Our expectation is that this is gonna encourage people to buy even better GPUs. It might—and probably will—encourage people to buy a second GPU for their SLI slot. And for the highest-end gamer, it will encourage them to buy three GPUs. Potentially two for graphics and one for physics, or one for graphics and two for physics."

I don't know where nvidia is going. People buy graphics cards to render games. The only way I would buy a GPU if it had a processor dedicated to processing physics... even then it costs more $, which is money I would rather have spent on rendering power. I believe that the GPU is running 100% all the time. That is why you see more FPS when there's not as much on the screen and less when there's more on the screen (V-Sync would be the only thing limiting the GPU from running at 100%). So you would have to render less in order to process physics, which is not what most people want.

Avatar image for OoSuperMarioO
OoSuperMarioO

6539

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 OoSuperMarioO
Member since 2005 • 6539 Posts
I think this was a great move from Nvidia.
Avatar image for DJGOON
DJGOON

603

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 DJGOON
Member since 2005 • 603 Posts
[QUOTE="G013M"]

[QUOTE="Swiftstrike5"]PhysX is used in what? 2 games? both made by Ageia to showcase their card. Maybe we will see more uses for it in the future for more realistic debris or something. Although, CPUs are perfectly capable of doing physics processing and moving that processing onto the GPU would probably make the GPU run slower (unless I'm mistaken and they have a processor on GPU that isn't being used).Swiftstrike5

Don't forget that UT3 also features PhysX support.

And they'd probably dedicate an entire GPU to just do Physics calculations.

"Our expectation is that this is gonna encourage people to buy even better GPUs. It might—and probably will—encourage people to buy a second GPU for their SLI slot. And for the highest-end gamer, it will encourage them to buy three GPUs. Potentially two for graphics and one for physics, or one for graphics and two for physics."

I don't know where nvidia is going. People buy graphics cards to render games. The only way I would buy a GPU if it had a processor dedicated to processing physics... even then it costs more $, which is money I would rather have spent on rendering power. I believe that the GPU is running 100% all the time. That is why you see more FPS when there's not as much on the screen and less when there's more on the screen (V-Sync would be the only thing limiting the GPU from running at 100%). So you would have to render less in order to process physics, which is not what most people want.

Yes with one GPU its a trade off between physics and graphics (depending on if its a GPU or CPU bottlenecked game). However I prefer gameplay > graphics and I see physics as a gameplay component so I would be ok to sacrafice graphics quality. That being said if you have a quad core then using 1 - 2 cores for physics would be a better move.(even though as I said before GPU > CPU for physics).

Avatar image for justforlotr2004
justforlotr2004

10935

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 justforlotr2004
Member since 2004 • 10935 Posts

PhysX is used in what? 2 games? both made by Ageia to showcase their card. Maybe we will see more uses for it in the future for more realistic debris or something. Although, CPUs are perfectly capable of doing physics processing and moving that processing onto the GPU would probably make the GPU run slower (unless I'm mistaken and they have a processor on GPU that isn't being used).Swiftstrike5

Thats the whole point in putting it in the GPU's, you dont have to waste money on a card that only does physics. Now the 8 series Nvidia cards will all be capable of doing what the Ageia cards do and at the same time being able to be the GPU also. With that more developers will start creating more games that incorperate the tech. The tech allows people with SLI to decide if they want to use one card for physics.

Avatar image for Swiftstrike5
Swiftstrike5

6950

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#21 Swiftstrike5
Member since 2005 • 6950 Posts

[QUOTE="Swiftstrike5"]PhysX is used in what? 2 games? both made by Ageia to showcase their card. Maybe we will see more uses for it in the future for more realistic debris or something. Although, CPUs are perfectly capable of doing physics processing and moving that processing onto the GPU would probably make the GPU run slower (unless I'm mistaken and they have a processor on GPU that isn't being used).justforlotr2004

Thats the whole point in putting it in the GPU's, you dont have to waste money on a card that only does physics. Now the 8 series Nvidia cards will all be capable of doing what the Ageia cards do and at the same time being able to be the GPU also. With that more developers will start creating more games that incorperate the tech. The tech allows people with SLI to decide if they want to use one card for physics.

So would you want your $300 GPU computeing ONLY physics (it could switch, but it can only run physics or render from what I've read)? I don't know many games where you wouldn't use both cards. You'd pretty much have to scale back on the AA/Resolution so that your GPU doesn't have to render as much (so that you can disable SLI to compute physics). Personally, I hate Ageia. They put code into tech demos that causes poor performance if you don't have a Ageia card (someone removed some lines of code and Ageia games ran almost 300% better). They almost have as bad reputation as EA imo.

Avatar image for biggest_loser
biggest_loser

24508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 60

User Lists: 0

#22 biggest_loser
Member since 2007 • 24508 Posts
What does PhysX actually do?
Avatar image for G013M
G013M

6424

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 G013M
Member since 2006 • 6424 Posts

What does PhysX actually do?biggest_loser

Pretty much it's a physics accelerator like you'd buy a graphics accelerator (like a Geforce or a Radeon). Which is specially designed to just do physics, and as such is faster at doing physics then the CPU.

The only problem is that games need to be coded to use it, which hasn't really been the case.

Avatar image for biggest_loser
biggest_loser

24508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 60

User Lists: 0

#24 biggest_loser
Member since 2007 • 24508 Posts

Do you know any games coming out that would actually use it?

Since it speeds the rate of physics up etc. would it increases performance and frame rates etc?

Avatar image for G013M
G013M

6424

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 G013M
Member since 2006 • 6424 Posts

Well UT3 can use it, and it's more of a case of, it'll allow you to have more smooth performance (so you won't hit dips in FPS when there's alot of physics events happening) and allows for more extravagant physics then what is avaliable at the moment with CPU driven physics.

I'm not too sure on future games though.

Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#26 Daytona_178
Member since 2005 • 14962 Posts

[QUOTE="dayaccus007"]That colaboration between Nvidia and PhysX will very important for the future of video card technology. Soon Nvidia will crush ATIdnuggs40

I hope not.

While I have had more Nvidia cards then Ati, the competition between the two is good for consumers.

Agreed! Lets hope ATI gets their hands on some similar technology aswell :)

Avatar image for biggest_loser
biggest_loser

24508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 60

User Lists: 0

#27 biggest_loser
Member since 2007 • 24508 Posts
Alan Wake will probably use it.
Avatar image for Krall
Krall

16463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Krall
Member since 2002 • 16463 Posts
More of a hardware topic so I moved it here :)
Avatar image for Taiko88
Taiko88

1854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 Taiko88
Member since 2004 • 1854 Posts
This is so funny, because another guy made a thread of whether or not to get the PhysX PCI card, and based off 3 people, he is now probably getting it, not knowing that he just wasted his money. hahaha, sorry it's funny.
Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 RayvinAzn
Member since 2004 • 12552 Posts

Crysis with all it's glory doesn't require this much physics power lol ;P

pminooei

Crysis in all its glory has terrible physics compared to its graphics.

Seriously, game physics and AI are still in the bleeding stone age. I think the peasants in Populous might have been smarter than the AI squadmates I've had in some more recent shooting games. And we've still got clipping - how sad is that? A major graphics problem from the original Quake game still haunts us to this day. As for physics, I have yet to see a game that really impressed me in terms of physics - sure, things have improved, but they're still leagues behind graphics. Things like this really need to be fixed, and PhysX is a step in the right direction, and the method Nvidia is implementing is also a step in the right direction. After all, it's hard to justify a PPU. But quite a lot of gamers have old graphics cards lying around (and mark my words, someday the 8800 cards will be old cards lying around) and wouldn't mind putting them to good use.