This topic is locked from further discussion.

Avatar image for Fou4
Fou4

43

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 Fou4
Member since 2008 • 43 Posts

Yes, I've been meaning to post this for a while but with nVidia's purchase of agiea will nVidia make some "adjustments" to there PhysX line

Avatar image for DGFreak
DGFreak

2234

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 DGFreak
Member since 2003 • 2234 Posts
There will be no more "PhysX" line. nVidia will incorporate that technology into their video cards... or something.
Avatar image for Fou4
Fou4

43

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 Fou4
Member since 2008 • 43 Posts

There will be no more "PhysX" line. nVidia will incorporate that technology into their video cards... or something.DGFreak

iight fo sho

Avatar image for TrooperManaic
TrooperManaic

3863

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#4 TrooperManaic
Member since 2004 • 3863 Posts

[QUOTE="DGFreak"]There will be no more "PhysX" line. nVidia will incorporate that technology into their video cards... or something.Fou4

iight fo sho

word
Avatar image for Fou4
Fou4

43

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Fou4
Member since 2008 • 43 Posts
lol
Avatar image for Baselerd
Baselerd

5104

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#6 Baselerd
Member since 2003 • 5104 Posts
basically anyone who bought physx cards is getting the shaft.
Avatar image for TrooperManaic
TrooperManaic

3863

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7 TrooperManaic
Member since 2004 • 3863 Posts
basically anyone who bought physx cards is getting the shaft.Baselerd
oooohh $%!@ just got real!!
Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#8 codezer0
Member since 2004 • 15898 Posts
NVIDIA has stated they've no interest in making discrete PPU's at this time. There will still be drivers available for those who did buy the PhysX PPU, but for the moment, NV is more intent on integrating the ability to process PhysX API calculations into video cards, much like they'd initially planned to do with HavoK before intel purchased them (HavoK, not NV).
Avatar image for IQT786
IQT786

2604

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 IQT786
Member since 2005 • 2604 Posts
don't the 8800 series already use this it says on my 8800gt box :?
Avatar image for -CheeseEater-
-CheeseEater-

5258

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10 -CheeseEater-
Member since 2007 • 5258 Posts
don't the 8800 series already use this it says on my 8800gt box :?IQT786
'fraid not.
Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 LordEC911
Member since 2004 • 9972 Posts
Actually all G80+ GPUs should eventually be able to be used as PhysX processors, which Nvidia recently stated in an interview.
Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#12 codezer0
Member since 2004 • 15898 Posts
don't the 8800 series already use this it says on my 8800gt box :?IQT786
It was planned that geForce 8 cards should be able to be used for physics acceleration. Problem is, HavoK backed out, because their physics implementation simply could not cover the sheer breadth and scope that AGEIA's PhysX API was able to. but by the time HavoK backed out, NV was already committing the design to silicon.
Avatar image for Sistem_42
Sistem_42

372

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13 Sistem_42
Member since 2005 • 372 Posts
maybe the nvidia wants to take a advantage form atia again. with nvidia physx and intel havok the ati will have a "good" physic solution to.
Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#14 codezer0
Member since 2004 • 15898 Posts
maybe the nvidia wants to take a advantage form atia again. with nvidia physx and intel havok the ati will have a "good" physic solution to.Sistem_42
It would be in NVIDIA's best interest to either license or collaborate with ATi to be able to share PhysX GPU acceleration, because it then gives them both a competitive advantage to intel. Right now, intel's next-gen CPU and NVIDIA GPU could then accelerate both HavoK and PhysX, and if ATi/AMD doesn't join up, it won't have either. But if DAAMiT joins with NVIDIA, suddenly there are a lot more people that will be able to have hardware acceleration for PhysX, and make intel's decision to purchase HavoK look all the more stupid. And such a move would also help a lot to build the friendly alliance that AMD and NVIDIA shared before they (AMD) purchased ATi.
Avatar image for Sistem_42
Sistem_42

372

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 Sistem_42
Member since 2005 • 372 Posts

yes that makes sense, but still there is the nvidia ment to be played sure. you know what i mean right.

in one hand if they make evrythig right they could triumph over both the ati and the intel, but it is a lot of risk and thy wont take it. still it remains to be seen if nvidia means to join with ati or they will join but the nvidia ment to be played way (holding most the apples for themself).

Avatar image for Munkyman587
Munkyman587

2007

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#16 Munkyman587
Member since 2003 • 2007 Posts
Meh, I thought it was a cool idea to have a seperate card for physics. Spreading the load, keeping it off the cpu, seems like a good plan. Since the GPU gets bogged down by a lot of the current games, I don't know if adding even more to its load will be helpful? /shrug
Avatar image for Baselerd
Baselerd

5104

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#17 Baselerd
Member since 2003 • 5104 Posts

Meh, I thought it was a cool idea to have a seperate card for physics. Spreading the load, keeping it off the cpu, seems like a good plan. Since the GPU gets bogged down by a lot of the current games, I don't know if adding even more to its load will be helpful? /shrugMunkyman587

True, but i would imagine that NVIDIA will use this primarily to sell more multi-GPU configurations. That way, you could buy one gpu for graphics, and then do SLI with another for physics (and maybe a little more graphics if there is some overhead left)