Just got some inside info

This topic is locked from further discussion.

Avatar image for chandlerr_360
chandlerr_360

5078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: -2

#1 chandlerr_360
Member since 2006 • 5078 Posts

I am a member of the Nvidia PartnerForce program for my business, and I just finished a interactive Webinar with the Nvida sales rep a few minutes ago. Overall, some pretty cool stuff. I knew most of it but there was prettyu awesome ideas/examples they threw out there. A lot of it I can't share but I can say that they are putting heavy emphasis on the new line-up being more about non-gaming video performance. They said that they are trying to balance the "fastest, most immersive, and most entertaining gaming experience ever..." along with "opening new doors to the interactive computing world...".

Piclens, BadaBOOM, and Folding@Home were just a few of the programs that he gave examples of. There was also mention of 10x faster , yes 10x faster HD video transcoding. Another great feature he mentioned was the entire PhysX engine (instead of being dedicated to its own processing unit) is no on the GPU. This is great, and is definitely convenient for future games. He mentioned a footbal game coming out soon called "Backbreaker" which instead of using the normal motion-capture method of tackle animations used by just about every football game to date, it is going to be using real time, completely isolated, motion. That is really cool.

He also mentioned how awesome the GT 200 series Hybrid power saving option is and how the GTX 280 only uses about 20% of its overall power consumption on idle.

The confirmation for the release of the GTX 260 was also announced for next Thursday, June 26th for $399. It performs better than a SLI 9800GTX setup...that is on ONE GPU people...damn.

Future GTX 200 cards were also mentioned :P

Avatar image for hofuldig
hofuldig

5126

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#2 hofuldig
Member since 2004 • 5126 Posts
ive seen benchmarks and from what i can tell these cards are less powerful than the 9 seeries. there crap dont buy em...
Avatar image for chandlerr_360
chandlerr_360

5078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: -2

#3 chandlerr_360
Member since 2006 • 5078 Posts
The GTX 280 outperforms any single card on the market. I also think it might be a little overpriced but it is still a very impressive card.
Avatar image for blackleather223
blackleather223

1569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 blackleather223
Member since 2004 • 1569 Posts

There new cards and haven't been run through the ringer like the past ones have. Give em a bit more time. Take vista for example did it work right from the get go and every one just loved it no there was alot of bugs I guess and stuff like that they had to fix and over time it has gotten better.

The some thing goes with these cards and probaly down the road ones as well. At least from what I understand these cards are still very new. There probably aren't many games out there that will use those cards to there fullest potential.

To me at least it is way to early to say that they are no good or good. So at least for me I'll just see how they do as time goes along. There seems to be some good ideas but I'll just wait and see what really comes.

Avatar image for hofuldig
hofuldig

5126

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#5 hofuldig
Member since 2004 • 5126 Posts
did you guys see all the benchmarks that people were posting from a bunch of different sites? there were like 10 sites that benched the cards using games and 3Dmark and even the 280 was crap. use the search feature here on GS and see for youre self
Avatar image for blackleather223
blackleather223

1569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 blackleather223
Member since 2004 • 1569 Posts

I imo don't even bother looking at those. To me if I'm interested in a card I'll come here and see what you think and if you say don't I'll think about not getting what ever but if you say yes then I'll get it.

As I said before it just takes time.

Avatar image for threepac81
threepac81

3459

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 threepac81
Member since 2003 • 3459 Posts

did you guys see all the benchmarks that people were posting from a bunch of different sites? there were like 10 sites that benched the cards using games and 3Dmark and even the 280 was crap. use the search feature here on GS and see for youre selfhofuldig

What were you saying...?

http://common.ziffdavisinternet.com/util_get_image/21/0,1425,i=212563,00.gif

Courtesy of ExtremeTech.com

Avatar image for Luminouslight
Luminouslight

6397

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 Luminouslight
Member since 2007 • 6397 Posts

did you guys see all the benchmarks that people were posting from a bunch of different sites? there were like 10 sites that benched the cards using games and 3Dmark and even the 280 was crap. use the search feature here on GS and see for youre selfhofuldig

They are the best, but the price is certainly unjustified for the preformance they produce now.

Avatar image for kodai
kodai

924

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 kodai
Member since 2003 • 924 Posts

"ive seen benchmarks and from what i can tell these cards are less powerful than the 9 seeries. there crap dont buy em... " - hofuldig

First off, you must not have read the articles that went with theh pictures you were looking at. These benchmarks are made with beta drivers on Nvidia's first consumer card that is not specifically meant for gaming. This GPU is a great gamers card and in time the drivers will improve it's performance by a mile. The same way the rest of Nvidia's lineup has been in the past. But the real difference here is this card is mean to showcase to the consumer and devloper market the Nvidia CUDA initiative. It's Nvidia's cahnce to come out swinging at Intel. They have been cliaming that they can do a better job with replacing the CPU of mainstream computers with their GPU's, than Intel and AMD can do with continuing to make their standard, general purpose CPU's. Of course this model cant replace a Core 2 or Phenom. It's not trying to. It's meant to show off what can be done with it. Apps like Folding@home that are made to run on the 280 itself vs. the top of the line Intel extreme CPU will show that Nvidia has the number crunching edge by a mile.

Thats what Nvidia wants people to see. Their plan is to make the consumer, put a demand to the devlopers to use their chips. They cant do this by only sticking with the niche gaming market. However, it the very same niche gaming market that will pave the way for this. In the consumer marketplace, we are the ones who spend the money on more "esoteric" hardware. The average user is quite happy with low end intergrated graphics and sound. Why shouldnt they be? They dont do anything that needs the power of our hardware. So what will happen here is this. We gamers will "subsidize" the CUDA platform for Nvidia. Then over the next so many years, the average consumer will begin to demand this power and apps that use it. At least, thats the theory anyway. Intel of course is fighting back with "Larrabee".

AMD is simply just trying to stay afloat for the next few years. They simply do not have anything on the drawingboard that can compete with Nvidia or Intel on the latest power struggle. Instead, they are refining Phenom and trying to take the mid power graphics market. They also do fairly well in the intergrated market. So they should be fine as long as nothing goes really wrong for them.

So the real question will be this. Will consumers want the eaiser to code for power of Larrabee? Or do they want the much harder to code for CUDA platform from Nvidia? Nvidia's system looks to have great performance but is the trouble worth it. Intel's idea seems to really speedup devlopment buy using the tried and true x86 instruction set directly in the GPU itself. Forget that extra year or so in devlopement for a game with this method. But it may end up with lackluster performance. In the end, devs will always tend to want the faster turnover in product so Larrabee will be of some very real interest for them. But its we the consumer that will tell them with our cash if it's good enough. Time will tell.

So in the end, I really dont see how you can say the 280 is "crap" when you A) dont have one, B) have not even see a performance test with finished drivers as well as testing the other abilities it was devloped with, and C) make recommendations based on a complete lack of knowledge of the subject at hand. Then again, thats just my opinion based on your very limited statement.

Avatar image for chandlerr_360
chandlerr_360

5078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: -2

#10 chandlerr_360
Member since 2006 • 5078 Posts

did you guys see all the benchmarks that people were posting from a bunch of different sites? there were like 10 sites that benched the cards using games and 3Dmark and even the 280 was crap. use the search feature here on GS and see for youre selfhofuldig

I don't know what benchmarks you are talking about, the only thing that can beat a single GTX280 card right now on the market is a Quad SLI 9800GTX2 setup, but that costs more and is much harder to deal with hardware and software wise. Plus, we have not even seen GTX 280 performance with refined drivers yet. The important thing to remember is that this card only has a SINGLE, yes ONE (1), GPU. It is extremely impressive.

Avatar image for death1505921
death1505921

5260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11 death1505921
Member since 2004 • 5260 Posts

[QUOTE="hofuldig"]did you guys see all the benchmarks that people were posting from a bunch of different sites? there were like 10 sites that benched the cards using games and 3Dmark and even the 280 was crap. use the search feature here on GS and see for youre selfchandlerr_360

I don't know what benchmarks you are talking about, the only thing that can beat a single GTX280 card right now on the market is a Quad SLI 9800GTX2 setup, but that costs more and is much harder to deal with hardware and software wise. Plus, we have not even seen GTX 280 performance with refined drivers yet. The important thing to remember is that this card only has a SINGLE, yes ONE (1), GPU. It is extremely impressive.

Drivers will only give an increase of 5 FPS max. And I've seen reviews where a SINGLE 9800GX2 beats the GTX280. Yes it is impressive the GTX is only one chip, but the 9800GX2 still runs on a single PCI-E so I feel it is right to compare them both.

All things considered, the GTX is more expensive than the 9800GX2 by about £150 here in the UK. With the 9800GX2 coming in at a little over 300, and the GTX coming in at a little over £450. For that price it should be handing the 9800GX2 it's ass on a silver platter in every benchmark regardless of drivers.

I'm not disputing it's an impressive chip but they made a bad move here. They need to refine the production methods before they start bringing out new chips like this. The only logical reason I can see nvidia doing this for is to test their fan base.

Avatar image for iBP_Rickochet
iBP_Rickochet

163

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 iBP_Rickochet
Member since 2008 • 163 Posts

In an attempt to put a stop to the "GTX 280 sux" argument,

3dMark Vantage GTX 280 Tri-SLI overclocked: 22337
3dMark Vantage 9800 GX2 Quad-SLI overclocked: 18113

To put this in perspective,

Dual 9600GT overclocked Vantage score: 8339
Triple 9800GTX overclocked Vantage: 16280
(9800GTX and 9600GT benchmarked on a 780i with DDR2, not 790i with DDR3 like the first two, so they're not directly comparable as just video cards)

3x GTX 280 > everything performance and price wise.

If you have the $$$, nothing beats GTX 280.

I'm working on a longer post with pictures and stuff.

P.S. So much for NDAs :) inside info, pshaw.

Avatar image for Jamiemydearx3
Jamiemydearx3

4062

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 Jamiemydearx3
Member since 2008 • 4062 Posts

We need some offical GTX2x0 vs HD 40xx...

Ive seen SO many mixed reviews so far ranging from

HD 4850 > GTX 260 with AA.

8800 gt sli > GTX 280

HD 4850 crossfire >> GTX 280

HD 4850 = 9800gtx

GTX 280 > GOD

ugh....lol this is insane.

Avatar image for iBP_Rickochet
iBP_Rickochet

163

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 iBP_Rickochet
Member since 2008 • 163 Posts
The Radeon 4000 series has not been launched yet. Triple GTX 280 cards will beat everything available on the market today.
Avatar image for X360PS3AMD05
X360PS3AMD05

36320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 X360PS3AMD05
Member since 2005 • 36320 Posts
I can't wait until the 4870 crushes these cards with price and efficiency. :D
Avatar image for death1505921
death1505921

5260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#16 death1505921
Member since 2004 • 5260 Posts

I can't wait until the 4870 crushes these cards with price and efficiency. :DX360PS3AMD05

I hope so too. Lets just hope that they don't flop like Nvids. They must be feeling the pressure atm.

Avatar image for gp556by45
gp556by45

3375

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#17 gp556by45
Member since 2005 • 3375 Posts
[QUOTE="chandlerr_360"]

[QUOTE="hofuldig"]did you guys see all the benchmarks that people were posting from a bunch of different sites? there were like 10 sites that benched the cards using games and 3Dmark and even the 280 was crap. use the search feature here on GS and see for youre selfdeath1505921

I don't know what benchmarks you are talking about, the only thing that can beat a single GTX280 card right now on the market is a Quad SLI 9800GTX2 setup, but that costs more and is much harder to deal with hardware and software wise. Plus, we have not even seen GTX 280 performance with refined drivers yet. The important thing to remember is that this card only has a SINGLE, yes ONE (1), GPU. It is extremely impressive.

Drivers will only give an increase of 5 FPS max. And I've seen reviews where a SINGLE 9800GX2 beats the GTX280. Yes it is impressive the GTX is only one chip, but the 9800GX2 still runs on a single PCI-E so I feel it is right to compare them both.

All things considered, the GTX is more expensive than the 9800GX2 by about £150 here in the UK. With the 9800GX2 coming in at a little over 300, and the GTX coming in at a little over £450. For that price it should be handing the 9800GX2 it's ass on a silver platter in every benchmark regardless of drivers.

I'm not disputing it's an impressive chip but they made a bad move here. They need to refine the production methods before they start bringing out new chips like this. The only logical reason I can see nvidia doing this for is to test their fan base.

the ammount of ignorance in the first paragraph is amazing.
Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 RayvinAzn
Member since 2004 • 12552 Posts

In an attempt to put a stop to the "GTX 280 sux" argument,

3dMark Vantage GTX 280 Tri-SLI overclocked: 22337
3dMark Vantage 9800 GX2 Quad-SLI overclocked: 18113

To put this in perspective,

Dual 9600GT overclocked Vantage score: 8339
Triple 9800GTX overclocked Vantage: 16280
(9800GTX and 9600GT benchmarked on a 780i with DDR2, not 790i with DDR3 like the first two, so they're not directly comparable as just video cards)

3x GTX 280 > everything performance and price wise.

If you have the $$$, nothing beats GTX 280.

I'm working on a longer post with pictures and stuff.

P.S. So much for NDAs :) inside info, pshaw.

iBP_Rickochet

3DMark performance is nice and all, but it's not games, and it doesn't necessarily indicate real-world performance. Yes, tri-SLI may be the most powerful graphics configuration available right now, but that's not how you win a graphics war. You must offer the best cards at ideal price points, something Nvidia pretty much threw out the window.

Avatar image for death1505921
death1505921

5260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19 death1505921
Member since 2004 • 5260 Posts
[QUOTE="death1505921"][QUOTE="chandlerr_360"]

[QUOTE="hofuldig"]did you guys see all the benchmarks that people were posting from a bunch of different sites? there were like 10 sites that benched the cards using games and 3Dmark and even the 280 was crap. use the search feature here on GS and see for youre selfgp556by45

I don't know what benchmarks you are talking about, the only thing that can beat a single GTX280 card right now on the market is a Quad SLI 9800GTX2 setup, but that costs more and is much harder to deal with hardware and software wise. Plus, we have not even seen GTX 280 performance with refined drivers yet. The important thing to remember is that this card only has a SINGLE, yes ONE (1), GPU. It is extremely impressive.

Drivers will only give an increase of 5 FPS max. And I've seen reviews where a SINGLE 9800GX2 beats the GTX280. Yes it is impressive the GTX is only one chip, but the 9800GX2 still runs on a single PCI-E so I feel it is right to compare them both.

All things considered, the GTX is more expensive than the 9800GX2 by about £150 here in the UK. With the 9800GX2 coming in at a little over 300, and the GTX coming in at a little over £450. For that price it should be handing the 9800GX2 it's ass on a silver platter in every benchmark regardless of drivers.

I'm not disputing it's an impressive chip but they made a bad move here. They need to refine the production methods before they start bringing out new chips like this. The only logical reason I can see nvidia doing this for is to test their fan base.

the ammount of ignorance in the first paragraph is amazing.

Please point out to me where I have been ignorant.

Avatar image for gp556by45
gp556by45

3375

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 gp556by45
Member since 2005 • 3375 Posts
[QUOTE="gp556by45"][QUOTE="death1505921"][QUOTE="chandlerr_360"]

[QUOTE="hofuldig"]did you guys see all the benchmarks that people were posting from a bunch of different sites? there were like 10 sites that benched the cards using games and 3Dmark and even the 280 was crap. use the search feature here on GS and see for youre selfdeath1505921

I don't know what benchmarks you are talking about, the only thing that can beat a single GTX280 card right now on the market is a Quad SLI 9800GTX2 setup, but that costs more and is much harder to deal with hardware and software wise. Plus, we have not even seen GTX 280 performance with refined drivers yet. The important thing to remember is that this card only has a SINGLE, yes ONE (1), GPU. It is extremely impressive.

Drivers will only give an increase of 5 FPS max. And I've seen reviews where a SINGLE 9800GX2 beats the GTX280. Yes it is impressive the GTX is only one chip, but the 9800GX2 still runs on a single PCI-E so I feel it is right to compare them both.

All things considered, the GTX is more expensive than the 9800GX2 by about £150 here in the UK. With the 9800GX2 coming in at a little over 300, and the GTX coming in at a little over £450. For that price it should be handing the 9800GX2 it's ass on a silver platter in every benchmark regardless of drivers.

I'm not disputing it's an impressive chip but they made a bad move here. They need to refine the production methods before they start bringing out new chips like this. The only logical reason I can see nvidia doing this for is to test their fan base.

the ammount of ignorance in the first paragraph is amazing.

Please point out to me where I have been ignorant.

Drivers are very important They add more than just "5 FPS Max" and you have to remember, a 9800GX2 has 2 gpus, where as the 280 has only one. Besides i dont know what review you have been looking at, but every single review i have seen done with 3dmark has beat the 9800 by atleast 4000 points
Avatar image for Jamiemydearx3
Jamiemydearx3

4062

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 Jamiemydearx3
Member since 2008 • 4062 Posts
[QUOTE="death1505921"][QUOTE="gp556by45"][QUOTE="death1505921"][QUOTE="chandlerr_360"]

[QUOTE="hofuldig"]did you guys see all the benchmarks that people were posting from a bunch of different sites? there were like 10 sites that benched the cards using games and 3Dmark and even the 280 was crap. use the search feature here on GS and see for youre selfgp556by45

I don't know what benchmarks you are talking about, the only thing that can beat a single GTX280 card right now on the market is a Quad SLI 9800GTX2 setup, but that costs more and is much harder to deal with hardware and software wise. Plus, we have not even seen GTX 280 performance with refined drivers yet. The important thing to remember is that this card only has a SINGLE, yes ONE (1), GPU. It is extremely impressive.

Drivers will only give an increase of 5 FPS max. And I've seen reviews where a SINGLE 9800GX2 beats the GTX280. Yes it is impressive the GTX is only one chip, but the 9800GX2 still runs on a single PCI-E so I feel it is right to compare them both.

All things considered, the GTX is more expensive than the 9800GX2 by about £150 here in the UK. With the 9800GX2 coming in at a little over 300, and the GTX coming in at a little over £450. For that price it should be handing the 9800GX2 it's ass on a silver platter in every benchmark regardless of drivers.

I'm not disputing it's an impressive chip but they made a bad move here. They need to refine the production methods before they start bringing out new chips like this. The only logical reason I can see nvidia doing this for is to test their fan base.

the ammount of ignorance in the first paragraph is amazing.

Please point out to me where I have been ignorant.

Drivers are very important They add more than just "5 FPS Max" and you have to remember, a 9800GX2 has 2 gpus, where as the 280 has only one. Besides i dont know what review you have been looking at, but every single review i have seen done with 3dmark has beat the 9800 by atleast 4000 points

You are right. Drivers can add a great deal of performence.
Avatar image for Mtnes
Mtnes

180

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 Mtnes
Member since 2008 • 180 Posts

Death1505921 may be wrong about the drivers but he does seem to make a good point. More trusted sources like tomhardware and anandtech have shown that the 9800GX2 outclasses the gtx280 in almost all games and also often by a huge margin. Even the most magical of drivers would have a tough time bridging this gap, and even if 1 did that would only mean that the gtx280 is on par with the 9800GX2. This in no way justifies the extra 200$. The 2 cards at the end of the day occupy a single slot so most people wont really care whats inside the card.

Ask most people here and they would tell u they expected the gtx280 to be nearly 40% faster than the 9800GX2. I admit this would have been extremely difficult, but is that so unreasonable to expext considering the price ? (which we already knew). THAT is what SUCKS. So many people would have been waiting so eagerly for these cards(including myself) especially after nvidia claimed that this would blow evrything else away after the general unhappiness with the 9 series and guess what happens ?

Avatar image for broseybrose
broseybrose

248

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#23 broseybrose
Member since 2007 • 248 Posts

well, i trust anandtech's reviews more than any site other than guru3d... and in the anandtech GTX280 review they benched all the popular graphical powerhouse games... and the result in the end was that 2x8800gt sli beat the GTX280 across the board. considering that i already have one 8800GTX, and the price at which nvidia is going to release this card... its pretty obvious that this card would not be a smart buy for me, or anyone else, honestly.

the 8800GTX was the last card to market to impress me enough to upgrade. the last year and a half of nvidia and ati releases have been micro-upgrades for macro-money. i really hope that ati brings something worth considering to the table price/performance wise. not that im even ready to upgrade my GTX yet, but just for market competition's sake. it would be awesome to see ati dethrone nvidia again.

Avatar image for death1505921
death1505921

5260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 death1505921
Member since 2004 • 5260 Posts

I apologise on behalf of the drivers comment. It looks as if I was misinformed. But I stand by the rest of my points. Sure it is a single GPU. But it's a single slot vs single slot.

Why the hell shouldn't they be compared? Are you honoustly telling me you'd go for a lesser card just because it's a "single gpu on a single PCB" vs "two gpus on a single PCB".

And if you REALLY need me to pull up benchies then I will. But I'd trust that after one look over google an anandtechs reviews you'd see what I mean.

Avatar image for teddyrob
teddyrob

4557

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 teddyrob
Member since 2004 • 4557 Posts

well, i trust anandtech's reviews more than any site other than guru3d... and in the anandtech GTX280 review they benched all the popular graphical powerhouse games... and the result in the end was that 2x8800gt sli beat the GTX280 across the board. considering that i already have one 8800GTX, and the price at which nvidia is going to release this card... its pretty obvious that this card would not be a smart buy for me, or anyone else, honestly.

broseybrose

Yeah. I've always considered Anandtech to have a large Nvidia bias but even they don't sing the praises of theses new cards.

No directX10.1. High price. There is no way I'll pay £500 on a single card to pay games.

Avatar image for xid32
xid32

1132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#26 xid32
Member since 2005 • 1132 Posts

The GTX 280 outperforms any single card on the market. I also think it might be a little overpriced but it is still a very impressive card.chandlerr_360

You can get a 9800GX2 with 1 gig ram for the price of a GTX 280. thats what im doing. I myself prefer nvidia's cards over ATI's.

Althought actually getting the 9800GX2 to run will be a bit pricey...

Avatar image for chandlerr_360
chandlerr_360

5078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: -2

#27 chandlerr_360
Member since 2006 • 5078 Posts
[QUOTE="broseybrose"]

well, i trust anandtech's reviews more than any site other than guru3d... and in the anandtech GTX280 review they benched all the popular graphical powerhouse games... and the result in the end was that 2x8800gt sli beat the GTX280 across the board. considering that i already have one 8800GTX, and the price at which nvidia is going to release this card... its pretty obvious that this card would not be a smart buy for me, or anyone else, honestly.

teddyrob

Yeah. I've always considered Anandtech to have a large Nvidia bias but even they don't sing the praises of theses new cards.

No directX10.1. High price. There is no way I'll pay £500 on a single card to pay games.

You see, that is where the difference is. This card is being marketed more towards NON GAMING applications. So your not spending all of that money on a single card to "pay (play I presume) games", you are paying for tons of other features.

Research some of the examples that I included in the original post, it is quite amazing what this card will be able to do that the 4000 series will not. That is not saying that it may not be a little bit overpriced, but it is not at all a "epic fail...".

Avatar image for chandlerr_360
chandlerr_360

5078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: -2

#28 chandlerr_360
Member since 2006 • 5078 Posts

[QUOTE="chandlerr_360"]The GTX 280 outperforms any single card on the market. I also think it might be a little overpriced but it is still a very impressive card.xid32

You can get a 9800GX2 with 1 gig ram for the price of a GTX 280. thats what im doing. I myself prefer nvidia's cards over ATI's.

Althought actually getting the 9800GX2 to run will be a bit pricey...

A GTX 280 will outperform a 9800GX2 though...not to mention that it is more energy and heat efficient with it's HybridPower mode. The drivers coming down the road are bound to raise its performance even higher too.

Avatar image for death1505921
death1505921

5260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 death1505921
Member since 2004 • 5260 Posts
[QUOTE="teddyrob"][QUOTE="broseybrose"]

well, i trust anandtech's reviews more than any site other than guru3d... and in the anandtech GTX280 review they benched all the popular graphical powerhouse games... and the result in the end was that 2x8800gt sli beat the GTX280 across the board. considering that i already have one 8800GTX, and the price at which nvidia is going to release this card... its pretty obvious that this card would not be a smart buy for me, or anyone else, honestly.

chandlerr_360

Yeah. I've always considered Anandtech to have a large Nvidia bias but even they don't sing the praises of theses new cards.

No directX10.1. High price. There is no way I'll pay £500 on a single card to pay games.

You see, that is where the difference is. This card is being marketed more towards NON GAMING applications. So your not spending all of that money on a single card to "pay (play I presume) games", you are paying for tons of other features.

Research some of the examples that I included in the original post, it is quite amazing what this card will be able to do that the 4000 series will not. That is not saying that it may not be a little bit overpriced, but it is not at all a "epic fail...".

They may, but this is a games forum. Anything besides gaming performance on a GPU is generally rendered a null argument. Sure it may fold too, but not many people will put folding @ home above performance and for an extra £100.

On the physiX side of things. I have a feeling its going to flop, at the moment theres not enough support for it. It's not in many games and not on the table to be in many games either.

Avatar image for CreasianDevaili
CreasianDevaili

4429

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 CreasianDevaili
Member since 2005 • 4429 Posts

I am a member of the Nvidia PartnerForce program for my business, and I just finished a interactive Webinar with the Nvida sales rep a few minutes ago. Overall, some pretty cool stuff. I knew most of it but there was prettyu awesome ideas/examples they threw out there. A lot of it I can't share but I can say that they are putting heavy emphasis on the new line-up being more about non-gaming video performance. They said that they are trying to balance the "fastest, most immersive, and most entertaining gaming experience ever..." along with "opening new doors to the interactive computing world...".

Piclens, BadaBOOM, and Folding@Home were just a few of the programs that he gave examples of. There was also mention of 10x faster , yes 10x faster HD video transcoding. Another great feature he mentioned was the entire PhysX engine (instead of being dedicated to its own processing unit) is no on the GPU. This is great, and is definitely convenient for future games. He mentioned a footbal game coming out soon called "Backbreaker" which instead of using the normal motion-capture method of tackle animations used by just about every football game to date, it is going to be using real time, completely isolated, motion. That is really cool.

He also mentioned how awesome the GT 200 series Hybrid power saving option is and how the GTX 280 only uses about 20% of its overall power consumption on idle.

The confirmation for the release of the GTX 260 was also announced for next Thursday, June 26th for $399. It performs better than a SLI 9800GTX setup...that is on ONE GPU people...damn.

Future GTX 200 cards were also mentioned :P

chandlerr_360

Source: http://forums.nvidia.com/index.php?showtopic=36286

What is the maximum kernel execution time?

On Windows, individual GPU program launches have a maximum run time of around 5 seconds. Exceeding this time limit usually will cause a launch failure reported through the CUDA driver or the CUDA runtime, but in some cases can hang the entire machine, requiring a hard reset.

This is caused by the Windows "watchdog" timer that causes programs using the primary graphics adapter to time out if they run longer than the maximum allowed time.

For this reason it is recommended that CUDA is run on a GPU that is NOT attached to a display and does not have the Windows desktop extended onto it. In this case, the system must contain at least one NVIDIA GPU that serves as the primary graphics adapter.

^

Wouldnt that limit users without a SLI capable motherboard? I.E. Less you SLI you dont get the "non game benefits"?

I am not sure. Just throwing a wrench out into the mix here.