X2900XT vs 8800GTX (ATI delivers best bang in a big way!)

  • 67 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 SirWrinkles
Member since 2007 • 218 Posts

Keep in mind the X2900XT has some very premature drivers, things will likely get much better in time as they did for the 8800 which got huge gains from drivers shortly after the launch.

 

From Tom's Hardware

 

These are only a few games and these are hardly thorough benchmarks, even with the aplha phase like drivers ATI has it still for it's price shows the X2900XT is the best bang for the buck period.

Toms Hardware shows that the X2900XT shows some impressive numbers at high res

At 2560 by 1600 4AA 8AF

DOOM 3

8800GTX 59.2 fps
X2900XT 62.4 fps
8800GTS 39.4 fps

Not only does the X2900XT completely crush the GTS here, but it beats the 200 dollar more GTX!

FEAR (same settings)

8800GTX 34 fps
X2900XT 32 fps
8800GTS 17 fps

Here the X2900XT is right behind the 200 dollar more card and completely crushes it's competition the GTS

OBLIVION No AA/AF .
.
8800GTX 22.67 fps
X2900XT 18.21 fps
8800GTS 15.91 fps

The GTX wins here, and the X2900XT beats the GTS it's competition, but doesn't crush it like the other tests.

 

Pretty impressive considering how bad the drivers are.

Avatar image for Empirefrtw
Empirefrtw

1324

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 Empirefrtw
Member since 2006 • 1324 Posts
Wow that looks great much better then the other review, i cant wait to get that card.
Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 SirWrinkles
Member since 2007 • 218 Posts

From GURU

2560 by 1600 4AA/8AF 

 

Prey
.
8800GTX 32 fps
X2900XT 40 fps
8800GTS 32 fps

Here the X2900XT smokes the 200 dollar more GTX  

 

STALKER (same, but no AA game doesnt support it)  

8800GTX 136 fps
X2900XT 93 fps
8800GTS 51 fps

GURU shows us much different results then the Gamespot test which has the X2900 doing poorly, here it doesn't beat the 200 dollar more GTX, but it flat out smokes it's competitor with almost double the frame rate.

 

 FEAR

8800GTX 25 fps
X2900XT 32 fps
8800GTS 17 fps

I already showed Tom's results for FEAR, these are roughly the same only the X2900XT actually wins by an even greater margin here soundly beating the dollar more card.

 

BATTLEFIELD 2

8800GTX 66 fps
X2900XT 55 fps
8800GTS 61 fps

ATI cards have never been to fond of BF2, the GTS actually manages to win here by a squeek

 

GHOST RECON ADVANCED WAR FIGHTER

8800GTX 32 fps
X2900XT 33 fps
8800GTS 24 fps

Here the 2900XT yet again manages to just topple the almighty 200 dollar more GTX with it's competition the GTS eating dust.

 

Splinter Cell 3

8800GTX 53 fps
X2900XT 56 fps
8800GTS 37 fps

 Amazingly AGAIN the X2900XT manages to edge out the 200 dollar more GTX and crush it's GTS competition

 

Another amazing thing GURU showed is that in GRAW FEAR BF2 STALKER the X2900XT showed little to no loss when running under vista in fact FEAR actually runs FASTER under vista for the 2900!

 

Avatar image for blurb1324
blurb1324

4551

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 blurb1324
Member since 2004 • 4551 Posts

Are these confirmed and legit?

I've only read about 2 or 3 reviews so far, and they point to the X2900XT being comparable to the 8800GTS, and getting utterly smashed by the GTX. They've also said that the GTS is a better value since it draws some 100W less and is cheaper. 

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 LordEC911
Member since 2004 • 9972 Posts

Are these confirmed and legit?

I've only read about 2 or 3 reviews so far, and they point to the X2900XT being comparable to the 8800GTS, and getting utterly smashed by the GTX. They've also said that the GTS is a better value since it draws some 100W less and is cheaper.

blurb1324

100w less? Wow, more like 2/3 of that. The 2900xt needs about 60w more then the GTS, since there is about a 40w gap between the GTS and GTX and a 20w gap between the GTX and XT.

Avatar image for Empirefrtw
Empirefrtw

1324

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 Empirefrtw
Member since 2006 • 1324 Posts
Hopefully with beter drivers these will improve even more its going to be great owning one of these cards :d.
Avatar image for Random__Guy
Random__Guy

1047

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 Random__Guy
Member since 2007 • 1047 Posts
Whoa easy there SirWrinkles, your gonna upset the geforce kids.
Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 SirWrinkles
Member since 2007 • 218 Posts

I think we can trust Tom's Hardware and GURU more then we can some of the other sites.

 I mean look at HardOCP they posted right on their front page X2900XT is a "FLOP" even IF that were true that's extremely unprofessional, clearly they have no credability.

Anandtech shows results in line with HardOCP so you can't trust them really either.

If you read Tom's Hardwares introduction you will see them talking about how many of the other sites show biased and how they show it.

Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 9mmSpliff
Member since 2005 • 21751 Posts
Personally i woud rather listen to a anandtech review then a tomshardware.  But ther eboth good.
Avatar image for PixelChief
PixelChief

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 PixelChief
Member since 2006 • 25 Posts

I think we can trust Tom's Hardware and GURU more then we can some of the other sites.

I mean look at HardOCP they posted right on their front page X2900XT is a "FLOP" even IF that were true that's extremely unprofessional, clearly they have no credability.

Anandtech shows results in line with HardOCP so you can't trust them really either.

If you read Tom's Hardwares introduction you will see them talking about how many of the other sites show biased and how they show it.

SirWrinkles
Yeah, clearly any site that tested the X2900XT and said it didn't perform as well is "biased." :roll:
Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 9mmSpliff
Member since 2005 • 21751 Posts
[QUOTE="SirWrinkles"]

I think we can trust Tom's Hardware and GURU more then we can some of the other sites.

I mean look at HardOCP they posted right on their front page X2900XT is a "FLOP" even IF that were true that's extremely unprofessional, clearly they have no credability.

Anandtech shows results in line with HardOCP so you can't trust them really either.

If you read Tom's Hardwares introduction you will see them talking about how many of the other sites show biased and how they show it.

PixelChief

Yeah, clearly any site that tested the X2900XT and said it didn't perform as well is "biased." :roll:

Hes sayin gcause they said flop and thats not to manner like for a business.  Thats what hes getting at smart ass.

Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 SirWrinkles
Member since 2007 • 218 Posts
They actually get even worse in the conclusion, they say the X2900XT is ATI's version of the Geforce 5 series and that the card is basically hopeless and to just avoid it, they don't even mention the thought that it might be the BETA DRIVERS and that ATI has SAID they will be releasing drivers shortly that will significantly boost performance.
Avatar image for blazethe1
blazethe1

1238

Forum Posts

0

Wiki Points

0

Followers

Reviews: 34

User Lists: 0

#13 blazethe1
Member since 2004 • 1238 Posts
im gonna wait for the 2600. all it has to do is beat out the 8600 by a good margin and i think the world has a new prefered card.
Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 LordEC911
Member since 2004 • 9972 Posts

im gonna wait for the 2600. all it has to do is beat out the 8600 by a good margin and i think the world has a new prefered card.blazethe1

The RV6*0 is already the world's preferred card, AMD is planning on shipping 100 million R600 based GPUs by the end of the year. These new cards are the perfect solution for OEMs, super low power consumption, small profile, low heat, great video/audio solution, HDCP compliant and the list goes on.

Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 SirWrinkles
Member since 2007 • 218 Posts
bump
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 Bebi_vegeta
Member since 2003 • 13558 Posts

Keep in mind the X2900XT has some very premature drivers, things will likely get much better in time as they did for the 8800 which got huge gains from drivers shortly after the launch.

 

From Tom's Hardware

 

These are only a few games and these are hardly thorough benchmarks, even with the aplha phase like drivers ATI has it still for it's price shows the X2900XT is the best bang for the buck period.

Toms Hardware shows that the X2900XT shows some impressive numbers at high res

At 2560 by 1600 4AA 8AF

DOOM 3

8800GTX 59.2 fps
X2900XT 62.4 fps
8800GTS 39.4 fps

Not only does the X2900XT completely crush the GTS here, but it beats the 200 dollar more GTX!

FEAR (same settings)

8800GTX 34 fps
X2900XT 32 fps
8800GTS 17 fps

Here the X2900XT is right behind the 200 dollar more card and completely crushes it's competition the GTS

OBLIVION No AA/AF .
.
8800GTX 22.67 fps
X2900XT 18.21 fps
8800GTS 15.91 fps

The GTX wins here, and the X2900XT beats the GTS it's competition, but doesn't crush it like the other tests.

 

Pretty impressive considering how bad the drivers are.

SirWrinkles

 

Interesting indeed, but who plays on that resolution... it must be for a few people?

Nobody wants to play with 30-20 frames per secondes... 

Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 SirWrinkles
Member since 2007 • 218 Posts

Then turn down other settings! What do you mean who play at that res? The people in which own these monster cards are just probably just as far and  few in between as those who have 30 inch monitors.

 The bottom line really is the X2900Xt gets more impressive the higher the resolution and my guess is most people who buy monster cards are more likely to play on a monster monitor then a 17 incher. Maybe not 30 inch big, but honestly your really just wasting your money if you buy any of these cards for a resolution south of 1600 by 1200

Avatar image for hofuldig
hofuldig

5126

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#18 hofuldig
Member since 2004 • 5126 Posts
[QUOTE="SirWrinkles"]

Keep in mind the X2900XT has some very premature drivers, things will likely get much better in time as they did for the 8800 which got huge gains from drivers shortly after the launch.

 

From Tom's Hardware

 

These are only a few games and these are hardly thorough benchmarks, even with the aplha phase like drivers ATI has it still for it's price shows the X2900XT is the best bang for the buck period.

Toms Hardware shows that the X2900XT shows some impressive numbers at high res

At 2560 by 1600 4AA 8AF

DOOM 3

8800GTX 59.2 fps
X2900XT 62.4 fps
8800GTS 39.4 fps

Not only does the X2900XT completely crush the GTS here, but it beats the 200 dollar more GTX!

FEAR (same settings)

8800GTX 34 fps
X2900XT 32 fps
8800GTS 17 fps

Here the X2900XT is right behind the 200 dollar more card and completely crushes it's competition the GTS

OBLIVION No AA/AF .
.
8800GTX 22.67 fps
X2900XT 18.21 fps
8800GTS 15.91 fps

The GTX wins here, and the X2900XT beats the GTS it's competition, but doesn't crush it like the other tests.

 

Pretty impressive considering how bad the drivers are.

Bebi_vegeta

 

Interesting indeed, but who plays on that resolution... it must be for a few people?

Nobody wants to play with 30-20 frames per secondes...

 

Well i shure as hell dont. i could but i dont see the point when you could just add a little AA and take less of a performance hit.

 

I have a 7300GT (beastly little card) i can play oblivion with High textures (no HDR) and the view distance set to max with 8xAA and 16xAF it may ba little slow but it still plays fine. oh my point is even though i can play at a higher resolution i choose not to because i think its kinda stupid. 1024x768 RULZ TEH WORLD!!!! 

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 Bebi_vegeta
Member since 2003 • 13558 Posts

Then turn down other settings! What do you mean who play at that res? The people in which own these monster cards are just probably just as far and few in between as those who have 30 inch monitors.

The bottom line really is the X2900Xt gets more impressive the higher the resolution and my guess is most people who buy monster cards are more likely to play on a monster monitor then a 17 incher. Maybe not 30 inch big, but honestly your really just wasting your money if you buy any of these cards for a resolution south of 1600 by 1200

SirWrinkles

Have you played the lastest games? They require lots of power if you play with all the eye cnadys on 1600*1200.

If I was to game on 2500*1600, i would atleast get SLI or crossfire.

Splinter cell, raibow six, oblivion, GRAW, CoC3, fear ? PLaying in the 30 ish frame/sec is not an option for me.

And I am very far away from wasting my money, since i'll keep this card longer... hopefully lasting me another 2 years just like my 7800GTX did. 

Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 SirWrinkles
Member since 2007 • 218 Posts

I think anyone who has a 30 incher also has a smaller monitor for games that their card can't handle that res,

Sure the few top games might need SLI/Crossfire to max out at the super high res, but theirs plenty of PC games you could max out just fine like that including CnC3/Stalker and SplinterCell all games you mentioned.

Avatar image for Empirefrtw
Empirefrtw

1324

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 Empirefrtw
Member since 2006 • 1324 Posts
I have to agree i only play at 1280x1024 or something like that and i plan to get one of these cards, why? Becuase one of them will last me a long time and some of the newer games do require alot of power to put all the eyecandy stuff on as well even at a lower resolution.
Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 SirWrinkles
Member since 2007 • 218 Posts
I think if you play at a lower res your better off buying a 200 dollar card ever 8 months or so then you are buying a 400 dollar card every year and a half.
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 Bebi_vegeta
Member since 2003 • 13558 Posts

I think if you play at a lower res your better off buying a 200 dollar card ever 8 months or so then you are buying a 400 dollar card every year and a half.SirWrinkles

 

When I buy something, I wish to settle for the best and be confortable for a few years without the need to overclock for some time.

The less trouble and worryness is what I like. 

Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 SirWrinkles
Member since 2007 • 218 Posts
In this day of super advancing technology I don't think you can buy a card to last you that long unless your will to run at the min on the later games :?
Avatar image for Subjekt83
Subjekt83

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Subjekt83
Member since 2006 • 25 Posts
keep in mid that nvidia's vista drivers arent exactly optimized yet either lots of stuff is still disabled with the vista driver or crashes the game if you use it.
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 Bebi_vegeta
Member since 2003 • 13558 Posts

In this day of super advancing technology I don't think you can buy a card to last you that long unless your will to run at the min on the later games :?SirWrinkles

My 7800GTX lasted me almost 2 years and could still handle the resolution I play  1280*1084. I had to change it for my other computer built.

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 Bebi_vegeta
Member since 2003 • 13558 Posts

keep in mid that nvidia's vista drivers arent exactly optimized yet either lots of stuff is still disabled with the vista driver or crashes the game if you use it.Subjekt83

I have no clue why people would wanna game on Vista now, since there is no dx10 games and alots of bugs. 

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 jfelisario
Member since 2006 • 2753 Posts

keep in mid that nvidia's vista drivers arent exactly optimized yet either lots of stuff is still disabled with the vista driver or crashes the game if you use it.Subjekt83

New Vista drivers for you my friend.... 

Avatar image for Empirefrtw
Empirefrtw

1324

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 Empirefrtw
Member since 2006 • 1324 Posts

You maybe right but i sorta tired of trying to keep up with the tech and buying something every few months so i want something that will last me aslong as possible of course though i do have a buget or i might be buying a 8800gtx.

Avatar image for 450tantrum
450tantrum

158

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#30 450tantrum
Member since 2005 • 158 Posts

Anyone know if the R600s hold up well on P5W DH Deluxe (Crossfire Ready) as far as Crossfire goes?

And is it true that it adopted the Bridge connector, like Nvidia's SLi? 

 

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 jfelisario
Member since 2006 • 2753 Posts

Anyone know if the R600s hold up well on P5W DH Deluxe (Crossfire Ready) as far as Crossfire goes?

And is it true that it adopted the Bridge connector, like Nvidia's SLi?

 

450tantrum

i don't see why it won't work with your mobo, and yes its internal crossfire, with the crossfire bridge between the two cards now.

Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 SirWrinkles
Member since 2007 • 218 Posts

Anyone know if the R600s hold up well on P5W DH Deluxe (Crossfire Ready) as far as Crossfire goes?

And is it true that it adopted the Bridge connector, like Nvidia's SLi? 

P5W that stand for 500 watts? I think they said they recommend 700 watts for X2900 CF :O

 

450tantrum
Avatar image for Ospi
Ospi

570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 Ospi
Member since 2006 • 570 Posts
After reading about this i cant say im disappointed with getting a GTS instead, the 2900 is pretty much on the same level as far as bang for buck goes and slots nicely inbetween the GTS and GTX.  However if Nvidia drop theur prices a bit that might change.  Nevertheless its a card which will be great when the driver support is up to scratch, but i certainly wouldnt be selling my GTS to get one.
Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 SirWrinkles
Member since 2007 • 218 Posts
I wonder why the benchmarks out there are show such varied results, meh I think the more credible sites are the ones showing the X2900XT shine like Tom's hardware.
Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 jfelisario
Member since 2006 • 2753 Posts

I wonder why the benchmarks out there are show such varied results, meh I think the more credible sites are the ones showing the X2900XT shine like Tom's hardware.SirWrinkles

They are using many different drivers, either 8.36 or 8.37, even in different increments of the driver as well, eg. 8.37.4, 8.37.4.2, 8.37.4.3, etc etc.... 

Avatar image for Baselerd
Baselerd

5104

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#36 Baselerd
Member since 2003 • 5104 Posts

I really want the hd 2900xt to work, but here's what I see...

It's benchmarks show inconsistent performance. That leads me to believe the potential is there, but it may have bad drivers? Hopefully they can bring all of those benchmarks up to be consistently nipping at the gtx's heels.

If not, what a shame :-S.  I think I'll wait till mid summer to see what happens. The truth of the matter is, I don't see performance skyrocketing after a driver release.

 

On another note, do you guys think my psu will run this card? I would imagine so, it has 40A on the 12v rail, but just making sure. 

Avatar image for frost_mourne13
frost_mourne13

1615

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 frost_mourne13
Member since 2006 • 1615 Posts
I think it will run it fine. Your rails are fine, and Corsair PSU's are all high quality.
Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 jfelisario
Member since 2006 • 2753 Posts

I really want the hd 2900xt to work, but here's what I see...

It's benchmarks show inconsistent performance. That leads me to believe the potential is there, but it may have bad drivers? Hopefully they can bring all of those benchmarks up to be consistently nipping at the gtx's heels.

If not, what a shame :-S. I think I'll wait till mid summer to see what happens. The truth of the matter is, I don't see performance skyrocketing after a driver release.

 

On another note, do you guys think my psu will run this card? I would imagine so, it has 40A on the 12v rail, but just making sure.

Baselerd

Theoretically, the max headrooms you need are 250w (75w from the pci-e slot + 75w from six-pin pci-e connector + 100w from the 8-pin connector) or about 20.83A just for the card, or 225w (75w from the pci-e slot + 75w from six-pin pci-e connector x2) or 18.75A without the Overdrive feature. In all practicality, the card runs well below this max head room so your 40A psu will do just fine.

Avatar image for 450tantrum
450tantrum

158

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#39 450tantrum
Member since 2005 • 158 Posts

the 2900 is pretty much on the same level as far as bang for buck goes and slots nicely inbetween the GTS and GTX. However if Nvidia drop theur prices a bit that might change.Ospi

 

Nvidia Plans on dropping price of current GTS 640 to $350 area and release new GTS 640 SuperClocked cards to compete at $400 level

 

sorry i have no link of source but it is in a thread in xtremesystems, just cant remember which one 

Avatar image for Staryoshi87
Staryoshi87

12760

Forum Posts

0

Wiki Points

0

Followers

Reviews: 18

User Lists: 0

#40 Staryoshi87
Member since 2003 • 12760 Posts

BATTLEFIELD 2

8800GTX 66 fps
X2900XT 55 fps
8800GTS 61 fps

ATI cards have never been to fond of BF2, the GTS actually manages to win here by a squeek

You act as though 2 fps is a difference when the X2900 does better, but you say 6 FPS is a "squeek." You're clearly biased....just putting that out there.

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 jfelisario
Member since 2006 • 2753 Posts

[QUOTE="Ospi"]the 2900 is pretty much on the same level as far as bang for buck goes and slots nicely inbetween the GTS and GTX. However if Nvidia drop theur prices a bit that might change.450tantrum

 

Nvidia Plans on dropping price of current GTS 640 to $350 area and release new GTS 640 SuperClocked cards to compete at $400 level

 

sorry i have no link of source but it is in a thread in xtremesystems, just cant remember which one

superclocked? like evga superclocked? that's their trademark i believe... 

Avatar image for jamesgj
jamesgj

1190

Forum Posts

0

Wiki Points

0

Followers

Reviews: 35

User Lists: 0

#42 jamesgj
Member since 2005 • 1190 Posts
Max settings for Doom 3 is 1600 x 1200...
Avatar image for saifiii
saifiii

274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 saifiii
Member since 2006 • 274 Posts

hey the 2900 in crossfire kills the 8800 ultra at the same price.

now dont fire at me by saying that the ultra is not even a proper card. hust saying how 2900 is best bang

Avatar image for SirWrinkles
SirWrinkles

218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 SirWrinkles
Member since 2007 • 218 Posts

Max settings for Doom 3 is 1600 x 1200...jamesgj

lmfao, alot of games like BF2 also don't natively support super high resolutions, but if you go into the Config file you can manually input them in.

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 jfelisario
Member since 2006 • 2753 Posts

hey the 2900 in crossfire kills the 8800 ultra at the same price.

now dont fire at me by saying that the ultra is not even a proper card. hust saying how 2900 is best bang

saifiii

if they could take it out with a single card i'd be amazed, but you take the single gtx and single 2900 xt.... yeah.....  

Avatar image for r3351925
r3351925

1728

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#46 r3351925
Member since 2006 • 1728 Posts

hey look guys, untill now am really mixed out,

r600= X?????

the 8800GTS version of ati name = ?2900?

sorry for the noobish question but am really lost.

And logically the 320 is really large, and from what i read the ati has really crappy drivers right now.

As supposed , the r600 is now running 2/5 from its original performance????

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 jfelisario
Member since 2006 • 2753 Posts

hey look guys, untill now am really mixed out,

r600= X?????

the 8800GTS version of ati name = ?2900?

sorry for the noobish question but am really lost.

And logically the 320 is really large, and from what i read the ati has really crappy drivers right now.

As supposed , the r600 is now running 2/5 from its original performance????

r3351925

Well firstly, AMD dropped the "X" prefix and now its officially the HD 2900 XT. 320? As in shader processors? Well actually its 64 sp vect5D (5 math operation/clock) so 64 x 5 = ??? Yeah...

As for the drivers, driver ver. 8.38 is supposedly rolling out soon, so that should help the card a bit in performance. 

Avatar image for X360PS3AMD05
X360PS3AMD05

36320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#48 X360PS3AMD05
Member since 2005 • 36320 Posts
R600 is the code name for the GPU that now came out as the HD 2900XT. Geforce 8800 (G80) is the high-end offering from Nvidia. Drivers will mature, how much performance will improve we have to see.
Avatar image for r3351925
r3351925

1728

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#49 r3351925
Member since 2006 • 1728 Posts
[QUOTE="r3351925"]

hey look guys, untill now am really mixed out,

r600= X?????

the 8800GTS version of ati name = ?2900?

sorry for the noobish question but am really lost.

And logically the 320 is really large, and from what i read the ati has really crappy drivers right now.

As supposed , the r600 is now running 2/5 from its original performance????

jfelisario

Well firstly, AMD dropped the "X" prefix and now its officially the HD 2900 XT. 320? As in shader processors? Well actually its 64 sp vect5D (5 math operation/clock) so 64 x 5 = ??? Yeah...

As for the drivers, driver ver. 8.38 is supposedly rolling out soon, so that should help the card a bit in performance. 

thanks, :lol: ur avatar got freaked out