ATI HD 2900 XT review!! (ATI dropped the ball)

This topic is locked from further discussion.

Avatar image for darkmagician06
darkmagician06

6060

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 darkmagician06
Member since 2003 • 6060 Posts

link

 

The 8800 series is still king and The GeForce 8800 GTS 640 MB is $50 cheaper, performs better, and draws a lot less power than the 2900 XT.



 

Avatar image for domke13
domke13

2891

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 domke13
Member since 2006 • 2891 Posts
I know that already. But thanks anyway.
Avatar image for Taijiquan
Taijiquan

7431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 Taijiquan
Member since 2002 • 7431 Posts

The Bottom Line

"A day late and a dollar short." Cliché but accurate. The Radeon HD 2900 XT is late to the party and unfortunately is bringing with it performance that cannot compete. The GeForce 8800 GTS 640 MB is $50 cheaper, performs better, and draws a lot less power than the 2900 XT.

This is as good as it is going to get for a while from ATI. The GeForce 8800 GTX will still dominate at the high end of the video card market. Of course we do not know about DX10 games yet, and there is no way to make any predictions how that comparison will turn out. As it stands right now the Radeon HD 2900 XT, in our opinion, is a flop. ATI needs to get its act together quickly. It needs to push out the mainstream cards soon and it needs to deliver a high end card that can actually compete at the high end of the market.

I have been constantly saying this.

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 jfelisario
Member since 2006 • 2753 Posts
uh WOW... like the millionth and one thread about this...... go to page 2 of pc hardware and woe and behold.... HD 2900 XT discussions!! Here ! Here ! Here ! Aha !! Oh and more reviews !!! Oh and a LIST OF REVIEWS of that card too!! sorry.... this has gone a bit too old i guess.... sorry for all that incessant bantering, but yeah you are late to the party.
Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 RayvinAzn
Member since 2004 • 12552 Posts

As soon as I see some sort of consistancy to the reviews (ExtremeTech loved it, HardOCP hated it, Guru of 3D thought it was decent, etc.)  I'll make a call. As it stands right now, there are way too many different sites saying way too many different things.

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 jfelisario
Member since 2006 • 2753 Posts

As soon as I see some sort of consistancy to the reviews (ExtremeTech loved it, HardOCP hated it, Guru of 3D thought it was decent, etc.) I'll make a call. As it stands right now, there are way too many different sites saying way too many different things.

RayvinAzn

QFT... though do you think they'd have second round through it? I'd prefer sticking to the most reliable sites among the bunch though. I know HardOCP is going for a second run through but with the 8800 gts 320 mb version though (lol and i'm not implying that HardOCP is my choice of reviewers, quite the contrary imho), so a little wait is in order for second opinions and look-backs.

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 jfelisario
Member since 2006 • 2753 Posts
oh.. kinda OT, but can you guys see my new sig and avatar?
Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 RayvinAzn
Member since 2004 • 12552 Posts

QFT... though do you think they'd have second round through it? I'd prefer sticking to the most reliable sites among the bunch though. I know HardOCP is going for a second run through but with the 8800 gts 320 mb version though (lol and i'm not implying that HardOCP is my choice of reviewers, quite the contrary imho), so a little wait is in order for second opinions and look-backs.

jfelisario

It's interesting to note that all the sites seem to agree on one thing: The HD2900XT generally tops the 8800GTS in 3DMark 06. 

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 jfelisario
Member since 2006 • 2753 Posts
[QUOTE="jfelisario"]

QFT... though do you think they'd have second round through it? I'd prefer sticking to the most reliable sites among the bunch though. I know HardOCP is going for a second run through but with the 8800 gts 320 mb version though (lol and i'm not implying that HardOCP is my choice of reviewers, quite the contrary imho), so a little wait is in order for second opinions and look-backs.

RayvinAzn

It's interesting to note that all the sites seem to agree on one thing: The HD2900XT generally tops the 8800GTS in 3DMark 06.

general consensus, but as with all benchmarks doesn't always translate to their corresponding real-world performances, where all sorts of factors come to play. 

Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 RayvinAzn
Member since 2004 • 12552 Posts

general consensus, but as with all benchmarks doesn't always translate to their corresponding real-world performances, where all sorts of factors come to play. jfelisario

I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else.

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 jfelisario
Member since 2006 • 2753 Posts

[QUOTE="jfelisario"]general consensus, but as with all benchmarks doesn't always translate to their corresponding real-world performances, where all sorts of factors come to play. RayvinAzn

I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else.

most definitely, varying results can be attributed to using two common driver versions, the 8.36 and the 8.37, when there is the 8.38 (which should be the retail version, its at its beta right now) which hasn't been benchmarked with among the tech sites, and AMD is expecting at least a 15% increase in performance, and obviously more fixes to what appears to be buggy early drivers, the fluctuations are off the hook and we are seeing weird results, eg. Better performance with AA+AF on rather than off, better fps as you go higher in resolutions, weird increases and inconsistent dips in fps.

Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 RayvinAzn
Member since 2004 • 12552 Posts
Still, it is pretty embarassing for ATI to come to the party late and still not have its drivers ready (or at least a little more ready than they are).
Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 jfelisario
Member since 2006 • 2753 Posts

Still, it is pretty embarassing for ATI to come to the party late and still not have its drivers ready (or at least a little more ready than they are).RayvinAzn

True, though I think you've seen the whole debacle with the Nvidia's "Vista-ready" WQHL 8800 drivers when Vista came out right? There were lawsuits in order for crying out loud!!! lol. The thing is, I believe these cards were previewed with the pre-retail drivers (pre-8.38 ) , I just wonder driver version the guys who are buying the card in retail will get.

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 jfelisario
Member since 2006 • 2753 Posts
you wanna know the funny thing? You can make either the 8800 gts or the HD 2900 xt a winner in a benchmark, given the buggy drivers, as long as you fiddle with the settings, you can make the 2900 lose out to the gts, or pick the "right" IQ to win over the gts...... lol... holy flabbergastic inconsistencies batman!
Avatar image for domke13
domke13

2891

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 domke13
Member since 2006 • 2891 Posts
So you think that 2900XT will beat 8800 GTX whit new drivers??
Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 jfelisario
Member since 2006 • 2753 Posts

So you think that 2900XT will beat 8800 GTX whit new drivers?? domke13

probably not, at least consistently. 

Avatar image for TheDarthvader
TheDarthvader

7916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 15

User Lists: 0

#17 TheDarthvader
Member since 2002 • 7916 Posts
drivers alone will not give this card a big enough boost that it would beat a GTX. heck, even AMD admits that!
Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 RayvinAzn
Member since 2004 • 12552 Posts

So you think that 2900XT will beat 8800 GTX whit new drivers?? domke13

Drivers alone aren't going to get this card past the 8800GTX - at least I severely doubt it. Maybe after a move to 65nm and with some intense overclocking. 

Avatar image for Makari
Makari

15250

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 Makari
Member since 2003 • 15250 Posts
Still, it is pretty embarassing for ATI to come to the party late and still not have its drivers ready (or at least a little more ready than they are).RayvinAzn
Honestly, after all the harping I did on nVidia for going through the few months of bad drivers, this launch with pretty obviously bad drivers makes me feel both ironic and stupid. :D Was [H] the one site that also called the 8600GTS as being better than the X1950Pro?
Avatar image for Empirefrtw
Empirefrtw

1324

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 Empirefrtw
Member since 2006 • 1324 Posts
Fewf only 550 watts im planing on geting one of these and i was afraid that my power supply wouldnt support it.
Avatar image for Bibbidy
Bibbidy

636

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 Bibbidy
Member since 2006 • 636 Posts

I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else. RayvinAzn

It can just as easily mean poor architecture. Synthetics tend to offer "ideal" conditions, and an architecture that is more designed around ideal conditions than real-world conditions can perform well in the synthetics, but work very poorly under certain conditions. If your testing meets those conditions, you get a poor result.

Avatar image for JimJackJose
JimJackJose

2937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 JimJackJose
Member since 2002 • 2937 Posts
It can also mean that AMD/ATI has spent its time optimizing for 3dmark as it knows that will be one of the main benchmarks used against its card.
Avatar image for Bibbidy
Bibbidy

636

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 Bibbidy
Member since 2006 • 636 Posts

After reading around-

I'm unimpressed. The 2900XT is competetive with the 8800GTS on a performance level, but soaks up too much power and generates too much heat. That's not a good sign considering how much older the 8800GTS is.

Really, what it looks like happened here is that ATI came out with a poor design, and tried to get around it by overclocking the hell out of it. The fact that they couldn't come up with anything better in the 6 months this thing has been delayed leads me to believe that there's a more fundamental limitation than bad drivers. It certainly doesn't signal much room for growth. Nvidia lauched the 8800s six months ago- is ATI going to be able to tweak this architecture enough before the second generation of Nvidia cards based off of the 8000 hits to make it even remotely competitive? This is the Geforce FX line part II- a product line delayed past the point where it was relevant.

Right now, if you're in the $400 range, it's a decent option. 8800 prices are already falling though, and once the 8800GTS falls out of that price range, it'll just be obsolete hardware.

 

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 LordEC911
Member since 2004 • 9972 Posts
[QUOTE="RayvinAzn"]I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else. Bibbidy

It can just as easily mean poor architecture. Synthetics tend to offer "ideal" conditions, and an architecture that is more designed around ideal conditions than real-world conditions can perform well in the synthetics, but work very poorly under certain conditions. If your testing meets those conditions, you get a poor result.

It can also mean that AMD/ATI has spent its time optimizing for 3dmark as it knows that will be one of the main benchmarks used against its card.JimJackJose

Wow at both of you. Their architecture is fantastic, it will just take them a little while longer to get the dispatcher's algorithms worked out.  To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.

This is news how?
At least AMD was able to get all the games working at decent FPS without glitches/errors, the same can't be said for about 6 months with Nvidia's drivers and G80 cards...

Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 9mmSpliff
Member since 2005 • 21751 Posts

Lol I love peple choosing Hardforums nVidia bias reviews.....but they are wlel known, so I dont blame them totally.  Right now as it stands, this card is the best price/performance for $400 and it will only get better over the month with newer drivers...like 8.38 which is due shortly here.

Im glad to see ATI shipped it to work wiht the games, which still cant be said for lots of nVidia drivers and games.  Also the architecture....far far far away from being poor, its much more sophisticated then 8800 series and the card just offers a lot more then the 8800 series from graphics, to video etc.

Avatar image for X360PS3AMD05
X360PS3AMD05

36320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#27 X360PS3AMD05
Member since 2005 • 36320 Posts
It's awesome because it means prices should come down on the 8800GTS :D
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bibbidy"][QUOTE="RayvinAzn"]I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else. LordEC911

It can just as easily mean poor architecture. Synthetics tend to offer "ideal" conditions, and an architecture that is more designed around ideal conditions than real-world conditions can perform well in the synthetics, but work very poorly under certain conditions. If your testing meets those conditions, you get a poor result.

It can also mean that AMD/ATI has spent its time optimizing for 3dmark as it knows that will be one of the main benchmarks used against its card.JimJackJose

Wow at both of you. Their architecture is fantastic, it will just take them a little while longer to get the dispatcher's algorithms worked out. To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.

This is news how?
At least AMD was able to get all the games working at decent FPS without glitches/errors, the same can't be said for about 6 months with Nvidia's drivers and G80 cards...

 

In 6 month you can do alot of "error correction", why do you thnk ATI waited 6 month? . Atleast Nvidia realeased there card ... unlike XTX is still in the lab.

Avatar image for darkmagician06
darkmagician06

6060

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 darkmagician06
Member since 2003 • 6060 Posts
I wonder how the HD 2600 Will compare to the 8600 series...since im more likely to get that level of card anyways...
Avatar image for X360PS3AMD05
X360PS3AMD05

36320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#30 X360PS3AMD05
Member since 2005 • 36320 Posts

[QUOTE="LordEC911"]To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.Bibbidy

Quick note, kid:
Actual performance is a better indicator of good or poor architecture than how many neato words and graphs they have on a website describing it.

Or it means drivers need tweaking.
Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 LordEC911
Member since 2004 • 9972 Posts
[QUOTE="LordEC911"]To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.Bibbidy

Quick note, kid:
Actual performance is a better indicator of good or poor architecture than how many neato words and graphs they have on a website describing it.

Not really. Better architecture leaves more potential. Right now that potential is having the dispatcher able to have 5 parrel tasks for each of the 64 shaders. That takes a lot more time/tweaking then coding drivers for Nvidia's architecture.  I would like to see Nvidia's driver team have to code for these much more complex architecture's, they probably still wouldn't have drivers out.

In 6 month you can do alot of "error correction", why do you thnk ATI waited 6 month? . Atleast Nvidia realeased there card ... unlike XTX is still in the lab. Bebi_vegeta

You can't redo an architecture that has already been taped out for the past year or so...
Just like Nvidia couldn't magically add more shader's into their G80 architecture for the Ultra, sooo many people were spreading Fud about that.

Avatar image for Bibbidy
Bibbidy

636

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 Bibbidy
Member since 2006 • 636 Posts

Not really.

Yes, really. Believe it or not, the real world does not bend and reshape itself just because you fantasise about "potential" that the card's performance shows no sign of actually existing.

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bibbidy"][QUOTE="LordEC911"]To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.LordEC911

Quick note, kid:
Actual performance is a better indicator of good or poor architecture than how many neato words and graphs they have on a website describing it.

Not really. Better architecture leaves more potential. Right now that potential is having the dispatcher able to have 5 parrel tasks for each of the 64 shaders. That takes a lot more time/tweaking then coding drivers for Nvidia's architecture. I would like to see Nvidia's driver team have to code for these much more complex architecture's, they probably still wouldn't have drivers out.

In 6 month you can do alot of "error correction", why do you thnk ATI waited 6 month? . Atleast Nvidia realeased there card ... unlike XTX is still in the lab. Bebi_vegeta

You can't redo an architecture that has already been taped out for the past year or so...
Just like Nvidia couldn't magically add more shader's into their G80 architecture for the Ultra, sooo many people were spreading Fud about that.

I guess this is why there is no XTX... but this doesnt explain the driver performance issue for XT. 

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 LordEC911
Member since 2004 • 9972 Posts

Yes, really. Believe it or not, the real world does not bend and reshape itself just because you fantasise about "potential" that the card's performance shows no sign of actually existing. Bibbidy

You seem to not be able to read all my posts...
Potential, as in we are only seeing about 2/5-3/5 of the R600's performance because of the immature dispatcher.
These things take time to work on and mature, 8.38 has already increased performance a bit and let's see what the 2900's is like in a month.

 

Avatar image for frost_mourne13
frost_mourne13

1615

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 frost_mourne13
Member since 2006 • 1615 Posts
Are they (sites) going to wait a month for better drivers and retest again?
Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 LordEC911
Member since 2004 • 9972 Posts

Are they (sites) going to wait a month for better drivers and retest again?frost_mourne13

Should see some of the more reliable sites doing a retest. 

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Bebi_vegeta
Member since 2003 • 13558 Posts

[QUOTE="Bibbidy"]Yes, really. Believe it or not, the real world does not bend and reshape itself just because you fantasise about "potential" that the card's performance shows no sign of actually existing. LordEC911

You seem to not be able to read all my posts...
Potential, as in we are only seeing about 2/5-3/5 of the R600's performance because of the immature dispatcher.
These things take time to work on and mature, 8.38 has already increased performance a bit and let's see what the 2900's is like in a month.

 

 

LOL, you seem to read nothing of my post.

How many time have R600 being delayed?

What were they doing in the mean time?

You think this driver issue couldn't of been fixed by then?

I'll beleive this increasement when i'll see it in benchmarks and i'm talking about 8.38... not 8.36 to 8.37

 

 

 

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 LordEC911
Member since 2004 • 9972 Posts
LOL, you seem to read nothing of my post.

How many time have R600 being delayed?

What were they doing in the mean time?

You think this driver issue couldn't of been fixed by then?

I'll beleive this increasement when i'll see it in benchmarks and i'm talking about 8.38... not 8.36 to 8.37Bebi_vegeta

I read your posts and I have constantly answered them, especially your repeating of questions.
The R600 was delayed once, technically twice.
They were only working on all the other projects they have going on, if you really want me to list them all I will.
The driver issue? At least AMD has all the games working with their drivers. Reliability and stability comes first, then performance.

Well you could look at some of the individual reviews done by unbias sources at Xtremesystems.com but I'm sure you don't believe them... 

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="Bebi_vegeta"]LOL, you seem to read nothing of my post.

How many time have R600 being delayed?

What were they doing in the mean time?

You think this driver issue couldn't of been fixed by then?

I'll beleive this increasement when i'll see it in benchmarks and i'm talking about 8.38... not 8.36 to 8.37LordEC911

I read your posts and I have constantly answered them, especially your repeating of questions.
The R600 was delayed once, technically twice.
They were only working on all the other projects they have going on, if you really want me to list them all I will.
The driver issue? At least AMD has all the games working with their drivers. Reliability and stability comes first, then performance.

Well you could look at some of the individual reviews done by unbias sources at Xtremesystems.com but I'm sure you don't believe them...

 

Delayed because of other projects, LOL... Oh please!

And wheres the XTX... comon man.

Yes every game works, they had 6 months to be sure of it!

If Nividia would of realesed the G80 may 14th, every game would work to.

I beleive 3dguru has a good review, I am not impressed with R600 yet. I was waiting for the R600 to come out and it got delayed, and there high end card is not even out... deceived indeed!

Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 9mmSpliff
Member since 2005 • 21751 Posts
ACtually not every game would work, because Splinter Cell still doesnt work, the XTX needs to be on 65nm because it bleeds too much on 80nm.  With it dropping to 65nm, ATI can push the core to 1Ghz now, while producing a less expensive card.  

AMD wants to do afull on 65nm launch with CPUs too.
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 Bebi_vegeta
Member since 2003 • 13558 Posts

ACtually not every game would work, because Splinter Cell still doesnt work, the XTX needs to be on 65nm because it bleeds too much on 80nm. With it dropping to 65nm, ATI can push the core to 1Ghz now, while producing a less expensive card.

AMD wants to do afull on 65nm launch with CPUs too. 9mmSpliff

Thats weird the XT is in 80nm... and i'm suspecting it had problemes with 65nm...

Splinter cell 3: is vista faults... http://www.guru3d.com/article/Videocards/431/22/

No Vista results here sorry, when I tried to install Splinter Cell CT the copy protection consistently crashes in Vista; yes of course ... It's Starforce copy protection on the move again. Somebody should throw a bomb on that company.

 

Avatar image for JimJackJose
JimJackJose

2937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 JimJackJose
Member since 2002 • 2937 Posts
[QUOTE="Bibbidy"][QUOTE="RayvinAzn"]I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else. LordEC911

It can just as easily mean poor architecture. Synthetics tend to offer "ideal" conditions, and an architecture that is more designed around ideal conditions than real-world conditions can perform well in the synthetics, but work very poorly under certain conditions. If your testing meets those conditions, you get a poor result.

It can also mean that AMD/ATI has spent its time optimizing for 3dmark as it knows that will be one of the main benchmarks used against its card.JimJackJose

Wow at both of you. Their architecture is fantastic, it will just take them a little while longer to get the dispatcher's algorithms worked out. To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.

This is news how?
At least AMD was able to get all the games working at decent FPS without glitches/errors, the same can't be said for about 6 months with Nvidia's drivers and G80 cards...

 

Excuse me? How is it from my comment you can mystically determine I have not "studied the cards architecture" enough? Futhermore what does that even have to do with my comment. What I said was in no way a stretch of the imagination. How else would you describe ATI posting huge numbers in the synthetic benchmark while showing numbers on par with the other cards in its category on all the other tests?

If the card has such "fantastic architecture" why does it consume so much more power while showing equal performance when compared to other cards in its category? Why is it you find it so hard to believe ATI may have optimized for 3dmark, when it is something both companies have been busted for in the past.

I would think the varying numbers between real performance and synthetic benchmarks would be a huge red flag to any educated / non fanboy consumer.

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 LordEC911
Member since 2004 • 9972 Posts
If the card has such "fantastic architecture" why does it consume so much more power while showing equal performance when compared to other cards in its category? Why is it you find it so hard to believe ATI may have optimized for 3dmark, when it is something both companies have been busted for in the past.

I would think the varying numbers between real performance and synthetic benchmarks would be a huge red flag to any educated / non fanboy consumer.JimJackJose

Well it's 700million transistors needpower to do their job...
Because they need much better drivers to have constant high-performance.
That is much easier to do with a synthetic benchmark then entire games.
Heck, the 8.38 drivers offer up about 500 points more in 3dMark06.

What the high 3dmark score shows is that the capability is there, it just needs to be ironed out with drivers. 

Avatar image for r3351925
r3351925

1728

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#44 r3351925
Member since 2006 • 1728 Posts

uh WOW... like the millionth and one thread about this...... go to page 2 of pc hardware and woe and behold.... HD 2900 XT discussions!! Here ! Here ! Here ! Aha !! Oh and more reviews !!! Oh and a LIST OF REVIEWS of that card too!! sorry.... this has gone a bit too old i guess.... sorry for all that incessant bantering, but yeah you are late to the party.jfelisario

:lol:

Avatar image for r3351925
r3351925

1728

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#46 r3351925
Member since 2006 • 1728 Posts
[QUOTE="r3351925"]

[QUOTE="jfelisario"]uh WOW... like the millionth and one thread about this...... go to page 2 of pc hardware and woe and behold.... HD 2900 XT discussions!! Here ! Here ! Here ! Aha !! Oh and more reviews !!! Oh and a LIST OF REVIEWS of that card too!! sorry.... this has gone a bit too old i guess.... sorry for all that incessant bantering, but yeah you are late to the party.jfelisario

:lol:

its part of the phase going into this new avatar, I was boring as a robot with my old avatar lawlercakes!!!1 

lol i never liked that 1.

Avatar image for domke13
domke13

2891

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 domke13
Member since 2006 • 2891 Posts
I think that nvidia is thinking too much about how being a king on those graphs. They are making faster cards, that costs almot a 1000$, while they totally forget about mid-range cards, which will decide the winner. And nvidias cards in mid-range will probably be pwned from ati mid-range cards. It is obvyous, that nvidia just wants to be the king, they want to have the fastest GPU on earth.
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 Bebi_vegeta
Member since 2003 • 13558 Posts

I think that nvidia is thinking too much about how being a king on those graphs. They are making faster cards, that costs almot a 1000$, while they totally forget about mid-range cards, which will decide the winner. And nvidias cards in mid-range will probably be pwned from ati mid-range cards. It is obvyous, that nvidia just wants to be the king, they want to have the fastest GPU on earth.domke13

 

I like the way you say it... PROBABLY

I think a 8800 GTS 320mb is a great deal.