link
The 8800 series is still king and The GeForce 8800 GTS 640 MB is $50 cheaper, performs better, and draws a lot less power than the 2900 XT.
This topic is locked from further discussion.
link
The 8800 series is still king and The GeForce 8800 GTS 640 MB is $50 cheaper, performs better, and draws a lot less power than the 2900 XT.
The Bottom Line
"A day late and a dollar short." Cliché but accurate. The Radeon HD 2900 XT is late to the party and unfortunately is bringing with it performance that cannot compete. The GeForce 8800 GTS 640 MB is $50 cheaper, performs better, and draws a lot less power than the 2900 XT.
This is as good as it is going to get for a while from ATI. The GeForce 8800 GTX will still dominate at the high end of the video card market. Of course we do not know about DX10 games yet, and there is no way to make any predictions how that comparison will turn out. As it stands right now the Radeon HD 2900 XT, in our opinion, is a flop. ATI needs to get its act together quickly. It needs to push out the mainstream cards soon and it needs to deliver a high end card that can actually compete at the high end of the market.
I have been constantly saying this.
As soon as I see some sort of consistancy to the reviews (ExtremeTech loved it, HardOCP hated it, Guru of 3D thought it was decent, etc.) I'll make a call. As it stands right now, there are way too many different sites saying way too many different things.
RayvinAzn
QFT... though do you think they'd have second round through it? I'd prefer sticking to the most reliable sites among the bunch though. I know HardOCP is going for a second run through but with the 8800 gts 320 mb version though (lol and i'm not implying that HardOCP is my choice of reviewers, quite the contrary imho), so a little wait is in order for second opinions and look-backs.
QFT... though do you think they'd have second round through it? I'd prefer sticking to the most reliable sites among the bunch though. I know HardOCP is going for a second run through but with the 8800 gts 320 mb version though (lol and i'm not implying that HardOCP is my choice of reviewers, quite the contrary imho), so a little wait is in order for second opinions and look-backs.
jfelisario
It's interesting to note that all the sites seem to agree on one thing: The HD2900XT generally tops the 8800GTS in 3DMark 06.
[QUOTE="jfelisario"]QFT... though do you think they'd have second round through it? I'd prefer sticking to the most reliable sites among the bunch though. I know HardOCP is going for a second run through but with the 8800 gts 320 mb version though (lol and i'm not implying that HardOCP is my choice of reviewers, quite the contrary imho), so a little wait is in order for second opinions and look-backs.
RayvinAzn
It's interesting to note that all the sites seem to agree on one thing: The HD2900XT generally tops the 8800GTS in 3DMark 06.
general consensus, but as with all benchmarks doesn't always translate to their corresponding real-world performances, where all sorts of factors come to play.
general consensus, but as with all benchmarks doesn't always translate to their corresponding real-world performances, where all sorts of factors come to play. jfelisario
I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else.
[QUOTE="jfelisario"]general consensus, but as with all benchmarks doesn't always translate to their corresponding real-world performances, where all sorts of factors come to play. RayvinAzn
I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else.
most definitely, varying results can be attributed to using two common driver versions, the 8.36 and the 8.37, when there is the 8.38 (which should be the retail version, its at its beta right now) which hasn't been benchmarked with among the tech sites, and AMD is expecting at least a 15% increase in performance, and obviously more fixes to what appears to be buggy early drivers, the fluctuations are off the hook and we are seeing weird results, eg. Better performance with AA+AF on rather than off, better fps as you go higher in resolutions, weird increases and inconsistent dips in fps.
Still, it is pretty embarassing for ATI to come to the party late and still not have its drivers ready (or at least a little more ready than they are).RayvinAzn
True, though I think you've seen the whole debacle with the Nvidia's "Vista-ready" WQHL 8800 drivers when Vista came out right? There were lawsuits in order for crying out loud!!! lol. The thing is, I believe these cards were previewed with the pre-retail drivers (pre-8.38 ) , I just wonder driver version the guys who are buying the card in retail will get.
So you think that 2900XT will beat 8800 GTX whit new drivers?? domke13
probably not, at least consistently.
Still, it is pretty embarassing for ATI to come to the party late and still not have its drivers ready (or at least a little more ready than they are).RayvinAznHonestly, after all the harping I did on nVidia for going through the few months of bad drivers, this launch with pretty obviously bad drivers makes me feel both ironic and stupid. :D Was [H] the one site that also called the 8600GTS as being better than the X1950Pro?
I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else. RayvinAzn
It can just as easily mean poor architecture. Synthetics tend to offer "ideal" conditions, and an architecture that is more designed around ideal conditions than real-world conditions can perform well in the synthetics, but work very poorly under certain conditions. If your testing meets those conditions, you get a poor result.
After reading around-
I'm unimpressed. The 2900XT is competetive with the 8800GTS on a performance level, but soaks up too much power and generates too much heat. That's not a good sign considering how much older the 8800GTS is.
Really, what it looks like happened here is that ATI came out with a poor design, and tried to get around it by overclocking the hell out of it. The fact that they couldn't come up with anything better in the 6 months this thing has been delayed leads me to believe that there's a more fundamental limitation than bad drivers. It certainly doesn't signal much room for growth. Nvidia lauched the 8800s six months ago- is ATI going to be able to tweak this architecture enough before the second generation of Nvidia cards based off of the 8000 hits to make it even remotely competitive? This is the Geforce FX line part II- a product line delayed past the point where it was relevant.
Right now, if you're in the $400 range, it's a decent option. 8800 prices are already falling though, and once the 8800GTS falls out of that price range, it'll just be obsolete hardware.
[QUOTE="RayvinAzn"]I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else. Bibbidy
It can just as easily mean poor architecture. Synthetics tend to offer "ideal" conditions, and an architecture that is more designed around ideal conditions than real-world conditions can perform well in the synthetics, but work very poorly under certain conditions. If your testing meets those conditions, you get a poor result.
It can also mean that AMD/ATI has spent its time optimizing for 3dmark as it knows that will be one of the main benchmarks used against its card.JimJackJose
Wow at both of you. Their architecture is fantastic, it will just take them a little while longer to get the dispatcher's algorithms worked out. To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.
This is news how?
At least AMD was able to get all the games working at decent FPS without glitches/errors, the same can't be said for about 6 months with Nvidia's drivers and G80 cards...
Lol I love peple choosing Hardforums nVidia bias reviews.....but they are wlel known, so I dont blame them totally. Right now as it stands, this card is the best price/performance for $400 and it will only get better over the month with newer drivers...like 8.38 which is due shortly here.
Im glad to see ATI shipped it to work wiht the games, which still cant be said for lots of nVidia drivers and games. Also the architecture....far far far away from being poor, its much more sophisticated then 8800 series and the card just offers a lot more then the 8800 series from graphics, to video etc.
[QUOTE="Bibbidy"][QUOTE="RayvinAzn"]I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else. LordEC911
It can just as easily mean poor architecture. Synthetics tend to offer "ideal" conditions, and an architecture that is more designed around ideal conditions than real-world conditions can perform well in the synthetics, but work very poorly under certain conditions. If your testing meets those conditions, you get a poor result.
It can also mean that AMD/ATI has spent its time optimizing for 3dmark as it knows that will be one of the main benchmarks used against its card.JimJackJose
Wow at both of you. Their architecture is fantastic, it will just take them a little while longer to get the dispatcher's algorithms worked out. To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.
This is news how?
At least AMD was able to get all the games working at decent FPS without glitches/errors, the same can't be said for about 6 months with Nvidia's drivers and G80 cards...
In 6 month you can do alot of "error correction", why do you thnk ATI waited 6 month? . Atleast Nvidia realeased there card ... unlike XTX is still in the lab.
[QUOTE="LordEC911"]To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.Bibbidy
Quick note, kid:
Actual performance is a better indicator of good or poor architecture than how many neato words and graphs they have on a website describing it.
[QUOTE="LordEC911"]To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.Bibbidy
Quick note, kid:
Actual performance is a better indicator of good or poor architecture than how many neato words and graphs they have on a website describing it.
Not really. Better architecture leaves more potential. Right now that potential is having the dispatcher able to have 5 parrel tasks for each of the 64 shaders. That takes a lot more time/tweaking then coding drivers for Nvidia's architecture. I would like to see Nvidia's driver team have to code for these much more complex architecture's, they probably still wouldn't have drivers out.
In 6 month you can do alot of "error correction", why do you thnk ATI waited 6 month? . Atleast Nvidia realeased there card ... unlike XTX is still in the lab. Bebi_vegeta
You can't redo an architecture that has already been taped out for the past year or so...
Just like Nvidia couldn't magically add more shader's into their G80 architecture for the Ultra, sooo many people were spreading Fud about that.
[QUOTE="Bibbidy"][QUOTE="LordEC911"]To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.LordEC911
Quick note, kid:
Actual performance is a better indicator of good or poor architecture than how many neato words and graphs they have on a website describing it.
Not really. Better architecture leaves more potential. Right now that potential is having the dispatcher able to have 5 parrel tasks for each of the 64 shaders. That takes a lot more time/tweaking then coding drivers for Nvidia's architecture. I would like to see Nvidia's driver team have to code for these much more complex architecture's, they probably still wouldn't have drivers out.
In 6 month you can do alot of "error correction", why do you thnk ATI waited 6 month? . Atleast Nvidia realeased there card ... unlike XTX is still in the lab. Bebi_vegeta
You can't redo an architecture that has already been taped out for the past year or so...
Just like Nvidia couldn't magically add more shader's into their G80 architecture for the Ultra, sooo many people were spreading Fud about that.
I guess this is why there is no XTX... but this doesnt explain the driver performance issue for XT.
Yes, really. Believe it or not, the real world does not bend and reshape itself just because you fantasise about "potential" that the card's performance shows no sign of actually existing. Bibbidy
You seem to not be able to read all my posts...
Potential, as in we are only seeing about 2/5-3/5 of the R600's performance because of the immature dispatcher.
These things take time to work on and mature, 8.38 has already increased performance a bit and let's see what the 2900's is like in a month.
[QUOTE="Bibbidy"]Yes, really. Believe it or not, the real world does not bend and reshape itself just because you fantasise about "potential" that the card's performance shows no sign of actually existing. LordEC911
You seem to not be able to read all my posts...
Potential, as in we are only seeing about 2/5-3/5 of the R600's performance because of the immature dispatcher.
These things take time to work on and mature, 8.38 has already increased performance a bit and let's see what the 2900's is like in a month.
LOL, you seem to read nothing of my post.
How many time have R600 being delayed?
What were they doing in the mean time?
You think this driver issue couldn't of been fixed by then?
I'll beleive this increasement when i'll see it in benchmarks and i'm talking about 8.38... not 8.36 to 8.37
LOL, you seem to read nothing of my post.How many time have R600 being delayed?
What were they doing in the mean time?
You think this driver issue couldn't of been fixed by then?
I'll beleive this increasement when i'll see it in benchmarks and i'm talking about 8.38... not 8.36 to 8.37Bebi_vegeta
I read your posts and I have constantly answered them, especially your repeating of questions.
The R600 was delayed once, technically twice.
They were only working on all the other projects they have going on, if you really want me to list them all I will.
The driver issue? At least AMD has all the games working with their drivers. Reliability and stability comes first, then performance.
Well you could look at some of the individual reviews done by unbias sources at Xtremesystems.com but I'm sure you don't believe them...
[QUOTE="Bebi_vegeta"]LOL, you seem to read nothing of my post.How many time have R600 being delayed?
What were they doing in the mean time?
You think this driver issue couldn't of been fixed by then?
I'll beleive this increasement when i'll see it in benchmarks and i'm talking about 8.38... not 8.36 to 8.37LordEC911
I read your posts and I have constantly answered them, especially your repeating of questions.
The R600 was delayed once, technically twice.
They were only working on all the other projects they have going on, if you really want me to list them all I will.
The driver issue? At least AMD has all the games working with their drivers. Reliability and stability comes first, then performance.
Well you could look at some of the individual reviews done by unbias sources at Xtremesystems.com but I'm sure you don't believe them...
Delayed because of other projects, LOL... Oh please!
And wheres the XTX... comon man.
Yes every game works, they had 6 months to be sure of it!
If Nividia would of realesed the G80 may 14th, every game would work to.
I beleive 3dguru has a good review, I am not impressed with R600 yet. I was waiting for the R600 to come out and it got delayed, and there high end card is not even out... deceived indeed!
ACtually not every game would work, because Splinter Cell still doesnt work, the XTX needs to be on 65nm because it bleeds too much on 80nm. With it dropping to 65nm, ATI can push the core to 1Ghz now, while producing a less expensive card.
AMD wants to do afull on 65nm launch with CPUs too. 9mmSpliff
Thats weird the XT is in 80nm... and i'm suspecting it had problemes with 65nm...
Splinter cell 3: is vista faults... http://www.guru3d.com/article/Videocards/431/22/ No Vista results here sorry, when I tried to install Splinter Cell CT the copy protection consistently crashes in Vista; yes of course ... It's Starforce copy protection on the move again. Somebody should throw a bomb on that company.
[QUOTE="Bibbidy"][QUOTE="RayvinAzn"]I do realize that, but when all the gaming benchmarks are all over the place, yet the synthetic benchmark stays around the same, it makes me scratch my head a bit and wonder if this problem is more driver related than anything else. LordEC911
It can just as easily mean poor architecture. Synthetics tend to offer "ideal" conditions, and an architecture that is more designed around ideal conditions than real-world conditions can perform well in the synthetics, but work very poorly under certain conditions. If your testing meets those conditions, you get a poor result.
It can also mean that AMD/ATI has spent its time optimizing for 3dmark as it knows that will be one of the main benchmarks used against its card.JimJackJose
Wow at both of you. Their architecture is fantastic, it will just take them a little while longer to get the dispatcher's algorithms worked out. To think that it is a poor achitecture is ridiculous and you might want to spend some more time studying the R600 architecture.
This is news how?
At least AMD was able to get all the games working at decent FPS without glitches/errors, the same can't be said for about 6 months with Nvidia's drivers and G80 cards...
Excuse me? How is it from my comment you can mystically determine I have not "studied the cards architecture" enough? Futhermore what does that even have to do with my comment. What I said was in no way a stretch of the imagination. How else would you describe ATI posting huge numbers in the synthetic benchmark while showing numbers on par with the other cards in its category on all the other tests?
If the card has such "fantastic architecture" why does it consume so much more power while showing equal performance when compared to other cards in its category? Why is it you find it so hard to believe ATI may have optimized for 3dmark, when it is something both companies have been busted for in the past.
I would think the varying numbers between real performance and synthetic benchmarks would be a huge red flag to any educated / non fanboy consumer.
If the card has such "fantastic architecture" why does it consume so much more power while showing equal performance when compared to other cards in its category? Why is it you find it so hard to believe ATI may have optimized for 3dmark, when it is something both companies have been busted for in the past.I would think the varying numbers between real performance and synthetic benchmarks would be a huge red flag to any educated / non fanboy consumer.JimJackJose
Well it's 700million transistors needpower to do their job...
Because they need much better drivers to have constant high-performance.
That is much easier to do with a synthetic benchmark then entire games.
Heck, the 8.38 drivers offer up about 500 points more in 3dMark06.
What the high 3dmark score shows is that the capability is there, it just needs to be ironed out with drivers.
uh WOW... like the millionth and one thread about this...... go to page 2 of pc hardware and woe and behold.... HD 2900 XT discussions!! Here ! Here ! Here ! Aha !! Oh and more reviews !!! Oh and a LIST OF REVIEWS of that card too!! sorry.... this has gone a bit too old i guess.... sorry for all that incessant bantering, but yeah you are late to the party.jfelisario
:lol:
[QUOTE="r3351925"][QUOTE="jfelisario"]uh WOW... like the millionth and one thread about this...... go to page 2 of pc hardware and woe and behold.... HD 2900 XT discussions!! Here ! Here ! Here ! Aha !! Oh and more reviews !!! Oh and a LIST OF REVIEWS of that card too!! sorry.... this has gone a bit too old i guess.... sorry for all that incessant bantering, but yeah you are late to the party.jfelisario
:lol:
its part of the phase going into this new avatar, I was boring as a robot with my old avatar lawlercakes!!!1
lol i never liked that 1.
I think that nvidia is thinking too much about how being a king on those graphs. They are making faster cards, that costs almot a 1000$, while they totally forget about mid-range cards, which will decide the winner. And nvidias cards in mid-range will probably be pwned from ati mid-range cards. It is obvyous, that nvidia just wants to be the king, they want to have the fastest GPU on earth.domke13
I like the way you say it... PROBABLY
I think a 8800 GTS 320mb is a great deal.
Please Log In to post.
Log in to comment