HD2900XT 512MB or 8800GTS 640MB

  • 71 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for SLI_Yoshi
SLI_Yoshi

461

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 SLI_Yoshi
Member since 2006 • 461 Posts
I'm having trouble deciding between these two graphics cards because some reviews say that the ATI HD2900XT is better while other reviews say the NVIDIA 8800GTS is better. The power and heat of the ATI is not a problem to me because I will be getting a good PSU and a good case with lots of cooling. So, which one will have better performance in present and future games? I will probably have 2 to 4x AA and 8 to 16xAF enabled on every game I played, if the card allows obviously. Most of the benchmarks are pretty close, however theNVIDIA seems more stable. On the other hard, ATIs drivers aren't maturedyet. So whichshould I buy? The ATI or the NVIDIA? Image quality counts also. Andcould you link me some benchmarks of the ATI with its latest drivers? Thanks!
Avatar image for lettuceman44
lettuceman44

7971

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#2 lettuceman44
Member since 2005 • 7971 Posts
If power and cooling isn't an issue, go with the 2900xt.
Avatar image for GamingMonkeyPC
GamingMonkeyPC

3576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#3 GamingMonkeyPC
Member since 2005 • 3576 Posts
I don't mind recommending the HD2900XT over my own 8800 GTS 640 MB. I've been seeing more benchmarks saying that the HD2900XT does better than 8800 GTS in R6: Vegas, which I'm taking a guess most of the U3 engine based games (UT3, BioShock, Gears of War, *maybe* Mass Effect, and etc.) will run better with on the HD2900XT - but who knows. Both are great cards, can't go wrong with one or the other.
Avatar image for lettuceman44
lettuceman44

7971

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#4 lettuceman44
Member since 2005 • 7971 Posts
I don't mind recommending the HD2900XT over my own 8800 GTS 640 MB. I've been seeing more benchmarks saying that the HD2900XT does better than 8800 GTS in R6: Vegas, which I'm taking a guess most of the U3 engine based games (UT3, BioShock, Gears of War, *maybe* Mass Effect, and etc.) will run better with on the HD2900XT - but who knows. Both are great cards, can't go wrong with one or the other.GamingMonkeyPC
indeed. You won't go wrong either way.
Avatar image for TrailorParkBoy
TrailorParkBoy

2922

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 TrailorParkBoy
Member since 2006 • 2922 Posts
http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=432&Itemid=29&limit=1&limitstart=2 On that benchmark in Direct X 10 the 8600 GTS is equal to the HD2900XT (the drivers are more optimised for Nvidia and the game has been optimized for nvidias cards but come on, a 8600 should never beat a HD2900XT). I would recommend the 8800 GTS 640 but its your call. Also be sure to read all of that article its interesting (that link is to the middle of the article so you might wanna read the hole thing).
Avatar image for lettuceman44
lettuceman44

7971

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#6 lettuceman44
Member since 2005 • 7971 Posts
http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=432&Itemid=29&limit=1&limitstart=2 On that benchmark in Direct X 10 the 8600 GTS is equal to the HD2900XT (the drivers are more optimised for Nvidia and the game has been optimized for nvidias cards but come on, a 8600 should never beat a HD2900XT). I would recommend the 8800 GTS 640 but its your call. Also be sure to read all of that article its interesting (that link is to the middle of the article so you might wanna read the hole thing).TrailorParkBoy
um, no. What ATI drivers were they? And you should not judge the card just based on that benchmark..........
Avatar image for TrailorParkBoy
TrailorParkBoy

2922

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 TrailorParkBoy
Member since 2006 • 2922 Posts
[QUOTE="TrailorParkBoy"]http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=432&Itemid=29&limit=1&limitstart=2 On that benchmark in Direct X 10 the 8600 GTS is equal to the HD2900XT (the drivers are more optimised for Nvidia and the game has been optimized for nvidias cards but come on, a 8600 should never beat a HD2900XT). I would recommend the 8800 GTS 640 but its your call. Also be sure to read all of that article its interesting (that link is to the middle of the article so you might wanna read the hole thing).lettuceman44
um, no. What ATI drivers were they? And you should not judge the card just based on that benchmark..........

um... ya? they used 7.6 isn't that the new one? And in Direct X 10 the 8600 GTS performed on par with the HD2900XT just like I said (some times the 8600 GTS won some times it didnt). and like I said its his call, he can take that benchmark for a grain of salt for all I care.
Avatar image for achilles614
achilles614

5310

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 achilles614
Member since 2005 • 5310 Posts
What res are you gaming at?
Avatar image for SLI_Yoshi
SLI_Yoshi

461

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 SLI_Yoshi
Member since 2006 • 461 Posts
1680x1050.
Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 LordEC911
Member since 2004 • 9972 Posts

[QUOTE="lettuceman44"][QUOTE="TrailorParkBoy"]http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=432&Itemid=29&limit=1&limitstart=2 On that benchmark in Direct X 10 the 8600 GTS is equal to the HD2900XT (the drivers are more optimised for Nvidia and the game has been optimized for nvidias cards but come on, a 8600 should never beat a HD2900XT). I would recommend the 8800 GTS 640 but its your call. Also be sure to read all of that article its interesting (that link is to the middle of the article so you might wanna read the hole thing).TrailorParkBoy
um, no. What ATI drivers were they? And you should not judge the card just based on that benchmark..........

um... ya? they used 7.6 isn't that the new one? And in Direct X 10 the 8600 GTS performed on par with the HD2900XT just like I said (some times the 8600 GTS won some times it didnt). and like I said its his call, he can take that benchmark for a grain of salt for all I care.

Ummm... ATI didn't get WiC until the end of June...
So how would they have driver support for the game if the driver was already sent to MS in the mid of June?

Expect a nice performance increase with WiC in DX10 with a HD2900XT when 7.7 comes out, or whatever the July release is numbered.

Avatar image for SLI_Yoshi
SLI_Yoshi

461

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 SLI_Yoshi
Member since 2006 • 461 Posts

[QUOTE="TrailorParkBoy"][QUOTE="lettuceman44"][QUOTE="TrailorParkBoy"]http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=432&Itemid=29&limit=1&limitstart=2 On that benchmark in Direct X 10 the 8600 GTS is equal to the HD2900XT (the drivers are more optimised for Nvidia and the game has been optimized for nvidias cards but come on, a 8600 should never beat a HD2900XT). I would recommend the 8800 GTS 640 but its your call. Also be sure to read all of that article its interesting (that link is to the middle of the article so you might wanna read the hole thing).LordEC911

um, no. What ATI drivers were they? And you should not judge the card just based on that benchmark..........

um... ya? they used 7.6 isn't that the new one? And in Direct X 10 the 8600 GTS performed on par with the HD2900XT just like I said (some times the 8600 GTS won some times it didnt). and like I said its his call, he can take that benchmark for a grain of salt for all I care.

Ummm... ATI didn't get WiC until the end of June...
So how would they have driver support for the game if the driver was already sent to MS in the mid of June?

Expect a nice performance increase with WiC in DX10 with a HD2900XT when 7.7 comes out, or whatever the July release is numbered.

LordEC911, you seem to be the leading expert on all stuff AMD/ATI. So I'm gonna ask youa couple questions.There are varied results in all the benchmarks I have seen with them so far. They each take turns coming on top. But for the most part, ATI seems to be ahead. And the drivers are still immature which is a good thing because won't the performance only get greater with drivers? (thats what im thinking, not sure tho) And if the performance does increase with drivers, then the 2900xt should be ahead of the 8800gts very soon. Ppl have also said the ATI is more futureproof which is another good thing imo. But remember that I am going to be running probably 4X AA and at least 8XAF in every game I play. Knowing that, which should I buy? I've been an NVIDIA guy all my life, starting with the Geforce 2 series, but I think its time for a little change since ATI seems to be superior for the most part to the GTS. Thanks.

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 LordEC911
Member since 2004 • 9972 Posts

LordEC911, you seem to be the leading expert on all stuff AMD/ATI. So I'm gonna ask youa couple questions.There are varied results in all the benchmarks I have seen with them so far. They each take turns coming on top. But for the most part, ATI seems to be ahead. And the drivers are still immature which is a good thing because won't the performance only get greater with drivers? (thats what im thinking, not sure tho) And if the performance does increase with drivers, then the 2900xt should be ahead of the 8800gts very soon. Ppl have also said the ATI is more futureproof which is another good thing imo. But remember that I am going to be running probably 4X AA and at least 8XAF in every game I play. Knowing that, which should I buy? I've been an NVIDIA guy all my life, starting with the Geforce 2 series, but I think its time for a little change since ATI seems to be superior for the most part to the GTS. Thanks.SLI_Yoshi

Well, my personal opinion is that it all comes down to pricing. If you can find a HD2900XT for under $400, I would personally get that but if it is over $400 I would just grab either GTS, 640mb or 320mb.

The drivers will get better over time, not by huge leaps but they should definitely crawl upwards a bit.

As for the AA, the folks at AMD decided to take a different route and have the shaders do the AA which seems to be where future games are heading. So the AA performance will definitely get better with future games.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 Wesker776
Member since 2005 • 7004 Posts

As for the AA, the folks at AMD decided to take a different route and have the shaders do the AA which seems to be where future games are heading. So the AA performance will definitely get better with future games.

Not if NVIDIA slips MS/Devs with a healthy sum of money to continue to adopt AA through the ROPs.

Like how they most likely paid off MS over DX10 guidelines, because the G80 can't do memory virtualisation (R600 can because ATi/AMD followed DX10 guidelines).

Avatar image for Manly-manly-man
Manly-manly-man

3477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 Manly-manly-man
Member since 2006 • 3477 Posts
The HD2900XT for sure. It performs as well as an 8800GTX in Rainbow Six: Vegas, which means it will also perform as well as a GTX in other UE3 games. UE3 is a very popular engine, and a lot of new upcoming games use it. Brothers in Arms: Hells Highway, Call of Duty 4, Kane and Lynch, Bioshock, and others. Also, in most games the HD2900XT performs as well, slightly worse, or slightly better up to 1600x1200 and 2xAA, and if you enable the CFAA it will look as good as it gets aliasing wise.
Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 LordEC911
Member since 2004 • 9972 Posts

As for the AA, the folks at AMD decided to take a different route and have the shaders do the AA which seems to be where future games are heading. So the AA performance will definitely get better with future games. Wesker776

Not if NVIDIA slips MS/Devs with a healthy sum of money to continue to adopt AA through the ROPs.

Like how they most likely paid off MS over DX10 guidelines, because the G80 can't do memory virtualisation (R600 can because ATi/AMD followed DX10 guidelines).

They can't, well at least the near future games into 2008, since they have already started on the engines and had to be cleared by MS.

Avatar image for Manly-manly-man
Manly-manly-man

3477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 Manly-manly-man
Member since 2006 • 3477 Posts
And, as others have said, newer games will start using the HD2900XT and other R600 cards' new features with DX10, so it will continue to improve performance wise. But even with these relatively immature drivers, it performs as well as the 8800GTS in most games. But really, what I am emphazising is, with all of the upcoming UE3 games, the HD2900XT will compete with the 8800GTX.
Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 Wesker776
Member since 2005 • 7004 Posts
[QUOTE="Wesker776"]

As for the AA, the folks at AMD decided to take a different route and have the shaders do the AA which seems to be where future games are heading. So the AA performance will definitely get better with future games. LordEC911

Not if NVIDIA slips MS/Devs with a healthy sum of money to continue to adopt AA through the ROPs.

Like how they most likely paid off MS over DX10 guidelines, because the G80 can't do memory virtualisation (R600 can because ATi/AMD followed DX10 guidelines).

They can't, well at least the near future games into 2008, since they have already started on the engines and had to be cleared by MS.

That's good to hear.

I'm still on the fence on whether or not I should wait for R650. My X1900 is still running strong, but I really wouldn't mind going R600 now with my Q6600 (yes, I pre-ordered yesterday :D ).

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 LordEC911
Member since 2004 • 9972 Posts
That's good to hear.

I'm still on the fence on whether or not I should wait for R650. My X1900 is still running strong, but I really wouldn't mind going R600 now with my Q6600 (yes, I pre-ordered yesterday :D ).Wesker776

R650 as in the high end XTX or XT model? If I were you I would just wait for the R700, that's what I plan to do. Or grab a RV670 this fall, if the refresh is still going to happen.

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 SDxSnOOpZ
Member since 2006 • 225 Posts
Nvidia - The Way It's Meant to be Played
Avatar image for Alpha_Omega69
Alpha_Omega69

11840

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 Alpha_Omega69
Member since 2004 • 11840 Posts

Nvidia - The Way It's Meant to be PlayedSDxSnOOpZ

...Thats all you could come up with? :|

Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 Wesker776
Member since 2005 • 7004 Posts
[QUOTE="Wesker776"]That's good to hear.

I'm still on the fence on whether or not I should wait for R650. My X1900 is still running strong, but I really wouldn't mind going R600 now with my Q6600 (yes, I pre-ordered yesterday :D ).LordEC911

R650 as in the high end XTX or XT model? If I were you I would just wait for the R700, that's what I plan to do. Or grab a RV670 this fall, if the refresh is still going to happen.

R650 as in 2950 XTX (?).

I might be able to wait for R700, but a $500 (AUD) 2900 XT is pretty tempting at the moment. I dunno, I'll buy a new video card when I get an X38 board (for possible CrossFire), whichever is king of the hill then.

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 SDxSnOOpZ
Member since 2006 • 225 Posts
[QUOTE="LordEC911"][QUOTE="Wesker776"]That's good to hear.

I'm still on the fence on whether or not I should wait for R650. My X1900 is still running strong, but I really wouldn't mind going R600 now with my Q6600 (yes, I pre-ordered yesterday :D ).Wesker776

R650 as in the high end XTX or XT model? If I were you I would just wait for the R700, that's what I plan to do. Or grab a RV670 this fall, if the refresh is still going to happen.

R650 as in 2950 XTX (?).

I might be able to wait for R700, but a $500 (AUD) 2900 XT is pretty tempting at the moment. I dunno, I'll buy a new video card when I get an X38 board (for possible CrossFire), whichever is king of the hill then.

so why can't you get whatever that is king of the hill now?

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 SDxSnOOpZ
Member since 2006 • 225 Posts

[QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedAlpha_Omega69

...Thats all you could come up with? :|

Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.

how do youknow that the R600 is more future-proof in dx10, are you a phsycic?

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 Wesker776
Member since 2005 • 7004 Posts

Because:

- I'm concerned about NVIDIA's hastiness to DX10.
- R600's power consumption/heat dissipation is way too high and the 65/55nm shrink should fix this.
- A good quality GTX (XFX/EVGA) is nearly twice as expensive as a 2900 XT here.
- My X1900 XTX has yet to fail me in a DX9 game.

Avatar image for Alpha_Omega69
Alpha_Omega69

11840

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Alpha_Omega69
Member since 2004 • 11840 Posts
[QUOTE="Alpha_Omega69"]

[QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedSDxSnOOpZ

...Thats all you could come up with? :|

Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.

why do you think that the R600 is more future-proof?

Because more dev's will lean to doing AA through stream processing, which would give huge performance hits to the G80's when AA is on unless they can figure out a way to do AA both ways. And that thing that the G80's don't have that the R600's do, memory virtualisation. I don't know what it does but it sounds important :lol:

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 Wesker776
Member since 2005 • 7004 Posts
[QUOTE="Alpha_Omega69"]

[QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedSDxSnOOpZ

...Thats all you could come up with? :|

Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.

how do youknow that the R600 is more future-proof in dx10, are you a phsycic?

ATi actually followed MS's exact DX10 guidelines.

Why do you think ATi went with a packed super scalar architecture (320 SPs)?

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 SDxSnOOpZ
Member since 2006 • 225 Posts
i remember all the hype before the R600 was released, all the talks about how it will kick thecrap outof G80..thenwe saw real benchmarks of its performance and how were not so impressed. anyway, my point is, you won't know anything untilwe see it in action.
Avatar image for Alpha_Omega69
Alpha_Omega69

11840

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Alpha_Omega69
Member since 2004 • 11840 Posts

i remember all the hype before the R600 was released, all the talks about how it will kick thecrap outof G80..thenwe saw real benchmarks of its performance and how were not so impressed. anyway, my point is, you won't know anything untilwe see it in action.SDxSnOOpZ

Lol but its already in action? Were you in some cave for this past month? In some benchmarks the HD2900XT murdered the 8800GTS and cought up to the 8800GTX. Though it was with AA disbaled, when AA was turned on it dropped. Look I own 2 8800 GTx's and I know what the R600's are capable of. So get it through your thick fanboy skull.

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 SDxSnOOpZ
Member since 2006 • 225 Posts
[QUOTE="SDxSnOOpZ"][QUOTE="Alpha_Omega69"]

[QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedWesker776

...Thats all you could come up with? :|

Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.

how do youknow that the R600 is more future-proof in dx10, are you a phsycic?

ATi actually followed MS's exact DX10 guidelines.

Why do you think ATi went with a packed super scalar architecture (320 SPs)?

nVidia doesn't follow the standard DX9 or DX10 guideline because they want to help developers to optimize the code for their GPU, hence the motto "The way its meant to be played", and since ATi actually followed the guideline...there are no optimization, or even its deoptimized.

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 SDxSnOOpZ
Member since 2006 • 225 Posts

[QUOTE="SDxSnOOpZ"]i remember all the hype before the R600 was released, all the talks about how it will kick thecrap outof G80..thenwe saw real benchmarks of its performance and how were not so impressed. anyway, my point is, you won't know anything untilwe see it in action.Alpha_Omega69

Lol but its already in action? Were you in some cave for this past month? In some benchmarks the HD2900XT murdered the 8800GTS and cought up to the 8800GTX. Though it was with AA disbaled, when AA was turned on it dropped. Look I own 2 8800 GTx's and I know what the R600's are capable of. So get it through your thick fanboy skull.

i've seen more 2900xt vs 8800 benchmarks then you ever will, dont think that idont know, but i'm just basing fromall the facts, all the benchmarks thatI've seen, the 2900xt got it ass wooped. I thought it was suppose to be wayfaster than 8800gtx. Stop assuming things andthink you knowwhat ther600is capable of..you dont know jack.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 Wesker776
Member since 2005 • 7004 Posts
[QUOTE="Wesker776"][QUOTE="SDxSnOOpZ"][QUOTE="Alpha_Omega69"]

[QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedSDxSnOOpZ

...Thats all you could come up with? :|

Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.

how do youknow that the R600 is more future-proof in dx10, are you a phsycic?

ATi actually followed MS's exact DX10 guidelines.

Why do you think ATi went with a packed super scalar architecture (320 SPs)?

nVidia doesn't follow the standard DX9 or DX10 guideline because they want to help developers to optimize the code for their GPU, hence the motto "The way its meant to be played", and since ATi actually followed the guideline...there are no optimization, or even its deoptimized.

:|

Quite possibly the most ridiculous understanding of guidelines I've ever heard.

Avatar image for TrailorParkBoy
TrailorParkBoy

2922

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 TrailorParkBoy
Member since 2006 • 2922 Posts

[QUOTE="SDxSnOOpZ"]i remember all the hype before the R600 was released, all the talks about how it will kick thecrap outof G80..thenwe saw real benchmarks of its performance and how were not so impressed. anyway, my point is, you won't know anything untilwe see it in action.Alpha_Omega69

Lol but its already in action? Were you in some cave for this past month? In some benchmarks the HD2900XT murdered the 8800GTS and cought up to the 8800GTX. Though it was with AA disbaled, when AA was turned on it dropped. Look I own 2 8800 GTx's and I know what the R600's are capable of. So get it through your thick fanboy skull.

we were all lead to believe that the HD2900XT was going to pwn the day it was out but now all I see is a card that blows at AA. who knows, maybe it will get alot better and I am wrong. All I am pointing out is that the card has already disappointed me once and I am not going to hype it again.
Avatar image for Alpha_Omega69
Alpha_Omega69

11840

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 Alpha_Omega69
Member since 2004 • 11840 Posts

i've seen more 2900xt vs 8800 benchmarks then you ever will, dont think that idont know, but i'm just basing fromall the facts, all the benchmarks thatI've seen, the 2900xt got it ass wooped. I thought it was suppose to be wayfaster than 8800gtx. Stop assuming things andthink you knowwhat ther600is capable of..you dont know jack.

SDxSnOOpZ

Lol, the R600 was never meant to be better than the GTX MO-RON. It was meant to go against the GTS. The fact that it DOES catch up to the GTX in some benchmarks makes it a better card. And would you care to post these so called benchmarks that show the HD2900XT getting its ass wooped?

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 SDxSnOOpZ
Member since 2006 • 225 Posts

why dont you post up links of the benchmarks where the 2900xt "murdered" the 8800gts

Avatar image for Alpha_Omega69
Alpha_Omega69

11840

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 Alpha_Omega69
Member since 2004 • 11840 Posts

Here is a few from Extreme Tech

http://www.extremetech.com/article2/0,1697,2129298,00.asp

http://www.extremetech.com/article2/0,1697,2129299,00.asp

As you can see, the 2900 is on most if not all occasions better than the 8800GTS. How about you post some of yours then Snoopz?

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 Wesker776
Member since 2005 • 7004 Posts

why dont you post up links of the benchmarks where the 2900xt "murdered" the 8800gts

SDxSnOOpZ

http://www.firingsquad.com/hardware/diamond_radeon_2900_xt_1gb/page1.asp

But to be honest, there are times where the 2900 XT slips to GTS levels (mainly when HDR + 4xAA + 16xAF is enabled), but all single cards usually choke under those conditions (1900x1200 resolution).

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 SDxSnOOpZ
Member since 2006 • 225 Posts

http://firingsquad.com/hardware/amd_ati_radeon_hd_2900_xt_performance_preview/default.asp

http://www.hardocp.com/article.html?art=MTM1MSwsLGhlbnRodXNpYXN0

http://www.anandtech.com/video/showdoc.aspx?i=2988

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 LordEC911
Member since 2004 • 9972 Posts
http://firingsquad.com/hardware/amd_ati_radeon_hd_2900_xt_performance_preview/default.asp

http://www.hardocp.com/article.html?art=MTM1MSwsLGhlbnRodXNpYXN0

http://www.anandtech.com/video/showdoc.aspx?i=2988
SDxSnOOpZ

Thank you for making my guess come true...
Your earlier post saying you have read more HD2900XT benches then Alpha I thought to myself, "Hmm... I have a feeling he is going to post the HardOCP bench."
And look what happens...

Kyle and Brent had already decided the card was a flop way before they ever got one in their hands.

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 Bebi_vegeta
Member since 2003 • 13558 Posts

[QUOTE="SDxSnOOpZ"]

i've seen more 2900xt vs 8800 benchmarks then you ever will, dont think that idont know, but i'm just basing fromall the facts, all the benchmarks thatI've seen, the 2900xt got it ass wooped. I thought it was suppose to be wayfaster than 8800gtx. Stop assuming things andthink you knowwhat ther600is capable of..you dont know jack.

Alpha_Omega69

Lol, the R600 was never meant to be better than the GTX MO-RON. It was meant to go against the GTS. The fact that it DOES catch up to the GTX in some benchmarks makes it a better card. And would you care to post these so called benchmarks that show the HD2900XT getting its ass wooped?

And i'm still waiting for the 2900xtx... unless it's the 1gig version?

Avatar image for domke13
domke13

2891

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 domke13
Member since 2006 • 2891 Posts

Will crysis use those AA through sahders??? Will UT3 use it?? Will bioshcok use it?? Will World in conflict use it??? Will ET: QW use it??? I think no. We will see AA through shaders when first true!! DX10 titles come. Like tiltes that will only support DX10 and Vista, no XP and DX9. That time IMO we will see AA through shaders. We will see how cards play games like crysis and bioshock and thats the time we will see which card is better. Its pointless to argue here on forums, how 2900 or 8800 will suck in DX10 games compared to oposite. Maybe R600 has tesselation, AA through shaders. G80 also have quantum effects. And IMO by the time AA through shaders will be used we will all have newer cards. Wesker sad that 1900 has yet to dissapoint him in DX9 games. Does 7900GTX dissapointed anyone in DX9 games yet??

Also i couldnt laughed enough when i found out that R600 doesnt use that "superior" HD video decoding" like ATI has told us.

I even saw comercials where they were poitnintg that thing out as "one of main advantages of R600" compared to G80.

And now we found out that R600 uses old video decoder from 1900. 8800 series have better image quality, produce less heat, use less energy, and are better whit AA on.

For now that tesselation and AA through shaders means nothing and 8800 are better buy. That could change in future when/if developers will start to use AA through shaders and tesselation, but by than i think we will have much better cards out.

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 LordEC911
Member since 2004 • 9972 Posts
Also i couldnt laughed enough when i found out that R600 doesnt use that "superior" HD video decoding" like ATI has told us.

I even saw comercials where they were poitnintg that thing out as "one of main advantages of R600" compared to G80.

And now we found out that R600 uses old video decoder from 1900. 8800 series have better image quality, produce less heat, use less energy, and are better whit AA on.

For now that tesselation and AA through shaders means nothing and 8800 are better buy. That could change in future when/if developers will start to use AA through shaders and tesselation, but by than i think we will have much better cards out.domke13

You saw a commercial? I highly doubt that.

The R600 does have superior HD video decoding...
It does NOT use the same decoder as from the 1900...
The G80 does NOT have better IQ, if anything they are equal... (less heat = less energy btw)

So if you could go back, would you have bought a X1800XT or a 7800GTX?
The 7800GTX was better during the launch but the X1800XT improved past it over time.
So the question is, do you want slightly more performance now or more performance a year from now?

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 Bebi_vegeta
Member since 2003 • 13558 Posts
[QUOTE="domke13"]Also i couldnt laughed enough when i found out that R600 doesnt use that "superior" HD video decoding" like ATI has told us.

I even saw comercials where they were poitnintg that thing out as "one of main advantages of R600" compared to G80.

And now we found out that R600 uses old video decoder from 1900. 8800 series have better image quality, produce less heat, use less energy, and are better whit AA on.

For now that tesselation and AA through shaders means nothing and 8800 are better buy. That could change in future when/if developers will start to use AA through shaders and tesselation, but by than i think we will have much better cards out.LordEC911

You saw a commercial? I highly doubt that.

The R600 does have superior HD video decoding...
It does NOT use the same decoder as from the 1900...
The G80 does NOT have better IQ, if anything they are equal... (less heat = less energy btw)

So if you could go back, would you have bought a X1800XT or a 7800GTX?
The 7800GTX was better during the launch but the X1800XT improved past it over time.
So the question is, do you want slightly more performance now or more performance a year from now?

The revised 7800GTX 512mb is equal and most of the time better.

Avatar image for guyver23
guyver23

877

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 guyver23
Member since 2007 • 877 Posts

i say its a toss the coin situation.

Avatar image for domke13
domke13

2891

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 domke13
Member since 2006 • 2891 Posts
[QUOTE="domke13"]Also i couldnt laughed enough when i found out that R600 doesnt use that "superior" HD video decoding" like ATI has told us.

I even saw comercials where they were poitnintg that thing out as "one of main advantages of R600" compared to G80.

And now we found out that R600 uses old video decoder from 1900. 8800 series have better image quality, produce less heat, use less energy, and are better whit AA on.

For now that tesselation and AA through shaders means nothing and 8800 are better buy. That could change in future when/if developers will start to use AA through shaders and tesselation, but by than i think we will have much better cards out.LordEC911

You saw a commercial? I highly doubt that.

The R600 does have superior HD video decoding...
It does NOT use the same decoder as from the 1900...
The G80 does NOT have better IQ, if anything they are equal... (less heat = less energy btw)

So if you could go back, would you have bought a X1800XT or a 7800GTX?
The 7800GTX was better during the launch but the X1800XT improved past it over time.
So the question is, do you want slightly more performance now or more performance a year from now?

I would get 7800GTX. So i would enjoy better card for 1 year, and when it would got beaten by 1800XT i would probalby have new high end card, since i have summer time job now and i make enoguh money. And even by the time 1800 would get better it couldnt run most of new games on max, since it would be too late for it. Hm Will you say you are smarter than hardware reviewer from PC magazine i read?? That guy goes to all conference, he was on ATi launch event and i seriously doubt you know more than him, about video decoder. Or i might be wrong, and maybe he has no clue about GPUs and work for magazine just cause he has nothing else to do. About IQ. 8800 series have better filtrering. I even saw some comparison pics on web. Ill try to find them for you.

Avatar image for SDxSnOOpZ
SDxSnOOpZ

225

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 SDxSnOOpZ
Member since 2006 • 225 Posts
both are great cards, they're both future-proof and will run dx10 games flawlessly. you can't go wrong with either one, as long as you can get above 30fps in games, you'll be happy.
Avatar image for domke13
domke13

2891

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 domke13
Member since 2006 • 2891 Posts

both are great cards, they're both future-proof and will run dx10 games flawlessly. you can't go wrong with either one, as long as you can get above 30fps in games, you'll be happy.SDxSnOOpZ

Agreed.

Avatar image for Samulies
Samulies

1658

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 Samulies
Member since 2005 • 1658 Posts

Im happy with a mountain dew. But i do like my games to run smooth...

Im grabbing a 2900xt in a few days personally. My choice

BTW: does anyone know anything about xpertvision? good brand ?

EDIT: BTW, i hear that all games that display "gmaes for windows" will ahve suport for ATi's style of shading. and that shows in the results in company of heroes, a games for windows title.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 Wesker776
Member since 2005 • 7004 Posts

Im happy with a mountain dew. But i do like my games to run smooth...

Im grabbing a 2900xt in a few days personally. My choice

BTW: does anyone know anything about xpertvision? good brand ?

EDIT: BTW, i hear that all games that display "gmaes for windows" will ahve suport for ATi's style of shading. and that shows in the results in company of heroes, a games for windows title.

Samulies

XpertVision is pretty dodgy IMO.

You're Aussie, right?

Go to umart.com.au and buy the $500 Jetway 2900 XT rather than the XpertVision.

EDIT: As for the superior decoder, R610 and R630 win.

Avatar image for yoyo462001
yoyo462001

7535

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#49 yoyo462001
Member since 2005 • 7535 Posts
i would personally go for the 2900XT, to be honest i think both cards have the same performance.
Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 Wesker776
Member since 2005 • 7004 Posts

Will crysis use those AA through sahders??? Will UT3 use it?? Will bioshcok use it?? Will World in conflict use it??? Will ET: QW use it??? I think no. We will see AA through shaders when first true!! DX10 titles come.

Those games use the DX10 API, yes?

Then yes, we will see AA though shaders. :| It's a fricken guideline not an option for DX10 API.