This topic is locked from further discussion.
I don't mind recommending the HD2900XT over my own 8800 GTS 640 MB. I've been seeing more benchmarks saying that the HD2900XT does better than 8800 GTS in R6: Vegas, which I'm taking a guess most of the U3 engine based games (UT3, BioShock, Gears of War, *maybe* Mass Effect, and etc.) will run better with on the HD2900XT - but who knows. Both are great cards, can't go wrong with one or the other.GamingMonkeyPCindeed. You won't go wrong either way.
http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=432&Itemid=29&limit=1&limitstart=2 On that benchmark in Direct X 10 the 8600 GTS is equal to the HD2900XT (the drivers are more optimised for Nvidia and the game has been optimized for nvidias cards but come on, a 8600 should never beat a HD2900XT). I would recommend the 8800 GTS 640 but its your call. Also be sure to read all of that article its interesting (that link is to the middle of the article so you might wanna read the hole thing).TrailorParkBoyum, no. What ATI drivers were they? And you should not judge the card just based on that benchmark..........
[QUOTE="TrailorParkBoy"]http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=432&Itemid=29&limit=1&limitstart=2 On that benchmark in Direct X 10 the 8600 GTS is equal to the HD2900XT (the drivers are more optimised for Nvidia and the game has been optimized for nvidias cards but come on, a 8600 should never beat a HD2900XT). I would recommend the 8800 GTS 640 but its your call. Also be sure to read all of that article its interesting (that link is to the middle of the article so you might wanna read the hole thing).lettuceman44um, no. What ATI drivers were they? And you should not judge the card just based on that benchmark..........
[QUOTE="lettuceman44"][QUOTE="TrailorParkBoy"]http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=432&Itemid=29&limit=1&limitstart=2 On that benchmark in Direct X 10 the 8600 GTS is equal to the HD2900XT (the drivers are more optimised for Nvidia and the game has been optimized for nvidias cards but come on, a 8600 should never beat a HD2900XT). I would recommend the 8800 GTS 640 but its your call. Also be sure to read all of that article its interesting (that link is to the middle of the article so you might wanna read the hole thing).TrailorParkBoyum, no. What ATI drivers were they? And you should not judge the card just based on that benchmark..........
Ummm... ATI didn't get WiC until the end of June...
So how would they have driver support for the game if the driver was already sent to MS in the mid of June?
Expect a nice performance increase with WiC in DX10 with a HD2900XT when 7.7 comes out, or whatever the July release is numbered.
um, no. What ATI drivers were they? And you should not judge the card just based on that benchmark..........[QUOTE="TrailorParkBoy"][QUOTE="lettuceman44"][QUOTE="TrailorParkBoy"]http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=432&Itemid=29&limit=1&limitstart=2 On that benchmark in Direct X 10 the 8600 GTS is equal to the HD2900XT (the drivers are more optimised for Nvidia and the game has been optimized for nvidias cards but come on, a 8600 should never beat a HD2900XT). I would recommend the 8800 GTS 640 but its your call. Also be sure to read all of that article its interesting (that link is to the middle of the article so you might wanna read the hole thing).LordEC911
Ummm... ATI didn't get WiC until the end of June...
So how would they have driver support for the game if the driver was already sent to MS in the mid of June?
Expect a nice performance increase with WiC in DX10 with a HD2900XT when 7.7 comes out, or whatever the July release is numbered.
LordEC911, you seem to be the leading expert on all stuff AMD/ATI. So I'm gonna ask youa couple questions.There are varied results in all the benchmarks I have seen with them so far. They each take turns coming on top. But for the most part, ATI seems to be ahead. And the drivers are still immature which is a good thing because won't the performance only get greater with drivers? (thats what im thinking, not sure tho) And if the performance does increase with drivers, then the 2900xt should be ahead of the 8800gts very soon. Ppl have also said the ATI is more futureproof which is another good thing imo. But remember that I am going to be running probably 4X AA and at least 8XAF in every game I play. Knowing that, which should I buy? I've been an NVIDIA guy all my life, starting with the Geforce 2 series, but I think its time for a little change since ATI seems to be superior for the most part to the GTS. Thanks.
LordEC911, you seem to be the leading expert on all stuff AMD/ATI. So I'm gonna ask youa couple questions.There are varied results in all the benchmarks I have seen with them so far. They each take turns coming on top. But for the most part, ATI seems to be ahead. And the drivers are still immature which is a good thing because won't the performance only get greater with drivers? (thats what im thinking, not sure tho) And if the performance does increase with drivers, then the 2900xt should be ahead of the 8800gts very soon. Ppl have also said the ATI is more futureproof which is another good thing imo. But remember that I am going to be running probably 4X AA and at least 8XAF in every game I play. Knowing that, which should I buy? I've been an NVIDIA guy all my life, starting with the Geforce 2 series, but I think its time for a little change since ATI seems to be superior for the most part to the GTS. Thanks.SLI_Yoshi
Well, my personal opinion is that it all comes down to pricing. If you can find a HD2900XT for under $400, I would personally get that but if it is over $400 I would just grab either GTS, 640mb or 320mb.
The drivers will get better over time, not by huge leaps but they should definitely crawl upwards a bit.
As for the AA, the folks at AMD decided to take a different route and have the shaders do the AA which seems to be where future games are heading. So the AA performance will definitely get better with future games.
As for the AA, the folks at AMD decided to take a different route and have the shaders do the AA which seems to be where future games are heading. So the AA performance will definitely get better with future games.
Not if NVIDIA slips MS/Devs with a healthy sum of money to continue to adopt AA through the ROPs.
Like how they most likely paid off MS over DX10 guidelines, because the G80 can't do memory virtualisation (R600 can because ATi/AMD followed DX10 guidelines).
As for the AA, the folks at AMD decided to take a different route and have the shaders do the AA which seems to be where future games are heading. So the AA performance will definitely get better with future games. Wesker776
Not if NVIDIA slips MS/Devs with a healthy sum of money to continue to adopt AA through the ROPs.
Like how they most likely paid off MS over DX10 guidelines, because the G80 can't do memory virtualisation (R600 can because ATi/AMD followed DX10 guidelines).
They can't, well at least the near future games into 2008, since they have already started on the engines and had to be cleared by MS.
[QUOTE="Wesker776"]As for the AA, the folks at AMD decided to take a different route and have the shaders do the AA which seems to be where future games are heading. So the AA performance will definitely get better with future games. LordEC911
Not if NVIDIA slips MS/Devs with a healthy sum of money to continue to adopt AA through the ROPs.
Like how they most likely paid off MS over DX10 guidelines, because the G80 can't do memory virtualisation (R600 can because ATi/AMD followed DX10 guidelines).
They can't, well at least the near future games into 2008, since they have already started on the engines and had to be cleared by MS.
That's good to hear.
I'm still on the fence on whether or not I should wait for R650. My X1900 is still running strong, but I really wouldn't mind going R600 now with my Q6600 (yes, I pre-ordered yesterday :D ).
That's good to hear.I'm still on the fence on whether or not I should wait for R650. My X1900 is still running strong, but I really wouldn't mind going R600 now with my Q6600 (yes, I pre-ordered yesterday :D ).Wesker776
R650 as in the high end XTX or XT model? If I were you I would just wait for the R700, that's what I plan to do. Or grab a RV670 this fall, if the refresh is still going to happen.
Nvidia - The Way It's Meant to be PlayedSDxSnOOpZ
...Thats all you could come up with? :|
Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.
[QUOTE="Wesker776"]That's good to hear.I'm still on the fence on whether or not I should wait for R650. My X1900 is still running strong, but I really wouldn't mind going R600 now with my Q6600 (yes, I pre-ordered yesterday :D ).LordEC911
R650 as in the high end XTX or XT model? If I were you I would just wait for the R700, that's what I plan to do. Or grab a RV670 this fall, if the refresh is still going to happen.
R650 as in 2950 XTX (?).
I might be able to wait for R700, but a $500 (AUD) 2900 XT is pretty tempting at the moment. I dunno, I'll buy a new video card when I get an X38 board (for possible CrossFire), whichever is king of the hill then.
[QUOTE="LordEC911"][QUOTE="Wesker776"]That's good to hear.I'm still on the fence on whether or not I should wait for R650. My X1900 is still running strong, but I really wouldn't mind going R600 now with my Q6600 (yes, I pre-ordered yesterday :D ).Wesker776
R650 as in the high end XTX or XT model? If I were you I would just wait for the R700, that's what I plan to do. Or grab a RV670 this fall, if the refresh is still going to happen.
R650 as in 2950 XTX (?).
I might be able to wait for R700, but a $500 (AUD) 2900 XT is pretty tempting at the moment. I dunno, I'll buy a new video card when I get an X38 board (for possible CrossFire), whichever is king of the hill then.
so why can't you get whatever that is king of the hill now?
[QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedAlpha_Omega69
...Thats all you could come up with? :|
Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.
how do youknow that the R600 is more future-proof in dx10, are you a phsycic?
Because:
- I'm concerned about NVIDIA's hastiness to DX10.
- R600's power consumption/heat dissipation is way too high and the 65/55nm shrink should fix this.
- A good quality GTX (XFX/EVGA) is nearly twice as expensive as a 2900 XT here.
- My X1900 XTX has yet to fail me in a DX9 game.
[QUOTE="Alpha_Omega69"][QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedSDxSnOOpZ
...Thats all you could come up with? :|
Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.
why do you think that the R600 is more future-proof?
Because more dev's will lean to doing AA through stream processing, which would give huge performance hits to the G80's when AA is on unless they can figure out a way to do AA both ways. And that thing that the G80's don't have that the R600's do, memory virtualisation. I don't know what it does but it sounds important :lol:
[QUOTE="Alpha_Omega69"][QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedSDxSnOOpZ
...Thats all you could come up with? :|
Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.
how do youknow that the R600 is more future-proof in dx10, are you a phsycic?
ATi actually followed MS's exact DX10 guidelines.
Why do you think ATi went with a packed super scalar architecture (320 SPs)?
i remember all the hype before the R600 was released, all the talks about how it will kick thecrap outof G80..thenwe saw real benchmarks of its performance and how were not so impressed. anyway, my point is, you won't know anything untilwe see it in action.SDxSnOOpZ
Lol but its already in action? Were you in some cave for this past month? In some benchmarks the HD2900XT murdered the 8800GTS and cought up to the 8800GTX. Though it was with AA disbaled, when AA was turned on it dropped. Look I own 2 8800 GTx's and I know what the R600's are capable of. So get it through your thick fanboy skull.
[QUOTE="SDxSnOOpZ"][QUOTE="Alpha_Omega69"][QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedWesker776
...Thats all you could come up with? :|
Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.
how do youknow that the R600 is more future-proof in dx10, are you a phsycic?
ATi actually followed MS's exact DX10 guidelines.
Why do you think ATi went with a packed super scalar architecture (320 SPs)?
nVidia doesn't follow the standard DX9 or DX10 guideline because they want to help developers to optimize the code for their GPU, hence the motto "The way its meant to be played", and since ATi actually followed the guideline...there are no optimization, or even its deoptimized.
[QUOTE="SDxSnOOpZ"]i remember all the hype before the R600 was released, all the talks about how it will kick thecrap outof G80..thenwe saw real benchmarks of its performance and how were not so impressed. anyway, my point is, you won't know anything untilwe see it in action.Alpha_Omega69
Lol but its already in action? Were you in some cave for this past month? In some benchmarks the HD2900XT murdered the 8800GTS and cought up to the 8800GTX. Though it was with AA disbaled, when AA was turned on it dropped. Look I own 2 8800 GTx's and I know what the R600's are capable of. So get it through your thick fanboy skull.
i've seen more 2900xt vs 8800 benchmarks then you ever will, dont think that idont know, but i'm just basing fromall the facts, all the benchmarks thatI've seen, the 2900xt got it ass wooped. I thought it was suppose to be wayfaster than 8800gtx. Stop assuming things andthink you knowwhat ther600is capable of..you dont know jack.
[QUOTE="Wesker776"][QUOTE="SDxSnOOpZ"][QUOTE="Alpha_Omega69"][QUOTE="SDxSnOOpZ"]Nvidia - The Way It's Meant to be PlayedSDxSnOOpZ
...Thats all you could come up with? :|
Anyways yeah if you just have to have AA on previous titles then go with the 8800GTS, but if you want more future-proofing in DX10 go with the R600.
how do youknow that the R600 is more future-proof in dx10, are you a phsycic?
ATi actually followed MS's exact DX10 guidelines.
Why do you think ATi went with a packed super scalar architecture (320 SPs)?
nVidia doesn't follow the standard DX9 or DX10 guideline because they want to help developers to optimize the code for their GPU, hence the motto "The way its meant to be played", and since ATi actually followed the guideline...there are no optimization, or even its deoptimized.
:|
Quite possibly the most ridiculous understanding of guidelines I've ever heard.
[QUOTE="SDxSnOOpZ"]i remember all the hype before the R600 was released, all the talks about how it will kick thecrap outof G80..thenwe saw real benchmarks of its performance and how were not so impressed. anyway, my point is, you won't know anything untilwe see it in action.Alpha_Omega69
Lol but its already in action? Were you in some cave for this past month? In some benchmarks the HD2900XT murdered the 8800GTS and cought up to the 8800GTX. Though it was with AA disbaled, when AA was turned on it dropped. Look I own 2 8800 GTx's and I know what the R600's are capable of. So get it through your thick fanboy skull.
we were all lead to believe that the HD2900XT was going to pwn the day it was out but now all I see is a card that blows at AA. who knows, maybe it will get alot better and I am wrong. All I am pointing out is that the card has already disappointed me once and I am not going to hype it again.i've seen more 2900xt vs 8800 benchmarks then you ever will, dont think that idont know, but i'm just basing fromall the facts, all the benchmarks thatI've seen, the 2900xt got it ass wooped. I thought it was suppose to be wayfaster than 8800gtx. Stop assuming things andthink you knowwhat ther600is capable of..you dont know jack.
SDxSnOOpZ
Lol, the R600 was never meant to be better than the GTX MO-RON. It was meant to go against the GTS. The fact that it DOES catch up to the GTX in some benchmarks makes it a better card. And would you care to post these so called benchmarks that show the HD2900XT getting its ass wooped?
Here is a few from Extreme Tech
http://www.extremetech.com/article2/0,1697,2129298,00.asp
http://www.extremetech.com/article2/0,1697,2129299,00.asp
As you can see, the 2900 is on most if not all occasions better than the 8800GTS. How about you post some of yours then Snoopz?
why dont you post up links of the benchmarks where the 2900xt "murdered" the 8800gts
SDxSnOOpZ
http://www.firingsquad.com/hardware/diamond_radeon_2900_xt_1gb/page1.asp
But to be honest, there are times where the 2900 XT slips to GTS levels (mainly when HDR + 4xAA + 16xAF is enabled), but all single cards usually choke under those conditions (1900x1200 resolution).
http://firingsquad.com/hardware/amd_ati_radeon_hd_2900_xt_performance_preview/default.asphttp://www.hardocp.com/article.html?art=MTM1MSwsLGhlbnRodXNpYXN0
http://www.anandtech.com/video/showdoc.aspx?i=2988
SDxSnOOpZ
Thank you for making my guess come true...
Your earlier post saying you have read more HD2900XT benches then Alpha I thought to myself, "Hmm... I have a feeling he is going to post the HardOCP bench."
And look what happens...
Kyle and Brent had already decided the card was a flop way before they ever got one in their hands.
[QUOTE="SDxSnOOpZ"]
i've seen more 2900xt vs 8800 benchmarks then you ever will, dont think that idont know, but i'm just basing fromall the facts, all the benchmarks thatI've seen, the 2900xt got it ass wooped. I thought it was suppose to be wayfaster than 8800gtx. Stop assuming things andthink you knowwhat ther600is capable of..you dont know jack.
Alpha_Omega69
Lol, the R600 was never meant to be better than the GTX MO-RON. It was meant to go against the GTS. The fact that it DOES catch up to the GTX in some benchmarks makes it a better card. And would you care to post these so called benchmarks that show the HD2900XT getting its ass wooped?
And i'm still waiting for the 2900xtx... unless it's the 1gig version?
Will crysis use those AA through sahders??? Will UT3 use it?? Will bioshcok use it?? Will World in conflict use it??? Will ET: QW use it??? I think no. We will see AA through shaders when first true!! DX10 titles come. Like tiltes that will only support DX10 and Vista, no XP and DX9. That time IMO we will see AA through shaders. We will see how cards play games like crysis and bioshock and thats the time we will see which card is better. Its pointless to argue here on forums, how 2900 or 8800 will suck in DX10 games compared to oposite. Maybe R600 has tesselation, AA through shaders. G80 also have quantum effects. And IMO by the time AA through shaders will be used we will all have newer cards. Wesker sad that 1900 has yet to dissapoint him in DX9 games. Does 7900GTX dissapointed anyone in DX9 games yet??
Also i couldnt laughed enough when i found out that R600 doesnt use that "superior" HD video decoding" like ATI has told us.
I even saw comercials where they were poitnintg that thing out as "one of main advantages of R600" compared to G80.
And now we found out that R600 uses old video decoder from 1900. 8800 series have better image quality, produce less heat, use less energy, and are better whit AA on.
For now that tesselation and AA through shaders means nothing and 8800 are better buy. That could change in future when/if developers will start to use AA through shaders and tesselation, but by than i think we will have much better cards out.
Also i couldnt laughed enough when i found out that R600 doesnt use that "superior" HD video decoding" like ATI has told us.I even saw comercials where they were poitnintg that thing out as "one of main advantages of R600" compared to G80.
And now we found out that R600 uses old video decoder from 1900. 8800 series have better image quality, produce less heat, use less energy, and are better whit AA on.
For now that tesselation and AA through shaders means nothing and 8800 are better buy. That could change in future when/if developers will start to use AA through shaders and tesselation, but by than i think we will have much better cards out.domke13
You saw a commercial? I highly doubt that.
The R600 does have superior HD video decoding...
It does NOT use the same decoder as from the 1900...
The G80 does NOT have better IQ, if anything they are equal... (less heat = less energy btw)
So if you could go back, would you have bought a X1800XT or a 7800GTX?
The 7800GTX was better during the launch but the X1800XT improved past it over time.
So the question is, do you want slightly more performance now or more performance a year from now?
[QUOTE="domke13"]Also i couldnt laughed enough when i found out that R600 doesnt use that "superior" HD video decoding" like ATI has told us.I even saw comercials where they were poitnintg that thing out as "one of main advantages of R600" compared to G80.
And now we found out that R600 uses old video decoder from 1900. 8800 series have better image quality, produce less heat, use less energy, and are better whit AA on.
For now that tesselation and AA through shaders means nothing and 8800 are better buy. That could change in future when/if developers will start to use AA through shaders and tesselation, but by than i think we will have much better cards out.LordEC911
You saw a commercial? I highly doubt that.
The R600 does have superior HD video decoding...
It does NOT use the same decoder as from the 1900...
The G80 does NOT have better IQ, if anything they are equal... (less heat = less energy btw)
So if you could go back, would you have bought a X1800XT or a 7800GTX?
The 7800GTX was better during the launch but the X1800XT improved past it over time.
So the question is, do you want slightly more performance now or more performance a year from now?
The revised 7800GTX 512mb is equal and most of the time better.
[QUOTE="domke13"]Also i couldnt laughed enough when i found out that R600 doesnt use that "superior" HD video decoding" like ATI has told us.I even saw comercials where they were poitnintg that thing out as "one of main advantages of R600" compared to G80.
And now we found out that R600 uses old video decoder from 1900. 8800 series have better image quality, produce less heat, use less energy, and are better whit AA on.
For now that tesselation and AA through shaders means nothing and 8800 are better buy. That could change in future when/if developers will start to use AA through shaders and tesselation, but by than i think we will have much better cards out.LordEC911
You saw a commercial? I highly doubt that.
The R600 does have superior HD video decoding...
It does NOT use the same decoder as from the 1900...
The G80 does NOT have better IQ, if anything they are equal... (less heat = less energy btw)
So if you could go back, would you have bought a X1800XT or a 7800GTX?
The 7800GTX was better during the launch but the X1800XT improved past it over time.
So the question is, do you want slightly more performance now or more performance a year from now?
I would get 7800GTX. So i would enjoy better card for 1 year, and when it would got beaten by 1800XT i would probalby have new high end card, since i have summer time job now and i make enoguh money. And even by the time 1800 would get better it couldnt run most of new games on max, since it would be too late for it. Hm Will you say you are smarter than hardware reviewer from PC magazine i read?? That guy goes to all conference, he was on ATi launch event and i seriously doubt you know more than him, about video decoder. Or i might be wrong, and maybe he has no clue about GPUs and work for magazine just cause he has nothing else to do. About IQ. 8800 series have better filtrering. I even saw some comparison pics on web. Ill try to find them for you.
Im happy with a mountain dew. But i do like my games to run smooth...
Im grabbing a 2900xt in a few days personally. My choice
BTW: does anyone know anything about xpertvision? good brand ?
EDIT: BTW, i hear that all games that display "gmaes for windows" will ahve suport for ATi's style of shading. and that shows in the results in company of heroes, a games for windows title.
Im happy with a mountain dew. But i do like my games to run smooth...
Im grabbing a 2900xt in a few days personally. My choice
BTW: does anyone know anything about xpertvision? good brand ?
EDIT: BTW, i hear that all games that display "gmaes for windows" will ahve suport for ATi's style of shading. and that shows in the results in company of heroes, a games for windows title.
Samulies
XpertVision is pretty dodgy IMO.
You're Aussie, right?
Go to umart.com.au and buy the $500 Jetway 2900 XT rather than the XpertVision.
EDIT: As for the superior decoder, R610 and R630 win.
Will crysis use those AA through sahders??? Will UT3 use it?? Will bioshcok use it?? Will World in conflict use it??? Will ET: QW use it??? I think no. We will see AA through shaders when first true!! DX10 titles come.
Those games use the DX10 API, yes?
Then yes, we will see AA though shaders. :| It's a fricken guideline not an option for DX10 API.
Please Log In to post.
Log in to comment