HD 2900 XT tidbits

This topic is locked from further discussion.

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 jfelisario
Member since 2006 • 2753 Posts
Seems like Fudzilla got their hands on the card's user manual , and it reveals quite alot, such as the recommended power supply of 750w (!), yeah I kid you not, just check out the manual yourself. Eat your hearts out power supply companies haha.... ( though a quality 400-500watter should suffice... emphasis on quality)
Avatar image for 353535355353535
353535355353535

4424

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#2 353535355353535
Member since 2005 • 4424 Posts
damn. All these crappy mistakes are eventually gonn put AMD out of business
Avatar image for DirkVDV01
DirkVDV01

20155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 DirkVDV01
Member since 2004 • 20155 Posts
I'm having a hard time not laughing from what I first read about all the AMD/ATI news. After the various delays, you really couldn't stop thinking that all the R600 news and posts here were beginning to look more like wishful thinking. Now, reality is starting to kick in...
Avatar image for 353535355353535
353535355353535

4424

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#4 353535355353535
Member since 2005 • 4424 Posts
if it needs a 750 watt PSU, it better cost like 150 dollars cuz the 8800 GTX only needs a 450 watt PSU
Avatar image for Taijiquan
Taijiquan

7431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 Taijiquan
Member since 2002 • 7431 Posts

I'm having a hard time not laughing from what I first read about all the AMD/ATI news. After the various delays, you really couldn't stop thinking that all the R600 news and posts here were beginning to look more like wishful thinking. Now, reality is starting to kick in...DirkVDV01

 

I have been saying this for months and even lately have only gotten rediculed and called names like a school kid.  I just do not see one this coming out positive for AMD.  This card is a complete failure imo unless it comes out and destroys the 8800 GTX.  There is no reason it shouldn't they have had over half a year.

Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 jfelisario
Member since 2006 • 2753 Posts

[QUOTE="DirkVDV01"]I'm having a hard time not laughing from what I first read about all the AMD/ATI news. After the various delays, you really couldn't stop thinking that all the R600 news and posts here were beginning to look more like wishful thinking. Now, reality is starting to kick in...Taijiquan

 

I have been saying this for months and even lately have only gotten rediculed and called names like a school kid. I just do not see one this coming out positive for AMD. This card is a complete failure imo unless it comes out and destroys the 8800 GTX. There is no reason it shouldn't they have had over half a year.

For the record, I have never called you any names :) 

Avatar image for Empirefrtw
Empirefrtw

1324

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 Empirefrtw
Member since 2006 • 1324 Posts
Bah and i was waiting to get this card i guess ill have to get the 8800gts. ATI made a big mistake on this card most people buy 750 watt power supplys for sli and this one just requires one for just itself no sli.:?
Avatar image for Taijiquan
Taijiquan

7431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#8 Taijiquan
Member since 2002 • 7431 Posts
[QUOTE="Taijiquan"]

[QUOTE="DirkVDV01"]I'm having a hard time not laughing from what I first read about all the AMD/ATI news. After the various delays, you really couldn't stop thinking that all the R600 news and posts here were beginning to look more like wishful thinking. Now, reality is starting to kick in...jfelisario

 

I have been saying this for months and even lately have only gotten rediculed and called names like a school kid. I just do not see one this coming out positive for AMD. This card is a complete failure imo unless it comes out and destroys the 8800 GTX. There is no reason it shouldn't they have had over half a year.

For the record, I have never called you any names :) 

 

You have been very open minded and I have enjoyed our discussions.

Avatar image for DirkVDV01
DirkVDV01

20155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 DirkVDV01
Member since 2004 • 20155 Posts
[QUOTE="Taijiquan"]

[QUOTE="DirkVDV01"]I'm having a hard time not laughing from what I first read about all the AMD/ATI news. After the various delays, you really couldn't stop thinking that all the R600 news and posts here were beginning to look more like wishful thinking. Now, reality is starting to kick in...jfelisario

I have been saying this for months and even lately have only gotten rediculed and called names like a school kid. I just do not see one this coming out positive for AMD. This card is a complete failure imo unless it comes out and destroys the 8800 GTX. There is no reason it shouldn't they have had over half a year.

For the record, I have never called you any names :)

He's not talking about you, jfelisario, but I'm NOT going to say which user Taijiquan talks about. Let's just say it's a user never worth mentioning. :)
Avatar image for DirkVDV01
DirkVDV01

20155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 DirkVDV01
Member since 2004 • 20155 Posts
[QUOTE="jfelisario"][QUOTE="Taijiquan"]

[QUOTE="DirkVDV01"]I'm having a hard time not laughing from what I first read about all the AMD/ATI news. After the various delays, you really couldn't stop thinking that all the R600 news and posts here were beginning to look more like wishful thinking. Now, reality is starting to kick in...Taijiquan

I have been saying this for months and even lately have only gotten rediculed and called names like a school kid. I just do not see one this coming out positive for AMD. This card is a complete failure imo unless it comes out and destroys the 8800 GTX. There is no reason it shouldn't they have had over half a year.

For the record, I have never called you any names :)

You have been very open minded and I have enjoyed our discussions.

And indeed this it should be here any time. Open discussions which are done in a civilised manner are enforced here. (as in any forum) Just acting like you are trying to get your truth shoved down somebodies throat isn't going to work. But some have a hard time figuring that out. So as far as I'm concerned, discussions are allowed and enforced here, as long as everybody keeps acting mature, that is. :)
Avatar image for DirkVDV01
DirkVDV01

20155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 DirkVDV01
Member since 2004 • 20155 Posts
OK, back on-topic. Sorry about the slight tread-derail. I've read the document. Nice, I do find it a good thing that it supports almost anything you can throw at it, DVI, S-VIDEO, DVI to DVI, HDMI, HDTV, Crossfire possibilites, physics (only if you aren't using CF) and ATI Overdrive. It's nice to see, but I can imagine you need to get a good or very specific PSU to have the connectors. You either need 2x 6-pin connectors to power the GPU or 1x 6-pin and 1x 8-pin connector if you want to use the Overdrive functions. And 150W solid power should be provided in case it would use everything at full. The needed power for this is high. No mention of needed amps. :| But the AMD/ATI Certitied components website already mentions that PSU's with a 750W or higher range are recommended, so I can imagine you need one of those for good functionality and stability. The one from the list (which odd enough doesn't include the new GPU's yet) that catches my attention is the PCP&C Silencer 750 Quad. I think I'm going to go and recommend this beast to all who dives into HD2900XT :D
Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 jfelisario
Member since 2006 • 2753 Posts

OK, back on-topic. Sorry about the slight tread-derail. I've read the document. Nice, I do find it a good thing that it supports almost anything you can throw at it, DVI, S-VIDEO, DVI to DVI, HDMI, HDTV, Crossfire possibilites, physics (only if you aren't using CF) and ATI Overdrive. It's nice to see, but I can imagine you need to get a good or very specific PSU to have the connectors. You either need 2x 6-pin connectors to power the GPU or 1x 6-pin and 1x 8-pin connector if you want to use the Overdrive functions. And 150W solid power should be provided in case it would use everything at full. The needed power for this is high. No mention of needed amps. :| But the AMD/ATI Certitied components website already mentions that PSU's with a 750W or higher range are recommended, so I can imagine you need one of those for good functionality and stability. The one from the list (which odd enough doesn't include the new GPU's yet) that catches my attention is the PCP&C Silencer 750 Quad. I think I'm going to go and recommend this beast to all who dives into HD2900XT :DDirkVDV01

 Booyah!!! We have a first customer here: Suggest it to him :)

Avatar image for dbowman
dbowman

6836

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#13 dbowman
Member since 2005 • 6836 Posts
HA HA! 750Watts ! Thats enough to power a whole house.
Avatar image for Staryoshi87
Staryoshi87

12760

Forum Posts

0

Wiki Points

0

Followers

Reviews: 18

User Lists: 0

#14 Staryoshi87
Member since 2003 • 12760 Posts

[QUOTE="DirkVDV01"]I'm having a hard time not laughing from what I first read about all the AMD/ATI news. After the various delays, you really couldn't stop thinking that all the R600 news and posts here were beginning to look more like wishful thinking. Now, reality is starting to kick in...Taijiquan

 

I have been saying this for months and even lately have only gotten rediculed and called names like a school kid.  I just do not see one this coming out positive for AMD.  This card is a complete failure imo unless it comes out and destroys the 8800 GTX.  There is no reason it shouldn't they have had over half a year.

If it falls short of raping the 8800GTX, it's a failure ;) More than half a year to retaliate apparently wasn't enough. It just shows you that nVidia didn't "rush" their cards... AMD "unrushed" theirs =P

Edit: I don't card about HDMI either :)

Avatar image for Taijiquan
Taijiquan

7431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 Taijiquan
Member since 2002 • 7431 Posts
[QUOTE="Taijiquan"]

[QUOTE="DirkVDV01"]I'm having a hard time not laughing from what I first read about all the AMD/ATI news. After the various delays, you really couldn't stop thinking that all the R600 news and posts here were beginning to look more like wishful thinking. Now, reality is starting to kick in...Staryoshi87

 

I have been saying this for months and even lately have only gotten rediculed and called names like a school kid.  I just do not see one this coming out positive for AMD.  This card is a complete failure imo unless it comes out and destroys the 8800 GTX.  There is no reason it shouldn't they have had over half a year.

If it falls short of raping the 8800GTX, it's a failure ;) More than half a year to retaliate apparently wasn't enough. It just shows you that nVidia didn't "rush" their cards... AMD "unrushed" theirs =P

Edit: I don't card about HDMI either :)

I completely agree.  Now I know a certain gamespot user that will come in here and try to hit you with everything in the book, "G80 was supposed to be a Direct X9 card" and garbage like this.  But I think the time frame alone tells you the G80 is simply no joke.  I really hope the AMD solution is competitive, which I am sure it will be.  Just not how it should be considering how long it has taken.  I have a buddy that will have his Monday morning (overnighted from ewiz on friday).  I am going to shoot over there to see what is real if benchmarks are not officially out yet.     

Avatar image for Goosestophe
Goosestophe

227

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#16 Goosestophe
Member since 2005 • 227 Posts
Well the demo system worked with a 625W power supply, maybe the manual had a spelling mistake?  I mean  I highly doubt this is the final edition of the manual.
Avatar image for Staryoshi87
Staryoshi87

12760

Forum Posts

0

Wiki Points

0

Followers

Reviews: 18

User Lists: 0

#17 Staryoshi87
Member since 2003 • 12760 Posts
You let us know, Tajiq. :) As long as it's enough to bring the price of an 8800GTS down (I could use another one around Crysis time...) I'll be happy ;)
Avatar image for DirkVDV01
DirkVDV01

20155

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 DirkVDV01
Member since 2004 • 20155 Posts
Well the demo system worked with a 625W power supply, maybe the manual had a spelling mistake? I mean I highly doubt this is the final edition of the manual. Goosestophe
625 or 750W, that is a big spelling mistake is you ask me. ;)
Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 LordEC911
Member since 2004 • 9972 Posts

Thanks for the kind words Dirk, glad to see you are somehow a mod and still stoop to low levels insulting other users... 

I completely agree. Now I know a certain gamespot user that will come in here and try to hit you with everything in the book, "G80 was supposed to be a Direct X9 card" and garbage like this. But I think the time frame alone tells you the G80 is simply no joke. I really hope the AMD solution is competitive, which I am sure it will be. Just not how it should be considering how long it has taken. I have a buddy that will have his Monday morning (overnighted from ewiz on friday). I am going to shoot over there to see what is real if benchmarks are not officially out yet. Taijiquan

So you are stating that the G80 isn't a great DX9 card?
We will see how "good" it is at DX10 shortly...

I don't see how the HD2900xt is a failure...
It is at a $400 price point, overclocks ridicously, consumes ~20w more then the GTX, is on par performance-wise with the GTX at high resolutions and max card settings, better IQ, more features and is just shy of the 500GFlop mark at stock settings.

So please tell me how that is a failure...
Oh yeah, since you are all so "open-minded" you think that a late card, that performs good enough in DX9 games and should dominate in DX10 game, that is 6 months "late," when there is no games that need DX10, is a failure.

Keep up the good work guys, you make me proud...:roll:

 

Avatar image for Taijiquan
Taijiquan

7431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 Taijiquan
Member since 2002 • 7431 Posts

Thanks for the kind words Dirk, glad to see you are somehow a mod and still stoop to low levels insulting other users... 

[QUOTE="Taijiquan"]I completely agree. Now I know a certain gamespot user that will come in here and try to hit you with everything in the book, "G80 was supposed to be a Direct X9 card" and garbage like this. But I think the time frame alone tells you the G80 is simply no joke. I really hope the AMD solution is competitive, which I am sure it will be. Just not how it should be considering how long it has taken. I have a buddy that will have his Monday morning (overnighted from ewiz on friday). I am going to shoot over there to see what is real if benchmarks are not officially out yet. LordEC911

So you are stating that the G80 isn't a great DX9 card?
We will see how "good" it is at DX10 shortly...

I don't see how the HD2900xt is a failure...
It is at a $400 price point, overclocks ridicously, consumes ~20w more then the GTX, is on par performance-wise with the GTX at high resolutions and max card settings, better IQ, more features and is just shy of the 500GFlop mark at stock settings.

So please tell me how that is a failure...
Oh yeah, since you are all so "open-minded" you think that a late card, that performs good enough in DX9 games and should dominate in DX10 game, that is 6 months "late," when there is no games that need DX10, is a failure.

Keep up the good work guys, you make me proud...:roll:

 

 

The fact that it is so late and only "competes" in Direct X 9 makes it a failure.  Especially after all the AMD marketing/publicity spins. 

In games that require raw power it does not look like it is on par with the 8800GTX.  We will see tomorrow I guess although I am sure there will always be beta drivers. :roll:

Here is the cycle, 

AMD Faithful, Bash Nvidia G80 as long and hard as possible because the new R600 which will save baby kittens from any disaster possible is right around the corner (only it hasn't be around the corner until recently).

Benchmarks start coming out and the faithful only choose the ones that meet their preconceived expectations of the card.  X site will be biased and this that and the other thing. 

When the card is finally derailed from the power house the loyal have been making it out to be, people will hang on to whatever they can.  For example, Just wait until Direct X 10 where it will dominate!!  Only by that time more then likely Nvidia will have spent their huge lead in focusing on the next thing which will make this all a forgotten memory.

These comments are not intended to you directly but this is almost a carbon copy of the Nvidia 5xxx launch.  This is intended for you though, I know you love AMD/ATI but imagine if R600 was released instead of the G80 way back when and the G80 was being released with the numbers it delivers tomorrow.  I am sure you would be laughing.   

Kind regards,

~B  

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 LordEC911
Member since 2004 • 9972 Posts
The fact that it is so late and only "competes" in Direct X 9 makes it a failure. Especially after all the AMD marketing/publicity spins.

In games that require raw power it does not look like it is on par with the 8800GTX. We will see tomorrow I guess although I am sure there will always be beta drivers. :roll:

Here is the cycle,

AMD Faithful, Bash Nvidia G80 as long and hard as possible because the new R600 which will save baby kittens from any disaster possible is right around the corner (only it hasn't be around the corner until recently).

Benchmarks start coming out and the faithful only choose the ones that meet their preconceived expectations of the card. X site will be biased and this that and the other thing.

When the card is finally derailed from the power house the loyal have been making it out to be, people will hang on to whatever they can. For example, Just wait until Direct X 10 where it will dominate!! Only by that time more then likely Nvidia will have spent their huge lead in focusing on the next thing which will make this all a forgotten memory.

These comments are not intended to you directly but this is almost a carbon copy of the Nvidia 5xxx launch. This is intended for you though, I know you love AMD/ATI but imagine if R600 was released instead of the G80 way back when and the G80 was being released with the numbers it delivers tomorrow. I am sure you would be laughing.

Kind regards,

~BTaijiquan

You are somehow implying that a G80/R600 isn't enough for DX9 performance? We need more then a 50-60FPS average at high resolutions? That is a joke...

Raw power? How do you define raw power?

I have never "bashed" the G80, I have merely pointed out some of it's underlying features that might be it's own downfall. You probably don't remember but I have praised the G80's DX9 performance and it's DX10 compatibility since launch. It has been the undeniable GPU leader, even CF x1950xtx's could only occasionally beat a 8800GTS. Once the ATi launch drew to within a month, I started telling people to wait if they could, since obviously competition is a good thing, either way the prices were going to drop and AMD might have an awesome performer on it's hands.

I do admit, I was expecting at least a 15% performance lead by the XT over the GTX, due to it's specs. Upon reading some much more knowledge posts in other forums, I quickly realized that the possibility of hitting a wall in DX9 performance is possible due to the features/designs of these cards. That quickly brought up the question of why do we need cards that perform higher then we need them to in old games? The answer, we don't.

You also forgot to mention the fact that the "benchmarks" you are referring to were FAKE. While I would like to believe that the VR-Zone review is fake, it is too similar to the performance I have seen with the independant benchmarks of unbias users on other forums. But it is obvious there is a driver hiccup, which is solved with the latest release.

We are hanging on to nothing, it is common sense that AMD/ATi cards age much better then Nvidia cards. Nvidia wants the performance crown of the here and now. AMD/ATi, on the other hand, are much more future looking and want their cards to not only keep up with future games but also increase performance if possible. With soo much raw power in the R600, it is impossible to look at DX9 benchmarks and simply state, "Yep, these look correct" while comparing those results to the specs. DX10 devs have also made comments, that the R600xt is currently ahead of the GTX in DX10 performance and that was on old drivers.

You may care about the past/present, I am more worried about how these cards will hold up by the end of this year.

Avatar image for Taijiquan
Taijiquan

7431

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22 Taijiquan
Member since 2002 • 7431 Posts

You are somehow implying that a G80/R600 isn't enough for DX9 performance? We need more then a 50-60FPS average at high resolutions? That is a joke...

Raw power? How do you define raw power?

I have never "bashed" the G80, I have merely pointed out some of it's underlying features that might be it's own downfall. You probably don't remember but I have praised the G80's DX9 performance and it's DX10 compatibility since launch. It has been the undeniable GPU leader, even CF x1950xtx's could only occasionally beat a 8800GTS. Once the ATi launch drew to within a month, I started telling people to wait if they could, since obviously competition is a good thing, either way the prices were going to drop and AMD might have an awesome performer on it's hands.

I do admit, I was expecting at least a 15% performance lead by the XT over the GTX, due to it's specs. Upon reading some much more knowledge posts in other forums, I quickly realized that the possibility of hitting a wall in DX9 performance is possible due to the features/designs of these cards. That quickly brought up the question of why do we need cards that perform higher then we need them to in old games? The answer, we don't.

You also forgot to mention the fact that the "benchmarks" you are referring to were FAKE. While I would like to believe that the VR-Zone review is fake, it is too similar to the performance I have seen with the independant benchmarks of unbias users on other forums. But it is obvious there is a driver hiccup, which is solved with the latest release.

We are hanging on to nothing, it is common sense that AMD/ATi cards age much better then Nvidia cards. Nvidia wants the performance crown of the here and now. AMD/ATi, on the other hand, are much more future looking and want their cards to not only keep up with future games but also increase performance if possible. With soo much raw power in the R600, it is impossible to look at DX9 benchmarks and simply state, "Yep, these look correct" while comparing those results to the specs. DX10 devs have also made comments, that the R600xt is currently ahead of the GTX in DX10 performance and that was on old drivers.

You may care about the past/present, I am more worried about how these cards will hold up by the end of this year.

LordEC911

I am not making any implications about any cards being enough for X9.  Who said you bashed the G80?  As far as cards performance and needing the next thing to perform better, it is hard to argue what each user needs.  The guy running 2560 resolutions may have a complete different opinion. I didn't by a GTX because I game @ 1366 by 768.  Could you be more specific on exactly what benchmarks I am referring too?  I just do not remember referring to any benchmarks in this thread.  Regardless how do you know who is biased and who isn't? 

"We are hanging on to nothing, it is common sense that AMD/ATi cards age much better then Nvidia cards. Nvidia wants the performance crown of the here and now. AMD/ATi, on the other hand, are much more future looking and want their cards to not only keep up with future games but also increase performance if possible. With soo much raw power in the R600, it is impossible to look at DX9 benchmarks and simply state, "Yep, these look correct" while comparing those results to the specs. DX10 devs have also made comments, that the R600xt is currently ahead of the GTX in DX10 performance and that was on old drivers."

What?  Who are we?  Are you speaking for every hardcore AMD/ATI fan?  Common sense that AMD/ATI cards age better?? For enthusiast age is no object because there are alot of people that won't hold onto a card for longer then 6-10 months anyway.  You claim AMD/ATI is more future looking?  G80 has been the performance leader for probably the longest spam in competitve GPU history.  There is a VERY real possiblity that Nvidia will continue to be the performance leader at this point.  This indicates just the opposite of what you just said.  Besides can you link where Nvidia wants just short (here and now) term performance crown (I am trying to be fair since clearly they have held it for the long term)?  What developer site shows R600 leading in DX10?  Most games were built on the G80 considering it has been out longer.  You get so worked up man.  You gotta relax some.  No need to be hostile in your response, ok?    

Respectfully,

B 

Avatar image for LordEC911
LordEC911

9972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 LordEC911
Member since 2004 • 9972 Posts
I am not making any implications about any cards being enough for X9. Who said you bashed the G80? As far as cards performance and needing the next thing to perform better, it is hard to argue what each user needs. The guy running 2560 resolutions may have a complete different opinion. I didn't by a GTX because I game @ 1366 by 768. Could you be more specific on exactly what benchmarks I am referring too? I just do not remember referring to any benchmarks in this thread. Regardless how do you know who is biased and who isn't?

"We are hanging on to nothing, it is common sense that AMD/ATi cards age much better then Nvidia cards. Nvidia wants the performance crown of the here and now. AMD/ATi, on the other hand, are much more future looking and want their cards to not only keep up with future games but also increase performance if possible. With soo much raw power in the R600, it is impossible to look at DX9 benchmarks and simply state, "Yep, these look correct" while comparing those results to the specs. DX10 devs have also made comments, that the R600xt is currently ahead of the GTX in DX10 performance and that was on old drivers."

What? Who are we? Are you speaking for every hardcore AMD/ATI fan? Common sense that AMD/ATI cards age better?? For enthusiast age is no object because there are alot of people that won't hold onto a card for longer then 6-10 months anyway. You claim AMD/ATI is more future looking? G80 has been the performance leader for probably the longest spam in competitve GPU history. There is a VERY real possiblity that Nvidia will continue to be the performance leader at this point. This indicates just the opposite of what you just said. Besides can you link where Nvidia wants just short (here and now) term performance crown (I am trying to be fair since clearly they have held it for the long term)? What developer site shows R600 leading in DX10? Most games were built on the G80 considering it has been out longer. You get so worked up man. You gotta relax some. No need to be hostile in your response, okTaijiquan

How am I worked up? I am just discussing something here with you.
I'm sorry you take it personal or like I am insulting you, because I'm not.

I guess I will just stop here since you can't seem to take my posts for some reason.... 

Avatar image for duelen
duelen

96

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 duelen
Member since 2003 • 96 Posts
i thought the xt was made to compete with the gts and the xtx, which isnt out yet, was made to compete with gtx
Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 9mmSpliff
Member since 2005 • 21751 Posts
another topic....again 2 pages back
Avatar image for jfelisario
jfelisario

2753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 jfelisario
Member since 2006 • 2753 Posts

another topic....again 2 pages back9mmSpliff

Lol this was a just post about the manual from Fudzilla, from yesterday hehe... now that all the reviews are popping, discussions are heating up... oh well.... 

Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 9mmSpliff
Member since 2005 • 21751 Posts
well just showing the guy in that other topic...you look 2-3 pages backa nd you will see 5 topics on 2900xt.
Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Bebi_vegeta
Member since 2003 • 13558 Posts

well just showing the guy in that other topic...you look 2-3 pages backa nd you will see 5 topics on 2900xt.9mmSpliff

well it could be worst LOL... ever been to system wars? LOLÂ