Nvidia is greater than ATI

  • 78 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for TekkenMaster606
TekkenMaster606

10980

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#1 TekkenMaster606
Member since 2006 • 10980 Posts

http://www.dailytech.com/article.aspx?newsid=7052

 

Their new card is already flopping. :lol: It's the Radeon 8500 all over again.

 

*if this is old, whatever*

 

 

Avatar image for Zaistev_basic
Zaistev_basic

2975

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 Zaistev_basic
Member since 2002 • 2975 Posts
regardless, NVidia invested in the wrong console (PS3)
Avatar image for iunderstand
iunderstand

3201

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 iunderstand
Member since 2006 • 3201 Posts
And 6 is greater than 5.
Avatar image for smokeydabear076
smokeydabear076

22109

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#4 smokeydabear076
Member since 2004 • 22109 Posts
It is news to me.
Avatar image for venture00
venture00

1060

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 venture00
Member since 2004 • 1060 Posts
When is the R-600 coming out?
Avatar image for NobuoMusicMaker
NobuoMusicMaker

6628

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 NobuoMusicMaker
Member since 2005 • 6628 Posts

Using current drivers, vs an overclocked board.

And they are talking about the XTX board. 

Avatar image for TekkenMaster606
TekkenMaster606

10980

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#7 TekkenMaster606
Member since 2006 • 10980 Posts

Using current drivers, vs an overclocked board.

And they are talking about the XTX board.

NobuoMusicMaker

 

Still not a good sign.  

Avatar image for haols
haols

2348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 haols
Member since 2005 • 2348 Posts
regardless, NVidia invested in the wrong console (PS3)Zaistev_basic
They didn't ivnest :|

they were paid to make a part.
To invest you risk money to win or lose.
They are paid regardless.
Avatar image for venture00
venture00

1060

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 venture00
Member since 2004 • 1060 Posts
When is the R-600 coming out?venture00


sorry again..
Avatar image for muscleserge
muscleserge

3307

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#11 muscleserge
Member since 2005 • 3307 Posts
regardless, NVidia invested in the wrong console (PS3)Zaistev_basic
Nvidia is still the market leader.
Avatar image for TOAO_Cyrus1
TOAO_Cyrus1

2895

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 TOAO_Cyrus1
Member since 2004 • 2895 Posts
These numbers are highly questionable. The r600's technical specs point to a performance at least alot better then the x1950xtx so there has got to be something wierd going on. I guraentee you they will be alot better when real reviews come out in two-three weeks.
Avatar image for Zaistev_basic
Zaistev_basic

2975

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 Zaistev_basic
Member since 2002 • 2975 Posts

[QUOTE="Zaistev_basic"]regardless, NVidia invested in the wrong console (PS3)haols
They didn't ivnest :|

they were paid to make a part.
To invest you risk money to win or lose.
They are paid regardless.

Paid they are. But of course they also lose potential sales because they get something for every sales out of the PS3. Few PS3 sold means few profit for NVidia from PS3. ATI got lucky that they have two winning consoles.

Avatar image for Zaistev_basic
Zaistev_basic

2975

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 Zaistev_basic
Member since 2002 • 2975 Posts

[QUOTE="Zaistev_basic"]regardless, NVidia invested in the wrong console (PS3)muscleserge
Nvidia is still the market leader.

No one is denying that, but they just chose the wrong side that they can lose potential profit from PS3 sales. Again, few PS3 sold, few profit (not loss) for NVidia. Is not about lost of profitm it's about not gaining potential profit that NVidia expected from PS3 (so much for that 100 million PS3 sales prediction).

Avatar image for Zenkuso
Zenkuso

4090

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 Zenkuso
Member since 2006 • 4090 Posts
They should have put the GDDR4 in it, would have snowed under 8800's but they went cheap and its dragged them down.
Avatar image for Hoobinator
Hoobinator

6899

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#16 Hoobinator
Member since 2006 • 6899 Posts
I've always seen ATi as being more innovative in the GPU wars, heck the next gen NVidia GPU's are being built upon ATi philosphies, but whatever it'll all change again in 6 months.
Avatar image for Sicknic
Sicknic

4241

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#17 Sicknic
Member since 2004 • 4241 Posts
Too bad their Vista drivers suck...
Avatar image for dgsag
dgsag

6760

Forum Posts

0

Wiki Points

0

Followers

Reviews: 93

User Lists: 0

#18 dgsag
Member since 2005 • 6760 Posts

They should have put the GDDR4 in it, would have snowed under 8800's but they went cheap and its dragged them down.Zenkuso

The XTX does have GDDR4. :D

 

Face it, ATI got owned. 

Avatar image for Makari
Makari

15250

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 Makari
Member since 2003 • 15250 Posts

[QUOTE="Zenkuso"]They should have put the GDDR4 in it, would have snowed under 8800's but they went cheap and its dragged them down.dgsag

The XTX does have GDDR4. :D

Face it, ATI got owned.

When the 2900XT outscores the 2900XTX, you know there's something fundamentally broken somewhere.
Avatar image for NobuoMusicMaker
NobuoMusicMaker

6628

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 NobuoMusicMaker
Member since 2005 • 6628 Posts
[QUOTE="dgsag"]

[QUOTE="Zenkuso"]They should have put the GDDR4 in it, would have snowed under 8800's but they went cheap and its dragged them down.Makari

The XTX does have GDDR4. :D

 

Face it, ATI got owned.

When the 2900XT outscores the 2900XTX, you know there's something fundamentally broken somewhere.

Drivers most likely. 

Avatar image for Marka1700
Marka1700

7500

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 Marka1700
Member since 2003 • 7500 Posts
ATI usually makes the better card, poeple just dont want to deal wither thier terrible drivers.
Avatar image for joeblak
joeblak

5474

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 joeblak
Member since 2005 • 5474 Posts

ATI usually makes the better card, poeple just dont want to deal wither thier terrible drivers.Marka1700

 

Looks like the roles have switched....... 

Avatar image for Makari
Makari

15250

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 Makari
Member since 2003 • 15250 Posts

[QUOTE="Marka1700"]ATI usually makes the better card, poeple just dont want to deal wither thier terrible drivers.joeblak

Looks like the roles have switched.......

i was going to say.. people still give ati a hard time for this, while nvidia's facing a (mildy stupid, to be fair) class-action lawsuit for having had completely broken vista drivers for the last 6 months. :P
Avatar image for TekkenMaster606
TekkenMaster606

10980

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#24 TekkenMaster606
Member since 2006 • 10980 Posts
[QUOTE="Makari"][QUOTE="dgsag"]

[QUOTE="Zenkuso"]They should have put the GDDR4 in it, would have snowed under 8800's but they went cheap and its dragged them down.NobuoMusicMaker

The XTX does have GDDR4. :D

 

Face it, ATI got owned.

When the 2900XT outscores the 2900XTX, you know there's something fundamentally broken somewhere.

Drivers most likely.

 

We'll see, but I for a chipset that's hitting the market way later than Nvidia, ATI should be worried...

 

Don't you guys remember the Geforce 3 Ti series...And then ATI countered with the Radeon 8500...And then BLAM, Geforce 4 Ti series.  

Avatar image for noIinteam
noIinteam

1644

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 noIinteam
Member since 2004 • 1644 Posts

These numbers are highly questionable. The r600's technical specs point to a performance at least alot better then the x1950xtx so there has got to be something wierd going on. I guraentee you they will be alot better when real reviews come out in two-three weeks.TOAO_Cyrus1

 

Someone posted this at Dailytech:

"The R600 has 64x5 Vec5D units which maces each unit handle a maximum of 320 stream ops per second, but makes it's worst case scenario a lot worse with 64 stream ops per second (for the lack of a better term). You can think of that in the same manner as you see that SIMD units in our current processors can deliver huge amounts of processing power if, and only if, used correctly and optimized accordingly, otherwise we see no gains.



In my opinion AMD/ATI made a design compromise, they used this approach as it could prove to be way better in the dx10 world, and in a much more interesting way, in the GPGPU world.



Think about it, if you open up your architecture with CTM and give the people the power of 64x5 vec5d units you end up with an amazing amount of processing power. That's where I think they are focusing.



Nvidia has a much more favorable place in the gaming world. If you have 128 scalar units, in a worst case scenario you'd still issue 128 stream ops (all else constant, and given you have the bandwidth). But your best case scenario isn't that good.



I believe they delayed it because they were expecting dx10 games (of course, this is just speculation on my part). And I hope, for their sake, that it performs a lot better in that world.



Still, if I am somewhat right, drivers could provide better optimization for shader programs that aren't made with a simd architecture in mind, but then again, I could be entirely wrong.

http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2039097&frmKeyword=&STARTPAGE=8&FTVAR_FORUMVIEWTMP=Linear 

 

Avatar image for jwcyclone15
jwcyclone15

1149

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 jwcyclone15
Member since 2005 • 1149 Posts

[QUOTE="TOAO_Cyrus1"]These numbers are highly questionable. The r600's technical specs point to a performance at least alot better then the x1950xtx so there has got to be something wierd going on. I guraentee you they will be alot better when real reviews come out in two-three weeks.noIinteam

 

Someone posted this at Dailytech:

"The R600 has 64x5 Vec5D units which maces each unit handle a maximum of 320 stream ops per second, but makes it's worst case scenario a lot worse with 64 stream ops per second (for the lack of a better term). You can think of that in the same manner as you see that SIMD units in our current processors can deliver huge amounts of processing power if, and only if, used correctly and optimized accordingly, otherwise we see no gains.



In my opinion AMD/ATI made a design compromise, they used this approach as it could prove to be way better in the dx10 world, and in a much more interesting way, in the GPGPU world.



Think about it, if you open up your architecture with CTM and give the people the power of 64x5 vec5d units you end up with an amazing amount of processing power. That's where I think they are focusing.



Nvidia has a much more favorable place in the gaming world. If you have 128 scalar units, in a worst case scenario you'd still issue 128 stream ops (all else constant, and given you have the bandwidth). But your best case scenario isn't that good.



I believe they delayed it because they were expecting dx10 games (of course, this is just speculation on my part). And I hope, for their sake, that it performs a lot better in that world.



Still, if I am somewhat right, drivers could provide better optimization for shader programs that aren't made with a simd architecture in mind, but then again, I could be entirely wrong.

http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2039097&frmKeyword=&STARTPAGE=8&FTVAR_FORUMVIEWTMP=Linear

 

Wow...very interesting...so in a DX9 environment it uses on 64 shader ops..while in DX10 it can use up to 320 shaders ops a sec... 

Avatar image for dilmos
dilmos

763

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#27 dilmos
Member since 2006 • 763 Posts

[QUOTE="Zaistev_basic"]regardless, NVidia invested in the wrong console (PS3)muscleserge
Nvidia is still the market leader.

 

So it is ok to use that logic if you are a PS3 fanboy, but when Wii lovers mention sales, it doesn't count right???

Avatar image for PS3_3DO
PS3_3DO

10976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 PS3_3DO
Member since 2006 • 10976 Posts

That is much BS tests. I know for a fact ATI cards do better in HL2. I will wait for better tests and drivers to be released.

 

Avatar image for Pro_wrestler
Pro_wrestler

7880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#29 Pro_wrestler
Member since 2002 • 7880 Posts

[QUOTE="muscleserge"][QUOTE="Zaistev_basic"]regardless, NVidia invested in the wrong console (PS3)dilmos

Nvidia is still the market leader.

 

So it is ok to use that logic if you are a PS3 fanboy, but when Wii lovers mention sales, it doesn't count right???

Haha, Wii lovers Haha, but you make a good point. Its sad really.

Avatar image for TOAO_Cyrus1
TOAO_Cyrus1

2895

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 TOAO_Cyrus1
Member since 2004 • 2895 Posts
[QUOTE="noIinteam"]

[QUOTE="TOAO_Cyrus1"]These numbers are highly questionable. The r600's technical specs point to a performance at least alot better then the x1950xtx so there has got to be something wierd going on. I guraentee you they will be alot better when real reviews come out in two-three weeks.jwcyclone15

 

Someone posted this at Dailytech:

"The R600 has 64x5 Vec5D units which maces each unit handle a maximum of 320 stream ops per second, but makes it's worst case scenario a lot worse with 64 stream ops per second (for the lack of a better term). You can think of that in the same manner as you see that SIMD units in our current processors can deliver huge amounts of processing power if, and only if, used correctly and optimized accordingly, otherwise we see no gains.



In my opinion AMD/ATI made a design compromise, they used this approach as it could prove to be way better in the dx10 world, and in a much more interesting way, in the GPGPU world.



Think about it, if you open up your architecture with CTM and give the people the power of 64x5 vec5d units you end up with an amazing amount of processing power. That's where I think they are focusing.



Nvidia has a much more favorable place in the gaming world. If you have 128 scalar units, in a worst case scenario you'd still issue 128 stream ops (all else constant, and given you have the bandwidth). But your best case scenario isn't that good.



I believe they delayed it because they were expecting dx10 games (of course, this is just speculation on my part). And I hope, for their sake, that it performs a lot better in that world.



Still, if I am somewhat right, drivers could provide better optimization for shader programs that aren't made with a simd architecture in mind, but then again, I could be entirely wrong.

http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2039097&frmKeyword=&STARTPAGE=8&FTVAR_FORUMVIEWTMP=Linear

 

Wow...very interesting...so in a DX9 environment it uses on 64 shader ops..while in DX10 it can use up to 320 shaders ops a sec... 

 

Uh no. Either way its 64 shader opps. The ATi card can do 64 vector operations per clock. If an instruction is scalor it uses up a whole Vector ALU which is inefficient. Nvidia has 128 scalor ALU's which are always 100% utalized. If there is a vector opp the compiler has to break it down into into three scalor opps which uses up 3 SP's. DX10 shaders will be more complex giving ATi's extra power the advantage over the efficiency of Nvidia.

 

Still the performance difference is way to big. They should be about even in DX9. The card they tested was the 12 inch OEM only card so it could really be just the OEM XT. That would go with rumors that ATI is going to launch with just the HD2900xt (better then a 8800gts for same price) and release the XTX version later with higher clock speeds and 65nm.

Avatar image for tramp
tramp

2110

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 tramp
Member since 2003 • 2110 Posts

ATI usually makes the better card, poeple just dont want to deal wither thier terrible drivers.Marka1700

At this point in time the 8800s still dont even have any official drivers 6 months after release.

Avatar image for tramp
tramp

2110

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 tramp
Member since 2003 • 2110 Posts
[QUOTE="NobuoMusicMaker"][QUOTE="Makari"][QUOTE="dgsag"]

[QUOTE="Zenkuso"]They should have put the GDDR4 in it, would have snowed under 8800's but they went cheap and its dragged them down.TekkenMaster606

The XTX does have GDDR4. :D

 

Face it, ATI got owned.

When the 2900XT outscores the 2900XTX, you know there's something fundamentally broken somewhere.

Drivers most likely.

 

We'll see, but I for a chipset that's hitting the market way later than Nvidia, ATI should be worried...

 

Don't you guys remember the Geforce 3 Ti series...And then ATI countered with the Radeon 8500...And then BLAM, Geforce 4 Ti series.  

yeh but i also remember the 9700/9800s smoking the Gf4s, the super flop 5800, and the 5900s.

 

 

Anyway

Their Article 24/04/07 for AMD ATI Radeon HD 2900 XT

Gaming: Maximum Quality, 1280x1024

Call of Duty 2 73.5 FPS
Company of Heroes 92.1 FPS
F.E.A.R. 84.0 FPS
Half Life 2: Episode 1 112.0
Oblivion 47.9 FPS
3DMark06 11447

Article on 26/04/07 XTX first XT 2nd

Frames per second 1280x1024

Company of Heroes 97.1 na
F.E.A.R. 84 79
Half Life 2: Episode 1 117.9 118.9
Elder Scrolls IV: Oblivion 100.3 101.4

So not only is the XT a faster card than the XTX - but the XT fps has dropped since they reviewed it 2 days ago which makes no sense given the res is the same and quality is max.....

Dailytech looks like a load of bollox tbh...... their benchs make no sense and the editor is a bit of a laughing stock if he can get away with stuff like that....

Personally I couldnt give a toss which out of AMD or Nvidia are faster - as long as I buy the faster of the two thats all that matters. I was hoping to give AMD a try this time wound but if these scores are correct (unlikely) then ill be getting a gtx...

Avatar image for TOAO_Cyrus1
TOAO_Cyrus1

2895

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 TOAO_Cyrus1
Member since 2004 • 2895 Posts
Yeah and they had the performance crown with the x800 sereis but were behind in features. They also won last generation if you dont count the dual GPU cards. Its been neck and neck since 2002.
Avatar image for TekkenMaster606
TekkenMaster606

10980

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#34 TekkenMaster606
Member since 2006 • 10980 Posts

First of all, I really don't care who wears the crown when it comes to video cards. Being a fanboy of one specific company is the dumbest thing ever. Everyone knows that it's back and forth.

 

I'll take this with a grain of salt until some other benchmarks come out. But for now, The Geforce that's on the market right now is looking better than an ATI card we have yet to see hit the shelves.  

Avatar image for TOAO_Cyrus1
TOAO_Cyrus1

2895

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 TOAO_Cyrus1
Member since 2004 • 2895 Posts
The thing is though ATi could dominate from lower-highend downwards. The XT trounces the GTS and nvidias 8600's ad 8400's sucked. ATi has a big chance here.
Avatar image for Radeon_X1950XTX
Radeon_X1950XTX

1055

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 Radeon_X1950XTX
Member since 2006 • 1055 Posts

[QUOTE="Zenkuso"]They should have put the GDDR4 in it, would have snowed under 8800's but they went cheap and its dragged them down.dgsag

The XTX does have GDDR4. :D

 

Face it, ATI got owned. 

the card they used for ATI was not even finished yet

Avatar image for BioShockOwnz
BioShockOwnz

52901

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#37 BioShockOwnz
Member since 2006 • 52901 Posts
I just bought the 8800GTX, so I hope NVIDIA is better.
Avatar image for TOAO_Cyrus1
TOAO_Cyrus1

2895

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 TOAO_Cyrus1
Member since 2004 • 2895 Posts

http://dailytech.com/DailyTech+Digest+Making+Sense+of+R600/article7079.htm

 

Dailytech says you will never see this card. They overclocked the XT to 845 Mhz which almost equaled a GTX. I smell an XTX with at least a 900Mhz core clock.

Remember G80's ALU's are clocked at 1400Mhz so ATi stands to gain the most with a clock increase. They already have more then enough bandwidth for DX9 games.

Avatar image for grassdream
grassdream

1902

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 grassdream
Member since 2006 • 1902 Posts
I prefer the ATI + AMD combo.
Avatar image for Makari
Makari

15250

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 Makari
Member since 2003 • 15250 Posts
I just bought the 8800GTX, so I hope NVIDIA is better.BioShockOwnz
buying a card now is a pretty bad idea - however the competition turns out, prices should be going down.
Avatar image for TyrantDragon55
TyrantDragon55

6851

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#41 TyrantDragon55
Member since 2004 • 6851 Posts
I've always prefered ATI myself, even though my current PC is using a Nvidia card.
Avatar image for TOAO_Cyrus1
TOAO_Cyrus1

2895

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 TOAO_Cyrus1
Member since 2004 • 2895 Posts

AMD is hurting now they need support. We are screwed if we dont have competition in high end graphics and CPU's.

Avatar image for miss_kitt3n
miss_kitt3n

2717

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 miss_kitt3n
Member since 2006 • 2717 Posts

Check out the leaked 8800 Ultra specs.

 

teh powa 

 

*wets self* 

Avatar image for Hammerofjustice
Hammerofjustice

2685

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 Hammerofjustice
Member since 2006 • 2685 Posts
Those benchmarks are trash, ill wait for a reliable site.
Avatar image for ChiChiMonKilla
ChiChiMonKilla

2339

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 ChiChiMonKilla
Member since 2007 • 2339 Posts
Stock with alpha drivers vs a oc'd card is in no way a fair bench, also the 2 ati cards show no fps difference so something is way off.
Avatar image for wok7
wok7

2034

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 wok7
Member since 2003 • 2034 Posts
Topic = Duh, also Consoles > PC since the PC video card costs more than console but doesn't perform as well, lolz.
Avatar image for TekkenMaster606
TekkenMaster606

10980

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#47 TekkenMaster606
Member since 2006 • 10980 Posts

Topic = Duh, also Consoles > PC since the PC video card costs more than console but doesn't perform as well, lolz.wok7

 

Got a link to that claim? Show me where either the PS3 or 360 outperform the current DX10 lineups. 

Avatar image for venture00
venture00

1060

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 venture00
Member since 2004 • 1060 Posts
when is the R-600 coming out -_- ???
Avatar image for Einhanderkiller
Einhanderkiller

13259

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#49 Einhanderkiller
Member since 2003 • 13259 Posts

Those benchmarks are trash, ill wait for a reliable site.Hammerofjustice

Dailytech is reliable. They are affiliated with Anandtech, one of the most reliable and trustworthy tech sites on the net.

That is much BS tests. I know for a fact ATI cards do better in HL2. I will wait for better tests and drivers to be released. PS3_3DO

It's not about the brand; it's about the card.

[QUOTE="Makari"][QUOTE="dgsag"]

[QUOTE="Zenkuso"]They should have put the GDDR4 in it, would have snowed under 8800's but they went cheap and its dragged them down.NobuoMusicMaker

The XTX does have GDDR4. :D

 

Face it, ATI got owned.

When the 2900XT outscores the 2900XTX, you know there's something fundamentally broken somewhere.

Drivers most likely.

Could be. The drivers might be in beta just like NVIDIA's. However, the drivers used to test this card were the retail drivers that will be shipping with the card.

when is the R-600 coming out -_- ???venture00

May.

The NDA lifts May 2nd, I believe. 

 

 

Avatar image for Bebi_vegeta
Bebi_vegeta

13558

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 Bebi_vegeta
Member since 2003 • 13558 Posts

Topic = Duh, also Consoles > PC since the PC video card costs more than console but doesn't perform as well, lolz.wok7

 

LOL, only a ignorant could say that!