[QUOTE="osan0"]nope. firstly the 360s GPU is much much older than that card. its older than a HD 2000 series GPU and that was a DX10 chip secondly, the 360s GPU is not even remotely close to be powerful enough to run DX 10 code. its only got 48 shaders running at 500MHz. the card you have linked to has 320 and the latest cards have 800. thirdly....you seem to be going under the assumption that the 360 has the best graphics. thats very very debateable. games like killzone2 and uncharted 2 make a very strong case for the PS3. gears 2 is also a great looking game...but is it better looking? ooo im not going to get into that here. but yea....the 360s GPU is not even remotely close to being up to DX10.1 spec. it has nowhere near enough shaders to get the job done.Brownesque
http://www.nvidia.com/object/product_geforce_8300_mgpu_us.html
This is a Geforce 8300. It can run DirectX10 code.
It's dramatically slower than the 8500 GT:
http://www.nvidia.com/object/geforce_8500.html
Which also runs DirectX10 code, and yet only has 16 processor cores, 1/3 of the Xenos' quantity of unified shaders. Granted, neither of these cards run DX10 code very well, they don't even run DX9 code very well at all.
But even the 9800 GT, which is essentially just a rebuffed 8800 GT, has only 112 processor cores.
This is the GTX280, which is a monstrosity that runs well over $250 and yet only has 240 stream processors.
http://www.nvidia.com/object/product_geforce_gtx_280_us.html
So to begin with, it's not necessary to have a lot of processor cores to run DX10 code. Second, modern processors can chew DX10 code pretty well, despite the fact that they only have a fraction of the stream processors you mentioned.
Granted, ATI DX10 cards seem to generally have more stream processors than their Nvidia counterparts. However, this doesn't seem to help their performance. Nvidia cards generally outperform DX10 ATI cards.
your comparing geforce cards to radeon cards.....which is pointless. they work very differently. their architecture is completly different. a geforce 8300 or 8500 are also absolutely, positively dreadful cards....absolutely woeful. they were woeful when they were released. nice for a media center PC but for games....not woth the cost of the cardboard the were packed with. as you said, they can technically run DX10.....but good look to any developer that tries it. it gets very messy so i wont go into detail....but lets just say that 1 nvidia shader is not equal to one ati shader. the way they are arranged at the hardware level and the way data is dealt with is also completly different. a more apt comparison is the 360s GPU vs a HD 2000 series radeon. that was based on 360 tech. again technically they could run DX10 code (even the really low end ones)...but it wasnt a good idea. the 2000 series will not be fondly remembered....especially for its DX10 performance. the latest radeons have 800 shaders (no that doesent mean there better than nvidia cards) and they do run DX10 code well. but that architecture has been greatly refined and improved since the 360.....and well it has roughly 8X as many shaders. it would actually be interesting to see if the 360s GPU could run DX10 code. it does have a unified shader architecture....so its possible i think. but of course as far as games concerned.....the GPU is a non runner for DX10. it just doesnt have the nexessary grunt. if a high end HD 2000series with 320 processors struggeled sturggeled with DX10...what chance has an older version of that chip with 48 shaders got at running DX10 code and DX10 effects in an actual game. it may be technically capable of running DX10....but in reality no dev is going to do it. they would end up with a pretty slide show.
Log in to comment