This topic is locked from further discussion.
I'm confused are you talking about the Xbox 360? PC definitely kicks its ass! I'm not sure what those specs are on the bottom but they suck!halflife2fan
Those are mine! :lol:
First of all a xbox360 is a mediocre system at best compared to a high end computer, I mean xbox 360 is sooo 2005, and by the way this is not a system wars forum its pc hardawre not 360 crap.deniiiii21
ugh. I am talking about the technology. The 360 uses a much better method for its graphic card. If a pc used that method, PC's could have MUST better graphics then they do now.
I am talking about the technology of the 360 graphics compared to the pc graphics. I wonder if computers in the future will use the 360's graphic technology. jed-at-war
The 360's VGA uses one of the three 3.2GHz cores of the 360 for its processor instead of its own. Imagine one of your quad core's cores powering a 8800GTX to over three times as fast as it is with its own processor.jed-at-war
[QUOTE="deniiiii21"]First of all a xbox360 is a mediocre system at best compared to a high end computer, I mean xbox 360 is sooo 2005, and by the way this is not a system wars forum its pc hardawre not 360 crap.jed-at-war
ugh. I am talking about the technology. The 360 uses a much better method for its graphic card. If a pc used that method, PC's could have MUST better graphics then they do now.
Pc is already better in terms of graphics.
What the poster is trying to say is wouldn't be great to have a graphics card with a quad core C.P.U. devoted to games. A good P.C. uses its C.P.U.,R.A.M., and video card to blow away a 360 in terms of graphics.roulettethedog
Thank you! Yes, the 8800GTX does not even have a 1GHz clock speed. If a 8800GTX was powered by one of the 2.66GHz cores of the Q6600, it would KILL every pc on the market.
the 360 processor in no way superior to a good core 2 duo. its graphics card is a modified x1900xtinyourface_12
It is modified by being given a 3.2GHz clock speed. One of the 3.2GHz cores is used to power the x1900xt.
[QUOTE="inyourface_12"]the 360 processor in no way superior to a good core 2 duo. its graphics card is a modified x1900xtjed-at-war
It is modified by being given a 3.2GHz clock speed. One of the 3.2GHz cores is used to power the x1900xt.
that is not true it only uses it to help accomplish smaller tasks by offloading them from the video processor and actually does that because they knew it would have to keep up longer than that graphics card can really compete and a computers architecture is entirely different hardly even comparable
[QUOTE="jed-at-war"][QUOTE="inyourface_12"]the 360 processor in no way superior to a good core 2 duo. its graphics card is a modified x1900xtinyourface_12
It is modified by being given a 3.2GHz clock speed. One of the 3.2GHz cores is used to power the x1900xt.
that is not true it only uses it to help accomplish smaller tasks by offloading them from the video processor and actually does that because they knew it would have to keep up longer than that graphics card can really compete and a computers architecture is entirely different hardly even comparable
It could be. My point is that it has boosted the card's performance. With all the multi-core cpus coming out, wouldn't you like to have one of your four cores boosting your 8800GTX dramatically?
[QUOTE="jed-at-war"]I didn't become active in these forums until this year.inyourface_12
your point?
I missed previous discussions on the 360's tech.
Well, pretty much the Xenos chip in the Xbox 360 itself has sort of been the "daddy" to the HD 2900XT. None of the CPU cores in the 360 are dedicated to graphics, because there is a GPU specifically for graphics.
The Xenos basically works like a motherboard chipset with an integrated HD2900 GPU, because of the fact that the architecture basically allows for most everything that DirectX 10 promises for Windows (Vista) users and OpenGL 3.0 users on any other OS.
The initial Xbox 360s used a maximum of VGA, since the DAC basically only supported output in an analog video format; likely because chances were that's all it would have been doing (Composite/S-Video/Component Video, and VGA). With the 360 Elite and the new revision 360 Premiums added a new video chip for HDMI output, which of course can allow for a digital video signal ( DVI video + up to 7.1 audio = HDMI ). With new firmware updates, the Xbox 360 could now rescale existing games to output at up to 1080p. And new games now all pretty much natively support up to 1080p as well.
Do you live under a rock? We've been past the 360 in terms of graphical technology for at least 1-2 years.K_r_a_u_s_e_rSo the 8800's came out in 2005?
Has there been a new PC game that runs as fast and looks as good as Gears of War on the market yet, or better? And last I recall, Vista by itself couldn't run on a system that only has 512MB of (dedicated) system RAM. Yet the 360 with 512MB of RAM shared between CPU(s) and GPU seems to more than hold its own.
Really, it uses one of its cores? Well guess what, mine uses an 8800 GTX and it blows it away.K_r_a_u_s_e_rTried running a recently released game? FS X would kill any system out now.
Every ubisoft game released in the last year (GRAW 2 and R6:Vegas in particular) chug like **** on anything other than an 8800 Ultra SLi or HD 2900 (1GB) CrossFire.
Overlord (demo) on Windows even with my system didn't look all that much better than the 360 (demo) version, apart from being able to use the native resolution my monitor is capable of.
System Wars are that way ---------> ExitK_r_a_u_s_e_r:roll: Everyone, look out. I think we have the next Chris Rock on our thread here. :lol:
The 360 has so much technology! I mean, it's just so overpowering that even Crysis couldn't run on it, AMIRITE?K_r_a_u_s_e_rWho said the Xbox 360 can't? Other than pure laziness, considering that Crytek is making it on 't3h directX10', it should be extremely easy for them to port the game to the Xbox 360 so that players on there can enjoy it too. What, did you forget? XNAGame Studio Express is freely available, and allows code compiled in it to work identically fluid on both an Xbox 360 and Windows.
Or did you also forget how many of the elements in DirectX 10 replacing older elements from DirectX9 and earlier (XInput replacing DirectInput, OpenAL for DirectSound3D/EAX), all came straight from the Microsoft Xbox 360 SDK? Yes sir, I believe you did. :|
Right, if it was THAT easy why hasn't Intel done it already?K_r_a_u_s_e_rBecause intel has not wanted to compete with the likes of nVidia and ATi.
Exhibit A: The intel GMA video chip. :lol: A video adapter so bad, it should come with a surgeon general's warning of how gaming on it is an exercise in futility.
[QUOTE="inyourface_12"][QUOTE="jed-at-war"][QUOTE="inyourface_12"]the 360 processor in no way superior to a good core 2 duo. its graphics card is a modified x1900xtjed-at-war
It is modified by being given a 3.2GHz clock speed. One of the 3.2GHz cores is used to power the x1900xt.
that is not true it only uses it to help accomplish smaller tasks by offloading them from the video processor and actually does that because they knew it would have to keep up longer than that graphics card can really compete and a computers architecture is entirely different hardly even comparable
It could be. My point is that it has boosted the card's performance. With all the multi-core cpus coming out, wouldn't you like to have one of your four cores boosting your 8800GTX dramatically?
I'd rather have the CPU working on the physics and ai.
And wouldn't it just make more sense to make gpus multi core?
I'm sure engineers already thought of many different things to do, but I guess to them, the way we are doing it now is probably the most efficient with the technology we have.
Please Log In to post.
Log in to comment