IF YOU POST A REPLY, BE SURE TO READ THE POST FIRST (I can't wait for sheep to post "but gameplay > graphics!!!!1111")
OK, there has been a lot of "debate" here about which console has the best graphics and best graphical hardware. I am about to lay this issue to rest, once and for all by doing what nobody ever seems to do here: look up the facts. I will then delve a bit into the history of how these facts came to be.
First, it is important to understand where graphics come from. Many cows seem to think that graphics are generated by the CPU, and therefore the Cell is advantageous. Unfortunatelyl for them, this is basically wrong. Video game graphics are made of triangles, which are drawn by the graphics processing unit (GPU). This is a completely separate, and distinct, piece of hardware from the CPU and for bandwidth reasons cannot offload triangle drawing to the CPU.
The fill rate is the theoretical maximum number of triangles the GPU can handle per second. The higher this number, the more complex a 3D scene the console (or PC) can render at real time rates (24 FPS and up).
Now here are the numbers:
Xbox 360: 500 million triangles/sec (source)
PlayStation 3: 275 million triangles/sec (source)
Nintendo Wii*: 30 million triangles/sec (source)
Now first a quick note, for ownage reasons, Nintendo dos not publish their fill rate. In fact, Sony tried to keep their fill rate a secret for a long time for the same reason. But back to Nintendo - here I used the Xbox 1 fill rate because a lot of people on this board (probably more than 60%) think the Xbox 1 has better graphics than the Wii.
The conclusion is pretty obvious: the reason the Xbox 360 has better graphics than the other two consoles is that it has a much, much more powerful video card. It is nearly twice as powerful as the PS3's "reality synthesizer" and over 10 times more powerful than the Wii's Hollywood.
But let's take a step back, and understand why this is. Why would Sony, obviously trying to build the most powerful console known to man, put a slow graphics chip inside their console? Well the reason is, this wasn't the plan. In fact, when Sony began the Cell project there were, in fact, two cell projects. One cell was designed to function as a CPU (what has become the "cell broadband engine" in the PS3 today), and another cell was designed to be a GPU. The GPU was designed in-house and was meant to be the successor to the emotion engine.
But unfortunately, developing modern graphics processors from scratch is very difficult and to make matters worse, ATI and NVIDIA hold a plethora of patents in this area, meaning a new chip has to skirt (or license) all of these patents. Ultimately, this Cell GPU project fell far behind schedule and 2 years before the PS3's launch Sony made a hard decision: kill the Cell GPU project. Sony then licensed a new chip from NVIDIA to be used in the PS3.
The chip they licensed is neither a 6800 nor a 7800 because it was developed by a separate team working simultaneously as the team at NVIDIA which ultimately developed the 6800 and now the 7800 and 8800. It seems this team may have been smaller, or underfunded, or perhaps neglected by NVIDIA because what they came up with - the RSX - is actually not very fast. What is more likely, though, is that Sony was had made a decision to include a hard drive in every unit and was trying to find ways to cut costs. And so they whacked one of the most important components of a game console: the GPU.
So, cows, now you know why your console has significantly worse graphics than the Xbox 360, and will continue to have worse graphics for the duration of the life of the console. Also, sheep now know how much slower their console is than the other two and hopefully will stop posting pointless threads like "the Wii has pixel shaders after all!!!!".
Log in to comment