This topic is locked from further discussion.
or for roughly 7x that amount you can buy a ps3 and have graphics that really are better (laughs)CRAZIE_GUY
which raises the question why consoles can do what they do with their specs, 256 ram on the ps3... matches a 8 year old computer... what makes consoles so special that they can outperform the games being released for 2gb ram + rigs?CRAZIE_GUY
Oh, well that's relatively easy to explain.
First off, not everyone has top of the line HW, therefor most PC game developer try to keep the required specs well below the curve.
Secondly, in consoles, you have fixed HW specifications for your game to be designed around. By contrast, on the PC, since the specifications vary so widely from system to system (Intel Core 2 vs AMD Athlon, Nvidia Geforce VS ATI Raideon, etc), you have to be more generic and use middleware (DirectX/OpenGL) to ensure broad compatibility.
Finally, when running a game on a PC, the game isn't the only thing occupying space in RAM. You've got Windows (or linux, MacOS, whatever) and all its little programs... you probably have an antivirus too, and maybe some IMs running. Point is, it has to share. Whereas with the console, there's very little besides the game itself running in memory.
I don't understand why some even bothered keeping they're Wii when they're second guessing it, and dwelling on it's capabilities compared to 360 or PS3. "It's the gameplay, it's the gameplay." Apparently not. And what controversy are we speaking of? Nintendo was open from the beginning about it's performance, not like they pulled one over on everyone like Sony...
I'm not sure about RS:V, but in the long term developers would have to deal with more than just downgrading the graphics. I imagine the Wii would choke on newer physics engines (what version is HAVOK up to now?) and AIs.ThePlothole
The Wii is believed and partially proven to have hardware extended from the Gamecube. But the G3 (if the Wii really has a G3 PPC CPU) architecture is more efficient per clock than Pentium 3 CPU architecture like seen in the Xbox. From what I've read it seems to be a fair estimate that the Wii's CPU is in the realm of being comparable to the Xbox's P3 clocked at 1.2 GHz. In comparison to the Xbox's specs that's a nice deal extra processing power. And for all I know, the CPU in the Wii could be even more powerful. Like I said previously, the PS2 and Xbox with relatively weak in comparison to PC CPUs, could pull of some nice physics. Havok may be up to version 4.0 about now, but 2.0 would be sufficient for Wii games, as there are much less points of articulation and detail to consider. Game engine wise, the Wii does run Unreal 2.5 in the form of Red Steel which graphically and game engine wise (considering physics and all) wasn't a bad looking nor bad running game (only the controls really sucked). I'm pretty sure a scaled back Unreal 3 is on the way (but not really necessary unless it has the benefit of greater efficiency for the same processes as UR2.5).
When it comes to the graphics side of things, the Wii can probably dish out much more than what we've had the privelage of witnessing. Rumors state that the Hollywood GPU in the Wii is essentially a "doubled-up" extension of the Flipper GPU from the Gamecube. Double the pixel units, texture units, TEV, rasterization capabilities and add a 50% higher clock speed (162 to 239 MHz) and we can pretty much theorize a tripling of capabilities in the Wii in comparison to the Gamecube. 8 pixel pipelines, 8 texture units, and 2 TEVs. Basically the Wii I think will be comparable to the performance of an Nvidia GeForce 6200, which is enough horsepower to run a more than decent version of Half Life 2, Call of Duty 2 on PC (in DX7 mode), or Battlefield 2 PC. Of course these are granted that the Wii can handle similar shaders as the Direct X 7/8/9 ones used in the mentioned games and has the vertex/polygon rendering power as well. Now how does the supposed Wii GPU stack up in comparison to the Xbox's GPU? Let us compare, and take note that this is raw theoreticals here, not taking shader programming capabilities into account nor the render API being used which would affect real world performance.
Xbox GPU Gamecube GPUWii GPU
233 MHz clock speed 162 MHz clock speed 239 MHz clock speed
4 pixel pipelines 4 pixel pipelines 8 pixel pipelines
8 texture units (2 per pixel pipe) 4 texture units (1 per pixel pipe) 8 texture units (1 per pixel pipe)
4 PP x 233 MHz = 932 MPixels/Sec 4 PP x 162 = 648 MPixels/Sec 8 x 239 MHz = 1912 MPixels/Sec
8 TU x 233 MHz = 1864 MTexels/Sec 4 TU x 162 = 648 MTexels/Sec 8 x 239 MHz = 1912 MTexels/sec
So basically and theoretically the Wii should possess twice the pixel processing capabilities of the Xbox, which should considerable give the Wii a real edge in pixel processes like bump maps, normal maps and other shader effects. Despite that the Wii possess barely higher texture processing but it matches well with it's ability to process pixels. We should see much more shadowing, bumpmapping, normal mapping and other various shader and texture effects as compared to the Gamecube where such effects were rarely implemented in the same regard as the PS2 which did possess such capabilities as well despite not being as good or prominently known.
I don't understand why some even bothered keeping they're Wii when they're second guessing it, and dwelling on it's capabilities compared to 360 or PS3. "It's the gameplay, it's the gameplay." Apparently not. And what controversy are we speaking of? Nintendo was open from the beginning about it's performance, not like they pulled one over on everyone like Sony...
MedicMike66
[QUOTE="ThePlothole"]I'm not sure about RS:V, but in the long term developers would have to deal with more than just downgrading the graphics. I imagine the Wii would choke on newer physics engines (what version is HAVOK up to now?) and AIs.TacticalElefant
The Wii is believed and partially proven to have hardware extended from the Gamecube. But the G3 (if the Wii really has a G3 PPC CPU) architecture is more efficient per clock than Pentium 3 CPU architecture like seen in the Xbox. From what I've read it seems to be a fair estimate that the Wii's CPU is in the realm of being comparable to the Xbox's P3 clocked at 1.2 GHz. In comparison to the Xbox's specs that's a nice deal extra processing power. And for all I know, the CPU in the Wii could be even more powerful. Like I said previously, the PS2 and Xbox with relatively weak in comparison to PC CPUs, could pull of some nice physics. Havok may be up to version 4.0 about now, but 2.0 would be sufficient for Wii games, as there are much less points of articulation and detail to consider. Game engine wise, the Wii does run Unreal 2.5 in the form of Red Steel which graphically and game engine wise (considering physics and all) wasn't a bad looking nor bad running game (only the controls really sucked). I'm pretty sure a scaled back Unreal 3 is on the way (but not really necessary unless it has the benefit of greater efficiency for the same processes as UR2.5).
When it comes to the graphics side of things, the Wii can probably dish out much more than what we've had the privelage of witnessing. Rumors state that the Hollywood GPU in the Wii is essentially a "doubled-up" extension of the Flipper GPU from the Gamecube. Double the pixel units, texture units, TEV, rasterization capabilities and add a 50% higher clock speed (162 to 239 MHz) and we can pretty much theorize a tripling of capabilities in the Wii in comparison to the Gamecube. 8 pixel pipelines, 8 texture units, and 2 TEVs. Basically the Wii I think will be comparable to the performance of an Nvidia GeForce 6200, which is enough horsepower to run a more than decent version of Half Life 2, Call of Duty 2 on PC (in DX7 mode), or Battlefield 2 PC. Of course these are granted that the Wii can handle similar shaders as the Direct X 7/8/9 ones used in the mentioned games and has the vertex/polygon rendering power as well. Now how does the supposed Wii GPU stack up in comparison to the Xbox's GPU? Let us compare, and take note that this is raw theoreticals here, not taking shader programming capabilities into account nor the render API being used which would affect real world performance.
Xbox GPU Gamecube GPUWii GPU
233 MHz clock speed 162 MHz clock speed 239 MHz clock speed
4 pixel pipelines 4 pixel pipelines 8 pixel pipelines
8 texture units (2 per pixel pipe) 4 texture units (1 per pixel pipe) 8 texture units (1 per pixel pipe)
4 PP x 233 MHz = 932 MPixels/Sec 4 PP x 162 = 648 MPixels/Sec 8 x 239 MHz = 1912 MPixels/Sec
8 TU x 233 MHz = 1864 MTexels/Sec 4 TU x 162 = 648 MTexels/Sec 8 x 239 MHz = 1912 MTexels/sec
So basically and theoretically the Wii should possess twice the pixel processing capabilities of the Xbox, which should considerable give the Wii a real edge in pixel processes like bump maps, normal maps and other shader effects. Despite that the Wii possess barely higher texture processing but it matches well with it's ability to process pixels. We should see much more shadowing, bumpmapping, normal mapping and other various shader and texture effects as compared to the Gamecube where such effects were rarely implemented in the same regard as the PS2 which did possess such capabilities as well despite not being as good or prominently known.
I have a GeForce 6200 and I can run Oblivion on a little more than minimal settings so the Wii can't be that bad.
[QUOTE="ThePlothole"]I'm not sure about RS:V, but in the long term developers would have to deal with more than just downgrading the graphics. I imagine the Wii would choke on newer physics engines (what version is HAVOK up to now?) and AIs.TacticalElefant
The Wii is believed and partially proven to have hardware extended from the Gamecube. But the G3 (if the Wii really has a G3 PPC CPU) architecture is more efficient per clock than Pentium 3 CPU architecture like seen in the Xbox. From what I've read it seems to be a fair estimate that the Wii's CPU is in the realm of being comparable to the Xbox's P3 clocked at 1.2 GHz. In comparison to the Xbox's specs that's a nice deal extra processing power. And for all I know, the CPU in the Wii could be even more powerful. Like I said previously, the PS2 and Xbox with relatively weak in comparison to PC CPUs, could pull of some nice physics. Havok may be up to version 4.0 about now, but 2.0 would be sufficient for Wii games, as there are much less points of articulation and detail to consider. Game engine wise, the Wii does run Unreal 2.5 in the form of Red Steel which graphically and game engine wise (considering physics and all) wasn't a bad looking nor bad running game (only the controls really sucked). I'm pretty sure a scaled back Unreal 3 is on the way (but not really necessary unless it has the benefit of greater efficiency for the same processes as UR2.5).
When it comes to the graphics side of things, the Wii can probably dish out much more than what we've had the privelage of witnessing. Rumors state that the Hollywood GPU in the Wii is essentially a "doubled-up" extension of the Flipper GPU from the Gamecube. Double the pixel units, texture units, TEV, rasterization capabilities and add a 50% higher clock speed (162 to 239 MHz) and we can pretty much theorize a tripling of capabilities in the Wii in comparison to the Gamecube. 8 pixel pipelines, 8 texture units, and 2 TEVs. Basically the Wii I think will be comparable to the performance of an Nvidia GeForce 6200, which is enough horsepower to run a more than decent version of Half Life 2, Call of Duty 2 on PC (in DX7 mode), or Battlefield 2 PC. Of course these are granted that the Wii can handle similar shaders as the Direct X 7/8/9 ones used in the mentioned games and has the vertex/polygon rendering power as well. Now how does the supposed Wii GPU stack up in comparison to the Xbox's GPU? Let us compare, and take note that this is raw theoreticals here, not taking shader programming capabilities into account nor the render API being used which would affect real world performance.
Xbox GPU Gamecube GPUWii GPU
233 MHz clock speed 162 MHz clock speed 239 MHz clock speed
4 pixel pipelines 4 pixel pipelines 8 pixel pipelines
8 texture units (2 per pixel pipe) 4 texture units (1 per pixel pipe) 8 texture units (1 per pixel pipe)
4 PP x 233 MHz = 932 MPixels/Sec 4 PP x 162 = 648 MPixels/Sec 8 x 239 MHz = 1912 MPixels/Sec
8 TU x 233 MHz = 1864 MTexels/Sec 4 TU x 162 = 648 MTexels/Sec 8 x 239 MHz = 1912 MTexels/sec
So basically and theoretically the Wii should possess twice the pixel processing capabilities of the Xbox, which should considerable give the Wii a real edge in pixel processes like bump maps, normal maps and other shader effects. Despite that the Wii possess barely higher texture processing but it matches well with it's ability to process pixels. We should see much more shadowing, bumpmapping, normal mapping and other various shader and texture effects as compared to the Gamecube where such effects were rarely implemented in the same regard as the PS2 which did possess such capabilities as well despite not being as good or prominently known.
Actually the GPU clock speed is rumored to be 243 MHz not 239 MHz according to wikipedia and IGN.
PC's require much more power due to what others have said- OS,anti-virus, and several other running programs. However ...consoles are adding these things as well. Wii, Xbox and PS3 all basically have operating systems now with background programs runnign for internet and other things. The biggest factor though it you have to remember that The Wii only has to worry about running games in 480x640. Knock any pc game down to that and you can see how low it is as far as resolution. The xbox and ps3 can run up to 1080i which is a much higher resolution (i dunno the exact dimensions).
Â
The Wii will look pretty nice eventually but i dont see it blowing anyone away . It's a fun system and we need some lasting games that arent party games. Â
[QUOTE="TacticalElefant"][QUOTE="ThePlothole"]I'm not sure about RS:V, but in the long term developers would have to deal with more than just downgrading the graphics. I imagine the Wii would choke on newer physics engines (what version is HAVOK up to now?) and AIs.komdosina
The Wii is believed and partially proven to have hardware extended from the Gamecube. But the G3 (if the Wii really has a G3 PPC CPU) architecture is more efficient per clock than Pentium 3 CPU architecture like seen in the Xbox. From what I've read it seems to be a fair estimate that the Wii's CPU is in the realm of being comparable to the Xbox's P3 clocked at 1.2 GHz. In comparison to the Xbox's specs that's a nice deal extra processing power. And for all I know, the CPU in the Wii could be even more powerful. Like I said previously, the PS2 and Xbox with relatively weak in comparison to PC CPUs, could pull of some nice physics. Havok may be up to version 4.0 about now, but 2.0 would be sufficient for Wii games, as there are much less points of articulation and detail to consider. Game engine wise, the Wii does run Unreal 2.5 in the form of Red Steel which graphically and game engine wise (considering physics and all) wasn't a bad looking nor bad running game (only the controls really sucked). I'm pretty sure a scaled back Unreal 3 is on the way (but not really necessary unless it has the benefit of greater efficiency for the same processes as UR2.5).
When it comes to the graphics side of things, the Wii can probably dish out much more than what we've had the privelage of witnessing. Rumors state that the Hollywood GPU in the Wii is essentially a "doubled-up" extension of the Flipper GPU from the Gamecube. Double the pixel units, texture units, TEV, rasterization capabilities and add a 50% higher clock speed (162 to 239 MHz) and we can pretty much theorize a tripling of capabilities in the Wii in comparison to the Gamecube. 8 pixel pipelines, 8 texture units, and 2 TEVs. Basically the Wii I think will be comparable to the performance of an Nvidia GeForce 6200, which is enough horsepower to run a more than decent version of Half Life 2, Call of Duty 2 on PC (in DX7 mode), or Battlefield 2 PC. Of course these are granted that the Wii can handle similar shaders as the Direct X 7/8/9 ones used in the mentioned games and has the vertex/polygon rendering power as well. Now how does the supposed Wii GPU stack up in comparison to the Xbox's GPU? Let us compare, and take note that this is raw theoreticals here, not taking shader programming capabilities into account nor the render API being used which would affect real world performance.
Xbox GPU Gamecube GPUWii GPU
233 MHz clock speed 162 MHz clock speed 239 MHz clock speed
4 pixel pipelines 4 pixel pipelines 8 pixel pipelines
8 texture units (2 per pixel pipe) 4 texture units (1 per pixel pipe) 8 texture units (1 per pixel pipe)
4 PP x 233 MHz = 932 MPixels/Sec 4 PP x 162 = 648 MPixels/Sec 8 x 239 MHz = 1912 MPixels/Sec
8 TU x 233 MHz = 1864 MTexels/Sec 4 TU x 162 = 648 MTexels/Sec 8 x 239 MHz = 1912 MTexels/sec
So basically and theoretically the Wii should possess twice the pixel processing capabilities of the Xbox, which should considerable give the Wii a real edge in pixel processes like bump maps, normal maps and other shader effects. Despite that the Wii possess barely higher texture processing but it matches well with it's ability to process pixels. We should see much more shadowing, bumpmapping, normal mapping and other various shader and texture effects as compared to the Gamecube where such effects were rarely implemented in the same regard as the PS2 which did possess such capabilities as well despite not being as good or prominently known.
Actually the GPU clock speed is rumored to be 243 MHz not 239 MHz according to wikipedia and IGN.
I mean the Wii GPU.
the wii is strange, it's like getting blood from a stone, getting the great graphics out of them
but it's been proven possible with metroid prime and RE4... now let me just sit with my knife and this rock...
I have a GeForce 6200 and I can run Oblivion on a little more than minimal settings so the Wii can't be that bad.Chirag6
min settings on that game looks worse than morrowind... the game before it... I have a radeon x1300 and I can run it on highest at a good 20-30 FPS constant
so that game has often been over estimated on it's graphical prowess
That was one of the best breakdowns of the Wii's graphical capabilities.
I found that wall of text so interesting, thanks man for taking the time to write that.
Â
That was one of the best breakdowns of the Wii's graphical capabilities.
I found that wall of text so interesting, thanks man for taking the time to write that.
Â
sonic_spark
PC's require much more power due to what others have said- OS,anti-virus, and several other running programs. However ...consoles are adding these things as well. Wii, Xbox and PS3 all basically have operating systems now with background programs runnign for internet and other things. The biggest factor though it you have to remember that The Wii only has to worry about running games in 480x640. Knock any pc game down to that and you can see how low it is as far as resolution. The xbox and ps3 can run up to 1080i which is a much higher resolution (i dunno the exact dimensions).
Â
The Wii will look pretty nice eventually but i dont see it blowing anyone away . It's a fun system and we need some lasting games that arent party games.
dieasgrey
You should post this on SW, it would bring up an interesting new twist to the debate about the Wii's graphics.
Personally with the whole developers being lazy thing, I think its basically a repeat of what happened with the PS2 - they arent willing to spend the time fine tuning the games, and making money from casuals is a big priority. Ubisoft themselves have acknowledged this mistake.
You should post this on SW, it would bring up an interesting new twist to the debate about the Wii's graphics.
Personally with the whole developers being lazy thing, I think its basically a repeat of what happened with the PS2 - they arent willing to spend the time fine tuning the games, and making money from casuals is a big priority. Ubisoft themselves have acknowledged this mistake.
Quofan
[QUOTE="sonic_spark"]That was one of the best breakdowns of the Wii's graphical capabilities.
I found that wall of text so interesting, thanks man for taking the time to write that.
Â
TacticalElefant
   Okay, I was wondering why you where using the number 239 MHz, whether its 243 MHz or 239 MHz it still is faster then Xbox's 233 MHz GPU, but 243 MHz sounds better :). As for Far Cry, I would have been just as happy had it looked like the Xbox version with the same graphics and AI running at 30fps with no slowdown (maybe added four player split screen not just two), a perfectly acceptable graphical situation for a first generation Wii game. Of course UBI Soft chose the cheap and rushed option rather then put the roughly extra 6 or 7 months to make proper Xbox Far Cry port, which I would rather have then that berley PS2 version.
Â
[QUOTE="TacticalElefant"][QUOTE="sonic_spark"]That was one of the best breakdowns of the Wii's graphical capabilities.
I found that wall of text so interesting, thanks man for taking the time to write that.
Â
komdosina
Okay, I was wondering why you where using the number 239 MHz, whether its 243 MHz or 239 MHz it still is faster then Xbox's 233 MHz GPU, but 243 MHz sounds better :). As for Far Cry, I would have been just as happy had it looked like the Xbox version with the same graphics and AI running at 30fps with no slowdown (maybe added four player split screen not just two), a perfectly acceptable graphical situation for a first generation Wii game. Of course UBI Soft chose the cheap and rushed option rather then put the roughly extra 6 or 7 months to make proper Xbox Far Cry port, which I would rather have then that berley PS2 version.
Â
There is a reason why the Gamecube and PS2 versions of Far Cry Instincts were cancelled. I can say that even still, a PS2 or Gamecube version could look much better than Vengence. Vengence is one of the worse case scenarios of a rushed game. Oh and the reason why the Gamecube at least didn't get FC Instincts I'm sure was because of RAM limitations. The PS2 version I'm sure had the same issue plus the overall graphics fillrate probably failed to deliver enough.Please Log In to post.
Log in to comment