The whole controversy of the Wii and it's capabilities........

This topic is locked from further discussion.

Avatar image for TacticalElefant
TacticalElefant

900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#1 TacticalElefant
Member since 2007 • 900 Posts
.......makes me want to get my hands on a devkit and learn how to make games for it to see how powerful the Wii really is. That and it would be a challenge creating certain games that depend on a clean and crisp presentation. Personally I think it would be really cool to translate Rainbow Six:Vegas to the Wii and still make it look good. Granted a Wii version of the game would lack many effects and graphical features, everything else I think would be translatable.

So yeah, I am a graphics whore but boy oh boy I'd like a challenge, and the Wii would be certainly interesting to program and develope for.  The devkits certainly are attractive being under $2000.
Avatar image for TacticalElefant
TacticalElefant

900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#3 TacticalElefant
Member since 2007 • 900 Posts
or for roughly 7x that amount you can buy a ps3 and have graphics that really are better (laughs)CRAZIE_GUY


Or use my PC which can outdo a PS3 and 360 combined.  I do like the Wii controls and I'm a hardware geek.  Solving problems and challenges within certain GPU/CPU and system capabilities is always an interesting quandrum.  I'm the kind of person who will play a game, and ask questions like "why didn't this game have bumpmaps and how would it look if it did?"  "How much of a processing hit would I be looking at if I implemented this feature along with this one?" 
Avatar image for ThePlothole
ThePlothole

11515

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 ThePlothole
Member since 2007 • 11515 Posts
I'm not sure about RS:V, but in the long term developers would have to deal with more than just downgrading the graphics. I imagine the Wii would choke on newer physics engines (what version is HAVOK up to now?) and AIs.
Avatar image for CRAZIE_GUY
CRAZIE_GUY

737

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#5 CRAZIE_GUY
Member since 2005 • 737 Posts
which raises the question why consoles can do what they do with their specs, 256 ram on the ps3... matches a 8 year old computer... what makes consoles so special that they can outperform the games being released for 2gb ram + rigs?
Avatar image for Optusnet
Optusnet

11065

Forum Posts

0

Wiki Points

0

Followers

Reviews: 24

User Lists: 0

#6 Optusnet
Member since 2003 • 11065 Posts
I would love a Wii devkit. I'd get some dudes who are nerdier than me, I'll be the creative director, and deliver a $5 pixel-perfect FPS coming to a VC near you!
Avatar image for ThePlothole
ThePlothole

11515

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 ThePlothole
Member since 2007 • 11515 Posts

which raises the question why consoles can do what they do with their specs, 256 ram on the ps3... matches a 8 year old computer... what makes consoles so special that they can outperform the games being released for 2gb ram + rigs?CRAZIE_GUY

Oh, well that's relatively easy to explain.

First off, not everyone has top of the line HW, therefor most PC game developer try to keep the required specs well below the curve.

Secondly, in consoles, you have fixed HW specifications for your game to be designed around. By contrast, on the PC, since the specifications vary so widely from system to system (Intel Core 2 vs AMD Athlon, Nvidia Geforce VS ATI Raideon, etc), you have to be more generic and use middleware (DirectX/OpenGL) to ensure broad compatibility.

Finally, when running a game on a PC, the game isn't the only thing occupying space in RAM. You've got Windows (or linux, MacOS, whatever) and all its little programs... you probably have an antivirus too, and maybe some IMs running. Point is, it has to share. Whereas with the console, there's very little besides the game itself running in memory.

Avatar image for MedicMike66
MedicMike66

886

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 MedicMike66
Member since 2007 • 886 Posts

I don't understand why some even bothered keeping they're Wii when they're second guessing it, and dwelling on it's capabilities compared to 360 or PS3.  "It's the gameplay, it's the gameplay."  Apparently not.  And what controversy are we speaking of?  Nintendo was open from the beginning about it's performance, not like they pulled one over on everyone like Sony...

Avatar image for TacticalElefant
TacticalElefant

900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#9 TacticalElefant
Member since 2007 • 900 Posts

I'm not sure about RS:V, but in the long term developers would have to deal with more than just downgrading the graphics. I imagine the Wii would choke on newer physics engines (what version is HAVOK up to now?) and AIs.ThePlothole


Well I think people don't give the Wii what it's worth in terms of capabilities. They are not extremely high like the PS3 and the 360 but in comparison I do think the Wii is considerably more powerful than the Gamecube, PS2, and even the Xbox.

The Xbox while having a somewhat weak CPU, a Pentium III 733 MHz, could pull off some nice physics, and even the PS2 could do some surprisingly nice ragdoll physics, for example in Area 51 and PSI-Ops: The Mindgate Conspiracy. What the PS2 really lacked in comparison to Microsoft's beast was graphics horsepower and RAM, which was to set the Xbox apart. The Gamecube though while still being weaker has much more comparible real world benchies against the Xbox as opposed to the PS2. Now where will all my gibber jabber take this? Let me explain.

The Wii is believed and partially proven to have hardware extended from the Gamecube. But the G3 (if the Wii really has a G3 PPC CPU) architecture is more efficient per clock than Pentium 3 CPU architecture like seen in the Xbox. From what I've read it seems to be a fair estimate that the Wii's CPU is in the realm of being comparable to the Xbox's P3 clocked at 1.2 GHz. In comparison to the Xbox's specs that's a nice deal extra processing power. And for all I know, the CPU in the Wii could be even more powerful. Like I said previously, the PS2 and Xbox with relatively weak in comparison to PC CPUs, could pull of some nice physics. Havok may be up to version 4.0 about now, but 2.0 would be sufficient for Wii games, as there are much less points of articulation and detail to consider. Game engine wise, the Wii does run Unreal 2.5 in the form of Red Steel which graphically and game engine wise (considering physics and all) wasn't a bad looking nor bad running game (only the controls really sucked). I'm pretty sure a scaled back Unreal 3 is on the way (but not really necessary unless it has the benefit of greater efficiency for the same processes as UR2.5).

When it comes to the graphics side of things, the Wii can probably dish out much more than what we've had the privelage of witnessing. Rumors state that the Hollywood GPU in the Wii is essentially a "doubled-up" extension of the Flipper GPU from the Gamecube. Double the pixel units, texture units, TEV, rasterization capabilities and add a 50% higher clock speed (162 to 239 MHz) and we can pretty much theorize a tripling of capabilities in the Wii in comparison to the Gamecube. 8 pixel pipelines, 8 texture units, and 2 TEVs. Basically the Wii I think will be comparable to the performance of an Nvidia GeForce 6200, which is enough horsepower to run a more than decent version of Half Life 2, Call of Duty 2 on PC (in DX7 mode), or Battlefield 2 PC. Of course these are granted that the Wii can handle similar shaders as the Direct X 7/8/9 ones used in the mentioned games and has the vertex/polygon rendering power as well. Now how does the supposed Wii GPU stack up in comparison to the Xbox's GPU? Let us compare, and take note that this is raw theoreticals here, not taking shader programming capabilities into account nor the render API being used which would affect real world performance.

Xbox GPU
Gamecube GPUWii GPU
233 MHz clock speed
162 MHz clock speed 239 MHz clock speed
4 pixel pipelines
4 pixel pipelines 8 pixel pipelines
8 texture units (2 per pixel pipe)
4 texture units (1 per pixel pipe) 8 texture units (1 per pixel pipe)
4 PP x 233 MHz = 932 MPixels/Sec
4 PP x 162 = 648 MPixels/Sec 8 x 239 MHz = 1912 MPixels/Sec
8 TU x 233 MHz = 1864 MTexels/Sec
4 TU x 162 = 648 MTexels/Sec 8 x 239 MHz = 1912 MTexels/sec

So basically and theoretically the Wii should possess twice the pixel processing capabilities of the Xbox, which should considerable give the Wii a real edge in pixel processes like bump maps, normal maps and other shader effects. Despite that the Wii possess barely higher texture processing but it matches well with it's ability to process pixels. We should see much more shadowing, bumpmapping, normal mapping and other various shader and texture effects as compared to the Gamecube where such effects were rarely implemented in the same regard as the PS2 which did possess such capabilities as well despite not being as good or prominently known.

Avatar image for TacticalElefant
TacticalElefant

900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10 TacticalElefant
Member since 2007 • 900 Posts

I don't understand why some even bothered keeping they're Wii when they're second guessing it, and dwelling on it's capabilities compared to 360 or PS3. "It's the gameplay, it's the gameplay." Apparently not. And what controversy are we speaking of? Nintendo was open from the beginning about it's performance, not like they pulled one over on everyone like Sony...

MedicMike66


Problem is Nintendo has been so hush hush about it.  Sure it "isn't about the graphics" but a decent jump from previous generation is a desirable outcome.  A game like Metroid should get the benefits of a better controller and such but why not have better graphics to further immerse the player into the experience.  And also graphics technology is a personal interest of mine, so seeing what the Wii can do is an object of interest to myself.

Oh and yeah if I can get Rainbow Six Vegas to run on my laptop which has a GeForce Go 7200 at minimal settings (take note that R6:V has very little in the way of graphics settings) I'm sure the Wii could run it if the game had it's textures and such dumbed down a bit and polygon counts lessened considerably as well as much less heavy physics orchestration.  I'd really like to see a Vegas like FPS on the Wii with cover and everything.
Avatar image for Chirag6
Chirag6

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 Chirag6
Member since 2006 • 666 Posts

[QUOTE="ThePlothole"]I'm not sure about RS:V, but in the long term developers would have to deal with more than just downgrading the graphics. I imagine the Wii would choke on newer physics engines (what version is HAVOK up to now?) and AIs.TacticalElefant



Well I think people don't give the Wii what it's worth in terms of capabilities. They are not extremely high like the PS3 and the 360 but in comparison I do think the Wii is considerably more powerful than the Gamecube, PS2, and even the Xbox.

The Xbox while having a somewhat weak CPU, a Pentium III 733 MHz, could pull off some nice physics, and even the PS2 could do some surprisingly nice ragdoll physics, for example in Area 51 and PSI-Ops: The Mindgate Conspiracy. What the PS2 really lacked in comparison to Microsoft's beast was graphics horsepower and RAM, which was to set the Xbox apart. The Gamecube though while still being weaker has much more comparible real world benchies against the Xbox as opposed to the PS2. Now where will all my gibber jabber take this? Let me explain.

The Wii is believed and partially proven to have hardware extended from the Gamecube. But the G3 (if the Wii really has a G3 PPC CPU) architecture is more efficient per clock than Pentium 3 CPU architecture like seen in the Xbox. From what I've read it seems to be a fair estimate that the Wii's CPU is in the realm of being comparable to the Xbox's P3 clocked at 1.2 GHz. In comparison to the Xbox's specs that's a nice deal extra processing power. And for all I know, the CPU in the Wii could be even more powerful. Like I said previously, the PS2 and Xbox with relatively weak in comparison to PC CPUs, could pull of some nice physics. Havok may be up to version 4.0 about now, but 2.0 would be sufficient for Wii games, as there are much less points of articulation and detail to consider. Game engine wise, the Wii does run Unreal 2.5 in the form of Red Steel which graphically and game engine wise (considering physics and all) wasn't a bad looking nor bad running game (only the controls really sucked). I'm pretty sure a scaled back Unreal 3 is on the way (but not really necessary unless it has the benefit of greater efficiency for the same processes as UR2.5).

When it comes to the graphics side of things, the Wii can probably dish out much more than what we've had the privelage of witnessing. Rumors state that the Hollywood GPU in the Wii is essentially a "doubled-up" extension of the Flipper GPU from the Gamecube. Double the pixel units, texture units, TEV, rasterization capabilities and add a 50% higher clock speed (162 to 239 MHz) and we can pretty much theorize a tripling of capabilities in the Wii in comparison to the Gamecube. 8 pixel pipelines, 8 texture units, and 2 TEVs. Basically the Wii I think will be comparable to the performance of an Nvidia GeForce 6200, which is enough horsepower to run a more than decent version of Half Life 2, Call of Duty 2 on PC (in DX7 mode), or Battlefield 2 PC. Of course these are granted that the Wii can handle similar shaders as the Direct X 7/8/9 ones used in the mentioned games and has the vertex/polygon rendering power as well. Now how does the supposed Wii GPU stack up in comparison to the Xbox's GPU? Let us compare, and take note that this is raw theoreticals here, not taking shader programming capabilities into account nor the render API being used which would affect real world performance.

Xbox GPU
Gamecube GPUWii GPU
233 MHz clock speed
162 MHz clock speed 239 MHz clock speed
4 pixel pipelines
4 pixel pipelines 8 pixel pipelines
8 texture units (2 per pixel pipe)
4 texture units (1 per pixel pipe) 8 texture units (1 per pixel pipe)
4 PP x 233 MHz = 932 MPixels/Sec
4 PP x 162 = 648 MPixels/Sec 8 x 239 MHz = 1912 MPixels/Sec
8 TU x 233 MHz = 1864 MTexels/Sec
4 TU x 162 = 648 MTexels/Sec 8 x 239 MHz = 1912 MTexels/sec

So basically and theoretically the Wii should possess twice the pixel processing capabilities of the Xbox, which should considerable give the Wii a real edge in pixel processes like bump maps, normal maps and other shader effects. Despite that the Wii possess barely higher texture processing but it matches well with it's ability to process pixels. We should see much more shadowing, bumpmapping, normal mapping and other various shader and texture effects as compared to the Gamecube where such effects were rarely implemented in the same regard as the PS2 which did possess such capabilities as well despite not being as good or prominently known.

I have a GeForce 6200 and I can run Oblivion on a little more than minimal settings so the Wii can't be that bad.

Avatar image for komdosina
komdosina

4972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 komdosina
Member since 2003 • 4972 Posts

[QUOTE="ThePlothole"]I'm not sure about RS:V, but in the long term developers would have to deal with more than just downgrading the graphics. I imagine the Wii would choke on newer physics engines (what version is HAVOK up to now?) and AIs.TacticalElefant



Well I think people don't give the Wii what it's worth in terms of capabilities. They are not extremely high like the PS3 and the 360 but in comparison I do think the Wii is considerably more powerful than the Gamecube, PS2, and even the Xbox.

The Xbox while having a somewhat weak CPU, a Pentium III 733 MHz, could pull off some nice physics, and even the PS2 could do some surprisingly nice ragdoll physics, for example in Area 51 and PSI-Ops: The Mindgate Conspiracy. What the PS2 really lacked in comparison to Microsoft's beast was graphics horsepower and RAM, which was to set the Xbox apart. The Gamecube though while still being weaker has much more comparible real world benchies against the Xbox as opposed to the PS2. Now where will all my gibber jabber take this? Let me explain.

The Wii is believed and partially proven to have hardware extended from the Gamecube. But the G3 (if the Wii really has a G3 PPC CPU) architecture is more efficient per clock than Pentium 3 CPU architecture like seen in the Xbox. From what I've read it seems to be a fair estimate that the Wii's CPU is in the realm of being comparable to the Xbox's P3 clocked at 1.2 GHz. In comparison to the Xbox's specs that's a nice deal extra processing power. And for all I know, the CPU in the Wii could be even more powerful. Like I said previously, the PS2 and Xbox with relatively weak in comparison to PC CPUs, could pull of some nice physics. Havok may be up to version 4.0 about now, but 2.0 would be sufficient for Wii games, as there are much less points of articulation and detail to consider. Game engine wise, the Wii does run Unreal 2.5 in the form of Red Steel which graphically and game engine wise (considering physics and all) wasn't a bad looking nor bad running game (only the controls really sucked). I'm pretty sure a scaled back Unreal 3 is on the way (but not really necessary unless it has the benefit of greater efficiency for the same processes as UR2.5).

When it comes to the graphics side of things, the Wii can probably dish out much more than what we've had the privelage of witnessing. Rumors state that the Hollywood GPU in the Wii is essentially a "doubled-up" extension of the Flipper GPU from the Gamecube. Double the pixel units, texture units, TEV, rasterization capabilities and add a 50% higher clock speed (162 to 239 MHz) and we can pretty much theorize a tripling of capabilities in the Wii in comparison to the Gamecube. 8 pixel pipelines, 8 texture units, and 2 TEVs. Basically the Wii I think will be comparable to the performance of an Nvidia GeForce 6200, which is enough horsepower to run a more than decent version of Half Life 2, Call of Duty 2 on PC (in DX7 mode), or Battlefield 2 PC. Of course these are granted that the Wii can handle similar shaders as the Direct X 7/8/9 ones used in the mentioned games and has the vertex/polygon rendering power as well. Now how does the supposed Wii GPU stack up in comparison to the Xbox's GPU? Let us compare, and take note that this is raw theoreticals here, not taking shader programming capabilities into account nor the render API being used which would affect real world performance.

Xbox GPU
Gamecube GPUWii GPU
233 MHz clock speed
162 MHz clock speed 239 MHz clock speed
4 pixel pipelines
4 pixel pipelines 8 pixel pipelines
8 texture units (2 per pixel pipe)
4 texture units (1 per pixel pipe) 8 texture units (1 per pixel pipe)
4 PP x 233 MHz = 932 MPixels/Sec
4 PP x 162 = 648 MPixels/Sec 8 x 239 MHz = 1912 MPixels/Sec
8 TU x 233 MHz = 1864 MTexels/Sec
4 TU x 162 = 648 MTexels/Sec 8 x 239 MHz = 1912 MTexels/sec

So basically and theoretically the Wii should possess twice the pixel processing capabilities of the Xbox, which should considerable give the Wii a real edge in pixel processes like bump maps, normal maps and other shader effects. Despite that the Wii possess barely higher texture processing but it matches well with it's ability to process pixels. We should see much more shadowing, bumpmapping, normal mapping and other various shader and texture effects as compared to the Gamecube where such effects were rarely implemented in the same regard as the PS2 which did possess such capabilities as well despite not being as good or prominently known.

Actually the GPU clock speed is rumored to be 243 MHz not 239 MHz according to wikipedia and IGN.

Avatar image for dieasgrey
dieasgrey

676

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#13 dieasgrey
Member since 2005 • 676 Posts

PC's require much more power due  to what others have said- OS,anti-virus, and several other running programs. However ...consoles are adding these things as well. Wii, Xbox and PS3 all basically have operating systems now with background programs runnign for internet and other things. The biggest factor though it you have to remember that The Wii only has to worry about running games in 480x640. Knock any pc game down to that and you can see how low it is as far as resolution. The xbox and ps3 can run up to 1080i which is a much higher resolution (i dunno the exact dimensions).

 

The Wii will look pretty nice eventually but i dont see it blowing anyone away . It's a fun system and we need some lasting games that arent party games.  

Avatar image for komdosina
komdosina

4972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 komdosina
Member since 2003 • 4972 Posts
[QUOTE="TacticalElefant"]

[QUOTE="ThePlothole"]I'm not sure about RS:V, but in the long term developers would have to deal with more than just downgrading the graphics. I imagine the Wii would choke on newer physics engines (what version is HAVOK up to now?) and AIs.komdosina



Well I think people don't give the Wii what it's worth in terms of capabilities. They are not extremely high like the PS3 and the 360 but in comparison I do think the Wii is considerably more powerful than the Gamecube, PS2, and even the Xbox.

The Xbox while having a somewhat weak CPU, a Pentium III 733 MHz, could pull off some nice physics, and even the PS2 could do some surprisingly nice ragdoll physics, for example in Area 51 and PSI-Ops: The Mindgate Conspiracy. What the PS2 really lacked in comparison to Microsoft's beast was graphics horsepower and RAM, which was to set the Xbox apart. The Gamecube though while still being weaker has much more comparible real world benchies against the Xbox as opposed to the PS2. Now where will all my gibber jabber take this? Let me explain.

The Wii is believed and partially proven to have hardware extended from the Gamecube. But the G3 (if the Wii really has a G3 PPC CPU) architecture is more efficient per clock than Pentium 3 CPU architecture like seen in the Xbox. From what I've read it seems to be a fair estimate that the Wii's CPU is in the realm of being comparable to the Xbox's P3 clocked at 1.2 GHz. In comparison to the Xbox's specs that's a nice deal extra processing power. And for all I know, the CPU in the Wii could be even more powerful. Like I said previously, the PS2 and Xbox with relatively weak in comparison to PC CPUs, could pull of some nice physics. Havok may be up to version 4.0 about now, but 2.0 would be sufficient for Wii games, as there are much less points of articulation and detail to consider. Game engine wise, the Wii does run Unreal 2.5 in the form of Red Steel which graphically and game engine wise (considering physics and all) wasn't a bad looking nor bad running game (only the controls really sucked). I'm pretty sure a scaled back Unreal 3 is on the way (but not really necessary unless it has the benefit of greater efficiency for the same processes as UR2.5).

When it comes to the graphics side of things, the Wii can probably dish out much more than what we've had the privelage of witnessing. Rumors state that the Hollywood GPU in the Wii is essentially a "doubled-up" extension of the Flipper GPU from the Gamecube. Double the pixel units, texture units, TEV, rasterization capabilities and add a 50% higher clock speed (162 to 239 MHz) and we can pretty much theorize a tripling of capabilities in the Wii in comparison to the Gamecube. 8 pixel pipelines, 8 texture units, and 2 TEVs. Basically the Wii I think will be comparable to the performance of an Nvidia GeForce 6200, which is enough horsepower to run a more than decent version of Half Life 2, Call of Duty 2 on PC (in DX7 mode), or Battlefield 2 PC. Of course these are granted that the Wii can handle similar shaders as the Direct X 7/8/9 ones used in the mentioned games and has the vertex/polygon rendering power as well. Now how does the supposed Wii GPU stack up in comparison to the Xbox's GPU? Let us compare, and take note that this is raw theoreticals here, not taking shader programming capabilities into account nor the render API being used which would affect real world performance.

Xbox GPU
Gamecube GPUWii GPU
233 MHz clock speed
162 MHz clock speed 239 MHz clock speed
4 pixel pipelines
4 pixel pipelines 8 pixel pipelines
8 texture units (2 per pixel pipe)
4 texture units (1 per pixel pipe) 8 texture units (1 per pixel pipe)
4 PP x 233 MHz = 932 MPixels/Sec
4 PP x 162 = 648 MPixels/Sec 8 x 239 MHz = 1912 MPixels/Sec
8 TU x 233 MHz = 1864 MTexels/Sec
4 TU x 162 = 648 MTexels/Sec 8 x 239 MHz = 1912 MTexels/sec

So basically and theoretically the Wii should possess twice the pixel processing capabilities of the Xbox, which should considerable give the Wii a real edge in pixel processes like bump maps, normal maps and other shader effects. Despite that the Wii possess barely higher texture processing but it matches well with it's ability to process pixels. We should see much more shadowing, bumpmapping, normal mapping and other various shader and texture effects as compared to the Gamecube where such effects were rarely implemented in the same regard as the PS2 which did possess such capabilities as well despite not being as good or prominently known.

Actually the GPU clock speed is rumored to be 243 MHz not 239 MHz according to wikipedia and IGN.

I mean the Wii GPU.

Avatar image for monty_4256
monty_4256

8577

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 monty_4256
Member since 2004 • 8577 Posts

the wii is strange, it's like getting blood from a stone, getting the great graphics out of them

but it's been proven possible with metroid prime and RE4... now let me just sit with my knife and this rock...

Avatar image for monty_4256
monty_4256

8577

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 monty_4256
Member since 2004 • 8577 Posts

I have a GeForce 6200 and I can run Oblivion on a little more than minimal settings so the Wii can't be that bad.Chirag6

min settings on that game looks worse than morrowind... the game before it... I have a radeon x1300 and I can run it on highest at a good 20-30 FPS constant
so that game has often been over estimated on it's graphical prowess

Avatar image for sonic_spark
sonic_spark

6196

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#17 sonic_spark
Member since 2003 • 6196 Posts

That was one of the best breakdowns of the Wii's graphical capabilities.

I found that wall of text so interesting, thanks man for taking the time to write that.

 

Avatar image for TacticalElefant
TacticalElefant

900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#18 TacticalElefant
Member since 2007 • 900 Posts

That was one of the best breakdowns of the Wii's graphical capabilities.

I found that wall of text so interesting, thanks man for taking the time to write that.

 

sonic_spark


It should give only more reason for you to be pissed at developers.  And on the clock speed thing, I've seen both 239 MHz and 243 MHz, and I went ahead with the smaller one "just in case" not to overestimate the Wii.  Well also like I had said you got to take into account what kind of shader programs the Hollywood GPU can pull off and how efficient it is at pulling them off.  It's hard comparing capabilities when the GPUs use different architecture somewhat, so it's going to come down to real world conditions and games to see how they look.  Hopefully the whole thing about being able to "emulate" shaders used on the 360 and PS3 is true and fully possible, because I don't want to see another Far Cry: Vengence on the Wii. 
Avatar image for TacticalElefant
TacticalElefant

900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#19 TacticalElefant
Member since 2007 • 900 Posts

PC's require much more power due to what others have said- OS,anti-virus, and several other running programs. However ...consoles are adding these things as well. Wii, Xbox and PS3 all basically have operating systems now with background programs runnign for internet and other things. The biggest factor though it you have to remember that The Wii only has to worry about running games in 480x640. Knock any pc game down to that and you can see how low it is as far as resolution. The xbox and ps3 can run up to 1080i which is a much higher resolution (i dunno the exact dimensions).

 

The Wii will look pretty nice eventually but i dont see it blowing anyone away . It's a fun system and we need some lasting games that arent party games.

dieasgrey


Well every game system needs an OS (OSs are necessary for interfacing all the components so they work correctly).  The real difference when it comes to hardware needs for PCs in comparison to consoles is RAM.  Plus requiring so much RAM for games does make it easy to take full advantage of the system.  With such high amounts of RAM, PCs can actually have much larger or more detailed worlds with less loading (albeit the GPU can render all that detail with ease).  On consoles, textures and other data generally gets streamed into the RAM and into the GPU constantly, especially with certain games like Grand Theft Auto.  Also the VRAM in PCs makes framebuffering very easy.  I do think the Hollywood could have a better texture memory buffer, as it's 3 MB but I think 6 MB or so could have been better, but it can be worked around.  Also the considerable amount of extra RAM at 88 MB should definetly help the Wii dish out alot more.  The fact the Wii has so much more along with the fact that the Wii has really good texture memory compression capabilities for storage is a good sign.  How so?  Why need so much more memory unless your system is alot more powerful?  I honestly do think a full 128 MB of RAM would have been a good idea, but the 88 MB is enough.
Avatar image for Quofan
Quofan

1606

Forum Posts

0

Wiki Points

0

Followers

Reviews: -1

User Lists: 0

#20 Quofan
Member since 2005 • 1606 Posts

You should post this on SW, it would bring up an interesting new twist to the debate about the Wii's graphics.

Personally with the whole developers being lazy thing, I think its basically a repeat of what happened with the PS2 - they arent willing to spend the time fine tuning the games, and making money from casuals is a big priority. Ubisoft themselves have acknowledged this mistake.

Avatar image for TacticalElefant
TacticalElefant

900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#21 TacticalElefant
Member since 2007 • 900 Posts

You should post this on SW, it would bring up an interesting new twist to the debate about the Wii's graphics.

Personally with the whole developers being lazy thing, I think its basically a repeat of what happened with the PS2 - they arent willing to spend the time fine tuning the games, and making money from casuals is a big priority. Ubisoft themselves have acknowledged this mistake.

Quofan


True as many multi platform games of last generation did not get special graphics treatments for the Xbox or Gamecube versions which could've been clearly better looking.
Avatar image for komdosina
komdosina

4972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 komdosina
Member since 2003 • 4972 Posts
[QUOTE="sonic_spark"]

That was one of the best breakdowns of the Wii's graphical capabilities.

I found that wall of text so interesting, thanks man for taking the time to write that.

 

TacticalElefant



It should give only more reason for you to be pissed at developers.  And on the clock speed thing, I've seen both 239 MHz and 243 MHz, and I went ahead with the smaller one "just in case" not to overestimatethe Wii.  Well also like I had said you got to take into account what kind of shader programs the Hollywood GPU can pull off and how efficient it is at pulling them off.  It's hard comparing capabilities when the GPUs use different architecture somewhat, so it's going to come down to real world conditions and games to see how they look.  Hopefully the whole thing about being able to "emulate" shaders used on the 360 and PS3 is true and fully possible, because I don't want to see another Far Cry: Vengence on the Wii. 

    Okay, I was wondering why you where using the number 239 MHz, whether its 243 MHz or 239 MHz it still is faster then Xbox's 233 MHz GPU, but 243 MHz sounds better :). As for Far Cry, I would have been just as happy had it looked like the Xbox version with the same graphics and AI running at 30fps with no slowdown (maybe added four player split screen not just two), a perfectly acceptable graphical situation for a first generation Wii game. Of course UBI Soft chose the cheap and rushed option rather then put the roughly extra 6 or 7 months to make proper Xbox Far Cry port, which I would rather have then that berley PS2 version.

 

Avatar image for TacticalElefant
TacticalElefant

900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#23 TacticalElefant
Member since 2007 • 900 Posts
[QUOTE="TacticalElefant"][QUOTE="sonic_spark"]

That was one of the best breakdowns of the Wii's graphical capabilities.

I found that wall of text so interesting, thanks man for taking the time to write that.

 

komdosina



It should give only more reason for you to be pissed at developers. And on the clock speed thing, I've seen both 239 MHz and 243 MHz, and I went ahead with the smaller one "just in case" not to overestimatethe Wii. Well also like I had said you got to take into account what kind of shader programs the Hollywood GPU can pull off and how efficient it is at pulling them off. It's hard comparing capabilities when the GPUs use different architecture somewhat, so it's going to come down to real world conditions and games to see how they look. Hopefully the whole thing about being able to "emulate" shaders used on the 360 and PS3 is true and fully possible, because I don't want to see another Far Cry: Vengence on the Wii.

Okay, I was wondering why you where using the number 239 MHz, whether its 243 MHz or 239 MHz it still is faster then Xbox's 233 MHz GPU, but 243 MHz sounds better :). As for Far Cry, I would have been just as happy had it looked like the Xbox version with the same graphics and AI running at 30fps with no slowdown (maybe added four player split screen not just two), a perfectly acceptable graphical situation for a first generation Wii game. Of course UBI Soft chose the cheap and rushed option rather then put the roughly extra 6 or 7 months to make proper Xbox Far Cry port, which I would rather have then that berley PS2 version.

 

There is a reason why the Gamecube and PS2 versions of Far Cry Instincts were cancelled.  I can say that even still, a PS2 or Gamecube version could look much better than Vengence.  Vengence is one of the worse case scenarios of a rushed game.  Oh and the reason why the Gamecube at least didn't get FC Instincts I'm sure was because of RAM limitations.  The PS2 version I'm sure had the same issue plus the overall graphics fillrate probably failed to deliver enough.