The True Reasoning Behind why most 3rd Parties aren't going with WiiU.

This topic is locked from further discussion.

Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#1 LegatoSkyheart
Member since 2009 • 29733 Posts

http://operationrainfall.com/nintendos-gambit-business-strategy-for-the-wii-u/

The gist of it,

In order for Nintendo to keep Production Costs low, but still put out the same performance if not greater performance of today's console. They sacrificed the CPU and made the GPU stronger.

The growing trend of Triple AAA developers dying much like THQ is due to Game Production costs going from $5-$10 million to $20-$50 million. Surely these are probably going to increase in the Next Generation. Nintendo however, Is saying with a Stronger GPU, Developers could stop relying so much on the CPU and focus on the GPU for better performance at a low Production costs.

However 3rd Party Developers in the Western Front are too used to using the CPU that just switching to the GPU is not going to do the WiiU any favors, it's not as simple as flipping the switch, there's other things that need to be done before any of that can happen and because of it WiiU is losing out of a lot of 3rd party games like Bioshock Infinite, Tomb Raider, Castlevania Lords of Shadow 2 and Metro The Last Light.

This explains Nintendo's Interest to start collaborating with Japanese 3rd Party to release games on the WiiU, possibly so Nintendo can not only try to get the Western Audience more interested in Eastern Developed games than Western Developed games and to get the 3rd Party over there to understand GPU programing more than CPU programing.

 

So what do you think System Warriors? Was Nintendo switching to GPU focused Games rather than CPU focused games going to be worth it? or will Nintendo get 2nd scraps like the Wii before it?

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2 clyde46
Member since 2005 • 49061 Posts
I thoughts its always been GPU centered games? I havent played a game that maxed my CPU out.
Avatar image for DarkLink77
DarkLink77

32731

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#3 DarkLink77
Member since 2004 • 32731 Posts

I thoughts its always been GPU centered games? I havent played a game that maxed my CPU out. clyde46

Pretty much this. My GPU on my rig maxes out a lot quicker than my CPU.

Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#4 LegatoSkyheart
Member since 2009 • 29733 Posts

I thoughts its always been GPU centered games? I havent played a game that maxed my CPU out. clyde46

I'm not sure how game development works, so I'm not sure how things work. :P I always thought it was the GPU that brought out how pretty a game can look and CPU just made it how fast it can go.

I guess what the article is saying that the WiiU can run games solely on the GPU and the CPU doesn't have to be used much? (So maxing out the CPU as well as the GPU would make for a Better performing game? I have no idea.)

Avatar image for nintendoboy16
nintendoboy16

42195

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 14

#5 nintendoboy16
Member since 2007 • 42195 Posts
Well, if it's to attempt to try and attract Japanese developers, I can kind of see why. Isn't it said that Japanese gaming is failing? Then no wonder.
Avatar image for SaltyMeatballs
SaltyMeatballs

25165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#6 SaltyMeatballs
Member since 2009 • 25165 Posts
Could be that last gen (360/PS3) a lot of effort was put into the CPU, especially with the Cell. Maybe the game engines rely on it more, games like GTA IV even on PC need a good CPU.
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#7 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

Nonsense.  That CPU in the Wii U can do pretty much anything short of Dead Rising.  And the real reason they went with such a CPU is for BC, and the fact that their archictecture was good enough already, in comparison to the rest of the system.

Big publishers are just trying to cash in with the Wii U's first year and then no one will care about multiplats on the console with 3-5x more powerful consoles ripe for development.  Same as Wii.

Avatar image for YoshiYogurt
YoshiYogurt

6008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 YoshiYogurt
Member since 2010 • 6008 Posts
Well, if it's to attempt to try and attract Japanese developers, I can kind of see why. Isn't it said that Japanese gaming is failing? Then no wonder.nintendoboy16
Japanese gaming failing? I don't think so. In terms of quality, Japanese games have always been better than western games IN MY OPINION.
Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#9 LegatoSkyheart
Member since 2009 • 29733 Posts

Could be that last gen (360/PS3) a lot of effort was put into the CPU, especially with the Cell. Maybe the game engines rely on it more, games like GTA IV even on PC need a good CPU.SaltyMeatballs

That's right, I can't run GTA IV on my PC simply because of my CPU, despite it running Battlefield 3 and Saints Row the Third like a boss.

 

Nonsense.  That CPU in the Wii U can do pretty much anything short of Dead Rising.  And the real reason they went with such a CPU is for BC, and the fact that their archictecture was good enough already, in comparison to the rest of the system.

Big publishers are just trying to cash in with the Wii U's first year and then no one will care about multiplats on the console with 3-5x more powerful consoles ripe for development.  Same as Wii.

Chozofication

So the reason behind the GPU for the WiiU is so the WiiU could play Wii games?

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#10 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="SaltyMeatballs"]Could be that last gen (360/PS3) a lot of effort was put into the CPU, especially with the Cell. Maybe the game engines rely on it more, games like GTA IV even on PC need a good CPU.LegatoSkyheart

That's right, I can't run GTA IV on my PC simply because of my CPU, despite it running Battlefield 3 and Saints Row the Third like a boss.

 

Nonsense.  That CPU in the Wii U can do pretty much anything short of Dead Rising.  And the real reason they went with such a CPU is for BC, and the fact that their archictecture was good enough already, in comparison to the rest of the system.

Big publishers are just trying to cash in with the Wii U's first year and then no one will care about multiplats on the console with 3-5x more powerful consoles ripe for development.  Same as Wii.

Chozofication

So the reason behind the GPU for the WiiU is so the WiiU could play Wii games?

No it's the reason for the CPU.  The CPU is the same architecture as Gamcube's Gekko, which is also same as Wii. 

Besides BC though, you'd be surprised that its architecture is actually a lot better than the main units in Xbox 360/PS3.  Those chips sacrificed effeciency for raw speed like from Pentium 3 to Pentium 4.  Also Gekko had better architecture than Pentium 3, even.

Avatar image for BrunoBRS
BrunoBRS

74156

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#11 BrunoBRS
Member since 2005 • 74156 Posts
soooooo... nintendo should start luring PC devs in?
Avatar image for GunSmith1_basic
GunSmith1_basic

10548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 21

User Lists: 0

#12 GunSmith1_basic
Member since 2002 • 10548 Posts
The wiiu will see more support once nintendo makes some customs engines for the system, which Retro is currently doing.
Avatar image for Cherokee_Jack
Cherokee_Jack

32198

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 2

#13 Cherokee_Jack
Member since 2008 • 32198 Posts
Short-sighted move to sacrifice the CPU, with open-world games being so in vogue.
Avatar image for GunSmith1_basic
GunSmith1_basic

10548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 21

User Lists: 0

#14 GunSmith1_basic
Member since 2002 • 10548 Posts
Short-sighted move to sacrifice the CPU, with open-world games being so in vogue. Cherokee_Jack
that's what I though too, although the wiiu is more of a GPGPU architecture, which can do CPU functions so who knows?
Avatar image for Rocker6
Rocker6

13358

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 Rocker6
Member since 2009 • 13358 Posts

[QUOTE="clyde46"]I thoughts its always been GPU centered games? I havent played a game that maxed my CPU out. DarkLink77

Pretty much this. My GPU on my rig maxes out a lot quicker than my CPU.

Yep, most games are GPU bound, usually games where CPU matters a lot are the ones where you have lots of AI on screen, like strategy games, or games with huge battles like ARMA (saw people bringing the CPUs to their knees in level editors when they spawned a crazy bunch of NPCs)...

Avatar image for PCgameruk
PCgameruk

2273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 PCgameruk
Member since 2012 • 2273 Posts

With next gen consoles around the corner. Why would devs waste time optimizing there engines which probabley wont even be used for next gen games.

Avatar image for DJ-Lafleur
DJ-Lafleur

35604

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#17 DJ-Lafleur
Member since 2007 • 35604 Posts

Nintendo secretly calls other developers fat behind the scenes, and insults their mothers.

Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#18 LegatoSkyheart
Member since 2009 • 29733 Posts

No it's the reason for the CPU.  The CPU is the same architecture as Gamcube's Gekko, which is also same as Wii. 

Besides BC though, you'd be surprised that its architecture is actually a lot better than the main units in Xbox 360/PS3.  Those chips sacrificed effeciency for raw speed like from Pentium 3 to Pentium 4.  Also Gekko had better architecture than Pentium 3, even.

Chozofication

Oh, sorry, I thought you mistyped and put CPU instead of GPU. My bad. :P

Well that's interesting.

Avatar image for mysticstryk
mysticstryk

1709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 mysticstryk
Member since 2008 • 1709 Posts

Stupid topic.  The reason is simple.  3rd parties have never fully supported Nintendo.  Plus, if they were to support them now, then we wouldn't see it happening until late 2013/2014.  Why?  Because games like gta v, bioshock infinite, tomb raider and others have been in development for years, way before final dev kits of the wii u were sent out.  It would not be cost effective to do a very late port of a game far in development.

Avatar image for way2funny
way2funny

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#20 way2funny
Member since 2003 • 4570 Posts

At this point the CPU is the least important part of the system. 

Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 faizan_faizan
Member since 2009 • 7869 Posts
I thoughts its always been GPU centered games? I havent played a game that maxed my CPU out. clyde46
Crysis did in 2007.
Avatar image for Slow_Show
Slow_Show

2018

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 Slow_Show
Member since 2011 • 2018 Posts

Need Teuf in here to settle this, but that sure sounds like 100% USDA-Prime unsubstantiated bullsh*t. 

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#23 Wasdie  Moderator
Member since 2003 • 53622 Posts

How the hell would switching between how the game's engine processes data decrease production costs?

It wouldn't. What is this guy smoking?

Engine work may prevent a game fro having better graphics, or make it difficult to port, but it's not going to cost you X more to render something one way or another unless you're constantly flip-flopping during development and delaying overall development. 

He's talking about offloading CPU work onto the GPU and think that's what Nintendo wants to do, which is also wrong. The GPU is a GPU, it's meant to do graphics. Developers don't offload work on to the GPU because their APIs haven't supported and the cores of a GPU are much weaker than a single core of a CPU. It's only really good for mass calcuations on math, not general CPU work. So while it's great for crunching physics, it's terrible for the core of game engine processing. 

Nintendo picked a lower clock speed for the WiiU just to save on heat and power consumption. This allows them to have a smaller console with smaller cooling systems and yet still be reliable. That's it.

I really don't think this guy knows what he is talking about at all.

Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#24 LegatoSkyheart
Member since 2009 • 29733 Posts

How the hell would switching between how the game's engine processes data decrease production costs?

It wouldn't. What is this guy smoking?

Engine work may prevent a game fro having better graphics, or make it difficult to port, but it's not going to cost you X more to render something one way or another unless you're constantly flip-flopping during development and delaying overall development. 

He's talking about offloading CPU work onto the GPU and think that's what Nintendo wants to do, which is also wrong. The GPU is a GPU, it's meant to do graphics. Developers don't offload work on to the GPU because their APIs haven't supported and the cores of a GPU are much weaker than a single core of a CPU. It's only really good for mass calcuations on math, not general CPU work. So while it's great for crunching physics, it's terrible for the core of game engine processing. 

Nintendo picked a lower clock speed for the WiiU just to save on heat and power consumption. This allows them to have a smaller console with smaller cooling systems and yet still be reliable. That's it.

I really don't think this guy knows what he is talking about at all.

Wasdie

So I was right about GPU being what governs how pretty a game can look and the CPU is how fast it can go?

Avatar image for GunSmith1_basic
GunSmith1_basic

10548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 21

User Lists: 0

#25 GunSmith1_basic
Member since 2002 • 10548 Posts
^ I thought physics engines and such do a lot of the work for a dev though, and work has associated costs. For instance for many devs it makes more sense to pay money to Havoc to use their physics engine rather than spend the resources themselves. Only really rich devs like Rockstar can build custom engines. Has Unreal 3 even been adapted to the WiiU yet?
Avatar image for fernandmondego_
fernandmondego_

3170

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 fernandmondego_
Member since 2005 • 3170 Posts

[QUOTE="Wasdie"]

How the hell would switching between how the game's engine processes data decrease production costs?

It wouldn't. What is this guy smoking?

Engine work may prevent a game fro having better graphics, or make it difficult to port, but it's not going to cost you X more to render something one way or another unless you're constantly flip-flopping during development and delaying overall development. 

He's talking about offloading CPU work onto the GPU and think that's what Nintendo wants to do, which is also wrong. The GPU is a GPU, it's meant to do graphics. Developers don't offload work on to the GPU because their APIs haven't supported and the cores of a GPU are much weaker than a single core of a CPU. It's only really good for mass calcuations on math, not general CPU work. So while it's great for crunching physics, it's terrible for the core of game engine processing. 

Nintendo picked a lower clock speed for the WiiU just to save on heat and power consumption. This allows them to have a smaller console with smaller cooling systems and yet still be reliable. That's it.

I really don't think this guy knows what he is talking about at all.

LegatoSkyheart

So I was right about GPU being what governs how pretty a game can look and the CPU is how fast it can go?

From my understanding, it's as simple as this. GPU controls what everything looks like, CPU controls what everything is doing.
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#27 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Wasdie"]

How the hell would switching between how the game's engine processes data decrease production costs?

It wouldn't. What is this guy smoking?

Engine work may prevent a game fro having better graphics, or make it difficult to port, but it's not going to cost you X more to render something one way or another unless you're constantly flip-flopping during development and delaying overall development. 

He's talking about offloading CPU work onto the GPU and think that's what Nintendo wants to do, which is also wrong. The GPU is a GPU, it's meant to do graphics. Developers don't offload work on to the GPU because their APIs haven't supported and the cores of a GPU are much weaker than a single core of a CPU. It's only really good for mass calcuations on math, not general CPU work. So while it's great for crunching physics, it's terrible for the core of game engine processing. 

Nintendo picked a lower clock speed for the WiiU just to save on heat and power consumption. This allows them to have a smaller console with smaller cooling systems and yet still be reliable. That's it.

I really don't think this guy knows what he is talking about at all.

LegatoSkyheart

So I was right about GPU being what governs how pretty a game can look and the CPU is how fast it can go?

Think of the GPU as processing everything you can see, and think of the CPU processing how everything on that screen works and moves.  If the GPU can't handle the 4k textures on screen, things will slow down.  If it can, but still can't handle the AI for the 10,000 enemies, then things will also slow down, and vice versa.

Avatar image for nini200
nini200

11484

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 nini200
Member since 2005 • 11484 Posts

Nintendo secretly calls other developers fat behind the scenes, and insults their mothers.

DJ-Lafleur
This, it has to be this
Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#29 Gue1
Member since 2004 • 12171 Posts

you guys are all wrong. The PS4 will be the most powerful console.

Avatar image for lamprey263
lamprey263

45427

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#30 lamprey263
Member since 2006 • 45427 Posts
I think the userbase is the main reason, Wii U only has a couple million at the moment, Xbox 360 and PS3 have combined console sales of 150 million units, that's more potential customers for publishers to reach out to than the couple million on the Wii U, plus if last gen was an indicator of anything it's that people didn't care for 3rd party games, they got the Wii for Nintendo titles.
Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 Teuf_
Member since 2004 • 30805 Posts

Need Teuf in here to settle this, but that sure sounds like 100% USDA-Prime unsubstantiated bullsh*t. 

Slow_Show



My bullsh*t detector went off as soon as I read the shakespeare quote. The Wii U is what it is: a console with close-to-current-gen hardware performance with its own unique take on controls and a connected experience. Anyone claiming there's some sort of "hidden power" that's going to emerge later down the line is seriously misguided, and is probably just desparately trying to prove that their console of choice isn't as weak as it is. This one is particularly bad, as the author sutmbles through his attempts at throwing in technical specs:

"the console also has 2GB of eDDR4 RAM, about four times as much memory as the Xbox 360 and the PS3 at 512MB. Also the memory transfer speeds in the Wii U are much faster than the competition, as the DDR4 RAM is superior to their DDR3 RAM"

First of all it's just "DDR4" and not "eDDR4". The WiiU has some embedded DRAM but the 2GB of main memory is not embedded. Second, bandwidth of that DDR4 is actually significantly lower than the bandwidth of the memory in the Xbox 360 and PS3. The clock speed of the Wii U's RAM only provides 12.8GB/s , compared to the ~22GB/s provided by the Xbox 360 and PS3 RAM. This is because both of those consoles use GDDR3 memory and not DDR3 memory like the author mistakenly claims (the PS3 also has a pool of XDR memory, which also provides around 20GB/s of bandwidth).

Either way is ridiculous claims that forcing developers to move CPU tasks to the GPU is going to somehow save millions in development costs is beyond absurd. Most development costs come from content creation, not from some programmer figuring out how to get something to perform well.

Avatar image for Sword-Demon
Sword-Demon

7007

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#32 Sword-Demon
Member since 2008 • 7007 Posts
if that's the case, then I'll expect about the same results as the ps3. at first, devs will avoid it due to its structure being different, but as time goes on, they'll warm up to it.
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#33 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Slow_Show"]

Need Teuf in here to settle this, but that sure sounds like 100% USDA-Prime unsubstantiated bullsh*t. 

Teufelhuhn



My bullsh*t detector went off as soon as I read the shakespeare quote. The Wii U is what it is: a console with close-to-current-gen hardware performance with its own unique take on controls and a connected experience. Anyone claiming there's some sort of "hidden power" that's going to emerge later down the line is seriously misguided, and is probably just desparately trying to prove that their console of choice isn't as weak as it is. This one is particularly bad, as the author sutmbles through his attempts at throwing in technical specs:

"the console also has 2GB of eDDR4 RAM, about four times as much memory as the Xbox 360 and the PS3 at 512MB. Also the memory transfer speeds in the Wii U are much faster than the competition, as the DDR4 RAM is superior to their DDR3 RAM"

First of all it's just "DDR4" and not "eDDR4". The WiiU has some embedded DRAM but the 2GB of main memory is not embedded. Second, bandwidth of that DDR4 is actually significantly lower than the bandwidth of the memory in the Xbox 360 and PS3. The clock speed of the Wii U's RAM only provides 12.8GB/s , compared to the ~22GB/s provided by the Xbox 360 and PS3 RAM. This is because both of those consoles use GDDR3 memory and not DDR3 memory like the author mistakenly claims (the PS3 also has a pool of XDR memory, which also provides around 20GB/s of bandwidth).

Either way is ridiculous claims that forcing developers to move CPU tasks to the GPU is going to somehow save millions in development costs is beyond absurd. Most development costs come from content creation, not from some programmer figuring out how to get something to perform well.

Wii U has DDR"3", not 4, and its 1600 spec, 2gbs of it with half available for games, that's double 360's and DDR3 is more than fast enough for a 720p spec console, with the added bonus of having a much lower latency.  The 32mb's of eDRAM is there to do what the slower DDR3 can't.  There is no issue with bandwidth, Wii U has plenty.  Besides that, the Wii U may have more of that DDR3 unlocked for games as time goes by and the firmware matures.

The Wii U is 2x Xbox 360, it's hardly close.  The GPU is far more modern and the CPU isn't really an issue.

Avatar image for Kingpin0114
Kingpin0114

2607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 Kingpin0114
Member since 2008 • 2607 Posts

3rd parties are probably avoiding them because the majority of core 3rd party games will not sell well on a Nintendo console. Yep...that's probably it.

Avatar image for fernandmondego_
fernandmondego_

3170

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 fernandmondego_
Member since 2005 • 3170 Posts
They are moving to engines that the Wii U probably wont be able to run. Giving the sales of the WiiU so far and more importantly it's software sales, there is no reason to support it. Nintendo better have something up it's sleeves, 'cause the Gamecube line up probably isn't going to cut it.
Avatar image for Miroku32
Miroku32

8666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 0

#36 Miroku32
Member since 2006 • 8666 Posts

So I was right about GPU being what governs how pretty a game can look and the CPU is how fast it can go?

LegatoSkyheart
So I gather that the CPU is also the one in charge of handling the FPS? To be honest I only have a minimum of understanding about the GPU and the CPU.
Avatar image for WiiCubeM1
WiiCubeM1

4735

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#37 WiiCubeM1
Member since 2009 • 4735 Posts

[QUOTE="nintendoboy16"]Well, if it's to attempt to try and attract Japanese developers, I can kind of see why. Isn't it said that Japanese gaming is failing? Then no wonder.YoshiYogurt
Japanese gaming failing? I don't think so. In terms of quality, Japanese games have always been better than western games IN MY OPINION.

Well, your "opinion" is weird and scary to me.

Avatar image for Cali3350
Cali3350

16134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 Cali3350
Member since 2003 • 16134 Posts

The Wii U CPU is just an evolution of the Broadway CPU in the Gamcube.  a 1.25 SMT enabled PowerPC 750 core should at the very least match the Xbox 360 (probasbly about 40% fsaster or so SIMD, FPU might actually be a bit lower) so this doesnt make sense.  The Wii U CPU atleast matches whats in the current gen consoles.  

Avatar image for tubbyc
tubbyc

4004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 tubbyc
Member since 2005 • 4004 Posts

[QUOTE="LegatoSkyheart"]

So I was right about GPU being what governs how pretty a game can look and the CPU is how fast it can go?

Miroku32

So I gather that the CPU is also the one in charge of handling the FPS? To be honest I only have a minimum of understanding about the GPU and the CPU.

The GPU does help with FPS as well. If you have a PC and just upgrade the GPU, you'll get an increase in FPS, as long as the original GPU wasn't already being bottlenecked by a weak CPU, holding back performance, which is more about handling AI, many characters interacting in the game world at once, and physics. A much better GPU allows for higher resolution, higher AA, and more advanced graphical features at the same FPS when compared to an inferior GPU using the same settings.

Hence why there's sites with charts for GPU performance comparisons in FPS.

Avatar image for deomag4
deomag4

86

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 deomag4
Member since 2013 • 86 Posts

A couple reasons, and hardware isn't one of them (im pretty sure Wii U can handle PS3 and 360 games, it's not THAT bad).

a) low install base - there's currently 2.5-3 million total who own the console

b) demographic - of that 2.5-3 million, how many actually bought it for games like GTA V?? I'm sure most did as they are the hardcore system fans, but you have to think a chunk is wii owners who bought it for their kids or something like that. Which magnifies point a)

c) costs - this kind of goes hand in hand, but when you're looking at a maximum of 2, 2.5 million sales (which history is an indicator, not every wii u owner will buy a particular game), is it worth funding millions to make a port?? at best you're looking at maybe a slight profit or break even, but not much more than that.

d) Nintendo 3rd party history - really since the SNES, nintendo has had bad luck with 3rd party support. I think 3rd party devs are a bit more hesitant with nintendo these days.  

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts
I still think 3rd parties see the monetary benefit of releasing one game on two consoles and then further benefitting from dlc.
Avatar image for deomag4
deomag4

86

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 deomag4
Member since 2013 • 86 Posts

[QUOTE="Slow_Show"]

Need Teuf in here to settle this, but that sure sounds like 100% USDA-Prime unsubstantiated bullsh*t. 

Teufelhuhn



My bullsh*t detector went off as soon as I read the shakespeare quote. The Wii U is what it is: a console with close-to-current-gen hardware performance with its own unique take on controls and a connected experience. Anyone claiming there's some sort of "hidden power" that's going to emerge later down the line is seriously misguided, and is probably just desparately trying to prove that their console of choice isn't as weak as it is. This one is particularly bad, as the author sutmbles through his attempts at throwing in technical specs:

"the console also has 2GB of eDDR4 RAM, about four times as much memory as the Xbox 360 and the PS3 at 512MB. Also the memory transfer speeds in the Wii U are much faster than the competition, as the DDR4 RAM is superior to their DDR3 RAM"

First of all it's just "DDR4" and not "eDDR4". The WiiU has some embedded DRAM but the 2GB of main memory is not embedded. Second, bandwidth of that DDR4 is actually significantly lower than the bandwidth of the memory in the Xbox 360 and PS3. The clock speed of the Wii U's RAM only provides 12.8GB/s , compared to the ~22GB/s provided by the Xbox 360 and PS3 RAM. This is because both of those consoles use GDDR3 memory and not DDR3 memory like the author mistakenly claims (the PS3 also has a pool of XDR memory, which also provides around 20GB/s of bandwidth).

Either way is ridiculous claims that forcing developers to move CPU tasks to the GPU is going to somehow save millions in development costs is beyond absurd. Most development costs come from content creation, not from some programmer figuring out how to get something to perform well.

Wii U has DDR3, and stating clock speed is a very incomplete piece of data for comparing hardware. It's like saying 3B > 2C because B is being multiplied by 3. It's good to know, but it doesn't tell us much if we don't know some of the other aspects of the hardware (clock cycles per instruction, what the ISA is like, etc).

 

I think in about 2 years we'll see how limited the wii u is. It could be marginally if not at all better than the 360/ps3, it could end up showing off games that put it at a whole other level (i don't think so, they probably were a bit conservative with the hardware because of the costs of the gamepad)

Avatar image for super600
super600

33158

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#43 super600  Moderator
Member since 2007 • 33158 Posts

Nonsense.  That CPU in the Wii U can do pretty much anything short of Dead Rising.  And the real reason they went with such a CPU is for BC, and the fact that their archictecture was good enough already, in comparison to the rest of the system.

Big publishers are just trying to cash in with the Wii U's first year and then no one will care about multiplats on the console with 3-5x more powerful consoles ripe for development.  Same as Wii.

Chozofication

Actually the 360 and PS3 were like 20X more powerful then the Wii. That's what stopped most of the third party support.

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 Teuf_
Member since 2004 • 30805 Posts

[QUOTE="Teufelhuhn"]

[QUOTE="Slow_Show"]

Need Teuf in here to settle this, but that sure sounds like 100% USDA-Prime unsubstantiated bullsh*t. 

deomag4



My bullsh*t detector went off as soon as I read the shakespeare quote. The Wii U is what it is: a console with close-to-current-gen hardware performance with its own unique take on controls and a connected experience. Anyone claiming there's some sort of "hidden power" that's going to emerge later down the line is seriously misguided, and is probably just desparately trying to prove that their console of choice isn't as weak as it is. This one is particularly bad, as the author sutmbles through his attempts at throwing in technical specs:

"the console also has 2GB of eDDR4 RAM, about four times as much memory as the Xbox 360 and the PS3 at 512MB. Also the memory transfer speeds in the Wii U are much faster than the competition, as the DDR4 RAM is superior to their DDR3 RAM"

First of all it's just "DDR4" and not "eDDR4". The WiiU has some embedded DRAM but the 2GB of main memory is not embedded. Second, bandwidth of that DDR4 is actually significantly lower than the bandwidth of the memory in the Xbox 360 and PS3. The clock speed of the Wii U's RAM only provides 12.8GB/s , compared to the ~22GB/s provided by the Xbox 360 and PS3 RAM. This is because both of those consoles use GDDR3 memory and not DDR3 memory like the author mistakenly claims (the PS3 also has a pool of XDR memory, which also provides around 20GB/s of bandwidth).

Either way is ridiculous claims that forcing developers to move CPU tasks to the GPU is going to somehow save millions in development costs is beyond absurd. Most development costs come from content creation, not from some programmer figuring out how to get something to perform well.

Wii U has DDR3, and stating clock speed is a very incomplete piece of data for comparing hardware. It's like saying 3B > 2C because B is being multiplied by 3. It's good to know, but it doesn't tell us much if we don't know some of the other aspects of the hardware (clock cycles per instruction, what the ISA is like, etc).

 

I think in about 2 years we'll see how limited the wii u is. It could be marginally if not at all better than the 360/ps3, it could end up showing off games that put it at a whole other level (i don't think so, they probably were a bit conservative with the hardware because of the costs of the gamepad)



I was talking about memory bandwidth, and clock speed is totally relevent if you're talking about bandwidth. For memory you take clock speed * bytes per clock and you get the bandwidth. Clock speed on its own isn't particularly useful for comparing memory or processors out of context, but that I wasn't doing that. In fact I wasn't talking about processors at all, so I'm not sure why you would bring up ISA's or "clock cycles per instruction".

Avatar image for Rocker6
Rocker6

13358

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 Rocker6
Member since 2009 • 13358 Posts

[QUOTE="Slow_Show"]

Need Teuf in here to settle this, but that sure sounds like 100% USDA-Prime unsubstantiated bullsh*t. 

Teufelhuhn



My bullsh*t detector went off as soon as I read the shakespeare quote. The Wii U is what it is: a console with close-to-current-gen hardware performance with its own unique take on controls and a connected experience. Anyone claiming there's some sort of "hidden power" that's going to emerge later down the line is seriously misguided, and is probably just desparately trying to prove that their console of choice isn't as weak as it is. This one is particularly bad, as the author sutmbles through his attempts at throwing in technical specs:

"the console also has 2GB of eDDR4 RAM, about four times as much memory as the Xbox 360 and the PS3 at 512MB. Also the memory transfer speeds in the Wii U are much faster than the competition, as the DDR4 RAM is superior to their DDR3 RAM"

First of all it's just "DDR4" and not "eDDR4". The WiiU has some embedded DRAM but the 2GB of main memory is not embedded. Second, bandwidth of that DDR4 is actually significantly lower than the bandwidth of the memory in the Xbox 360 and PS3. The clock speed of the Wii U's RAM only provides 12.8GB/s , compared to the ~22GB/s provided by the Xbox 360 and PS3 RAM. This is because both of those consoles use GDDR3 memory and not DDR3 memory like the author mistakenly claims (the PS3 also has a pool of XDR memory, which also provides around 20GB/s of bandwidth).

Either way is ridiculous claims that forcing developers to move CPU tasks to the GPU is going to somehow save millions in development costs is beyond absurd. Most development costs come from content creation, not from some programmer figuring out how to get something to perform well.

You need to post more, that was a good read to start this fine day!

Avatar image for Miroku32
Miroku32

8666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 0

#46 Miroku32
Member since 2006 • 8666 Posts

[QUOTE="Miroku32"][QUOTE="LegatoSkyheart"]

So I was right about GPU being what governs how pretty a game can look and the CPU is how fast it can go?

tubbyc

So I gather that the CPU is also the one in charge of handling the FPS? To be honest I only have a minimum of understanding about the GPU and the CPU.

The GPU does help with FPS as well. If you have a PC and just upgrade the GPU, you'll get an increase in FPS, as long as the original GPU wasn't already being bottlenecked by a weak CPU, holding back performance, which is more about handling AI, many characters interacting in the game world at once, and physics. A much better GPU allows for higher resolution, higher AA, and more advanced graphical features at the same FPS when compared to an inferior GPU using the same settings.

Hence why there's sites with charts for GPU performance comparisons in FPS.

Thanks for explaining. I understand it better now.