@07wintert mate, get the facts. CPU is based on AMD Jaguar, it's MOBILE technology.... overheating? probably, but much more difficult than overheating a launch Xbox 360 since it's safely under 150w power consumption.
And yes, I agree. 8 cores is crazy, a powerful CPU with just 4 cores like Intel i7 would have been a much better option, but then they are very expensive in comparisson.
I don't think I concur with the author of this article, I guess he doesn't know much about programming. In the very first place he's very pretentious suggesting next-gen consoles won't represent a challenge just because they're x86 based CPU's. I'm sorry my friend but that's not true.
Second most PC Graphic Engines use only 2 CPU cores and very rarely 4 or more just like CryEngine does. When we talk about 8 cores, it's easy to see from the mere CPU perspective it's not a piece of cake. And I can assure you CPU is powerful enough for next-gen games thanks to its 128 bit FPU, and also supports most of latest instruction sets, at the same time also features much better performance in highly threaded apps compared to Wii U, this will give tremendous potential in the years to come. Additionally, we have at least 12 or 18 Radeon GCN units (GPU) with enough power to assist the CPU in many ways we are not even sure at this moment, simply because Radeon GPGPU technology has being largely ignored by the industry all these years, and we don't have much in hands, so developers will need to build new specific apps and that will take time. And not to mention GPU graphic processing power by itself is at least 2x-6x more powerful than a PS3. And we only know the basics of the specs, we don't know much about extra hardware components, like in the case of the PS4, digital foundry claims it will feature a GPU-like compute module, working independently from the CPU and the GPU.
You simply can't compare console hardware with similar PC hardware and have a realistic idea of its real performance. Simply because coding for consoles have significantly more advantages, for example, coding from a very low level and then improving greatly overall performance. This privilege is simply not possible on PC platform because there are so many considerations and components in the market that developers just want their games to run fine, not really bothering about squeezing every bit of a specific hardware CPU/GPU. Also, I'm sure these consoles will be much more powerful than 90% of the PC's owned by average people. I think it's an easy bet to say comparing Wii U vs Next-gen consoles is like comparing N64 vs Dreamcast. First party launch games will prove what I'm saying, we don't even need to wait 2 years.
The truth is, the video game consoles industry has always used cheap hardware and always will, but people are also ignoring simple but important facts like most games run at 30 frames per second, very few games run at 60 frames per second (call of duty, gran turismo 5, etc). This will obviously give the opportunity to double detail almost automatically, even for the laziest of the programmers, and will be very easy to enhance graphic presentation with complex effects. On the other hand, no one said they need to run at 1080p. They can easily run at 720p@30 fps, use much better upscaling, and will look vastly superior to any Wii U game, granted. Available RAM in next-gen consoles guarantees texture resolution will be significantly better, anisotropic filtering from the new Radeon architecture will ensure they also be presented sharper at all times, while antialiasing filtering will be noticeably improved because it's so sophisticated it can handle both MLAA (morphologic antialiasing) and normal FSAA at the same time. That's just the beginning because additional RAM means additional tricks you can make. I can go on all day and would never finish telling about improvements.
heirdt's comments