[QUOTE="skrat_01"] Developers learn how to work around the hardware better over time. True, but they DO also learn to better utilize the hardware, not all improvements are strictly from working around the hardware, but from working WITH the hardware.
The hardware does not magically improve through "optimisation".
No, but the use of hardware DOES.
Fanboys love to pull the 'performance card'
Fact is Gears of War 1 is using just as much hardware as Gears of War 2.
Difference is with Gears of War 2 the developers have learns how to work around the hardware better.
This isn't true, while I can't speak for G2 or G1 situation, keeping the execution units of the machine(in both the CPU and GPU)full can vary and does later in the generation. I guarantee that Gears 1 keeps a smaller percentage of the execution units full than Gears 2 does.
The idea of 'squeezing more out of hardware' is an invention designed to make people think the hardware is still good. It isnt. Its aged and outdated, and developers have to think of better ideas to work with it.
Happens with every freaking console every gen.
No, you DO squeeze more out of the hardware...however that doesn't make it good. The consoles GPU feature set will really get exloited. Whereas on the PC the hardware is gone and obsolete long before you really learn its intricacies.
Steppy_76
While your general idea is sound, your reasoning isn't. The hardware may not "get better" but the utilization of it does.Well indeed working around, working with, learning how to ultilise it better. Problem is a game like Bomberman Act Zero can be using just as much reasources as GTA IV, simply depending on how the devs have utilise it.Yes use of hardware does. The term 'optimisation' is thrown around that anything can be scaled to the hardware. It has its set limitations, and even if you learn to use it better those limitations are still going to impact on your game design. Hence why its impossible for a game like Crysis to run - because of game design, and designers like Kojima had to scale their original ideas back a notch, because of the limitations.
Fair enough
I dont think the term 'squeeze more out' fits at all. It gives the imagery that the consoles hardware is producing more than it should be, exeeding its boundaries... when in reality its boundaries are set and pre determined. Even if a developer learns to use the hardware better, they arent somehow jumping over its limitation hurdles, just learning how to deal with them better. There is way too much focus on the 'hardware' rather than the developers use of it.
As for PC hardware, I wouldent say that - too a degree. Developers arent limited by pre set hardware restrictions, and ultimatley it comes down to how your software scales for your target audience. Hence why games like the Total War series grace the platform. Though how your software utilises the hardware comes into play, and it seems more about how well - lets say - the game engine is optimised, than how X card utilises it.
Log in to comment