New consoles don't necessarily have a technical edge over PC's that are their contemporary. Their graphics cards are usually modified versions of a top graphics card at the time or one coming down the pipeline, they have modified existing CPU's (except in Sony's case, but the performance is in the same ballpark), and they have less Ram than a contemporary PC. Their hardware isn't better than the best contemporary PC's, simply put. In fact, new consoles are historically just a bit behind upcoming PCs when you take their hardware as a whole - but that slightly weaker hardware is far more optimized and streamlined than PC hardware.
That being the case though, almost every gen I can remember has the consoles putting out the most technically impressive games at the time in the first year or so of their lifespan. Why is this? Even with hardware that isn't as good as the best PCs, it is extremely optimized, and console game makers never have to say to themselves "Ok, if we make a game that uses the best possible hardware 100%, most of our audience won't be able to play it." Epic could make Gears of War what it was, the best looking game at the time, and 100% of XBOX's could play it - whereas Crytek knew making Crysis that a huge portion of its potential audience would not be able to play their game.
So summed up? Consoles, even out the game, do not have hardware that trumps PCs - but developers can safely use it as best they possibly know how right out the gate and every console user can still enjoy the game... Unlike PCs which are limited not just by upper hardware limits, but by average user hardware limits.
Log in to comment