The problem with defining it along side the console, is that the PC is ALWAYS ahead of them. We were playing rich multi-player games at HD resolutions and "next gen" console level graphics YEARS before the first next gen console hit the shelfs.
Same thing will happen again. PC gamers are by and large a digital distribution crowd. We're playing games at full 1080p or higher (instead of sub 720p like most console games) with much more advanced lighting and other rendering technology and in 3D stereoscopic and/or multi-monitor setups.
Guess what the next generation of consoles will look like? You guessed it, like a modern PC. Of course by the time the next gen consoles actually hit shelfs, PC's will be a generation of video cards and CPU's ahead of the consoles, probably completely digital, and god knows what else (it looks like kinect-like tech is coming to PC soon).
I would agree with the poster above who said we should define them by when most of the games begin making a significant push in system requirements. That would mean that this generation started around the time of Crysis, stalled a bit thanks to consoles and their outdated tech + multi-platform gaming, and will end probably around end of year next, when games begin to move forward toward DX11, multi-core CPU's and modern OS's.
Kinthalis
Well, the most common OS of PC gamers on Steam is XP 32-bit.Only a quarter of Steam users have quad-core CPUs, most have Intel Pentium Ds or Core 2 Duos running at speeds lower than 2.67 Ghz.
The ATI Radeon 4800 series is the most common card, with the 8800, 9800, 9600 and 8600 rounding out the Top 5.
Their most common display resolution is 1280x1024 (!), with less than 18% reporting resolutions of 1080p or higher.
This is according to the most recent survey, July 2010. [link] Why do we always define PC gaming according to what high-end gaming rigs are capable of, rather than what most PC gamers actually have?
Log in to comment