This topic is locked from further discussion.
Seven years is quite a looong time for a PC.
This is not a closed and restricted system like a console but an open and ever-evolving one.
Simply put, devs aren't targeting that old hardware anymore.
But I can say that this gen, old PC hardware lasted really long.
I mean you could play almost anything with a Nvidia GTX 8800 and Intel Quad 6600 and 2 GB DDR2 RAM on console-like settings at playlable framerates and that would be a almost seven years old PC by now.
Infact, you still could if some devs wouldn't drop DX10 support. (such as CryTek with their Crysis 3 which requires a DX11 card to run - ironically the current consoles aren't even really capable of DX10, sans for WiiU)
Â
edit: I correct myself - a GTX 8800 with Q6600 runs games at *better* settings than consoles do. (for example Crysis on high/very high@720p - console version is low/med@720p + some improved lighting)
Infact I think this old PC system would outperform even the WiiU.
Why would you want to purposely limit yourself to the same hardware for 7 years on a platform that's upgradable
Not to mention that in multi-platform games like Bioshock they push the limit far more for the PC version with regards to textures and everything. If they kept it at the console level, a lower level pc would suffice.
I think this next generation is going to be very different from the previous ones, mainly because the 8th gen TV game systems are essentially PC's based on x86 CPU architecture.
What this means to me is the consoles will peak very early, they'll hit maximum potential with in a year, 2 at the most. There are games out that would already melt a PS4 (Crysis 3 and Tombraider to name a couple) and there will be more coming out before the next consoles are on the shelves. I feel quite safe in this prediction as developers already know how to use the hardware that will be in the nextbox and PS4. They will get better as they squeeze the hardware but this will be good for both the consoles and PC as everyone will benefit from 'optimisation'.
More on topic with your question, we know the power of the PS4 (Nextbox id of course TBC) and what the hardware in it means, there are no unknowns in the cake mix. So a PC with comparable hardware to the PS4 should be able to play multiplat games at the same level for the entire lifespan of the console.
PC exclusives and graphic advancements are a different thing all together though. A computer that will run Rome 2 on full graphics will be unlikely to run Medieval 3 (or Total War : Warhammer Fantasy with a bit of luck ;) ) on full graphics. Of course why would you want to fix the hardware in a PC, one of the major advantages is the upgradability of the platform.
It's actually not universally true, that 7 year old PC's can't play game at console settings. For some engines, this is true, for other, it isn't.Â
Developers on PC tend to look for a mid range PC as their target, and then scale their games up and down a bit depending on the hardware it's runnign on.Â
This generation, for example, you can absolutley play Half-life 2 engine game sona 7 year old PC. Crysis 3? Nto so much, but then again, Crysis 3 on PC is targetting a much better perfoming hardware than consoles. The game looks better than the console version, even at Low settings.
Overall htough, most PC gamers would have upgraded severla years ago, not necessarily because they COULDN'T play games at console settings, but because they DIDN'T WANT TO play games at console settings.Â
Display evolved, going from the old 780 to 1050, then 1080, etc. People wanted to play games at 60 FPS, and finally at least some develoeprs started takign advanatge of modern hardware by includign lighting, shadowing, and particle effects the consoles couldn't handle.
So, most PC gamers upgraded.
I have a friend whos been gaming on a gtx 8800 since launch and another with a 9800. Doesnt bother them at all, in fact they were playing tomb raider and felt it looked nice, definitely better than anytyhing on consoles even thought they're very old gpus... so what is your answer? It's upto the user/owner whether they want to upgrade or not, because pc gamers have that freedom. They do plan on getting 700 series cards though.
I believe you can. I played Crysis 2 on a laptop with a Radeon 4250, while the card is newer, it isn't near as powerful as a GTX 8800, and it is actually worse then the consoles GPU. The cpu the laptop had was better then the 360's CPU though, but just by a little. I ran it at the ps3 resolution for Crysis 2, 1024x720.
Just to prove this, here is a video of C2 on a radeon 4250.Â
Â
Â
4250 is an integrated GPU,even worseI believe you can. I played Crysis 2 on a laptop with a Radeon 4250, while the card is newer, it isn't near as powerful as a GTX 8800, and it is actually worse then the consoles GPU. The cpu the laptop had was better then the 360's CPU though, but just by a little. I ran it at the ps3 resolution for Crysis 2, 1024x720.
Just to prove this, here is a video of C2 on a radeon 4250.Â
Â
Â
SentientMind
Because a ton of stuff is scaled back to work on consoles, and built for them due to the static hardware. And I don't just mean textures, and resolutions, but even animations, model detail, and model count are often scaled back to function properly.
On PC games can't be built that way. Low settings is just a downgrade to resolutions, texture quality, and other cosmetic things, but there can be no changes to the actual gameplay ever. A dev doesn't know what PC their games are going to, so it must be the exact same game with the assumption that someone is playing on low, or maxed out.
It's not that every game has to do this though, but some do. Some console games literally can't function as just the "PC version on low settings", and plenty of times the devs develop different mechanics to get a game running properly on consoles.
Bioshock Infinite is playable on the 8800gtx which is almost 7 years old. Back in the late 90's and early 2000's top of the line gaming pc's would be obsolete within 3 years.Not to mention that in multi-platform games like Bioshock they push the limit far more for the PC version with regards to textures and everything. If they kept it at the console level, a lower level pc would suffice.
ankor77
If you stick to console level settings I think you could make hardware last without upgrading for 7 years. I'm not sure how a 8800 does in this day. But it should be able to handle most games at ~720P, 30 fps and low/medium settings, right?
Ben-Buja
Correct, I still have my 8800 gtx in my closet, in case I ever need a backup. Depending on the games, it can do medium/high ish, or some multiplats like ME3, it will max them easily
I think pc hardware can last pretty well with console settings. My first rig, which has a Geforce 8500GT can happily play a lot of modern console games at 720p (or lower as is case for a lot of console games) without any AA, and at 30fps like consoles. The 8500GT is the exact same age as the PS3 in the UK, so I think it has lasted pretty well (for a £60 in 2007 graphics card)Â
Admittedly that would be sticking with Dx9 where possible, and possibly slightly lower settings than consoles, but still all same, quite impressive for a card which was low budget even back in 2007.
I'm sure you could make a pc that would last 7 years for sure but you would have to deal with diminishing returns while with consoles you exp will remain pretty consistent throughout. Maybe when you build it you can max every game and get 60FPS no problem but over time you have to start dropping settings to maintain 60FPS. Then eventually at lowest settings you cant get 60 so you start to aim for 30 and then eventually you cant even maintain that. Theres a few reason for this.
Consoles enjoy a nice advantage over pc's in that way. They have specialized designs specifically for gaming and paper thin OS / Api's that really allow then to utilize the hardware to its maximum while on PC's the maximum power will almost never be fully utilized for a variety of reasons. Also for a developer having that one set pf hardware to worry about is a huge advantage because they can make sure that their games works on that hardware even if they have to make con sessions maybe making crucial changes to an engine dropping certain effects or whatever but they can make sure that all 360s will play that game for example. For pc its a crapshoot really due to how many different configurations there are. You also have to take into account that PC's are asked to do more for the same games. Things like ambient occlusion, tessellation and advanced forms of AA. Much higher resolutions when compared to consoles and 2 to 8 times the refresh rate. When you add all of these things up you should easily realize that consoles and pc's are not on the same lvl power wise.
I feel like now though its easier than ever to build a PC that will last a long time after getting a glimpse of what the ps4 is gonna have. If you have high end gear in your pc right now and your not trying to run like a triple monitor 3D setup you should be fine for a good while maybe even the whole upcoming gen as long as long as your willing to sacrifice performance down the road but all the games should at least be playable.
A 8800GTX is 7 years old, so a pc with this GPU can still play recent games with better graphics and performance than consoles.
Why can't a PC play the newest games for 7 years without upgrading?Sali217It can. I just did it. My last PC I built I built in 2006 and I just now built a new PC.
Not true.
Â
I've built my PC in 2006.
It cost me around 550 bucks.
I can still play every game with better graphics and better framerates than my Xbox 360.
Â
So I'm pretty sure I my current PC will outlast this generations consols and I hope I can build a gaming PC that's
cheaper than the next gerneration and will play games with the same or better graphics/framerate.
It can. If it was a good system when it was new. And if it's playing at low settings/resolution, like the console versions. Except for the rare exception like Crysis 3 PC version, which is one of the first games (maybe the actual first) to drop support for anything older than DX11. So, even if you have a high-end DX10 card, which is actually more powerful than a low-end DX11 card, you won't even be able to try to play the game.
But yeah, aside from cases like that, where they just upgrade the API to the point it's not compatible with older hardware, a PC that was "high-end" 7 years ago, will still run today's games.
I'm sure you could make a pc that would last 7 years for sure but you would have to deal with diminishing returns while with consoles you exp will remain pretty consistent throughout.
Gen007
LOL. If playing today's multiplatform games, with better graphics and framerates than the console versions (albeit not maxed out), is the price of "diminishing returns", than I'm pretty damn happy with my diminishing returns. And what about the diminishing returns on consoles? What about the late-gen games that have atrocious framerates, and are barely playable at times?Â
Sorry consolites, but even a PC that's as old as your consoles, assuming it was decent when it was new, can get better performance in multiplats 99 times out of 100. PC gamers don't spend extra money upgrading their rigs so they can play the latest games. They spend extra money upgrading their rigs so they can play the latest games looking leaps and bounds better than consoles, and running at triple the framerate. Rather than looking just a little better, and running only double the framerate.
Not true.
Â
I've built my PC in 2006.
It cost me around 550 bucks.
I can still play every game with better graphics and better framerates than my Xbox 360.
Â
So I'm pretty sure I my current PC will outlast this generations consols and I hope I can build a gaming PC that's
cheaper than the next gerneration and will play games with the same or better graphics/framerate.
Specs please.[QUOTE="Gen007"]
I'm sure you could make a pc that would last 7 years for sure but you would have to deal with diminishing returns while with consoles you exp will remain pretty consistent throughout.
the_bi99man
LOL. If playing today's multiplatform games, with better graphics and framerates than the console versions (albeit not maxed out), is the price of "diminishing returns", than I'm pretty damn happy with my diminishing returns. And what about the diminishing returns on consoles? What about the late-gen games that have atrocious framerates, and are barely playable at times?Â
Sorry consolites, but even a PC that's as old as your consoles, assuming it was decent when it was new, can get better performance in multiplats 99 times out of 100. PC gamers don't spend extra money upgrading their rigs so they can play the latest games. They spend extra money upgrading their rigs so they can play the latest games looking leaps and bounds better than consoles, and running at triple the framerate. Rather than looking just a little better, and running only double the framerate.
Well I wasn't really comparing the graphics quality between consoles and PC's when i said that. What I meant is that say you build your 7 year pc lets say 07 pr 08 even and its top of the line then and you can max everything at a crazy high resolution at 120 frames and that's what your used to playing at then slowly overtime though you can't hit that anymore with the latest games. Yeah that pc can probably start crysis 3 but i doubt your maxing it out at 120 FPS anymore and all the settings are set low just to make it playable and that's annoying when your used to playing at the opposite end of the spectrum. Of course the average PC gamer would just upgrade but in the hypothetical 7 year situation that's what you would have to deal with.
On consoles though you don't have that problem. You don't have to start playing games at half the resolution or at half the frame rate all of the sudden just because your hardware is aging or having to worry about the other stuff. I spend a good bit of time playing on console and PC. IMO consoles have been in the same spot performance wise all gen. I haven't noticed the diminishing returns you speak of. This gen's consoles have had spotty frame rates since day 1 and I experienced it first hand it's not a late gen t at this point people are just used to it. Oblivion could get pretty choppy on the 360 at launch. Its actually impressive what has been done with such old hardware but once again im not saying that console games look better than PC games and im not saying that the 7 year old pc still wouldn't play games better than the consoles. Just that you would have to put up with that performance fall off. That is what i meant by consistent.
[QUOTE="the_bi99man"]
[QUOTE="Gen007"]
I'm sure you could make a pc that would last 7 years for sure but you would have to deal with diminishing returns while with consoles you exp will remain pretty consistent throughout.
Gen007
LOL. If playing today's multiplatform games, with better graphics and framerates than the console versions (albeit not maxed out), is the price of "diminishing returns", than I'm pretty damn happy with my diminishing returns. And what about the diminishing returns on consoles? What about the late-gen games that have atrocious framerates, and are barely playable at times?Â
Sorry consolites, but even a PC that's as old as your consoles, assuming it was decent when it was new, can get better performance in multiplats 99 times out of 100. PC gamers don't spend extra money upgrading their rigs so they can play the latest games. They spend extra money upgrading their rigs so they can play the latest games looking leaps and bounds better than consoles, and running at triple the framerate. Rather than looking just a little better, and running only double the framerate.
Well I wasn't really comparing the graphics quality between consoles and PC's when i said that. What I meant is that say you build your 7 year pc lets say 07 pr 08 even and its top of the line then and you can max everything at a crazy high resolution at 120 frames and that's what your used to playing at then slowly overtime though you can't hit that anymore with the latest games. Yeah that pc can probably start crysis 3 but i doubt your maxing it out at 120 FPS anymore and all the settings are set low just to make it playable and that's annoying when your used to playing at the opposite end of the spectrum. Of course the average PC gamer would just upgrade but in the hypothetical 7 year situation that's what you would have to deal with.
On consoles though you don't have that problem. You don't have to start playing games at half the resolution or at half the frame rate all of the sudden just because your hardware is aging or having to worry about the other stuff. I spend a good bit of time playing on console and PC. IMO consoles have been in the same spot performance wise all gen. I haven't noticed the diminishing returns you speak of. This gen's consoles have had spotty frame rates since day 1 and I experienced it first hand it's not a late gen t at this point people are just used to it. Oblivion could get pretty choppy on the 360 at launch. Its actually impressive what has been done with such old hardware but once again im not saying that console games look better than PC games and im not saying that the 7 year old pc still wouldn't play games better than the consoles. Just that you would have to put up with that performance fall off. That is what i meant by consistent.
Simple pc's progress and bring higher graphical quality, and requirements require newer hardware. However your wrong about consoles not having the same problems . For example the call of duty's, COD 2 ran at 1280x720 to 1024x600 from cod4 to MW3, then BO 960x544 to BO2 880x720.... to try to keep the 60 fps while pushing slight graphical improvements with each game. Also ive noticed many games that are 30 fps locked from start of this gen to toward its end have become more unstable with framerates staying above 30.[QUOTE="iPage"]Specs please.Not true.
I've built my PC in 2006.
It cost me around 550 bucks.
I can still play every game with better graphics and better framerates than my Xbox 360.
So I'm pretty sure I my current PC will outlast this generations consols and I hope I can build a gaming PC that's
cheaper than the next gerneration and will play games with the same or better graphics/framerate.
Cranler
I'd love to see those specs too. I call BS in every regard of his post...other than the "building it in 2006" part.
This is something that has always confused me, and as I'm looking at building a consolised computer I can't really understand it. Basically, why can't a PC run games at console like settings for as long as a console can? People have been telling me I can't make one that will play all the latest games without upgrading for that long. But I really don't see why not, it's clearly more a more powerful platform, it should be able to last even longer without upgrading?Sali217Flawed argument, Nvidia 8800GTX came out in 2006 and still crushes the PS3 and Xbox.
bad video is badmyth of console optimization
Â
Yes optimization happened, but when games started looking better, the pc hardware of the same spec started looking better as well
Â
PROOF -Â http://www.youtube.com/watch?v=abGW1bk1nmM&list=FLGqf82gZWsoJ_jHAggd1jPw&index=12
lunar1122
Please Log In to post.
Log in to comment