Sorry to state the obvious for some, but TC seems to be legitimately pondering the question. 1080p had been reached decades ago. It's the question of balancing framerate with resolution while advancing technology (better looking graphics, networking, AI, etc.). Technology is advancing and thus demands more from the machine. So while we enter "next gen", we can produce better graphics. But, we still might be stagnant with FPS, or even go backwards because devs might feel more pressured to produce "next gen" effects (such as better particle systems, scope of the environment, etc.). If they push it too far, their frames will drop as a result. To combat this, they can reduce the resolution of the images. Thus, they have a playable games that has a lot of neat effects going on.
Realistically, this depends on how close you sit next to your screen and how big it is. PC enthusiasts often brag about being able to produce higher resolution images, but it's ironic because they actually need to produce higher resolution images due to how close you normally sit to the PC monitor.
In my personal set up for my lounge room (size of the TV and how far I sit from it), I only need 720p. If it goes any higher, I won't actually be affected by it. On the other hand, for my PC, I would need to keep the resolution settings higher to get a good looking display.
FPS matters depending on the kind of game you're playing. If there is a lot going on (with lots of animations, etc.), 60 FPS would be better. Otherwise 30 is fine.
Log in to comment