this is not a simple topic and comes down to the games being made and what they require (yeah duh i know).
there are 2 sides to this really: the CPU and the GPU. ram is also a factor but, with current systems anyway, i think devs are happy with the ram.
on the GPU side FC can go on for a long long time now. GPUs are not really changing all that much. assuming that the PS5 uses vega for a second (itll probably use navi of course) its not that different from the GPU in the PS4. they both support the same level of DX/openGL/vulkan support (to all intents and purposes. i know sony uses their own low level driver of course and i think vega has a slightly higher level DX12 level support but nothing major.). they can both pull off the same types of effects. the PS5 GPU will just be more efficient, more refined and have a crap load more brute force. but its not a fundamental shift in how GPUs work or anything. to put it another way: any effect a PS4 can do a switch can do to...just not as quickly. strictly speaking for the GPU side: if 3rd parties did support the switch in multiplats it wouldn't hold back the high end for a long time. it wouldn't even hold back the PS5 or X2. compare that to, say, a wii and an xbox 360: the way the GPU worked in the wii was completely different to the 360 and there was stuff the 360 could do that the wii simply couldnt do even if you threw all the wiis resources at the problem.
on the CPU side things are trickier as we don't actually know what devs would do with more CPU power at their disposal. AC unity, for example, was built under the assumption that the CPUs in the current consoles would be a bigger leap over their predecessors. they werent and the games performance suffered due to large crowds and other complicated simulations. these were scaled back for syndicate.
but the underlying simulation side of a game can be harder to cut. its not like reducing a texture size or turning an effect off; tinkering with the simulation can have a profound effect on what the game is. there is some low hanging fruit that can reduce the burden like reducing the number of characters on the screen, simplifying animations for characters in the distance, reducing the world geometry, having generally fewer objects on the screen to send to the GPU (remember the CPU is also involved in the graphics process.). but what about slowing down the physics simulation? or maybe a dev has an idea that requires the CPU to keep track of huge amount of data in the background (shadow of mordor on the 360/PS3, as far as i remember, was cut back mainly due to CPU and ram limits. the nemesis system was axed.).
e.g. last gen the wii couldnt run the underlying logic for climbing buildings in the creed games. the CPU wasnt good enough to do that AND simulate the AI and do its share of the rendering and have the number of characters in screen needed to make other systems in the game work and so on. thats a fundamental part of the creed games. they could work around it of course but then it becomes a different game (probably more like POP). even if the wii had a powerful GPU and loads of ram: CPU wasn't good enough. creed was out the window.
there is an argument that one of the issues this gen is that we are essentially playing prettier versions of last gen games. nothing has really been massively enhanced (except the nemesis system in LOTR :P). if you set aside the graphics, there isnt that much being made for a PS4 that couldnt run well on a PS3 for example. however there were games made for the PS3/360 that just fundamentally wouldnt work on a PS2/GC/xbox. something like oblivion or skyrim, in terms of the underlying simulation, would have to be curtailed a lot. the AI would need to be scaled back. things like the dragon fights would probably be a non runner. remember morrowind was the best the OGxbox could muster (now you could argue that morrowind is the better game and i would agree but the underlying simulation is much simpler.). i mean if you think back you can probably think of tons of games on a 360 that just wouldnt work on an ogxbox on a simulation level. either something would need to be really cut back or it would need to completely change so that the CPU in the ogxbox could deal with it.
now whether the blame for this is the CPU or the cost of development is debatable (im leaning more on the cost of development side myself if we are to assert that the argument is true) but there does come a point where cutting back the underlying simulation of a game designed to run on, say, an 16 core/32 thread ryzen CPU (not that the PS5/X2 will have that) so itll run on the jaguar cores of the PS4/X1 will have a very profound effect on the game. turning down graphics is far easier that scaling back the underlying logic of a game.
so where do you draw the line on FC with the CPU side? if devs just want to make prettier versions of games we are getting today then the jaguar cores are probably going to be able to get he job done for years to come. but what would they like to do with more CPU power available?
Log in to comment