Quite frankly, knowing that 4K resolution was always just around the corner, I wonder why both vendors choose such entry level GPU's for their consoles?
The PS4's was based on a three year old entry level 1080p GPU, the 7870. The Xbox One's is a three year old entry level 720p-900p(sub-1080p) GPU, the 7790. Naturally, games change and become more demanding, so their ability to meet those resolution statistics begins to decline and pretty soon those GPU's have to be scaled down to 720P and 480P, just to continue to run more complex shading and textures. These GPU's were predominantly designed as light duty PC gaming GPU's, such as MMO's like WOW, GW2 etc. With the 7870 having a bit more resource to possibly cross into slightly more graphically complex titles, but more than likely at 720P, if trying to preserve graphical effects fidelity and not pixels. The only way to avoid scaling down to lower resolutions, as games get more demanding is to compress textures, induce horrible pop-in, lower vegetation/ tree counts and create smaller world maps and fake a lot of effects e.g. bloom instead of HDR, leave out ambient occlusion, etc. Either way, you begin to decline in some area of graphics, as you require more resources.
The PS4 and the Xbox One should have had, at minimum an HD7970..and even that, by today's standards is pretty weak, but much better for 1080p computing and possibly even helping to cross into beyond 1080P, but not exactly 4K. In other words, 28CU's should have been the next gen minimum. 16CU's is a joke, but I guess it was a matter of weighing out, "do we dump all of our funds into RAM? Or GPU?" The funny thing here, is both consoles would have never needed anything more than 2GB of RAM, for their respective GPU's. The 7790 itself only really requires 1GB of RAM, and the 7870, about 2GB. Both consoles have much more RAM than their GPU's can process at a given time lol It's like having 64GB of RAM in a PC today...what for? 16 is fine...chances are your desktop can't page flip through 64GB's worth of data without bottlenecking. Even the 7970 GPU, which is more powerful than both console's GPU is only really good for 3GB of VRAM. I'd have drawn a line down the middle and shipped both consoles with 2-3GB of VRAM and a 7970 and probably still saved both companies money. The results would be beautiful 1080p pictures at a locked 60FPS. Sure, there'd be more disk caching, but if you use an SSD, that's fine.
when Microsoft stated that DX12 would boost performance by 50%, they weren't lying. Many were mislead into thinking suddenly the Xbox One was going to be cranking out 1080p title after 1080p title. Let me point out the breakdown. This is actually from a more recent article pertaining to just how exactly it helps the Xbox One.
"Microsoft apparently confirms that DirectX 12's new Direct3D graphics API guarantees an increase in performance for the GPU of about 20%. The overall technology also brings an even bigger improvement to the CPU performance figures, as developers can expect a 50% increase in the output of the main processor."
Therefore, given the above, CPU has little to do with resolution and its increase in the CPU area greatly depends on whether said title will be CPU intense, or GPU intense. It depends on the graphics engine, essentially. Like PC, some games rely more on CPU, others GPU. Now, if leveraged appropriately, what you can do is take that extra 50% CPU performance and prevent less need for CPU-to-GPU offloading e.g. sending the GPU CPU-bound tasks. This, in turn, will allow for the GPU to focus on more intense rendering needs, which could equate to better resolution performance...or for games that support dynamic switching, allow that 1080p mark to be hit more often, but not necessarily 100% of the time. So, games like Halo 5 and Witcher 3 can benefit from the performance, but it won't give you 100% native 1080p.
Wow, the most hyped aspect of Windows 10(aka NXOE) was this whole DirectX 12 ordeal that created both praise and flame wars on both sides of the fence and they've been quite silent about it. I haven't heard a single reference to DirectX 12, since before they started referring to Windows 10 as NXOE.
Aside from that, I'm not sure why this update earns the praise of the biggest-most significant console update. It's not as much of a "leaps and bounds" update as was NXE's update to the original Xbox 360 blade-styled OS. I mean, it's great and all, but I don't even detect Windows 10 in it. It's essentially the Windows 8 Metro-styled tile system, in a re-arranged order(Windows 8 - Remixed lol). I actually feel like I prefer the original Xbox One OS layout, but with the efficiency and speed that this update brought.
When I first imagined Windows 10 launching on Xbox, for some reason, my mind automatically associated the whole "having a desktop" thing, which was one of the biggest aspects about Windows 10 for PC. I'm still not sure why Microsoft insists on pushing their tiles on us. I mean, these things aren't touch screen. If you can simply flip a switch on the PC iteration to either display in desktop mode, or tiled mode, why can't you on Xbox? Sure, for a phone or a yoga-style PC/ tablet, tiles are great...this, however, is Xbox.
I have an issue on our living room and bedroom xbox, where they often times lose pairing with the controller. So, if I try to turn the Xbox on by controller, it just sits there flashing and then the controller eventually shuts off. I have to turn the consoles on, let the console boot, then hold the pairing button in on the controller, and it may find the console, or not. But if I fiddle around with it for about 5 minutes, I will eventually get it to pair. Happens with both console this way.
I don't know. I like 60FPS, but I don't think hardware is there yet to push for that and consider it a "gold standard." Not without having to make too many trade-offs. PC's, sure...but consoles just aren't there yet. Perhaps this gen should just stay 30FPS and throw some extra polish into post processing and texture detail. Halo 5 graphics are great, but not as clean as it could be. I would happily game at 30FPS, solid, for extra shine. 30FPS doesn't bother me in the least, in Destiny and that game looks cleeeean.
Split screen affects the graphics quality, That's why Halo 5 did not use it. I believe I even recall the developer stating that in trying to adhere to the 60fps gold standard, meant dropping split-screen. You could still do split-screen and keep 60fps, but the game would have to be re-designed around split-screen being an option and after that you are limited to what you can do, graphically. It's a lot to process everything that is going on, on 4 screens. If the game had been 30FPS, no problem. You'd have the same graphics we have now, but with 4-way split at the cost of fluidity. Perhaps they will put more cloud processing elements in for more than AI and then we could have a smooth 4-way split with all the graphics fidelity and frames, but then that defeats the purpose of offline gaming.
I love the aiming. I think the complainers are people who aren't as seasoned at aiming, yet. Maybe they're used to having too much assist(?) It's funny how people never think it's themselves. Maybe it's their controller, too. Make sure there's no slack/ play in your movements. Your controller may be worn. I'm already on my second Xbox One controller because my original had too much dead give before tension was felt. It was affecting my play and I had to pick up sensitivity, but all this dead was accelerate me past my target most of the time and for most games. I'm seriously considering an Elite controller. Now my second controller is starting to show "slight" signs of heading out the door. I like my controllers tight. Once they start to exhibit the least amount of give/ dead movement...even if it's just a wee small fraction of deadness in the springs, I get a new controller. My wife is completely fine with playing on my old controllers, but I feel like it affects my performance.
Microsoft pisses me off. Not because of outages, either. Everyone who owns both consoles, knows they are down far fewer than Sony. However, when they are down, you Cannot play you favorite game/s without a connection. At least, digital downloads. So, I have very few precious little time to game. I get home from work, I go for a run, and it's time to eat dinner, I spend time with the wife and I have, probably, between 8PM and 10(EST) to play. So, I sign in and there's no Live..."fine!" I say, "I only need to play Metal Gear Solid and most of what I'm doing, is offline." I launch the game "Game cannot launch error: xxxxxxx(I don't remember the hex code)." So, I remembered having this issue during an outage, around Spring, with Dragon Age. So, I go into Network settings and go into Offline Mode. I launch the game "Your Xbox must be set to Home Xbox" to enable offline play." I have two Xbox's...one in the living room and one in the bedroom. Me and the wife used to co-op Halo often enough to warrant another box, but it's the only game she cares to play and she does use the living room Xbox to play shows/ Hulu/ Blu-Rays etc. In which case, I use the bedroom Xbox, and so, I can't have a specific Home Xbox, and even if I wanted to temporarily set one as my Home Xbox, I cannot do so without a Live connection, as the My Xbox option goes missing when Xbox Live is down. This means, you can't play the one title you've been working on."
So, I am bitter with them and will move back to enjoying Witcher 3 for my PS4. Screw it. I give Sony that victory, for sure. They do not tell you that you cannot play any one of your games. Digital, or otherwise. They may be down more often, but you can still play as if services were up. I will strongly consider purchasing all digital multi-platform games on PS4, in the future. The only reason I got MGSV on Xbox, was because MGSV Ground Zero was a free download on Xbox Live, a few weeks prior to Phantom Pain's arrival. Also, my buddy wanted the opportunity to raid my FOB, which apparently isn't even an option in the game anyway(you cannot raid your friend's bases).
@Tranula There's going to be games on both sides of the fence that you'll want to play. That's why I've always ever owned both. Who cares about the graphical differences. The true reason you buy a console is for their exclusives and the exclusives are custom tailored to that system's hardware so it looks good either way. For multi-platform games, just by the one that gets the best rating...which, yes, is more likely to be on the PS4...or better yet, buy it for whichever system most of your friends are on.
Kopesettik's comments