@evernessince: Specifically, in that link I meant the line from Robert Hallock about how console developers have seen 30% gains in some cases using it.
But I decided to look for more direct evidence and came across and article that mentions almost all the points I was trying to make, including those about DX12 using multiple cores better to achieve more draw calls, and a few examples of PS4 games using asynchronous shaders: Battlefield, Infamous Second Son, and The Tomorrow Children
It is possible to devote some of your new generation of graphics IP at the hardware level to backwards compatibility to varying extents. The 360 had a little bit to be compatible with original Xbox shaders, etc. However strict binary compatibility is MUCH more interesting, and I wouldn't be surprised if something like that lies within the Polaris architecture the PS4 Pro is running.
This line is interesting, though, for the point you're making:
"The Xbox One wasn’t mentioned in the slides – so it’s possible that the DX11 API inside the system doesn’t allow it, but things could certainly change when it is updated to DX12."
The examples I provided were for PS4 games, but it seems the Xbox One could be doing just as you said.
@evernessince: You raise some valid points in the first paragraph, but given that console CPU's are AMD CPU's I don't think they should be excluded from the discussion.
I think I got a bit mixed up about DX11 vs DX12 multithreading. Seems I was mixing that up with the multithreaded nature of async shaders (I also meant to say shaders, where I accidentally said compute).
I'm not convinced you completely understand what it means for a game to be CPU bound either. The CPU 'not keeping up with the GPU' is kind of right, and matches with what I said initially so I'm not sure why you added the 'FYI' as if it were a correction... Because it's effectively feeding the GPU work. If the GPU has an increased graphics workload (with something like a higher resolution like 4K) the CPU workload really won't change much. The GPU will be more stressed, and if the CPU finds itself waiting to feed it a bit - congrats, you're GPU bound. If you add a more powerful GPU here, you will get a better framerate. However if you're running at a lower resolution the GPU will be finishing its work very quickly, and the CPU will have to work harder to keep feeding it new work. An easy test for being CPU bound is to just reduce the graphics workload and see if your framerate improves. If you have an awesome GPU running a game at low settings at 1080p at 80 FPS, then decrease to 720P and still get 80 FPS - you're CPU bound. Putting a beefier GPU in the system at this point will not improve you beyond 80 FPS.
With respect to async shaders consoles absolutely use this feature and I'd go so far as to speculate it's part of why true hardware support for it is built into AMD's architecture and not Nvidia's (originally, a useful piece of IP for consoles). There are articles about console developers using async shaders to see gains: http://www.pcper.com/reviews/Editorial/Hot-Topic-Asynchronous-Shaders
Console APIs are not exactly PC APIs with hardware optimization. This is kind of vague and I'm not sure what you mean. Are most all graphics API's just taking programmer calls and chewing out some triangles.. I mean, yes? But in distinct ways. Consoles are technically compatible with versions of DX / OGL, but it's really never used in a big game because it's a terrible idea - the developers always use the lower level, less CPU heavy option.
For example, on consoles you don't have to write your shaders in HLSL, you can write them in the GPU microcode specific to the console GPU. On the PC typically your DX shaders are handled by the driver / shader compiler to compile into a specific platform's microcode. If you know the specific GPU you're targeting though, you can just basically write them in assembly outright.
AMD is beginning to offer something like this via its GPUOpen initiative via shader intrinsic functions. I'm not sure if the gains have been measured yet or how close to a console implementation they truly are, but it is interesting. It allows developers to access hardware instructions more directly rather than going through a high level API, AKA kind of what consoles do:
Oh wow, the PS4 Pro looked like it would be pretty heavy to me, but the original Xbox One is still heavier, eh?
I think there's an error here:
"Which consoles support 4K? All of the consoles here, with the exception of the original Xbox One, support 4K video playback, but the only consoles that purport to support 4K gaming include Sony’s upcoming PlayStation 4 Pro and Microsoft’s Project Scorpio system."
The PS4 and PS4 Slim don't support 4K video, right? Either this statement is incorrect, or the table above it is incorrect.
@evernessince: If you can get fully GPU bound, then yes I'm inclined to agree that console compute units will not somehow magically run better than PC compute units.
The overhead argument is more about bringing down CPU overhead, like what DX12 excels at.
It's less about whether the GPU can handle it and more about what overhead needs to be done to consistently feed the GPU. In DX11 it can still be quite a bit, and it doesn't really take advantage of multiple CPU cores as well as it could (something that DX12 does better, and consoles could also do)
EDIT:
I forgot that there are also some GPU hardware features that aren't exposed by DX11. Asynchronus shaders, for example. Console API's can also take advantage of something like this.
@evernessince: Just because the chip is the same architecture as PC doesn't mean the overhead is the same. The graphics API's are still lower level with less overhead.
A better comparison could probably be made with DX12/Vulkan on PC as it matures.
@feathersdoe: The character was introduced as bisexual earlier in the game. Her liking women was something she forced herself to do JUST to frustrate her father. She stuck with it a while out of habit, and ultimately regretted it later. Even though she's introduced as bisexual I'd argue she's probably straight and trying to force herself to be something she isn't only to spite her father.
The drug didn't alter her sexual preference. It made her perception of genders change in the interest of her own livelihood in battle.
Potion wears off and when she's completely in control of her actions she thanks the protagonist for the help in battle and develops feelings.
If what you said had happened then there might be reason to be outraged, but it didn't. There's no "gay conversion" and the drug certainly wasn't something debilitating like a roofie since it helped her survive.
The character in question was bisexual to begin with (as stated earlier in the game). All these misleading posts make it sound like she was drugged and forced to become straight.
She was drugged because she couldn't concentrate with all the pretty ladies on the battlefield. It had nothing to do with trying to make her straight. She only ever started liking women to frustrate her father, sort of kept going with it a while, then regretted it later. It's much more likely she was actually leaning straight the whole time, if anything.
Once the potion wears off, she's completely in control of herself and thanks the main character for helping her (again, with battle. life and death stuff), and develops feelings knowing full well he's a dude.
The only thing kind of sketchy is that the main character gave her a potion, but it was to help her survive and if she knew about it in advance then he thought it probably wouldn't work.
You can ask if the ends justify the means if you want, but there's no "gay conversion" via drugs here. Just some overreactions to a mistranslation by a tumblr user people and news sites started inexplicably accepting as fact.
Challenge is still there if you want it. Just don't use the cheat. I think it's great for someone that wants to go through and only experience the story.
Personally I'd like to see more games with "story-only" (or close to it) options. If that's all you want, then have at it. If not, just pick a higher difficulty.
pheria's comments