No lems....this is for PC only.
I don't know the deep down and dirty about DX12 and all that....but thats one hell of a difference.
Edit: sorry if already posted.
This topic is locked from further discussion.
No lems....this is for PC only.
I don't know the deep down and dirty about DX12 and all that....but thats one hell of a difference.
Edit: sorry if already posted.
Exciting times. This will be like the jump from ps2 quality games to half life 2 during gen 6. Hope ps5 and X2 fully use dx 12, mantle to the fullest potential next gen ( but that's a long way
Wonder how much, if any, of a boost my GTX 760 will get. I had contemplated upgrading to a 900 series card but if the boost is as significant as that presentation shows I might just hold off and get something else while waiting for Nvida's next line up of cards.
Wonder how much, if any, of a boost my GTX 760 will get. I had contemplated upgrading to a 900 series card but if the boost is as significant as that presentation shows I might just hold off and get something else while waiting for Nvida's next line up of cards.
That benchmark is just showing the difference in draw calls between APIs, which something mostly related to CPU performance. That 3dmark test is not something you should use for comparing graphics card performance
DX11 is just fine, it can still deliver a massive amount of unique objects per frame with a decent cpu while delivering 60fps, sure DX12 will helped a lot with weak CPU's but that's it.
Cows am scared.
Why only cows?
What about lems and sheep? I guess when you're kicking the shit out of the competition, console wise, you should expect to have a target on your back.
Cows am scared.
Lems am hope.
PC Master Race?
Herms am praise/cheer?
No lems....this is for PC only.
lol It'll be interesting to see if the XB1's hardware can take advantage of it at all.
Wonder how much, if any, of a boost my GTX 760 will get. I had contemplated upgrading to a 900 series card but if the boost is as significant as that presentation shows I might just hold off and get something else while waiting for Nvida's next line up of cards.
That benchmark is just showing the difference in draw calls between APIs, which something mostly related to CPU performance. That 3dmark test is not something you should use for comparing graphics card performance
Ahhh, that makes a lot more sense now. Guess I will probably go get the 960 or something and see what this does for my i5.
Secret PC sauce is definitely on the way, pretty exciting stuff! This likely means it's going to be a couple years before it's standard though.
That'll depend on the Windows 10 up take, which is likely to be pretty high with the freetimed update. Though I'm expecting this years cards to be capable of fully using DX12.
No lems....this is for PC only.
lol It'll be interesting to see if the XB1's hardware can take advantage of it at all.
The CPU in both the XBone and PS4 does not clock in high enough to make very large gains from the multithreaded draw calls. PS4's main advantage doing so would be the fact that the CPU and GPU can reference the same memory addresses negating the need to duplicate items that were drawn back into main memory to calculate against in a real-time interactive environment sequence.
Then again, the memory management tools relating to the ESRam will be much more mature for the XBone when this suite is released to SDKs. Although, there's no patch for the 32MB physical space on the ESRam, the ability of the CPU to make more draw calls likely means that a better way to parallelize the command list brings the Xbone closer to its theoretical peak performance and "combined write" to the framebuffer.
People expecting dramatic results from the XBone need to remember that the software tool won't change the amount of pixels and vertices the GPU can rasterize. That will continue to the the achilles heel of the console. The range and amount of animation being pushed to the screen will improve though. It's literally impossible to state a percentage because a lot of the gains will hinge on the type of rendering engine developers use for a particular game.
No lems....this is for PC only.
lol It'll be interesting to see if the XB1's hardware can take advantage of it at all.
IMO, it will but not as drastic as PC.
No lems....this is for PC only.
lol It'll be interesting to see if the XB1's hardware can take advantage of it at all.
How DX12 will impact X1 in short is:
Allow full use of the two ACE units(these allow more gpu processing power for specific compute tasks like shader workloads, no where as strong as PS4 but for normal gaming scenarios not much of a difference will be seen between the two consoles.
Better efficiency due to reduced CPU overhead allowing more cpu cycles to go where their needed. (X1 is still using a modified DX11 ie DX11.X still has many limitations carried over from DX11)
Introduction of cpu multithreading allowing better communication between CPU and GPU and Implementing parallel processing, asynchronous and synchronous. Allows the maximum gpu usage using all available slow cpu cores to feed its data) PS4 API allows this but is dependent on if the dev's take the time to use it. Almost all dev's are only using the method of one slow core feeding all data to the gpu.
Better management of ESRAM resources
With multiplatform gaming DX12 development will affect the PS4 because the vast majority of games do not get coded to use all these modern features. This will force devs to code on the same level maximizing resources on each console.
@04dcarraher:
DX11.x already has low overhead API, but, it doesn't do multi-threaded draw calls due to DX11 software limitations. The gain is limited to multi-threaded draw calls using weaker multi-core CPU.
Yea, however DX12's overhead is still lower then DX11.X, it uses even less cpu cycles freeing up some cpu resources. Direct x prior to 12 has so much excess overheads with checks and sums to make sure the software and hardware can communicate. MS with DX11.X all they did was removed the checks and sums but did not change the core of how it handles everything on the cpu side.
@04dcarraher: Dont you get tired of repeating yourself?
As long as its civil no I dont.
@mr_huggles_dog: It has already been explain in this thread, if you dont understand and need it broken down into simple terms just say that but o4d put up a pretty good synopsis of what it entails.
Please Log In to post.
Log in to comment