OMG you cows are so insecure and delusional. When did i ever say that the PS4 didn't have a better GPU? I've never disagreed, what i did was make a joke about the PS4 is 50% more powerful line that cows have been throwing around, when its not remotely true. and you took it as me being a lem. Why are you hardcore cows so insecure? Seriously man, you are costing me so many good pairs of underwear because every time i read a post of yours, i shit myself laughing in your stupidity.
Actually is more than 50% in many instances..
30 FPS vs 60 FPS on Tomb Raider is 100% difference in frames.
720p vs 1080p is more than 100% difference in Pixels,power difference on GPU are misunderstood by people like you,50% more power from one console or another using the same family of GPU doesn't translate into 50% better graphics.
The 7970 has double the flops of the 7850 basically doesn't mean it has 100% better graphics,it just translate into faster frames,at higher resolutions.
If DX12 can really double the power of a GPU, then put DX12 on the 360 that way it will be as powerful as the Xbox One, then there would be no need for the Xbox One. Oh wait, you can't do that? OK. Didn't think so.
Quit using your mind. Logic is not applicable here. This is a lemming DX12 thread. Notice how they ignored my last comment in here? lol
Logic? That isn't logic AT ALL... Not every piece of software works on the same hardware. What's next, I can install all PC games on my Mac?
Your comment is the dumbest thing I've read this week, you're lucky it's monday
Funny how you can't prove it wrong. It's ok that you don't understand how energy transforms. lol
I'm not sure but doesn't the Xbox 360 use DirectX 9, but with updates some features from DirectX 10 and 11 were added. Now you suddenly think they can run DirectX 12, completely? When it couldn't even run DirectX 10 and 11 completely... If you really want to talk tech, I'm up for it.
If DX12 can really double the power of a GPU, then put DX12 on the 360 that way it will be as powerful as the Xbox One, then there would be no need for the Xbox One. Oh wait, you can't do that? OK. Didn't think so.
Quit using your mind. Logic is not applicable here. This is a lemming DX12 thread. Notice how they ignored my last comment in here? lol
Logic? That isn't logic AT ALL... Not every piece of software works on the same hardware. What's next, I can install all PC games on my Mac?
Your comment is the dumbest thing I've read this week, you're lucky it's monday
Funny how you can't prove it wrong. It's ok that you don't understand how energy transforms. lol
I'm not sure but doesn't the Xbox 360 use DirectX 9, but with updates some features from DirectX 10 and 11 were added. Now you suddenly think they can run DirectX 12, completely? When it couldn't even run DirectX 10 and 11 completely... If you really want to talk tech, I'm up for it.
Your ignoring my OP.
I don't have the DirectX 12 API, I'm not a developer. So I can't give you facts that will prove that Xbox 360 can't run DirectX12, but some common sense will tell you that this will be the case.
is there people that stupid to think a 9 year old hardware can run directx 12 with just sofrware updates? If you are ignorant please at least abstain from talking about what you don't know and even more so arguing stupid bs.
OMG you cows are so insecure and delusional. When did i ever say that the PS4 didn't have a better GPU? I've never disagreed, what i did was make a joke about the PS4 is 50% more powerful line that cows have been throwing around, when its not remotely true. and you took it as me being a lem. Why are you hardcore cows so insecure? Seriously man, you are costing me so many good pairs of underwear because every time i read a post of yours, i shit myself laughing in your stupidity.
Actually is more than 50% in many instances..
30 FPS vs 60 FPS on Tomb Raider is 100% difference in frames.
720p vs 1080p is more than 100% difference in Pixels,power difference on GPU are misunderstood by people like you,50% more power from one console or another using the same family of GPU doesn't translate into 50% better graphics.
The 7970 has double the flops of the 7850 basically doesn't mean it has 100% better graphics,it just translate into faster frames,at higher resolutions.
If DX12 can really double the power of a GPU, then put DX12 on the 360 that way it will be as powerful as the Xbox One, then there would be no need for the Xbox One. Oh wait, you can't do that? OK. Didn't think so.
Quit using your mind. Logic is not applicable here. This is a lemming DX12 thread. Notice how they ignored my last comment in here? lol
Logic? That isn't logic AT ALL... Not every piece of software works on the same hardware. What's next, I can install all PC games on my Mac?
Your comment is the dumbest thing I've read this week, you're lucky it's monday
Funny how you can't prove it wrong. It's ok that you don't understand how energy transforms. lol
I'm not sure but doesn't the Xbox 360 use DirectX 9, but with updates some features from DirectX 10 and 11 were added. Now you suddenly think they can run DirectX 12, completely? When it couldn't even run DirectX 10 and 11 completely... If you really want to talk tech, I'm up for it.
Your ignoring my OP.
I don't have the DirectX 12 API, I'm not a developer. So I can't give you facts that will prove that Xbox 360 can't run DirectX12, but some common sense will tell you that this will be the case.
is there people that stupid to think a 9 year old hardware can run directx 12 with just sofrware updates? If you are ignorant please at least abstain from talking about what you don't know and even more so arguing stupid bs.
It's official, so people cannot understand jokes.
Anyways, I'm trying to talk about how OP said DX12 will cause the GPU to run hotter. That's impossible without puting more electrical energy through the GPU, or gimping the cooling system.
So once again 50% more power doesn't mean 50% better graphics,get a damn clue..
60FPS vs 30 FPS is 100% difference and is actually bigger than the difference between the 7870 vs the 7970 in Tomb Raider under the same settings.
SO a cow who doesn't even have a PS4 and clearly has no clue about any of this, wants to tell me who has them both about it? really. again cherry picking numbers to prove a point, shows you have no understanding in the word variables.
If DX12 can really double the power of a GPU, then put DX12 on the 360 that way it will be as powerful as the Xbox One, then there would be no need for the Xbox One. Oh wait, you can't do that? OK. Didn't think so.
Quit using your mind. Logic is not applicable here. This is a lemming DX12 thread. Notice how they ignored my last comment in here? lol
Logic? That isn't logic AT ALL... Not every piece of software works on the same hardware. What's next, I can install all PC games on my Mac?
Your comment is the dumbest thing I've read this week, you're lucky it's monday
Funny how you can't prove it wrong. It's ok that you don't understand how energy transforms. lol
I'm not sure but doesn't the Xbox 360 use DirectX 9, but with updates some features from DirectX 10 and 11 were added. Now you suddenly think they can run DirectX 12, completely? When it couldn't even run DirectX 10 and 11 completely... If you really want to talk tech, I'm up for it.
Your ignoring my OP.
I don't have the DirectX 12 API, I'm not a developer. So I can't give you facts that will prove that Xbox 360 can't run DirectX12, but some common sense will tell you that this will be the case.
is there people that stupid to think a 9 year old hardware can run directx 12 with just sofrware updates? If you are ignorant please at least abstain from talking about what you don't know and even more so arguing stupid bs.
It's official, so people cannot understand jokes.
Anyways, I'm trying to talk about how OP said DX12 will cause the GPU to run hotter. That's impossible without puting more electrical energy through the GPU, or gimping the cooling system.
with lems on this site you never know what kind of bs they are going to come up with so i just accept that they are talking seriously and that they are that dumb.
So once again 50% more power doesn't mean 50% better graphics,get a damn clue..
60FPS vs 30 FPS is 100% difference and is actually bigger than the difference between the 7870 vs the 7970 in Tomb Raider under the same settings.
SO a cow who doesn't even have a PS4 and clearly has no clue about any of this, wants to tell me who has them both about it? really. again cherry picking numbers to prove a point, shows you have no understanding in the word variables.
So once again 50% more power doesn't mean 50% better graphics,get a damn clue..
60FPS vs 30 FPS is 100% difference and is actually bigger than the difference between the 7870 vs the 7970 in Tomb Raider under the same settings.
SO a cow who doesn't even have a PS4 and clearly has no clue about any of this, wants to tell me who has them both about it? really. again cherry picking numbers to prove a point, shows you have no understanding in the word variables.
Exactly. Really bugs me.
and it gives a bad name to the sony fans who aren't idiots. People like tormentos are hopeless
Sony has no exclusive that is 1080p 60fps. And what??
how many games actually run 1080p on xb1 compared to ps4, and what???
How are ether of these two statements relevant to the topic?? That was my point. You point out obscure non relevant fact, so can I.
And as far as your statement, Im sure there are more 1080p games on the ps4, and Im sure there are more 60fps on the ps4. But there are none that are both on the ps4. That's what
I don't normally reply to amoeba-like mouth breathers but Resogun is 1080p @60 fps, with full-on realtime voxel physics simulation. Now get banned boi.
Anyways, I'm trying to talk about how OP said DX12 will cause the GPU to run hotter. That's impossible without puting more electrical energy through the GPU, or gimping the cooling system.
Actually that is not true...
Most GPU don't run full speed all the time,if you stress he hardware for long time it will get hotter,i have play many games on PS3 which barely make my PS3 warm,but as soon as i pop Uncharted 3 my PS3 would get very hot.
Remember when Gears of war came out and many xbox 360 actually die.? Yeah it was because non the game had taxed the hardware was bad as Gears did,so the xbox 360 which by default run hot,got even hotter,got several friends who never experience RROD until the played Gears for an hour...
Now i am not saying that he silly theory about doubling performance is true or close,since i don't believe a damn API will double the performance of a console just like that,if it was that easy MS would not even cancel the 10% reservation.
But you can make a GPU hotter without having to put more electricity or over clock it.
So once again 50% more power doesn't mean 50% better graphics,get a damn clue..
60FPS vs 30 FPS is 100% difference and is actually bigger than the difference between the 7870 vs the 7970 in Tomb Raider under the same settings.
SO a cow who doesn't even have a PS4 and clearly has no clue about any of this, wants to tell me who has them both about it? really. again cherry picking numbers to prove a point, shows you have no understanding in the word variables.
Look at the difference between the 7970 and the 7870 in Tomb Raider under the same damn quality blind fanboy.
18 FPS average is the difference between the 7870 and the 7970.
The PS4 has 20 FPS advantage over the xbox one version of that game on average,the PS4 version at times commands a 30 FPS advantage,oh should i add that while the 7870 and 7970 have the same quality in everything,when you compare the PS4 version vs the xbox one version is not the case either.
For the most part the main graphical bells and whistles are lavished equally across both consoles, although intriguingly there are a few areas that do see Xbox One cutbacks. As demonstrated in our head-to-head video below (and in our vast Tomb Raider comparison gallery), alpha-based effects in certain areas give the appearance of rendering at half resolution - though other examples do look much cleaner. We also see a lower-quality depth of field in cut-scenes, and reduced levels of anisotropic filtering on artwork during gameplay. Curiously, there are also a few lower-resolution textures in places on Xbox One,
So once again 50% more power doesn't mean 50% better graphics,get a damn clue..
60FPS vs 30 FPS is 100% difference and is actually bigger than the difference between the 7870 vs the 7970 in Tomb Raider under the same settings.
SO a cow who doesn't even have a PS4 and clearly has no clue about any of this, wants to tell me who has them both about it? really. again cherry picking numbers to prove a point, shows you have no understanding in the word variables.
Look at the difference between the 7970 and the 7870 in Tomb Raider under the same damn quality blind fanboy.
18 FPS average is the difference between the 7870 and the 7970.
The PS4 has 20 FPS advantage over the xbox one version of that game on average,the PS4 version at times commands a 30 FPS advantage,oh should i add that while the 7870 and 7970 have the same quality in everything,when you compare the PS4 version vs the xbox one version is not the case either.
For the most part the main graphical bells and whistles are lavished equally across both consoles, although intriguingly there are a few areas that do see Xbox One cutbacks. As demonstrated in our head-to-head video below (and in our vast Tomb Raider comparison gallery), alpha-based effects in certain areas give the appearance of rendering at half resolution - though other examples do look much cleaner. We also see a lower-quality depth of field in cut-scenes, and reduced levels of anisotropic filtering on artwork during gameplay. Curiously, there are also a few lower-resolution textures in places on Xbox One,
Anyways, I'm trying to talk about how OP said DX12 will cause the GPU to run hotter. That's impossible without puting more electrical energy through the GPU, or gimping the cooling system.
Actually that is not true...
Most GPU don't run full speed all the time,if you stress he hardware for long time it will get hotter,i have play many games on PS3 which barely make my PS3 warm,but as soon as i pop Uncharted 3 my PS3 would get very hot.
Remember when Gears of war came out and many xbox 360 actually die.? Yeah it was because non the game had taxed the hardware was bad as Gears did,so the xbox 360 which by default run hot,got even hotter,got several friends who never experience RROD until the played Gears for an hour...
Now i am not saying that he silly theory about doubling performance is true or close,since i don't believe a damn API will double the performance of a console just like that,if it was that easy MS would not even cancel the 10% reservation.
But you can make a GPU hotter without having to put more electricity or over clock it.
Over clocking is more electricty. DX12 is just software stream lineing.
Saw elsewhere... Agreed mostly. Modified formatting for Reddit (which I suck at)
Extremely heavy tech article overall very positive for X1 - to immediately be followed up with downplay and fanboy nonsense unchanged from the last six months. Great :\
In case anyone can actually admit they aren't a programmer, but still wants to know what this is all about, I'll do my best to explain, briefly.... (I'm a low level hardware programmer, I don't program for DirectX, all below is just my actually-educated take on it)
DX12 is introducing a new way to program for GPUs. As code has become more complex the old way of programming is becoming less efficient. MS is estimating that in most games, frame to frame, about 90% of the code is the same. This makes perfect sense, engines do this naturally of course, even between same-engine games the majority of the rendering will look familiar. The Last Of Us code knows how to make a Last of Us frame, and that doesn't change for the most part. DX12 is introducing "bundles" that highly optimize this process.
Bundling code wouldn't do much on it's own though, besides making the code look easier to read and change for the dev (so like OpenGL's version of bundles, there is a benefit in usability). So the cool part is that MS/DX12 has devised a way to use the CPU to auto-optimize the time, the order, the priority, where the results are stored, and how to que up the next instruction. This is CPU heavy. It's using the CPU to allow the GPU to run at a higher efficiency.
Ok, but the CPU is a finite resource too! You can't just fall back on the CPU to help keep the GPU fed and churning along, it's not like the CPU is just sitting idle. So... Another aspect of DX12 is that it's designed to not only reduce the overhead on the CPU (see Mantle), but it's being implemented as way to improve the parallelism of all the cores (see the DX12 launch slides where one CPU core is always loaded far higher than the rest, much more equally loaded after DX12). By reducing the dependance on the core assigned to the "main game loop" which runs at 1.75/1.6Ghz X1/PS4 respectively, they are breaking the load down across all available game cores. This increased ability to push parallel tasks means a huge increase in effective CPU throughput. This is why we even have multiple CPU cores of course. DX12 allows for benefit from them.
So now DX12 is helping load the GPU at the cost of CPU while taking that new CPU load and spreading out across cores (see article and slides above). But... DX12 is also adding new rendering techniques like the previous versions of DirectX. So, the new effects that came with DX9, 10, 11... There are new versions for 12 as well. These have not yet been detailed. What remains to be seen is if they are just updated versions of 11.2/existing hardware functions (hardware tiling, etc) or if MS included the new DX12 hardware in the X1 SoC. They were developing the two concurrently for the past 5 years... So it's very likely to me they would have included them together. No guarantee.
DX12 is also doing something cool with ram management. By loading the CPU heavy with GPU efficiency tasks, the CPU is now keeping track of the CPU and the GPU's memory requirements. So it's the ideal time to schedule/execute tasks and move memory around. Something that often gets REALLY misconstrued is the "ONLY" 32mb of ESRAM the X1 has. DX12 is being shown to have the ability to dynamically move data on a predictive nature based on... bundles. What they are saying is that the system and dev will be able to make sure the right memory is where it needs to be when it needs to be. For this, the X1 would utilize the "Move Engines" / DMA controllers so that the CPU and GPU both while busy doing tasks, the DMA is loading and unloading from system ram and ESRAM all the time. It doesn't matter that you have a 32mb plate and want to eat an elephant, you do it the same way everytime - you eat it one bite (byte, heh) at a time.
In reply to the author's notes and like minded to another poster here... Remember all that CPU loading DX12 is now doing? Remember that the X1 has a slightly faster processor than other consoles? What else helps with CPU utilization? Cloud offloading (see Titanfall's AI). If you have a system that you know is going to load the CPU very heavily in order to massively improve efficiency on the GPU... You try and off load things like map generation, AI, NPCs, physics, and whatever else you can to the network/cloud. Besides pulling load from the CPU, could compute could be a huge benefit for realism later on (see latest Cloud Compute building destruction physics example), and right now it's just awesome for reduced lag. This is an interesting advantage for MS as they are the only console player currently with access to a network that can make this possible.
.
So... That should be about it. There is a WHOLE LOT of wait and see here. But what is for sure, is that this is a MAJOR shift in technology here (at the DX12 reveal AMD saying "it's like getting four generations are hardware ahead today"). So while it MAY or MAY NOT "double performance" on X1. I think it's clear it's going to be a big deal. With next wave games like Wolfenstien and Sniper3 already 1080p on X1 just with more time over launch games, or better MS tools, or removed kinect/app/OS reservation (we really don't know the answer to which) it seems to me, that DX12 will firmly cement 1080p for future X1 games (see Phil Spencer's comments implying we're in the 360-Perfect Dark Zero phase right now and not the 360-Halo4 stage). Now... Before the fanboys jump in with some propaganda that doesn't work on anyone but the extreme base, does it really hurt you if X1 games are 1080p? No, it helps you. I fully expect PS4 games to remain 1080p as well. It's not a competition. If you'd like, you can comfort yourselves with obscure hardware specifications I guess.
TL;DR: it doesn't matter what DX12 is doing if you aren't a game programmer. Short story is MS does seem to have a plan, while I'll admit it's clear both consoles were launched early, it does seem that MS has a plan in place for their hardware decisions they've made. But... If you see a fanboy immediately coming here to downplay DX12... Well, that just confirms it's a pretty big deal.
This all might not happen the way MS is planning for, wait and see. Sorry, I can't write briefly to save my life!
4) Constantly tickle Sony's taint with your tongue
5) Constantly too dense, so i won't waste my time explaining why you are wrong
6) Constantly a from of entertainment, such as when a dog chases its tail
7) Constantly living a sad life
8) Constantly having to copy and paste stuff, as you have no mental capacity of your own
9) Constantly the butt of everyone's jokes
10) Constantly so stupid, even other sony fanboys laugh at you
At least you are consistent ;)
I am constantly wrong because you say so,but you don't show or put any arguments to counter mine just your silly damage controls.
Why am i to poor do you even know what i work in and how much i make an hour.? So because i don't feel like i need to jump into the next gen i am poor...lol
How do you even know i live a sad life do you live here.? Nop.?
So basically you pull up all this sh** yet you can't put a decent reply to counter my argument.? Hahahaha
You want to discus hardware but you accuse people of been wrong without proving them wrong,so instead of doing what i did show you what the difference between both consoles is and showing you what the difference be tween hardware on PC also is,all you do is stupid comments without substance or anything.?
Please dude next time you want to argue with me bring an argument rather than a rhetoric attack.hahahaha
@ttboy said:
Interesting stuff from Noodles...
Source
Saw elsewhere... Agreed mostly. Modified formatting for Reddit (which I suck at)
Extremely heavy tech article overall very positive for X1 - to immediately be followed up with downplay and fanboy nonsense unchanged from the last six months. Great :\
In case anyone can actually admit they aren't a programmer, but still wants to know what this is all about, I'll do my best to explain, briefly.... (I'm a low level hardware programmer, I don't program for DirectX, all below is just my actually-educated take on it)
DX12 is introducing a new way to program for GPUs. As code has become more complex the old way of programming is becoming less efficient. MS is estimating that in most games, frame to frame, about 90% of the code is the same. This makes perfect sense, engines do this naturally of course, even between same-engine games the majority of the rendering will look familiar. The Last Of Us code knows how to make a Last of Us frame, and that doesn't change for the most part. DX12 is introducing "bundles" that highly optimize this process.
Bundling code wouldn't do much on it's own though, besides making the code look easier to read and change for the dev (so like OpenGL's version of bundles, there is a benefit in usability). So the cool part is that MS/DX12 has devised a way to use the CPU to auto-optimize the time, the order, the priority, where the results are stored, and how to que up the next instruction. This is CPU heavy. It's using the CPU to allow the GPU to run at a higher efficiency.
Ok, but the CPU is a finite resource too! You can't just fall back on the CPU to help keep the GPU fed and churning along, it's not like the CPU is just sitting idle. So... Another aspect of DX12 is that it's designed to not only reduce the overhead on the CPU (see Mantle), but it's being implemented as way to improve the parallelism of all the cores (see the DX12 launch slides where one CPU core is always loaded far higher than the rest, much more equally loaded after DX12). By reducing the dependance on the core assigned to the "main game loop" which runs at 1.75/1.6Ghz X1/PS4 respectively, they are breaking the load down across all available game cores. This increased ability to push parallel tasks means a huge increase in effective CPU throughput. This is why we even have multiple CPU cores of course. DX12 allows for benefit from them.
So now DX12 is helping load the GPU at the cost of CPU while taking that new CPU load and spreading out across cores (see article and slides above). But... DX12 is also adding new rendering techniques like the previous versions of DirectX. So, the new effects that came with DX9, 10, 11... There are new versions for 12 as well. These have not yet been detailed. What remains to be seen is if they are just updated versions of 11.2/existing hardware functions (hardware tiling, etc) or if MS included the new DX12 hardware in the X1 SoC. They were developing the two concurrently for the past 5 years... So it's very likely to me they would have included them together. No guarantee.
DX12 is also doing something cool with ram management. By loading the CPU heavy with GPU efficiency tasks, the CPU is now keeping track of the CPU and the GPU's memory requirements. So it's the ideal time to schedule/execute tasks and move memory around. Something that often gets REALLY misconstrued is the "ONLY" 32mb of ESRAM the X1 has. DX12 is being shown to have the ability to dynamically move data on a predictive nature based on... bundles. What they are saying is that the system and dev will be able to make sure the right memory is where it needs to be when it needs to be. For this, the X1 would utilize the "Move Engines" / DMA controllers so that the CPU and GPU both while busy doing tasks, the DMA is loading and unloading from system ram and ESRAM all the time. It doesn't matter that you have a 32mb plate and want to eat an elephant, you do it the same way everytime - you eat it one bite (byte, heh) at a time.
In reply to the author's notes and like minded to another poster here... Remember all that CPU loading DX12 is now doing? Remember that the X1 has a slightly faster processor than other consoles? What else helps with CPU utilization? Cloud offloading (see Titanfall's AI). If you have a system that you know is going to load the CPU very heavily in order to massively improve efficiency on the GPU... You try and off load things like map generation, AI, NPCs, physics, and whatever else you can to the network/cloud. Besides pulling load from the CPU, could compute could be a huge benefit for realism later on (see latest Cloud Compute building destruction physics example), and right now it's just awesome for reduced lag. This is an interesting advantage for MS as they are the only console player currently with access to a network that can make this possible.
.
So... That should be about it. There is a WHOLE LOT of wait and see here. But what is for sure, is that this is a MAJOR shift in technology here (at the DX12 reveal AMD saying "it's like getting four generations are hardware ahead today"). So while it MAY or MAY NOT "double performance" on X1. I think it's clear it's going to be a big deal. With next wave games like Wolfenstien and Sniper3 already 1080p on X1 just with more time over launch games, or better MS tools, or removed kinect/app/OS reservation (we really don't know the answer to which) it seems to me, that DX12 will firmly cement 1080p for future X1 games (see Phil Spencer's comments implying we're in the 360-Perfect Dark Zero phase right now and not the 360-Halo4 stage). Now... Before the fanboys jump in with some propaganda that doesn't work on anyone but the extreme base, does it really hurt you if X1 games are 1080p? No, it helps you. I fully expect PS4 games to remain 1080p as well. It's not a competition. If you'd like, you can comfort yourselves with obscure hardware specifications I guess.
TL;DR: it doesn't matter what DX12 is doing if you aren't a game programmer. Short story is MS does seem to have a plan, while I'll admit it's clear both consoles were launched early, it does seem that MS has a plan in place for their hardware decisions they've made. But... If you see a fanboy immediately coming here to downplay DX12... Well, that just confirms it's a pretty big deal.
This all might not happen the way MS is planning for, wait and see. Sorry, I can't write briefly to save my life!
Never read so much garbage in one post....
Anything the xbox one can do by API the PS4 can do it as well,they should drop the charade already,and stop take lemmings for suckers with false hopes.
so wait a sec, is MS in other words telling PC owners not to buy next-gen GPU's because DX12 will give current GPU's double the performance? rofl rofl rofl
Saw elsewhere... Agreed mostly. Modified formatting for Reddit (which I suck at)
Extremely heavy tech article overall very positive for X1 - to immediately be followed up with downplay and fanboy nonsense unchanged from the last six months. Great :\
In case anyone can actually admit they aren't a programmer, but still wants to know what this is all about, I'll do my best to explain, briefly.... (I'm a low level hardware programmer, I don't program for DirectX, all below is just my actually-educated take on it)
DX12 is introducing a new way to program for GPUs. As code has become more complex the old way of programming is becoming less efficient. MS is estimating that in most games, frame to frame, about 90% of the code is the same. This makes perfect sense, engines do this naturally of course, even between same-engine games the majority of the rendering will look familiar. The Last Of Us code knows how to make a Last of Us frame, and that doesn't change for the most part. DX12 is introducing "bundles" that highly optimize this process.
Bundling code wouldn't do much on it's own though, besides making the code look easier to read and change for the dev (so like OpenGL's version of bundles, there is a benefit in usability). So the cool part is that MS/DX12 has devised a way to use the CPU to auto-optimize the time, the order, the priority, where the results are stored, and how to que up the next instruction. This is CPU heavy. It's using the CPU to allow the GPU to run at a higher efficiency.
Ok, but the CPU is a finite resource too! You can't just fall back on the CPU to help keep the GPU fed and churning along, it's not like the CPU is just sitting idle. So... Another aspect of DX12 is that it's designed to not only reduce the overhead on the CPU (see Mantle), but it's being implemented as way to improve the parallelism of all the cores (see the DX12 launch slides where one CPU core is always loaded far higher than the rest, much more equally loaded after DX12). By reducing the dependance on the core assigned to the "main game loop" which runs at 1.75/1.6Ghz X1/PS4 respectively, they are breaking the load down across all available game cores. This increased ability to push parallel tasks means a huge increase in effective CPU throughput. This is why we even have multiple CPU cores of course. DX12 allows for benefit from them.
So now DX12 is helping load the GPU at the cost of CPU while taking that new CPU load and spreading out across cores (see article and slides above). But... DX12 is also adding new rendering techniques like the previous versions of DirectX. So, the new effects that came with DX9, 10, 11... There are new versions for 12 as well. These have not yet been detailed. What remains to be seen is if they are just updated versions of 11.2/existing hardware functions (hardware tiling, etc) or if MS included the new DX12 hardware in the X1 SoC. They were developing the two concurrently for the past 5 years... So it's very likely to me they would have included them together. No guarantee.
DX12 is also doing something cool with ram management. By loading the CPU heavy with GPU efficiency tasks, the CPU is now keeping track of the CPU and the GPU's memory requirements. So it's the ideal time to schedule/execute tasks and move memory around. Something that often gets REALLY misconstrued is the "ONLY" 32mb of ESRAM the X1 has. DX12 is being shown to have the ability to dynamically move data on a predictive nature based on... bundles. What they are saying is that the system and dev will be able to make sure the right memory is where it needs to be when it needs to be. For this, the X1 would utilize the "Move Engines" / DMA controllers so that the CPU and GPU both while busy doing tasks, the DMA is loading and unloading from system ram and ESRAM all the time. It doesn't matter that you have a 32mb plate and want to eat an elephant, you do it the same way everytime - you eat it one bite (byte, heh) at a time.
In reply to the author's notes and like minded to another poster here... Remember all that CPU loading DX12 is now doing? Remember that the X1 has a slightly faster processor than other consoles? What else helps with CPU utilization? Cloud offloading (see Titanfall's AI). If you have a system that you know is going to load the CPU very heavily in order to massively improve efficiency on the GPU... You try and off load things like map generation, AI, NPCs, physics, and whatever else you can to the network/cloud. Besides pulling load from the CPU, could compute could be a huge benefit for realism later on (see latest Cloud Compute building destruction physics example), and right now it's just awesome for reduced lag. This is an interesting advantage for MS as they are the only console player currently with access to a network that can make this possible.
.
So... That should be about it. There is a WHOLE LOT of wait and see here. But what is for sure, is that this is a MAJOR shift in technology here (at the DX12 reveal AMD saying "it's like getting four generations are hardware ahead today"). So while it MAY or MAY NOT "double performance" on X1. I think it's clear it's going to be a big deal. With next wave games like Wolfenstien and Sniper3 already 1080p on X1 just with more time over launch games, or better MS tools, or removed kinect/app/OS reservation (we really don't know the answer to which) it seems to me, that DX12 will firmly cement 1080p for future X1 games (see Phil Spencer's comments implying we're in the 360-Perfect Dark Zero phase right now and not the 360-Halo4 stage). Now... Before the fanboys jump in with some propaganda that doesn't work on anyone but the extreme base, does it really hurt you if X1 games are 1080p? No, it helps you. I fully expect PS4 games to remain 1080p as well. It's not a competition. If you'd like, you can comfort yourselves with obscure hardware specifications I guess.
TL;DR: it doesn't matter what DX12 is doing if you aren't a game programmer. Short story is MS does seem to have a plan, while I'll admit it's clear both consoles were launched early, it does seem that MS has a plan in place for their hardware decisions they've made. But... If you see a fanboy immediately coming here to downplay DX12... Well, that just confirms it's a pretty big deal.
This all might not happen the way MS is planning for, wait and see. Sorry, I can't write briefly to save my life!
Nice interpretation of what's going on. Makes sense.
Hard to argue against really. Actual impact is yet to be seen but I'd say the future is getting brighter for Xbox One and PC.
@tormentos: thats the most garbage you ever heard? You must read your on posts lol
NO i usually read yours which make no sense what so ever and you even mix CU with CPU cores..lol
In reply to the author's notes and like minded to another poster here... Remember all that CPU loading DX12 is now doing? Remember that the X1 has a slightly faster processor than other consoles? What else helps with CPU utilization? Cloud offloading (see Titanfall's AI).
You just need to reed to know this is nothing but Garbage...
Funny enough the PS4 CPU is proving to be faster than the xbox one even with a 100mhz slower.
Worse he call in the cloud..lol As seen on Titanfall yeah have you see Titanfall.? It has the dumbest by far bots ever put on a damn video game,they could very well replace them with mules it would have the same effect because they can't hit you even if you are 2 feet in front of them and even while you are out numbers with them.
So yeah Titanfall is the biggest testament that the cloud is sh** it look like garbage is not even 900p it has frames drops into the 30's and has AI as dumb as rocks.
Last time you were on a graphics thread it didn't end well and you accuse people or panicking ,you claimed ownage based on old article about COD Ghost been 1080p,you fight with teeth and nails calling out cow as been owned and damage controlling you,wanted to mix 2 different articles which had nothing to do one with the other and in the end you ended up Owned...lol
Lost rib,Lundy,Delta and the list goes on,oh even Ronvalencia which did on that thread what he always does hide on an old article and ignoring everything else...
Hahahhaaaaaaaaaaaaaaaaaa...
So don't worry ill wait while MS catch up...lol
Just let me pull out this chair to seat while i wait...lol
Oh wait you went on record on that to and claimed that by 2015 everything would change...mmmm lets wait....lol
@FastRobby said:
You are such a child, please debunk everything the guy stated, you can't because you aren't a tech guy. You are a gamer, that hopes PS4 can do the same, although it has different hardware. Well it can't, the CPU is slower, they don't have DX12, they don't have such a big server infrastructure. Maybe, Sony can develop the tools to do the same as DX12, in 3 years or so, but by then, Microsoft is already a step ahead, again.
The Last Of Us code knows how to make a Last of Us frame, and that doesn't change for the most part. DX12 is introducing "bundles" that highly optimize this process.
I will as soon as you explain to me what this ^^ crap even means.
DX12 is introducing a new way to program for GPUs. As code has become more complex the old way of programming is becoming less efficient. MS is estimating that in most games, frame to frame, about 90% of the code is the same.
This basically refers to copy and paste done on PC when there is no HSA,something the PS4 already has since is true HSA,unlike the xbox one which is say to have something similar but not quite HSA,is also a benefit of having the CPU and GPU on the same die,which also the PS4 has.
Exes copy and paste is not need it any more,since the CPU and GPU can see the same data all the time,there is no need for flushing the GPU,but on xbox one how can this be.? While the data is on the DDR3 the CPU can see it,when is on ESRAM it can't how do you have HSA on xbox one when the CPU can't see the data when it is on ESRAM only the GPU can,enters hUMA as well which is the ability of the CPU and GPU to read from each other memory address,on xbox one there are 2 memory set up hUMA is 1,and with no CPU see the data on ESRAM there is no hUMA either.
Bundling code wouldn't do much on it's own though, besides making the code look easier to read and change for the dev (so like OpenGL's version of bundles, there is a benefit in usability).
So Bundles are on OpenGL already and MS is late to the party like i claim.?
So DX is nothing more that MS late entry into what AMD,sony and openGL had have.
So the cool part is that MS/DX12 has devised a way to use the CPU to auto-optimize the time, the order, the priority, where the results are stored, and how to que up the next instruction. This is CPU heavy. It's using the CPU to allow the GPU to run at a higher efficiency.
For people like you who know sh** about it,this is what Mantle does already and so does LIbGNM from sony,is a way to make the GPU not depend so much on the CPU and do the work on its own,which is why you see some gains when you compare Mantle vs DX 11.
Ok, but the CPU is a finite resource too! You can't just fall back on the CPU to help keep the GPU fed and churning along, it's not like the CPU is just sitting idle. So... Another aspect of DX12 is that it's designed to not only reduce the overhead on the CPU (see Mantle), but it's being implemented as way to improve the parallelism of all the cores (see the DX12 launch slides where one CPU core is always loaded far higher than the rest, much more equally loaded after DX12).
See....lol
By reducing the dependance on the core assigned to the "main game loop" which runs at 1.75/1.6Ghz X1/PS4 respectively,
BY this time you can already get this written by a misterxmedia type of moron,people doing an actual technical observation will not fall into this kind of crap.
they are breaking the load down across all available game cores. This increased ability to push parallel tasks means a huge increase in effective CPU throughput.
The xbox 360 does this,the PS3 does this,the PS4 does this and is obvious that the xbox one also does this,hell also PC does this.
In reply to the author's notes and like minded to another poster here... Remember all that CPU loading DX12 is now doing? Remember that the X1 has a slightly faster processor than other consoles?
By this part it i re confirmed that it is a misterxmedia type of moron..
Also this is a Reddit post nothing more to say,it could be made up by the poster who post it here..lol
Which is also the creator of this wonderful thread..
Nice interpretation of what's going on. Makes sense.
Hard to argue against really. Actual impact is yet to be seen but I'd say the future is getting brighter for Xbox One and PC.
Actually you don't even know what hes talking about,basically he is describing Mantle and HSA process,2 things MS is way late to and that they will do latter on,hell it doesn't drop until 2015 on PC when Mantle dropped on 2013..
DX 12 is a direct response from MS to Mantle,which is the reason is been say to be release on holiday 2015..
So yeah MS had nothing and were catch by AMD pants down,API like Mantle had been on consoles for a long time,in fact Mantle is a direct response for PS API which allowed to the metal access,DX12 is a response for AMD Mantle and like always MS is late...lol
@tormentos: good ol trolly torm copy and pasting same misinformation i see.
Have you ever used star swarm demo on your pc ?
With and without mantle ?
What was the benchmark?
Or just the latest Nvidia driver for dx11?
Mantle is just spreading the workloads across each thread evenly and look at the performance jump dx12 is that plus more.
You are a hater of all things x1 but Intel said this is the biggest jump in technology in a long time skipping 4 generations ahead with dx12 AMD and Nvidia says its real preformance not theoritical. Who to believe these guys or copy and paste torm .........Oh thats a hard choice lol.
I said from the start ms had a plan the xbox was too big ,fan a coolin system was there for a reason and we see why dx12.
@tormentos: good ol trolly torm copy and pasting same misinformation i see.
Have you ever used star swarm demo on your pc ?
With and without mantle ?
What was the benchmark?
Or just the latest Nvidia driver for dx11?
Mantle is just spreading the workloads across each thread evenly and look at the performance jump dx12 is that plus more.
You are a hater of all things x1 but Intel said this is the biggest jump in technology in a long time skipping 4 generations ahead with dx12 AMD and Nvidia says its real preformance not theoritical. Who to believe these guys or copy and paste torm .........Oh thats a hard choice lol.
I said from the start ms had a plan the xbox was too big ,fan a coolin system was there for a reason and we see why dx12.
biggest jump in technology - FOR PC, consoles already spread workloads between their cores
skipping 4 generations ahead with dx12 - only a lembot would say that, it is the most stupid thing i've seen in this forum, and that's saying a lot.
Keep dreaming the xbone will stop being magically bad.
@tormentos: good ol trolly torm copy and pasting same misinformation i see.
Have you ever used star swarm demo on your pc ?
With and without mantle ?
What was the benchmark?
Or just the latest Nvidia driver for dx11?
Mantle is just spreading the workloads across each thread evenly and look at the performance jump dx12 is that plus more.
You are a hater of all things x1 but Intel said this is the biggest jump in technology in a long time skipping 4 generations ahead with dx12 AMD and Nvidia says its real preformance not theoritical. Who to believe these guys or copy and paste torm .........Oh thats a hard choice lol.
I said from the start ms had a plan the xbox was too big ,fan a coolin system was there for a reason and we see why dx12.
biggest jump in technology - FOR PC, consoles already spread workloads between their cores
skipping 4 generations ahead with dx12 - only a lembot would say that, it is the most stupid thing i've seen in this forum, and that's saying a lot.
Keep dreaming the xbone will stop being magically bad.
I agree, keep reaching though tighaman.
You're right that the fans were in there for a reason though, to keep the Xbox quiet for use as a DVR and avoid RROD mark 2. People keep looking for reasons to think that MS engineered the Xbox for power, they didn't. First concern was always cost, they chose DDR3 as they can cut the die size easier and paired that with a weak GPU. They gambled on what they thought Sony would do and lost.
If MS were even remotely interested in power they'd have put a stronger GPU in the system and wouldn't have put up with the ESRAM, simple as.
@Krelian-co: its being designed and prototyped on the x1 4 generations ahead is not my words or a lembot it was from a dev, i know its hard to grasps because its not your console of choice but its going to happen and how you gonna respond to my post and not answer none of those questions that was asked? If you do respond make sure its with your mouth because your ass smells.
@hoosier7: keep reaching? im not reaching at all it called forward thinking this has been in place since 09 the news is just coming out embedded ram on the gpu is going to be common in these next few years and again ms is ahead of sched. with the Sony left behind again. And workload on dx11 is not spreaded evenly on the threads first thread is doing the most work. Thats the problem with Sony and their fanboys always looking in the past for answers when you should be looking ahead.
@Tighaman: Lol, seems you need to look into the past once in while though since they're already doing that on the consoles so how is the X1 going to benefit from that? Planetside 2 on PC has benefited from this because of the work they've been doing with the PS4 for example. The only real benefit is on the dev end as they should save time on this.
If MS is ahead of schedule then how come their launch tools have been such a shambles? Lol they're playing catch up.
Also what is there to stop the PS4 from doing exactly the same? They already use the same feature set as 11.2 plus extra features and have dedicated teams to optimising and improving on it. Sony just don't have to shout it from the rooftop to try and sell OS's and to try and patch up the PR disaster they've got with the performance disparity between the X1 and PS4.
Keep denying it but the drivers aren't going to be enough to even close the gap never mind over take the PS4 especially when Sony can do the same thing. Any gap in driver performance is temporary and can be worked on, the hardware difference is permanent.
@Krelian-co: its being designed and prototyped on the x1 4 generations ahead is not my words or a lembot it was from a dev, i know its hard to grasps because its not your console of choice but its going to happen and how you gonna respond to my post and not answer none of those questions that was asked? If you do respond make sure its with your mouth because your ass smells.
oh don't worry i have my powerful gaming pc to help me understand the fact that lems are making bs hoping the xbone will stop being magically garbage, don't worry about me, i'll manage. I stand by my point only a fuking moron would say directx 12 will bring ahead 4 generations, but again what can you expect from someone with brain damage? it is so sad that lems have to make bs because their console is so bad. Ps4 is a LOT stronger than xbone #DEALWITHIT
lemmings are idiots..... and this thread proves it..... Can't wait for all the damage control at E3 when this performance upgrade turns out to be the bullshit normal people can see it is.
@hoosier7: im not saying sony couldnt do it with their API the ICE team is talented but this was about dx12 helping the x1. Planetside 2 must be having problems running on the ps4 because that was suposed to been an launch title and havent seen it since. Yes ps4 has SOME kind of dx11.2 framework only for it to be able to run the dx11 game but its not FULLY dx11.2 and will never be dx11.2(tier 2)
You mention hardware differences but tell me more tell me about the devs for infamous say that the CPU was the biggest bottleneck how the virtual memory slow as hell. Or how the 32 ROPS will never be fully used because of not enough bw to spread out across all 32.
Or how you have to use the CUs to take the load off the CPU, or using GPU to help with game sounds because it only have one DSP. We can go on and on about flaws and holes but its really a waste the games will talk.
@Krelian-co: so give me your star swarm benchmarks with or without mantle mr powerful pc man give me your nvidia benchmarks before and after the driver install. Did your performance change? Did you see any benifits?
@hoosier7: im not saying sony couldnt do it with their API the ICE team is talented but this was about dx12 helping the x1. Planetside 2 must be having problems running on the ps4 because that was suposed to been an launch title and havent seen it since. Yes ps4 has SOME kind of dx11.2 framework only for it to be able to run the dx11 game but its not FULLY dx11.2 and will never be dx11.2(tier 2)
You mention hardware differences but tell me more tell me about the devs for infamous say that the CPU was the biggest bottleneck how the virtual memory slow as hell. Or how the 32 ROPS will never be fully used because of not enough bw to spread out across all 32.
Or how you have to use the CUs to take the load off the CPU, or using GPU to help with game sounds because it only have one DSP. We can go on and on about flaws and holes but its really a waste the games will talk.
Planetside 2 has other issues than just proper porting of the engine. They are reworking the entire UI. I talk to the SoE dev who is doing that on a regular basis through a Planetside 2 IRC channel and the person is still in the middle of working on UI reworking. It was a pretty expansive code re-write as PS2's original UI code was very inefficient. It was actually the cause of a lot of performance issues on the PC leary on.
From what I can tell the engine runs fine on the PS4 now that they have reworked the multithreading and have ported the engine over to OpenGL. It's now more of a stability, bugfixing, and UI upgrade thing they are working on.
They never said that PS2 was going to be a launch title for the PS4, that was just an assumption.
Oh lord, I can see already this is going to keep the fanboys busy for the next TWO YEARS. :(
Wasdie said it best, and he said it several days ago, I'd advise everyone taking a second look. My thoughts in Brief:
No, we aren't all getting ~50% utilization out of our DX11 cards now, and Consoles are more efficient in keeping their meager GPUs full, which is a large part of the reason why consoles can push as much performance out of low spec hardware as they can.
Wardell's claims that this doubling will also come with double the heat should've ended the fanboy party right there. Remember how much thought MS put into a ~6% GPU clock increase pre launch last year? Why? First thing they talked about was realizing they had a bit more room in their "thermal envelope" and were below acoustic targets (they didn't want it nearly as loud as last gen). Now you think they're going to jump in with 100% more heat? Puhlease. The hardware can't take it, I don't care how big you think the fan is. Even IF it could, it would destroy their power consumption and noise targets.
Finally, as driver solutions evolve, they spread. Chip makers seem excited by the potential of certain DX12 features, so it's inevitable those techniques will be integrated into OpenGL in very short order. It's to the benefit of hardware and software makers to keep the feature sets largely on par. So whatever gains are forthcoming, they'll be to the benefit of both consoles. Drivers come and go, but a hardware advantage is forever.
@Wasdie: thats what i said they where having problem i never said what the problem was. Ps4 doesnt run Open gl so dont really know how it helps the ps4 can you help me understand. That sounds like we wont see that game on the ps4 no time soon from what you saying. Ms never said they had weaker hardware thats what everyone assumed
You guys do realize all the Xbox One games out right now are single CPU thread heavy. Thread 0.
Direct X 12 will utilize all CPU threads evenly. Freeing up major GPU resources.
No they are fucking not... they use all available cores you idiot...
And using more CPU doesn't free up the GPU either you numpty...
Prove it. I know for a fact you're wrong. Throw around name calling all day. You're still wrong. Link me. turthfacts.
How can the developers utilize all the CPU cores evenly when the SDK isn't even optimized for it?
You're going to sit there and tell me that offloading instructions to the CPU will not free up GPU resources?
Really man?
You fail hard.... YOu obviously have no understanding of how things like this work so do yourself a favor and stop talking as if you have a clue because you don't..
Quote from one of the Titanshite developers.. Link
"A lot of the work was making the engine multi-threaded. The original Source engine had a main thread that built up a command buffer for the render thread and there was a frame of delay. We kind of changed that so the render thread buffered commands as soon as they were issued - in the best case, it gets rid of a frame of latency there. And then almost every system has been made multi-threaded at this point. The networking is multi-threaded, the particle simulation is multi-threaded. The animation is multi-threaded - I could go on forever."
That was just a very quick Google search.... DX11.XX support multi-threading... Heck DX10 supports it and you think that they're only running the games on 1 of those pathetic weak CPU cores? LOL @ you
And no increasing the CPU load does not free up the GPU at all, GPU's in consoles are always run as close to maximum performance as possible, regards of how much CPU power is being used.
And the CPU in Xbone is so pathetic it doesn't have the power to do any GPU off loading anyway.
@Wasdie: thats what i said they where having problem i never said what the problem was. Ps4 doesnt run Open gl so dont really know how it helps the ps4 can you help me understand. That sounds like we wont see that game on the ps4 no time soon from what you saying. Ms never said they had weaker hardware thats what everyone assumed
The PS4 doesn't have a DX11.2 API though. It uses OpenGL and is DX 11 like, which means it codes similar to that of DX11. It does run OpenGL though. It's a customized version for the PS4, but it is still OpenGL.
Log in to comment