@tormentos, you can't even answer all my posts, guy - you have to cherry pick parts of some of them to try and make a point, and you even go as far as to counter claims I didn't even make. Weak. It just so fun proving that you are full of crap that I just can't seem to bring myself to ignore you like so many other people do.
Its really easy. Read his post. Laugh, don't reply. Its what I do.
@tormentos, you can't even answer all my posts, guy - you have to cherry pick parts of some of them to try and make a point, and you even go as far as to counter claims I didn't even make. Weak. It just so fun proving that you are full of crap that I just can't seem to bring myself to ignore you like so many other people do.
Its really easy. Read his post. Laugh, don't reply. Its what I do.
Cloud is a joke - Tell that to Windows Azure, Amazon and Google. Just a few years ago server based services where running locally in businesses. Now massive amounts of data are being held in the cloud and its just increasing. Give it a couple more years and broadband will be as much a utility as electric and gas in the major countries Microsoft are interested in. Cloud will differentiate the Xbox and Sony have reacted with there purchase of Gaikai. The 24 hour online access was for DRM and the cloud can be used without it.
In fact The PS4 APi LibGNM is ahead of DX12 and is now available to PS4 coders so when DX12 arrive on holiday 2015 the PS4 will be welcome DX12 to 2012...lol - This explains why the difference between current games is what it is, PS4 has Mantle/DirectX 12 features, Xbox One doesn't, if it did the release of DirectX 12 wouldn't have an impact and Microsoft would be hiding the fact not shouting about it. So when DirectX 12 is released the difference will be much less. All good news for Microsoft.
Always be a difference - Of course there will be but it wont be what it is now. The PS4 doesn't have the power to go above 1080P but the Xbox One has the power to get to 900p (1080p in certain scenarios). 900p or 1080p is close enough for nobody to care.
@tormentos, you can't even answer all my posts, guy - you have to cherry pick parts of some of them to try and make a point, and you even go as far as to counter claims I didn't even make. Weak. It just so fun proving that you are full of crap that I just can't seem to bring myself to ignore you like so many other people do.
Oh dude STFU you are the dumbest IT i have see in my entire life,if you can't fu**ing get the difference between GPU that are on PC and if can't even see how big a gap is when is presented to you with facts,they you are either dumb or a blin d biased fanboy.
I have benchmarks on my side proving all my points what do you have.?
Oh your years as IT and you argument about .net lol..
The difference in game is all i care what has been show now is not promising for the xbox one,in march they were outdone again by MGS5 which is 720p on xbox one and 1080p with more effects on PS4,now if you consider that small have it your way,i know what you need to double another card resolution wise while keeping parity in frames.
@darkangel115 said:
@Wasdie said:
@darkangel115: The PS3 didn't have a huge RAM advantage, it had a huge RAM disadvantage. 256 mb split pools of two separate types of memory were horrible, especially considering devs had to use the Cell to compensate for the lacking GPU. The faster ram was given to the cell while the slower ram was given to the GPU. It was a terrible design choice.
I know that, but in the link provided, which was an article from 2010 so 3+ years into the gen. IGN said the PS3 had better RAM, which is beyond a joke, and goes to show how little these gaming sites really know about hardware. They do for clicks because clicks = money, and people will keep quoting their articles to try and prove a they are correct to other people who wouldn't change their mind regardless.
And what is your point that IGN suck.? Because there is no debate here about that...lol
games will still be 720p, but not dropping below 30fps like they are now.
dx12 will allow less cpu overhead, meaning better/steady fps, aka low level API.
bad news is that PS4 has already been using this technique before the system launched, lol.
no matter what MS tries to do, they will always be behind, and once we see a ps4 game from Naughty Dog or Santa Monica it will be the final nail in the coffin.
Cloud is a joke - Tell that to Windows Azure, Amazon and Google. Just a few years ago server based services where running locally in businesses. Now massive amounts of data are being held in the cloud and its just increasing. Give it a couple more years and broadband will be as much a utility as electric and gas in the major countries Microsoft are interested in. Cloud will differentiate the Xbox and Sony have reacted with there purchase of Gaikai. The 24 hour online access was for DRM and the cloud can be used without it.
In fact The PS4 APi LibGNM is ahead of DX12 and is now available to PS4 coders so when DX12 arrive on holiday 2015 the PS4 will be welcome DX12 to 2012...lol - This explains why the difference between current games is what it is, PS4 has Mantle/DirectX 12 features, Xbox One doesn't, if it did the release of DirectX 12 wouldn't have an impact and Microsoft would be hiding the fact not shouting about it. So when DirectX 12 is released the difference will be much less. All good news for Microsoft.
Always be a difference - Of course there will be but it wont be what it is now. The PS4 doesn't have the power to go above 1080P but the Xbox One has the power to get to 900p (1080p in certain scenarios). 900p or 1080p is close enough for nobody to care.
Those cloud are not the same as MS was saying,you can use cloud for many things,but increasing the xbox one power from 10 xbox 360 to 40 xbox 360 it is a joke.
Cloud is not fast enough to deliver graphics over the damn internet,you can't period so on one side you fools are debating how ESRAM is enough to not starve the xbox one GPU for having 140 to 150 GB/s,and on the other side you completely ignore what that means to graphics,nothing complex can be deliver by online, not only network lack the speed,latency will kill anything,graphics need results in the same frame,the data a GPU requires every second is infinitely bigger than the can you can transmit using your online connection.
Oh the xbox one API is not DX11.2 is DX11X is already modify,is the reason i have been telling you that don't spec huge changes with DX12 because the xbox one api already is more efficient than DX11.2 on PC,just like the 360 version also was.
Aside from some improvements don't spec much,never the less the PS4 will always be ahead,it has more power.
Actually the PS4 does have the power to go beyond 1080p even the 7770 can,but what TV support those resolutions.? Most TV out there are 1080p not higher.
These features will be available on XB1 later. Also "Descriptor Heaps & Tables" which is a sort of bindles rendering (page 19 under the "CPU Overhead: Redundant Resource Binding") would be possible only on GPUs that are fully DX11.2 capable (tier 2) and beyond. Considering that both DX11.2 and DX12 were announced for XB1 and DX team is prototyping DX12 on XB1 HW right now, it's likely that Descriptor Heaps & Tables will be available on XB1, too.
From that its safe to say that the Xbox One is in for a good bump. We'll find out at E3!! This means that the modified GPU is at least a tier 2 DirectX 11 card.
watch as the xbox 1 is still 720p'ish after dx 12 releases.. Come on people. Direct X isnt the reason the xbone sucks.. Its the amount of ROPS and the terrible RAM configuration in the bone. Nothing they can do will help or solve that situation until some new hardware is put out there.
I love it when people who obviously don't know d!ck about hardware and software try and talk like they do using buzz words...
He's correct though, 16 ROP's is really piss poor for a next gen console aiming for 1080p.
Look, I have been writing software for... well, more years than I care to admit to. And, I will tell you first hand that optimized APIs can significantly improve an applications performance, just like poorly written APIs can cripple it. So without having any benchmark comparisons between DX 11's API performance and DX 12's API performance, making a blanket comment like "Its all because of hardware piece X" is nothing short of pure ignorance.
The difference between the PS4 and XB1 is, in the large scale of things, relatively minor. If you apply Moore's Law, the difference doesn't even amount to 1 CPU generation (2 years). So, is a 1-2 year old high end PC that much slower than a brand new high end PC?
Im a developer as well and yes optimized APIs will improve performance but like you said before how poorly written where the previous APIs? most likely they were in fact fairly well written (minimal performace gains)
secondly i didnt do any research into this i but how compatible it the 7750 card in the xbone with dx 12?
thirdly a 2 year old high end pc will run crysis 3 on high where as a new high end pc would run crysis 3 on ultra at the same fps that is a fair diffrence expecially when the new high end pc is 100 dollars less
finally unless you are a video game developer and not just a application dev(like I am) you dont really know what your talking about ... both of us are just making very very very educated guesses
also i dont know why lems keep forgetting sony can improve their apis as well
If this is true and DX12 is this leap forward then excellent for PC Gaming. Just to add if these gains are made in software only then SOny will not sit back and allow MS to dominate they will find a way to make the same gains with the PS4 CPU with Open GL software updates.
How the hell can an API double a GPU speed? it doesn't even make sense.....
DX12 will probably be REALISTICALLY 5-10% of a performance boost on X1 and maybe abit more on PC for SOME setups..... if that even.
the f*ck is wrong with lemmings?
They need secret sauce to hold on to! What I like to do is picture a merri-go-round that is labelled 'Secret Sauce' and you have all the lemmings holding onto the rails. The faster it gets the more people let go but no matter how hard the G-force hits there's always going to be those diehards that can take the spin and hold on for eternity.
@tormentos, you can't even answer all my posts, guy - you have to cherry pick parts of some of them to try and make a point, and you even go as far as to counter claims I didn't even make. Weak. It just so fun proving that you are full of crap that I just can't seem to bring myself to ignore you like so many other people do.
Oh dude STFU you are the dumbest IT i have see in my entire life,if you can't fu**ing get the difference between GPU that are on PC and if can't even see how big a gap is when is presented to you with facts,they you are either dumb or a blin d biased fanboy.
I have benchmarks on my side proving all my points what do you have.?
Oh your years as IT and you argument about .net lol..
The difference in game is all i care what has been show now is not promising for the xbox one,in march they were outdone again by MGS5 which is 720p on xbox one and 1080p with more effects on PS4,now if you consider that small have it your way,i know what you need to double another card resolution wise while keeping parity in frames.
@darkangel115 said:
@Wasdie said:
@darkangel115: The PS3 didn't have a huge RAM advantage, it had a huge RAM disadvantage. 256 mb split pools of two separate types of memory were horrible, especially considering devs had to use the Cell to compensate for the lacking GPU. The faster ram was given to the cell while the slower ram was given to the GPU. It was a terrible design choice.
I know that, but in the link provided, which was an article from 2010 so 3+ years into the gen. IGN said the PS3 had better RAM, which is beyond a joke, and goes to show how little these gaming sites really know about hardware. They do for clicks because clicks = money, and people will keep quoting their articles to try and prove a they are correct to other people who wouldn't change their mind regardless.
And what is your point that IGN suck.? Because there is no debate here about that...lol
See what I mean? Did you just skim over the part where I said the PS4 has a more powerful GPU?
Just bow out of this thread gracefully. You know you lost this one, why keep shaming yourself?
watch as the xbox 1 is still 720p'ish after dx 12 releases.. Come on people. Direct X isnt the reason the xbone sucks.. Its the amount of ROPS and the terrible RAM configuration in the bone. Nothing they can do will help or solve that situation until some new hardware is put out there.
I love it when people who obviously don't know d!ck about hardware and software try and talk like they do using buzz words...
He's correct though, 16 ROP's is really piss poor for a next gen console aiming for 1080p.
Look, I have been writing software for... well, more years than I care to admit to. And, I will tell you first hand that optimized APIs can significantly improve an applications performance, just like poorly written APIs can cripple it. So without having any benchmark comparisons between DX 11's API performance and DX 12's API performance, making a blanket comment like "Its all because of hardware piece X" is nothing short of pure ignorance.
The difference between the PS4 and XB1 is, in the large scale of things, relatively minor. If you apply Moore's Law, the difference doesn't even amount to 1 CPU generation (2 years). So, is a 1-2 year old high end PC that much slower than a brand new high end PC?
Im a developer as well and yes optimized APIs will improve performance but like you said before how poorly written where the previous APIs? most likely they were in fact fairly well written (minimal performace gains)
secondly i didnt do any research into this i but how compatible it the 7750 card in the xbone with dx 12?
thirdly a 2 year old high end pc will run crysis 3 on high where as a new high end pc would run crysis 3 on ultra at the same fps
that is a fair diffrence expecially when the new high end pc is 100 dollars less
finally unless you are a video game developer and not just a application dev(like I am) you dont really know what your talking about ... both of us are just making very very very educated guesses
also i dont know why lems keep forgetting sony can improve their apis as well
In interviews with the devs prior to launch, they said the APIs sucked and they didn't even have final builds until a week or two before launch. @tormentos Even went as far as to bash MS for it - something I am sure he wishes I'd forgotten about.
Apparently, the version of DX12 the XB1 is getting is optimized for the XB1.
And yes, these are educated guesses, but considering a lot of devs complained about the XB1's APIs, and no one has complained about Sony's, I would speculate that Sony's APIs were in better shape a while ago - there is a finite amount of optimization you can do on code.
These features will be available on XB1 later. Also "Descriptor Heaps & Tables" which is a sort of bindles rendering (page 19 under the "CPU Overhead: Redundant Resource Binding") would be possible only on GPUs that are fully DX11.2 capable (tier 2) and beyond. Considering that both DX11.2 and DX12 were announced for XB1 and DX team is prototyping DX12 on XB1 HW right now, it's likely that Descriptor Heaps & Tables will be available on XB1, too.
Thank another for Tormentos...hahaha how many times in this thread i have tell you people that some of this features are already on the xbox one.? That the so call gains you see on PC will not translate to console because consoles are from the get go more efficient than PC's.?
@ttboy said:
From that its safe to say that the Xbox One is in for a good bump. We'll find out at E3!! This means that the modified GPU is at least a tier 2 DirectX 11 card.
DX12 will work on all GCN,you know what that means right.? Any feature it get by API the PS4 also get it on LibGNM.
Hell DX12 will work on Nvidia GPU from the 400GTX series and up,no special hardware need it..
And as i already say the jump in performance will not be much,since the xbox one already have some of the features of DX12 since before launch...lol
Not sure if serious.... but that is a load of crap.
Hahahahaaaaaaaaaaaaaaaaaa.... Agreeeeeeee.....
@FastRobby said:
You are one of the most ignorant people I have ever met. On the one hand you know a bit about your tech, but then you claim it's the cloud's fault that the AI is dumb, apparently it's not up to the developers no more... If you're going to say shit like that, you better say nothing at all
Let me say it again Titanfall runs on the cloud it uses the cloud for AI,the AI is a smart as a chair,already show by Angry Joe review he stand in front of 3 soldiers like 4 feet from then and all he did was strife from left to right and the AI could not hit him....Hahahaaaaaaaaaaaaaaaaaaaaaa
Then he stop and killed all 3..lol
So yeah the cloud is sh**,it doesn't improve graphics like MS claim,and the AI done on the cloud so far has proven to be bad,the game runs a little higher than 720p with drop into the 30's on a game where the developer claimed that FPS was king,they wanted to prove CBOAT wrong so bad that they move the game from 720p to a little higher and the game suffer from it,COD Ghost doesn't have the drops Titanfall has at 720p,Titanfall should have stay that way.
@FastRobby said:
The Xbox One API was really bad in the beginnen, a lot of developers have complained about it, but they got an updated one in February/March, and that one is much better. A lot of developers said they have no worries about the Xbox One anymore, and that the gap is getting much smaller with Sony.
Developers didn't really complain like they did when the PS3 came out,no matter how bad the xbox one API could be,i am sure better than Cell it was by a truck load,what developers complained about was about performance they got running those API and about the reservation for Kinect,which activision confirm that they asked MS about dropping it,if the problems is the hardware been to weak there is no fix around that.
Also the xbox one effectively on launch had less power than a 7770 1.18TF because it had a 10% reservation the 7770 is 1.28TF,now without the reservation is 1.28 TF just like the 7770,you can't really expect performance like that to top a much stronger GPU at 1.84TF on PS4.
Intel has come out recently also saying that DX12 is the biggest leap in a long long time it's looking like DX12 is the real deal and it will make GPU's do things and add performance gains never before seem until now.
I think if you factor in the latest Cloud demo of MS showing off real time demos of game using the cloud and you factor in DX12 and also Tiled Resources it's starting to look obvious that the X1 is going to be a powerhouse in a few years and do 60fps 1080p easily and most like have the graphics king games. Also AMD has hinted that the GPU in the X1 is not as close to a 7790 as people have claimed and it's more of an exotic design heavily modified and it's starting to look like now that this GPU has been specifically designed with forward thinking and DX12 in mind.
MS isn't dumb they aren't the biggest computer company in the world for nothing. Lets give MS some praise and be happy that the X1 has so much room to stretch it's legs in the coming years. Lets stop bickering PS4 is a great console with powerful hardware right out the gate but lets start to give MS due props they deserve with what is more and more seemingly a system designed with heavily forward thinking and can and will coming into it's own graphics in the coming years and start to open up lots of more power with superb software and API designs by MS some of the best engineers in the world.
Um, no. Both Apple and Google are considerably larger than Microsoft in Market Capitalization.
In interviews with the devs prior to launch, they said the APIs sucked and they didn't even have final builds until a week or two before launch. @tormentos Even went as far as to bash MS for it - something I am sure he wishes I'd forgotten about.
Apparently, the version of DX12 the XB1 is getting is optimized for the XB1.
And yes, these are educated guesses, but considering a lot of devs complained about the XB1's APIs, and no one has complained about Sony's, I would speculate that Sony's APIs were in better shape a while ago - there is a finite amount of optimization you can do on code.
No i i don't wish that,but please do talk about what developers were really complaining about,hard to code.? No it wasn't that it was performance and ESRAM complexity.
That bold part is wrong,the xbox one already has DX12 features it has has them for a while and they were use in Forza.
This is why i have been telling you,the xbox one API since the get go is more streamline than the PC version and already DX11X xbox one has some of this features,so don't expect the same gains you saw on PC the xbox one was already enjoying them.
2 features and already in 2 more to come don't expect much.
The fact that sony API were in a better spot than MS ones doesn't mean sony one will not improve for the 100 time...
Finally wrote that ASM I was looking forward to. Early results: PS4 surface tiling/detiling on the CPU is ~10-100x faster now. SIMDlicious!
The PS4 surface tilling is now from early test 10 to 100 times faster on CPU,10 to 100 times man as you can see just because no one complained about the PS4 doesn't mean there aren't things that can't improve greatly,i just proved that with link from a sony ICE team member..
Mostly an sign of how wretched the original code was, but still. Thanks for reminding me to write full VRAM cache lines per iter!
Just some twits lower look at how he say how wretched the original code was,hell i think most developer complained about the xbox one because the gap they were getting with the PS4 more than anything no matter what DX is easy to use on xbox one,the problems comes trying to bring performance up because it has some pitfalls like ESRAM 10% reservation and not completely cook API.
As you can see there were wrong thing about the PS4 as well which are getting improve no launch game took advantage of this new code neither did Infamous which look incredible even so.
Nope that's wrong, they complained about the tools also. Also PS4 never reaches 1.84TF because of the CPU, but keep saying it does, lol.
Really...
They didn't say the tool were hard to use or the console hard to code,they complain that they where getting low performance vs the PS4,ESRAM added complexity as well and to think DX on xbox one already had some DX 12 features,imagine how much sadder it would have been without them.
Really please link me to where it say that the PS4 can't reach 1.84TF because of the CPU..lol
In interviews with the devs prior to launch, they said the APIs sucked and they didn't even have final builds until a week or two before launch. @tormentos Even went as far as to bash MS for it - something I am sure he wishes I'd forgotten about.
Apparently, the version of DX12 the XB1 is getting is optimized for the XB1.
And yes, these are educated guesses, but considering a lot of devs complained about the XB1's APIs, and no one has complained about Sony's, I would speculate that Sony's APIs were in better shape a while ago - there is a finite amount of optimization you can do on code.
No i i don't wish that,but please do talk about what developers were really complaining about,hard to code.? No it wasn't that it was performance and ESRAM complexity.
That bold part is wrong,the xbox one already has DX12 features it has has them for a while and they were use in Forza.
This is why i have been telling you,the xbox one API since the get go is more streamline than the PC version and already DX11X xbox one has some of this features,so don't expect the same gains you saw on PC the xbox one was already enjoying them.
2 features and already in 2 more to come don't expect much.
The fact that sony API were in a better spot than MS ones doesn't mean sony one will not improve for the 100 time...
Finally wrote that ASM I was looking forward to. Early results: PS4 surface tiling/detiling on the CPU is ~10-100x faster now. SIMDlicious!
The PS4 surface tilling is now from early test 10 to 100 times faster on CPU,10 to 100 times man as you can see just because no one complained about the PS4 doesn't mean there aren't things that can't improve greatly,i just proved that with link from a sony ICE team member..
Mostly an sign of how wretched the original code was, but still. Thanks for reminding me to write full VRAM cache lines per iter!
Just some twits lower look at how he say how wretched the original code was,hell i think most developer complained about the xbox one because the gap they were getting with the PS4 more than anything no matter what DX is easy to use on xbox one,the problems comes trying to bring performance up because it has some pitfalls like ESRAM 10% reservation and not completely cook API.
As you can see there were wrong thing about the PS4 as well which are getting improve no launch game took advantage of this new code neither did Infamous which look incredible even so.
Again, I never said that Sony GPU wasn't more powerful, or that Sony's APIs wouldn't improve. Or that improving APIs on MS's side would make the consoles be "even" with each other as far as hardware performance. I said optimized APIs would shorten the gap, and that it appears that MS had far more room for improvement than Sony.
MS was shipping out new versions of their APIs within weeks of the console's launch. Devs (and your mecca NeoGAF) have said that the "resolution-gate" issue would disappear over time. You seem to keep omitting that in your replies... it must negate your "moon is made of cheese" mindset.
Nope that's wrong, they complained about the tools also. Also PS4 never reaches 1.84TF because of the CPU, but keep saying it does, lol.
Really...
They didn't say the tool were hard to use or the console hard to code,they complain that they where getting low performance vs the PS4,ESRAM added complexity as well and to think DX on xbox one already had some DX 12 features,imagine how much sadder it would have been without them.
Really please link me to where it say that the PS4 can't reach 1.84TF because of the CPU..lol
I'm searching for the link, it's a slide from a presentation given by Sony.
The developers said that the tools sucked, that's why when they had an updated version, they said it was waaaaaaay better and that they don't have any worries anymore.
I believe you mean the Sniper elite comment by Rebellion. Source
Just because Xbox One may have 2 features of Direct X12 it does not mean that they were optimized implementations. They could've been a hack of Direct X 11 in order to get them in (hence Direct X 11.x reference).
The likely situation is that Direct X 12 was not ready to include in the Xbox release, therefore they hacked the existing Direct X 11 to extend the 2 classes mentioned above. They may have been completely re-written since you don't optimize until later in the cycle. I would not be so quick to say that it will not give Xbox a big jump. There is one Dev who is ecstatic about it and thats telling. E3 is going to be world war III and I am excited to see how much crow will be eaten. More and more that messed up architecture is making sense.
Also I believe the last 2 features of Direct X 12 requires a tier 2 card so not all features will go to all cards.
I believe you mean the Sniper elite comment by Rebellion. Source
Just because Xbox One may have 2 features of Direct X12 it does not mean that they were optimized implementations. They could've been a hack of Direct X 11 in order to get them in (hence Direct X 11.x reference).
The likely situation is that Direct X 12 was not ready to include in the Xbox release, therefore they hacked the existing Direct X 11 to extend the 2 classes mentioned above. They may have been completely re-written since you don't optimize until later in the cycle. I would not be so quick to say that it will not give Xbox a big jump. There is one Dev who is ecstatic about it and thats telling. E3 is going to be world war III and I am excited to see how much crow will be eaten. More and more that messed up architecture is making sense.
Also I believe the last 2 features of Direct X 12 requires a tier 2 card so not all features will go to all cards.
Yes and look Watch Dogs unconfirmed to be 1080p on xbox one,what happen did the tilling fail or the tricks.?
Dude this is MS stop inventing crap DX11X for the xbox one was say to be a custom version from the get go,also DX12 is a damn response to mantle,and mantle is a response to CONSOLES API,yeah even the xbox 360 API which is DX9 is more streamline than DX 11 on PC,because console target 1 single hardware,so yeah the xbox one already have DX12 features when the PC ones arrive on holiday 2015..
They didn't hack anything MS make custom version of its API all the time dude stop inventing things.
Ignore the PS4 dev, seems like a pointless comment. More quotes at the source.
PS4 ICE Team programmer Cort Stratton added that, “New SDKs can significantly improve performance on the same hardware, yes. Dunno about DX12/X1 specifically, of course; not my dept.
He also said that people have a right to be skeptical about performance gains. “Good; always be suspicious of ANY perf. improvement claims. e.g. what *exactly* got 50-100% faster? Faster than what? Details!”
Treyarch software engineer Dan Olson had a less amused take. “Here’s an article… no idea why people go on record for stuff like this.”
Programmer Dean Ashton found it downright hilarious. Either that or life-threatening judging by his response. “2x perf on Xbox One when using DX12? That article nearly made me choke on my cup of tea.”
Read more at http://gamingbolt.com/devs-react-to-...AoDlY3FJqQ5.99
Seems it's not just the fanboys that are skeptical. Discuss.
Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?
AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."
Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."
Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."
Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?
AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."
Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."
Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."
Yeah AMD also claim that all xbox 360 games would be 720p minimum with 4XAA... Which is infinitely more achievable and failed, than making hardware jump 4 generations ahead because of a damn APi that will help with CPU over head..lol
The crow you will be eating next year will make you open 10 troll accounts and never post under balckace again..hahahaha
Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?
AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."
Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."
Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."
Yeah AMD also claim that all xbox 360 games would be 720p minimum with 4XAA... Which is infinitely more achievable and failed, than making hardware jump 4 generations ahead because of a damn APi that will help with CPU over head..lol
The crow you will be eating next year will make you open 10 troll accounts and never post under balckace again..hahahaha
Why kid yourself. If TVBox720 becoming a reality didn't chase the astroturfers away, why would DX12 Sauce flopping?
Ignore the PS4 dev, seems like a pointless comment. More quotes at the source.
PS4 ICE Team programmer Cort Stratton added that, “New SDKs can significantly improve performance on the same hardware, yes. Dunno about DX12/X1 specifically, of course; not my dept.
He also said that people have a right to be skeptical about performance gains. “Good; always be suspicious of ANY perf. improvement claims. e.g. what *exactly* got 50-100% faster? Faster than what? Details!”
Treyarch software engineer Dan Olson had a less amused take. “Here’s an article… no idea why people go on record for stuff like this.”
Programmer Dean Ashton found it downright hilarious. Either that or life-threatening judging by his response. “2x perf on Xbox One when using DX12? That article nearly made me choke on my cup of tea.”
Read more at http://gamingbolt.com/devs-react-to-...AoDlY3FJqQ5.99
Seems it's not just the fanboys that are skeptical. Discuss.
Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?
AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."
Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."
Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."
We'll have to see at E3 2016 for confirmation like I'm surprised you haven't said yet ;). At the minute it's fairy dust & hope.
Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?
AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."
Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."
Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."
Yeah AMD also claim that all xbox 360 games would be 720p minimum with 4XAA... Which is infinitely more achievable and failed, than making hardware jump 4 generations ahead because of a damn APi that will help with CPU over head..lol
The crow you will be eating next year will make you open 10 troll accounts and never post under balckace again..hahahaha
I doubt it. lol!! I've was on the money with PS4 demand, I'm pretty sure I'll be right about this as well. We'll see how many 720P games we see on XB1 after March 2015. LOL!! I would believe AMD, Intel and Nvidia over anyone on these boards, that's for sure.
Ignore the PS4 dev, seems like a pointless comment. More quotes at the source.
PS4 ICE Team programmer Cort Stratton added that, “New SDKs can significantly improve performance on the same hardware, yes. Dunno about DX12/X1 specifically, of course; not my dept.
He also said that people have a right to be skeptical about performance gains. “Good; always be suspicious of ANY perf. improvement claims. e.g. what *exactly* got 50-100% faster? Faster than what? Details!”
Treyarch software engineer Dan Olson had a less amused take. “Here’s an article… no idea why people go on record for stuff like this.”
Programmer Dean Ashton found it downright hilarious. Either that or life-threatening judging by his response. “2x perf on Xbox One when using DX12? That article nearly made me choke on my cup of tea.”
Read more at http://gamingbolt.com/devs-react-to-...AoDlY3FJqQ5.99
Seems it's not just the fanboys that are skeptical. Discuss.
Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?
AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."
Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."
Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."
We'll have to see at E3 2016 for confirmation like I'm surprised you haven't said yet ;). At the minute it's fairy dust & hope.
Never said anything about E3 2016. Stop pulling more BS out of your ass. Your crediability on here is continuing to sink into the abyss of a blackhole. I said, "this time next year". Let's see how many 720P/900P games are being announced at that time.
I also see you're not discrediting anything AMD, Intel & Nvidia are saying. lol!! Covering your ass, I'm sure.
@misterpmedia: I doubt it will double shit. But I'm not a software developer. I don't know, it sounds to me to be, 1 would that GPU even be possible to double its power AT ALL? As in if you were to overclock it right up to the brink of meltdown and then make the software on the thing as perfect for that GPU as it can possibly be, would it be double the overall power? Something tells me not from what I've seen of PC overclocking.
2. How much more can developers and stuff like DX12 get out of hardware? To me it seems like programming in its current form has been around for about 30 years, it has been coming along in leaps and bounds, but to say that Microsoft have suddenly found this holy grail that will take something like a console GPU to double its capacity using software alone sounds very unlikely to me.
I am going to be getting an Xbox One. I haven't liked Playstation since 2 and PS4 hasn't changed that so far. Just feel uninspiring. Whether the GPU on the Xbox One gets doubled in power or not I'll still be buying it. But you know every little helps.
I doubt it. lol!! I've was on the money with PS4 demand, I'm pretty sure I'll be right about this as well. We'll see how many 720P games we see on XB1 after March 2015. LOL!! I would believe AMD, Intel and Nvidia over anyone on these boards, that's for sure.
Sure you were $449 at walmart yesterday and yet is the PS4 the one sold out..hahahaaaaaaaaaaaa
Yeah the demand is surely dying..lol
You were wrong and the PS4 continues to have high demand..
oh please you will believe any moron who post pro xbox one...lol
Ignore the PS4 dev, seems like a pointless comment. More quotes at the source.
PS4 ICE Team programmer Cort Stratton added that, “New SDKs can significantly improve performance on the same hardware, yes. Dunno about DX12/X1 specifically, of course; not my dept.
He also said that people have a right to be skeptical about performance gains. “Good; always be suspicious of ANY perf. improvement claims. e.g. what *exactly* got 50-100% faster? Faster than what? Details!”
Treyarch software engineer Dan Olson had a less amused take. “Here’s an article… no idea why people go on record for stuff like this.”
Programmer Dean Ashton found it downright hilarious. Either that or life-threatening judging by his response. “2x perf on Xbox One when using DX12? That article nearly made me choke on my cup of tea.”
Read more at http://gamingbolt.com/devs-react-to-...AoDlY3FJqQ5.99
Seems it's not just the fanboys that are skeptical. Discuss.
Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?
AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."
Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."
Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."
We'll have to see at E3 2016 for confirmation like I'm surprised you haven't said yet ;). At the minute it's fairy dust & hope.
Never said anything about E3 2016. Stop pulling more BS out of your ass. Your crediability on here is continuing to sink into the abyss of a blackhole. I said, "this time next year". Let's see how many 720P/900P games are being announced at that time.
I also see you're not discrediting anything AMD, Intel & Nvidia are saying. lol!! Covering your ass, I'm sure.
lmfao I almost spat out my tea. But yes you're right, getting too ahead of the game there for when you'll eventually push it back to E3 2016, that comment was meant for your faux-insider comments dotted around old topics about it all 'happening' this year then suddenly it's all down to next year's E3, interdasting. Also, addressing credibility, I'm sorry but none of us have credibility on here, what kind of special club do you think this is blackace? We're forum posters. I never had any credibility to 'sink into the abyss of a blackhole'(bit vicious, damn :P!)in the first place. Sounds like I'm rustling your jimmies a bit. Also, side note, flattered that you thought I had credibility in the first place. Danke.
I didn't discredit them, but will remain skeptical like the other devs will especially for the Xbone, however it sounds like it's going to be quite beneficial....for PC. The doubling of GPU sounds like hyperbole PR and a desperate attempt to ensure xbone dieharders that their console won't stay gimped for the rest of the gen. Once I see a FULL DX12 game working on actual xbone hardware I will reassess the situation.
@misterpmedia: I doubt it will double shit. But I'm not a software developer. I don't know, it sounds to me to be, 1 would that GPU even be possible to double its power AT ALL? As in if you were to overclock it right up to the brink of meltdown and then make the software on the thing as perfect for that GPU as it can possibly be, would it be double the overall power? Something tells me not from what I've seen of PC overclocking.
2. How much more can developers and stuff like DX12 get out of hardware? To me it seems like programming in its current form has been around for about 30 years, it has been coming along in leaps and bounds, but to say that Microsoft have suddenly found this holy grail that will take something like a console GPU to double its capacity using software alone sounds very unlikely to me.
I am going to be getting an Xbox One. I haven't liked Playstation since 2 and PS4 hasn't changed that so far. Just feel uninspiring. Whether the GPU on the Xbox One gets doubled in power or not I'll still be buying it. But you know every little helps.
Didn't they already upclock the GPU anyway? I share similar views with you. Nice post, and for the record I've never said it wouldn't improve anything, I don't think any on has said that lol.
Anyone calling the cloud rubbish because the AI in Titanfall is bad is just an idiot.
It wouldn't be that hard to program the AI to be harder if they wanted to. Even I could program them to have more hit points or armour. They could easily make them move around better or shoot you more accurately or faster.
They make them they way they do by choice. Titanfall servers are pretty rock solid and 100,000+ virtual servers spinning up as required is pretty impressive.
Log in to comment