And here we go again with Ronvalecia school of hypocrisy.
They only hypocrisy comes from you.
So a 1.6 Ghz CPU can a drive a 7850OC to 30FPS but some how an 1.75GHz one can't drive 30 on a 7770.?
Metro dev already gave some reasons for XBO's higher CPU usage than PS4 e.g. dependency tracking.
From http://www.eurogamer.net/articles/digitalfoundry-2014-alien-isolation-face-off
In comparison, both consoles target 30fps as the target frame-rate, although neither manages to solidly stick to it under load. Performance is mostly stable on PS4 with frame-rates incurring a light drop in combat, but the action remains v-synced at all times, so we never encounter any tearing. The main distraction comes in the form of brief freezes throughout the game, which temporarily sees frame-times languishing in the 200-460ms area, depending on the length of the pause.
Neither consoles sustained 30 fps as the minimum frame rate. You created an argument from fictional basis.
That quote you make from metro developer is debunked by him self on that same article you fool,He stated that was on the MONO driver period on the crappy mono drive and he him self not only state that MS had tools to work around the problem but also that they have a DX12 like API already but Metro developer didn't have time to use it,what fu** up the game you idiot was the fact that it was a game made for 5 platforms period.
He did NOT debunk himself you fool. You missed the question includes "low overhead" calls you stupid cow.
Digital Foundry: To what extent will DX12 prove useful on Xbox One? Isn't there already a >>>>>> LOW CPU overhead <<<<<<<<< there in addressing the GPU?
Oles Shishkovstov: No, it's important. All the dependency tracking takes a huge slice of CPU power. And if we are talking about the multi-threaded command buffer chunks generation - the DX11 model was essentially a 'flop', while DX12 should be the right one.
That is funny because Project cars developer claims to have split rendering over several cores something you people claim is not happening..hahaha
You attempted to use Killzone Shadow Fall's workload screenshot that shown single CPU draw submission you clown.
Project Cars developer's claims was for PS4 nothing for XBO.
Current XBO has one async compute context and one immediate rendering context.
DX12 expands XBO's Async context and removes deferred context multithreading to serialisation to immediate rendering context.
Again, XBO DX11.X is beyond PC's DX11.2 but falls short of DX12 i.e. XBO's DX11.X is a half baked API.
am waiting for you to explain to me why that gap is 97% when the PS4 GPU isn't 97% stronger than the xbox one CPU...hahahaaa
Again,
1. You created an argument from fictional basis.
2. DX12 expands XBO's Async context (from a single context) and removes deferred context multithreading to serialisation to immediate rendering context.
3. XBO DX11.X is beyond PC's DX11.2 but falls short of DX12 i.e. XBO's DX11.X is a half baked API.
4. Ubisoft shows XBO is being gimped by it's current API.
NO metro didn't give any reason you blind biased fanboy.
I want a quote from Metro developer stating that AI is under 30FPS on xbox one because of the reason they stated,that is something you are assuming.
Even worse you blind buffoon i posted several times how that scenario apply only to the MONO driver,he was talking about the mono driver.
But not only that Metro developer also stated that MS not only had tools to mitigate the issues but also that it had a DX12 like API which they could not use.
That is a fact and the reality the rest is speculation on your
No stop you butthurt denial AI is not CPU intensive Watchdogs is,AI runs on a damn crappy celeron at 90 frames per second which Watch Dogs doesn't do under the same parameters.
Fact is everything DX12 has to offer is basically inside the xbox one,reason why Fable has 20% gain on PC and shit on XBO..hahaha
Yay another example of not excepting the facts when they are staring at you in the face.
I suppose I set my alarm clock for early hours so I can get up and post on these forums as TTBoy. Some of us have better things to do, unlike you.
And I bet even now you wont admit you are wrong, the only person talking shit here is you, but that's nothing new.
Oh please STFU i post here many times vs Delta and Sts106mats which are on UK i on Puerto Rico,it mean total shit and it just sound like a stupid ass excuse even more.
Now that confirms to me even more that you and him are the same account..lol
But but but he is in UK that mean he can't post at the same time frame that prove is not me..lol
So you have 2 accounts 1 for night one for days.? hahaha
Yay another example of not excepting the facts when they are staring at you in the face.
I suppose I set my alarm clock for early hours so I can get up and post on these forums as TTBoy. Some of us have better things to do, unlike you.
And I bet even now you wont admit you are wrong, the only person talking shit here is you, but that's nothing new.
Oh please STFU i post here many times vs Delta and Sts106mats which are on UK i on Puerto Rico,it mean total shit and it just sound like a stupid ass excuse even more.
Now that confirms to me even more that you and him are the same account..lol
But but but he is in UK that mean he can't post at the same time frame that prove is not me..lol
So you have 2 accounts 1 for night one for days.? hahaha
Have you any idea how stupid that sounds. I have an account I use in the day and one I use at night. Get real, your just looking more stupid by the minute. If you bother to look I post from this account at all times of my waking day.
Yes there is a time in the day when someone in UK and USA can be online at the same time. But when I log in at 10.00am and it says a post by TTBoy was posted 5 hours ago it would mean I got up at 5.00 am to post on a public forum and I'm not that sad.
Really just take a minute, breath, read what has been said and realise you are wrong and stop being an idiot.
Why do some of you console gamers have a chip on your shoulder where you constantly have to post screens of games that everyone forgot about every few weeks to let us know: "Look guys, we have great graphics too!". Good for you. You do have good graphics. So? That's not going to make anyone without a PS4 want to go out and buy one...
@nyadc: If all those games look good to you on PC then Infamous SS along with other top tier console games should look better. The vast majority of the titles you posted were last gen ports with extra effects. That doesn't put them abo've anything on consoles. That's my point!
Only that 4-5 of us in this thread posted direct-feed gameplay shots compared with each other, and the PC games look better. By far. To the point where anyone who casually enters this thread sees our in-game comparisons and either agrees with us or is a PS4 fanboy and whines about not using photomode PS4 shots.
@Jankarcop: Do you just ignore every ones post? Over half the thread disagreed with you and the only ones supporting you are raging hermits. None of those pics you posted looked better then Infamous SS , more then half the people in the thread agreed it didn't look as good and you just keep rambling on. We can agree to disagree and move on.
@Jankarcop: Do you just ignore every ones post? Over half the thread disagreed with you and the only ones supporting you are raging hermits. None of those pics you posted looked better then Infamous SS , more then half the people in the thread agreed it didn't look as good and you just keep rambling on. We can agree to disagree and move on.
BullSHIT. I just re-read the thread. More people agreed that SS is not the open world gfx king. These people posted shots of several games that look better. There was 4 people who agreed with you, and two of them were known anti-PC trolls and posted ugly screenshots (lol GPUKING) or photomode shots (lol Heirrien). Make a poll asking if SS is the open world gfx king and you will lose. Even you stated AC:UNITY looks much better at one point, conceding to me.
You have to be a pretty insecure person pretend Infamous look bad,you can argue other games run at higher resolution and higher frames but the level of fidelity in Infamous is quite great.
I didn't say SS looks bad I said GTA V maxed on PC looks better which is true.
@Jankarcop: Do you just ignore every ones post? Over half the thread disagreed with you and the only ones supporting you are raging hermits. None of those pics you posted looked better then Infamous SS , more then half the people in the thread agreed it didn't look as good and you just keep rambling on. We can agree to disagree and move on.
Over half the thread disagreed with you and the only ones supporting you are raging console gamers.
You have to be a pretty insecure person pretend Infamous look bad,you can argue other games run at higher resolution and higher frames but the level of fidelity in Infamous is quite great.
I didn't say SS looks bad I said GTA V maxed on PC looks better which is true.
Looks are subjective though. I tend to agree but someone could easily make an argument for Infamous Second Son from a pure visual standpoint. What cannot be argued is the resolution and framerate advantage GTAV PC has.
@RyviusARC said:
BB uses too much chromatic aberration and gives me a headache.
CA seems to be the new bloom for devs to overuse.
Yeah that effect is pretty annoying honestly. Does nothing to enhance the image and probably takes up resources. But I still stand by my statement that Bloodborne can look pretty fantastic at times.
@Jankarcop: Two of those games are not out so stfu about those. WD, Skyrim and GTA V don't look better that's laughable. AC unity maxed out I will give you that.
And no more people didn't agree with you that GTAV looked better then Infamous SS.
@MK-Professor: Half the thread my ass and you are a raging hermit. Everyone that disagreed with me were hermit trolls and have the post history to back it up .
Are you kidding me? GPUKing, Herrien, and you are vocal anti-PC trolls..... Other than those three goons only 2 others agreed with you. The rest of the thread is people laughing at Vaseline gfx.
@Jankarcop: The rest of the thread? Cloud imperium_, Krilean_co,odchar, ryuvrc, mk_ professor,Nyadc, Delta? What do they all have in common? RAGING HERMS!
Not one kid that posted in your favor was a neutral poster.
@Jankarcop: The rest of the thread? Cloud imperium_, Krilean_co,odchar, ryuvrc, mk_ professor,Nyadc, Delta? What do they all have in common? RAGING HERMS!
Not one kid that posted in your favor was a neutral poster.
We all provided evidence. All you had was 2 fakeboys circle jerking. Still more people agree.
If you disagree, make a poll. SS vs. GTA 4k Ultra PC.
@MK-Professor: Half the thread my ass and you are a raging hermit. Everyone that disagreed with me were hermit trolls and have the post history to back it up .
I can say the same:
Half the thread my ass and you are a raging console gamer. Everyone that disagreed with me were console gamers trolls and have the post history to back it up .
NO metro didn't give any reason you blind biased fanboy.
I want a quote from Metro developer stating that AI is under 30FPS on xbox one because of the reason they stated,that is something you are assuming.
Even worse you blind buffoon i posted several times how that scenario apply only to the MONO driver,he was talking about the mono driver.
But not only that Metro developer also stated that MS not only had tools to mitigate the issues but also that it had a DX12 like API which they could not use.
That is a fact and the reality the rest is speculation on your
No stop you butthurt denial AI is not CPU intensive Watchdogs is,AI runs on a damn crappy celeron at 90 frames per second which Watch Dogs doesn't do under the same parameters.
Fact is everything DX12 has to offer is basically inside the xbox one,reason why Fable has 20% gain on PC and shit on XBO..hahaha
No, Metro dev gave additional issues related to XBO with the question that includes low CPU overhead context you blind bias fanboy
Again,
Digital Foundry: To what extent will DX12 prove useful on Xbox One? Isn't there already a >>>>>> LOW CPU overhead <<<<<<<<< there in addressing the GPU?
Oles Shishkovstov: No, it's important. All the dependency tracking takes a huge slice of CPU power. And if we are talking about the multi-threaded command buffer chunks generation - the DX11 model was essentially a 'flop', while DX12 should be the right one.
XBO's MONO driver is based from DirectX 11 model driver you blind bias fanboy
From Major Nelson
Since E3, an example is that we've dropped in what we internally call our mono driver. It's our graphics driver that really is 100 percent optimised for the Xbox One hardware. You start with the base [DirectX] driver, and then you take out all parts that don't look like Xbox One and you add in everything that really optimises that experience. Almost all of our content partners have really picked it up now, and I think it's made a really nice improvement
DirectX12's bundles improvements reduces the CPU usage to 1/3 on XBO you blind bias fanboy
Current XBO DX11.X API and MONO driver doesn't have DirectX12's resource bindling model you blind bias fanboy
That damn Celeron was Intel Haswell core you blind bias fanboy i.e. the same CPU core design as any Intel Haswell SKUs.
Features for Intel Celeron Haswell dual core at 2.7Ghz
A total of four 256bit SIMD AVXv2 hardware units at 2.7Ghz and each unit handles both ADD and MUL operations i.e. better flexibility.
The combined SIMD width is 1024bit (four 256bit SIMD) for ADD and MUL operations at 2.7Ghz. Intel Haswell's AVXv2 hardware has 3 operands format capability.
8 integer units at 2.7 Ghz.
Intel Haswell can feed it's multiple execution units properly with larger instruction issue ports. Two Intel Haswell CPU cores can decode 8 instructions per cycle at 2.7 Ghz. 21.6 Giga-operations per second.
A total of 8 ADD 128bit SIMD and 8 MUL 28bit SIMD units at 1.6Ghz. AMD Jaguar's 128bit are fixed to either ADD or MUL operation i.e. they are not flexible. AMD Jaguar's 128bit SIMD hardware only handles two operands format like the old Athlon 64s. 3 and 4 operands format capability for SIMD hardware is exclusive for AMD FX CPUs. Only 6 ADD 128bit SIMD and 6 MUL 128bit SIMD units are available for games.
The combined SIMD width is 768 bit ADD and 768 bit MUL at 1.6Ghz. Intel Celeron "Haswell" Dual Core wins this area.
16 integer units at 1.6Ghz with 12 integer units at 1.6Ghz available games. Intel Celeron "Haswell" Dual Core wins this area.
AMD Jaguar can't feed it's multiple execution units properly with narrow instruction issue ports i.e. gimped with two decode instructions per cycle per CPU core. 6 AMD Jaguar CPU cores can decode 12 instructions per cycle at 1.6Ghz. 19.2 Giga-operations per second. Intel Celeron "Haswell" Dual Core wins this area.
The requirement for wider multithreading to properly feed the GPU is a must for PS4.Intel Celeron "Haswell" Dual Core wins this area.
From http://www.pcper.com/reviews/Processors/Jaguar-GCN-Compute-Architecture-Temash-and-Kabini
AMD Jaguar's L2 cache runs half of the advised clock speed you blind bias fanboy. 800Mhz L2 cache from 1.6 Ghz CPU LOL. Are we back to Pentium II era L2 cache model? Xbox 360's IBM PPE has half speed L2 cache.
Intel Atom Bay Trail L2 cache runs at full clock speed with 2 watts SoC. AMD is just fucking clowns with low power CPUs.
There's a reason why Intel is smashing AMD in every PC CPU segment i.e. from tablets to HPC.
AMD Athlon 5350 = AMD Jaguar quad core at 2 Ghz getting smashed by Intel Celeron G1820 dual core at 2.7 Ghz
@ronvalencia: So all said and done, will DX12 do shit for the Xbox One in terms of performance? And is DX12 more streamlined than PS4's programming interface?
i remember before this-gen was released that people here in SW were saying that PS3/X360 already have good enough graphics, and that new gen is not even needed. lol. LOL i said.
No, Metro dev gave additional issues related to XBO with the question that includes low CPU overhead context you blind bias fanboy
Again,
Digital Foundry: To what extent will DX12 prove useful on Xbox One? Isn't there already a >>>>>> LOW CPU overhead <<<<<<<<< there in addressing the GPU?
Oles Shishkovstov: No, it's important. All the dependency tracking takes a huge slice of CPU power. And if we are talking about the multi-threaded command buffer chunks generation - the DX11 model was essentially a 'flop', while DX12 should be the right one.
XBO's MONO driver is based from DirectX 11 model driver you blind bias fanboy
From Major Nelson
Since E3, an example is that we've dropped in what we internally call our mono driver. It's our graphics driver that really is 100 percent optimised for the Xbox One hardware. You start with the base [DirectX] driver, and then you take out all parts that don't look like Xbox One and you add in everything that really optimises that experience. Almost all of our content partners have really picked it up now, and I think it's made a really nice improvement
DirectX12's bundles improvements reduces the CPU usage to 1/3 on XBO you blind bias fanboy
Current XBO DX11.X API and MONO driver doesn't have DirectX12's resource bindling model you blind bias fanboy
That damn Celeron was Intel Haswell core you blind bias fanboy i.e. the same CPU core design as any Intel Haswell SKUs.
Features for Intel Celeron Haswell dual core at 2.7Ghz
A total of four 256bit SIMD AVXv2 hardware units at 2.7Ghz and each unit handles both ADD and MUL operations i.e. better flexibility.
The combined SIMD width is 1024bit (four 256bit SIMD) for ADD and MUL operations at 2.7Ghz. Intel Haswell's AVXv2 hardware has 3 operands format capability.
8 integer units at 2.7 Ghz.
Intel Haswell can feed it's multiple execution units properly with larger instruction issue ports. Two Intel Haswell CPU cores can decode 8 instructions per cycle at 2.7 Ghz. 21.6 Giga-operations per second.
A total of 8 ADD 128bit SIMD and 8 MUL 28bit SIMD units at 1.6Ghz. AMD Jaguar's 128bit are fixed to either ADD or MUL operation i.e. they are not flexible. AMD Jaguar's 128bit SIMD hardware only handles two operands format like the old Athlon 64s. 3 and 4 operands format capability for SIMD hardware is exclusive for AMD FX CPUs. Only 6 ADD 128bit SIMD and 6 MUL 128bit SIMD units are available for games.
The combined SIMD width is 768 bit ADD and 768 bit MUL at 1.6Ghz. Intel Celeron "Haswell" Dual Core wins this area.
16 integer units at 1.6Ghz with 12 integer units at 1.6Ghz available games. Intel Celeron "Haswell" Dual Core wins this area.
AMD Jaguar can't feed it's multiple execution units properly with narrow instruction issue ports i.e. gimped with two decode instructions per cycle per CPU core. 6 AMD Jaguar CPU cores can decode 12 instructions per cycle at 1.6Ghz. 19.2 Giga-operations per second. Intel Celeron "Haswell" Dual Core wins this area.
The requirement for wider multithreading to properly feed the GPU is a must for PS4.Intel Celeron "Haswell" Dual Core wins this area.
From http://www.pcper.com/reviews/Processors/Jaguar-GCN-Compute-Architecture-Temash-and-Kabini
AMD Jaguar's L2 cache runs half of the advised clock speed you blind bias fanboy. 800Mhz L2 cache from 1.6 Ghz CPU LOL. Are we back to Pentium II era L2 cache model? Xbox 360's IBM PPE has half speed L2 cache.
Intel Atom Bay Trail L2 cache runs at full clock speed with 2 watts SoC. AMD is just fucking clowns with low power CPUs.
There's a reason why Intel is smashing AMD in every PC CPU segment i.e. from tablets to HPC.
AMD Athlon 5350 = AMD Jaguar quad core at 2 Ghz getting smashed by Intel Celeron G1820 dual core at 2.7 Ghz
You can fu**ing quote 30 irrelevant chartz fact is you moron that AI is fu** on consoles because the damn developer screw up,since you don't have a single proof from that developer stating that 30FPS was the max they could get from the xbox one because of CPU over head that is a total joke.
DX11 on xbox one is not DX11 on PC is vanilla with the horrible over head gone the mono driver was particularly bad and developers complained about it,so yeah Metro developer worked on a fu** up API,worse you constant ignorance of other part of the article are a joke MS did have a DX12 like API and they have tools to mitigate those CPU problems.
Call of Duty: Ghosts runs at 1080p resolution on Xbox One and PlayStation 4, Infinity Ward has said.
@tormentos
That link is older than you...hahahaa and in nothing invalidates the leak about 720p which came a few days ago,not 4 months ago..hahaha
@ronvalencia said:
IW already stated 1080p for both consoles. Why should IW continually counter some CBOAT statement? This CBOAT person needs to be fired i.e. the likes CBOAT makes a mockery of signed NDAs.
@ronvalencia said:
First bold part is what this news is based on.
Again, you missed my post's context. I was replying against the following claim by danbo
"As revealed by Eurogamer in June, the game runs at 60 frames per second in 1080p, upscaled from 720p on the Xbox One. "We actually have a 4K TV at work and got the game running on that," Rubin told the site at the time. "It looks phenomenal. The 4K TVs have a max hertz of 30, so we're maxing it out. It looks amazing!""
The statement above DOES NOT exist in the originalhttp://www.oxm.co.uk/64778/call-of-duty-ghosts-has-fantastic-graphics-on-xbox-one-reiterates-infinity-ward link
The original text is as follows
As revealed by Eurogamer in June, the game runs at 60 frames per second in 1080p. "We actually have a 4K TV at work and got the game running on that," Rubin told the site at the time. "It looks phenomenal. The 4K TVs have a max hertz of 30, so we're maxing it out. It looks amazing!"
Can any of you tell me how Panello doesn't know and OXM do,.? Hahahahaa all games that release on xbox pass for a certification process and quality control this is not new is on all consoles.
IW has confirmed sh** Panello is running all time high damage controls,Activision is showing the PS4 version in an move without precedents since Activision and MS inked a deal,the xbox one version been hide like almost all xbox games version of multiplatform games.
I don't care about innuendos BS. Your innuendos BS doesn't prove sh**.
From http://www.eurogamer.net/articles/2013-06-11-call-of-duty-ghosts-runs-at-1080p-and-60fps-on-xbox-one-and-ps4
IW already stated 1080p for both next-gen consoles.
PS; It would be LOL episode if Wii U's CoDG has 1280x720p.
@tormentos
Wow lemming are hugely butthurt by this,such denial the game is 720p until confirmed by DF Deal with it..
Wow cows are hugely butthurt by this,such denial the game is 1080p** until confirmed by DF Deal with it.
**based from IW's statement fromhttp://www.eurogamer.net/articles/2013-06-11-call-of-duty-ghosts-runs-at-1080p-and-60fps-on-xbox-one-and-ps4
This is you on a thread where you fought me and several others because you insisted that a June link from Infinity ward was more valid than a new rumor about the xbox one version been 720p,several people told you that the link you were using was old and you insisted in give validity to it because it serve you best.? What happen tell me.? What happen to COD Ghost.?
720p on xbox one all your crap and argument on the floor because you are a heard headed lemming with a love for PC specs and chartz..
I don't care what you say and what you want to use selectively buffoon,all i care is that theory by theory all of your arguments have fall,including Sniper Elite 3 ones to..lol
@Jankarcop: What evidence did you provide? Other then screenshots that looked worse? Evidence would be this game has higher levels of geometry in the environment, more polys in each character, a more complex particle system, superior animations. That is an example of evidence. All advantages for infamous SS. All you managed to post were shitty screenshots with a bullshit opinion.
@Jankarcop: It doesn't have to be the open world gfx king. It does look better then GTA V , WD and some of the other tiles you listed which was laughable.
I already said AC unity maxed out would take the throne. That's because I use logic as opposed to you who automatically trashes every game on PS4.
@Jankarcop: It doesn't have to be the open world gfx king. It does look better then GTA V , WD and some of the other tiles you listed which was laughable.
I already said AC unity maxed out would take the throne. That's because I use logic as opposed to you who automatically trashes every game on PS4.
Well I'm glad I changed your mind on calling SS the open world gfx king. But GTAV still looks better according to facts.
@Jankarcop: The rest of the thread? Cloud imperium_, Krilean_co,odchar, ryuvrc, mk_ professor,Nyadc, Delta? What do they all have in common? RAGING HERMS!
Not one kid that posted in your favor was a neutral poster.
How am I a "raging hermit"? I love my consoles, all 16 of them... But I at least have the ability to look at something for what it is and say exactly what it is unlike you people...
Infamous SS is a good looking game but it's not better looking than the other games I listed or nearly as impressive especially when you consider the difference in scale.
It uses a lot of masking techniques to hide the limitations of the PlayStation 4 much like Driveclub and The Order. They look good at face value, at a glance, but once you stop and really look around the illusion is destroyed.
How am I a "raging hermit"? I love my consoles, all 16 of them... But I at least have the ability to look at something for what it is and say exactly what it is unlike you people...
Infamous SS is a good looking game but it's not better looking than the other games I listed or nearly as impressive especially when you consider the difference in scale.
It uses a lot of masking techniques to hide the limitations of the PlayStation 4 much like Driveclub and The Order. They look good at face value, at a glance, but once you stop and really look around the illusion is destroyed.
Hahahaha now you repeat what ever shit some alter spew in this forum...
So what techniques are use to hide limitations.? Hahaha
@nyadc: Having over 4x the polys in the environment and character models isn't an illusion it's a fact. GPU accelerated particle effects, higher levels of geometry those are not illusions just technical facts that are more impressive then any of the games you posted.
As far as you loving your consoles your posting history states otherwise. You love your X1 but you constantly trash the PS4.
@ronvalencia: So all said and done, will DX12 do shit for the Xbox One in terms of performance? And is DX12 more streamlined than PS4's programming interface?
Yes it will do shit for the X1, lol
How DX12 will impact X1 in short is:
Allow full use of the two ACE units " better compute abilities"
Better efficiency due to reduced CPU overhead allowing more cpu cycles to go where their needed.
Introduction of cpu multithreading allowing better communication between CPU and GPU and Implementing parallel processing, asynchronous and synchronous.
Better management of ESRAM
PS4 is able to do what DX12 is introducing but its harder and time consuming. Which is the reason why even on the PS4 vast majority of games still use old methods of using all the main core to feed the gpu and or using most of main core and offload non priority tasks onto other cores. Even the X1's cpu holds back its weak gpu to some degree.
@ronvalencia: So all said and done, will DX12 do shit for the Xbox One in terms of performance? And is DX12 more streamlined than PS4's programming interface?
Yes it will do shit for the X1, lol
How DX12 will impact X1 in short is:
Allow full use of the two ACE units " better compute abilities"
Better efficiency due to reduced CPU overhead allowing more cpu cycles to go where their needed.
Introduction of cpu multithreading allowing better communication between CPU and GPU and Implementing parallel processing, asynchronous and synchronous.
Better management of ESRAM
PS4 is able to do what DX12 is introducing but its harder and time consuming. Which is the reason why even on the PS4 vast majority of games still use old methods of using all the main core to feed the gpu and or using most of main core and offload non priority tasks onto other cores. Even the X1's cpu holds back its weak gpu to some degree.
So you mean unless it's an exclusive, developers won't use multithreads to feed the GPU?
So you mean unless it's an exclusive, developers won't use multithreads to feed the GPU?
not necessarily have to an exclusive, just a publisher and dev team that have the time and money to do so. BF4 on PS4 used Async Shaders while X1 and PC didnt because they dont support it in current API's used ie DX11.
@ronvalencia: So all said and done, will DX12 do shit for the Xbox One in terms of performance? And is DX12 more streamlined than PS4's programming interface?
Yes it will do shit for the X1, lol
How DX12 will impact X1 in short is:
Allow full use of the two ACE units " better compute abilities"
Better efficiency due to reduced CPU overhead allowing more cpu cycles to go where their needed.
Introduction of cpu multithreading allowing better communication between CPU and GPU and Implementing parallel processing, asynchronous and synchronous.
Better management of ESRAM
PS4 is able to do what DX12 is introducing but its harder and time consuming. Which is the reason why even on the PS4 vast majority of games still use old methods of using all the main core to feed the gpu and or using most of main core and offload non priority tasks onto other cores. Even the X1's cpu holds back its weak gpu to some degree.
So you mean unless it's an exclusive, developers won't use multithreads to feed the GPU?
If you notice Killzone Shadow Fall's workload allocation, the top thread has CPU draw calls while the other threads has deferred context. Tormentos attempted to use Killzone Shadow Fall's workload allocation as proof for concurrent multithreaded draw calls when the other threads has deferred context.
The below diagram is the Xbox 360's and DX11's multithreading model. DirectX 12 removes the multithreading serialisation into the primary thread bottleneck.
@ronvalencia: So all said and done, will DX12 do shit for the Xbox One in terms of performance? And is DX12 more streamlined than PS4's programming interface?
Yes it will do shit for the X1, lol
How DX12 will impact X1 in short is:
Allow full use of the two ACE units " better compute abilities"
Better efficiency due to reduced CPU overhead allowing more cpu cycles to go where their needed.
Introduction of cpu multithreading allowing better communication between CPU and GPU and Implementing parallel processing, asynchronous and synchronous.
Better management of ESRAM
PS4 is able to do what DX12 is introducing but its harder and time consuming. Which is the reason why even on the PS4 vast majority of games still use old methods of using all the main core to feed the gpu and or using most of main core and offload non priority tasks onto other cores. Even the X1's cpu holds back its weak gpu to some degree.
So you mean unless it's an exclusive, developers won't use multithreads to feed the GPU?
If you notice Killzone Shadow Fall's workload allocation, the top thread has CPU draw calls while the other threads has deferred context. Tormentos attempted to use Killzone Shadow Fall's workload allocation as proof for concurrent multithreaded draw calls when the other threads has deferred context.
The below diagram is the Xbox 360's and DX11's multithreading model. DirectX 12 removes the multithreading serialisation into the primary thread bottleneck.
From http://gamingbolt.com/project-cars-dev-ps4-single-core-speed-slower-than-high-end-pc-splitting-renderer-across-threads-challenging
On being asked if there was a challenge in development due to different CPU threads and GPU compute units in the PS4, Tudor stated that, “It’s been challenging splitting the renderer further across threads in an even more fine-grained manner – even splitting already-small tasks into 2-3ms chunks. The single-core speed is quite slow compared to a high-end PC though so splitting across cores is essential.
“The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel.
Project Cars on PS4 has 1920x1080p with up to 60 fps.
The game's frame-rate is also a sticking point. Project Cars targets an ambitious 60fps on each platform, but the sheer breadth of options gives players the power to determine whether it hits this mark, or drops closer to 30fps. For example, our first race is on the Dubai Autodrome International circuit, a manic 35-car race with light clouds overhead, camera set to interior cockpit view and no damage physics enabled. Even with this number of AI racers, the game sticks to a 60fps line throughout, and only drops for one stretch on the circuit (to 50fps on Xbox One, and 55fps on PS4).
Even with multiple AI (35 car race), Project Cars PS4 sticks to around 55 fps to 60 fps. Concurrent multithreading GPU feed enables the minimum frame rate to remain high.
DirectX12 like features are available on PS4, if the developer chose to use it.
You can fu**ing quote 30 irrelevant chartz fact is you moron that AI is fu** on consoles because the damn developer screw up,since you don't have a single proof from that developer stating that 30FPS was the max they could get from the xbox one because of CPU over head that is a total joke.
DX11 on xbox one is not DX11 on PC is vanilla with the horrible over head gone the mono driver was particularly bad and developers complained about it,so yeah Metro developer worked on a fu** up API,worse you constant ignorance of other part of the article are a joke MS did have a DX12 like API and they have tools to mitigate those CPU problems.
Call of Duty: Ghosts runs at 1080p resolution on Xbox One and PlayStation 4, Infinity Ward has said.
@tormentos
That link is older than you...hahahaa and in nothing invalidates the leak about 720p which came a few days ago,not 4 months ago..hahaha
@ronvalencia said:
IW already stated 1080p for both consoles. Why should IW continually counter some CBOAT statement? This CBOAT person needs to be fired i.e. the likes CBOAT makes a mockery of signed NDAs.
@ronvalencia said:
First bold part is what this news is based on.
Again, you missed my post's context. I was replying against the following claim by danbo
"As revealed by Eurogamer in June, the game runs at 60 frames per second in 1080p, upscaled from 720p on the Xbox One. "We actually have a 4K TV at work and got the game running on that," Rubin told the site at the time. "It looks phenomenal. The 4K TVs have a max hertz of 30, so we're maxing it out. It looks amazing!""
The statement above DOES NOT exist in the originalhttp://www.oxm.co.uk/64778/call-of-duty-ghosts-has-fantastic-graphics-on-xbox-one-reiterates-infinity-ward link
The original text is as follows
As revealed by Eurogamer in June, the game runs at 60 frames per second in 1080p. "We actually have a 4K TV at work and got the game running on that," Rubin told the site at the time. "It looks phenomenal. The 4K TVs have a max hertz of 30, so we're maxing it out. It looks amazing!"
Can any of you tell me how Panello doesn't know and OXM do,.? Hahahahaa all games that release on xbox pass for a certification process and quality control this is not new is on all consoles.
IW has confirmed sh** Panello is running all time high damage controls,Activision is showing the PS4 version in an move without precedents since Activision and MS inked a deal,the xbox one version been hide like almost all xbox games version of multiplatform games.
I don't care about innuendos BS. Your innuendos BS doesn't prove sh**.
From http://www.eurogamer.net/articles/2013-06-11-call-of-duty-ghosts-runs-at-1080p-and-60fps-on-xbox-one-and-ps4
IW already stated 1080p for both next-gen consoles.
PS; It would be LOL episode if Wii U's CoDG has 1280x720p.
@tormentos
Wow lemming are hugely butthurt by this,such denial the game is 720p until confirmed by DF Deal with it..
Wow cows are hugely butthurt by this,such denial the game is 1080p** until confirmed by DF Deal with it.
**based from IW's statement fromhttp://www.eurogamer.net/articles/2013-06-11-call-of-duty-ghosts-runs-at-1080p-and-60fps-on-xbox-one-and-ps4
This is you on a thread where you fought me and several others because you insisted that a June link from Infinity ward was more valid than a new rumor about the xbox one version been 720p,several people told you that the link you were using was old and you insisted in give validity to it because it serve you best.? What happen tell me.? What happen to COD Ghost.?
720p on xbox one all your crap and argument on the floor because you are a heard headed lemming with a love for PC specs and chartz..
I don't care what you say and what you want to use selectively buffoon,all i care is that theory by theory all of your arguments have fall,including Sniper Elite 3 ones to..lol
Benefits of Direct3D 12 will extend to Xbox One. LOL.
Unlike your slide, my AMD slide is platform specific i.e. Xbox One.
EA DICE statements are platform specific.
For Mantle and PS4, EA DICE statements are platform specific.
My "console" platform specific slides beats your non-specific "consoles" slide.
My comment on CoD Ghost is based on initial IW's claim on 1920x1080p and development plans/results can change mid-way.
Being "low level" doesn't automatically result in DirectX12 like model for XBO you fool.
Digital Foundry: To what extent will DX12 prove useful on Xbox One? Isn't there already a >>>>>> LOW CPU overhead <<<<<<<<< there in addressing the GPU?
Oles Shishkovstov: No, it's important. All the dependency tracking takes a huge slice of CPU power. And if we are talking about the multi-threaded command buffer chunks generation - the DX11 model was essentially a 'flop', while DX12 should be the right one.
DirectX12's bundles improvements reduces the CPU usage to 1/3 on XBO you blind bias fanboy
Current XBO DX11.X API and MONO driver doesn't have DirectX12's resource bindling model you blind bias fanboy.
My specific feature by feature matching arguments are superior to your non-specific feature arguments.
Log in to comment