DICE Dev: Xbox One Isn’t As Powerful As PS4, DX12 Won’t Help Reduce Difference

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#251 tormentos
Member since 2003 • 33793 Posts

@kingtito said:

Wrong cow. With Live you got a lot more than just online play, seamless online, party chat a fully integrated system. A LOT more than what PSN gave you last gen. Now you have gained NOTHING else with PSN and in fact now HAVE to pay to play online. So you get the same shitty online only now you have to pay. As I said cow, good job pointing that out.

So you claim live got you lot more than just online play,but name online play and party chat.?

Could use download and use mods made for PC for xbox 360.?

Did the xbox 360 allow co-op game with PC.?

Did you have a virtual place where you could round up with friends and play free games and launch your games from.?

Why live had less games on dedicated servers than PSN.? Is there a game with 60 players online at once and 8 players co-op on xbox 360.?

Having to pay online is pathetic for both fact is for 11 years MS charge for nothing,in fact on the OG xbox there was no party chat just friend list and basically all games were P2P even Socom which was free to play online ran on dedicated servers on 2002.

PSN+ give you 6 games monthly for 3 different platforms with 1 account dude the value it has is there and was there when online play was also free.

@04dcarraher said:

Wrong vast majority multiplat dev copy & paste same usage methods ie single threaded or deferred , and do not use PS4's GNM API to the degree as 1st party or say DICE did with frostbite using async. Once DX12 comes becomes norm PS4 will see performance gains with multiplats.

Good to hear that because with Project cars you try to imply they are coded differently which is a joke,both game are on equal field and even DX12 gains would carry to the PS4 stated by the developer it self,which mean yeah on a CPU heavy game do to drawcalls the PS4 CPU is able to beat the xbox one with 7 cores,so much for the xbox one having an advantage there.

@commander said:

it took 6 months to run the witcher slighty better on the ps4,

what a joke, it was a slideshow before that.

It could have take 12 years,FACT still stand it wasn't the CPU like you wanted to pretend it was bad coding...lol

Now the xbox one version is the slideshow.lol

@WilliamRLBaker said:

As much as we all know that the one isn't as powerful, I tend to take anything said by a Dice dev with a grain of salt...their games aren't known for optimization.

I mean shit battlefield 4 on the ps4 still had issues over the ps3 version at launch.

Yeah because having shitty online code has shit to do with graphics,Dice had a Async shader game on consoles before any one else including sony studios it was the first on PC to and was knee deep in Mantle which is basically DX12 dude.

Dice has more experience working with DX12 like code that most developers have this gen they have 3 Aysnc shader games 99% of developers don't even have one.

The xbox one version also had issues..lol

@commander said:

it's 10 percent overclock. When there is a cpu shortage that's a lot, why do you think the ps4 struggles with framerates sometimes. It's because the cpu gets taxed. Of course you won't see it in star wars battlefront, it's mostly gpu bound.

That extra 100 mhz on all 8 cores has been noticable, look at mad max , unity , gta V, call of duty and the witcher 3. The ps4 has a much better gpu than the xbox one., that extra power on the xbox doesn't come out of thin air.

Besides , ms has freed up an extra core for games as well, which adds roughly an extra 14 percent. something else sony can't do, they're just not as good at making operating systems. It's probably the reason why mad max is better on the xboxone.

Is not 100mhz is 150mhz.

And Mad Max is 30FPS basically lock on both,in fact a video was shown on the xbox one dropping like to 10 FPS the game slowed down teribly and was recorded with the xbox one internal recorder were ever it was a bug or something else it happen.

Unity = a screw up game.

GTA5 drops on xbox one in shut outs and while driving to and has less foliage.

COD AW runs online at 60FPS on PS4 like on xbox one and is 1080p unlike the xbox one 1360x1080p.

And the witcher 3 is faster on PS4.hahahaahahaaaaaa

You mean the extra core that Project Cars use and stil the xbox one version is as much as 14 FPS behind.?

Man 10% CPU is nothing specially on a crappy jaguar.

This should shut you up,look at the differnce 2 frames per second from a 2ghz drop from 4.5ghz to 2.5ghz..lol

2 freaking ghz of difference and the only gap it cause is a 2 frame drop in the witcher 3,those 150mhz on xbox one would do nothing not even a frame you know how many times 150 fit on 2000 right.?

@StormyJoe said:

@tormentos: I didn't even read your reply. Not one line. I don't care what the biggest cow fanboy has to say.

How pathetic preteneding to not read my post,when in fact you did.lol

Your link kill your own argument you owned your self,the PS4 API is slimer than DX12.

And i prove how the PS4 APi also had its problem and how it improve tiling from 10 to 100% with new code in just 4 months,so yeah your theory has being destroy and the PS4 will continue to improve which men the gap is here to stay.

@EG101 said:

Sorry but this is false.

This gen the difference between multiplats is far smaller than last gen or the gen before that.

Last gen games like Bayonetta and Skyrim were literally unplayable on the PS3 while they ran fine on the 360 at 720P. This gen the games are mostly identical and play the same there are no game breaking differences and the differences are minor.

Bullshit and stop with the whole unplayable lie about Bayoneta and Skyrim,just because a developer screw up a game on PS3 doesn't mean it was the PS3,after all the PS3 was harder to code and stil have superior games as well.

Who ever claim the difference was bigger last gen is a blind buffoon who don't know anything about performance,there is no way the gap from 720p to 640p is even close to the gap between 720p vs 900p,900p vs 1080p or 720p vs 1080p not even close.

Or having up to 30FPS advantage like Tomb Raider and sniper elite.

Avatar image for RR360DD
RR360DD

14099

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#252 RR360DD
Member since 2011 • 14099 Posts

Consoles are out two years and we're still talking about this crap

LOL

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#253  Edited By tormentos
Member since 2003 • 33793 Posts
@commander said:

The cpu in both consoles is so weak that with cpu intensive games you get cpu bottlenecks, so 10 extra performance will translate in real world numbers. The reason why the ps4 gets better with patches is because of gpgpu tools and general optimization.

and yes some games run with dynamic resolution because either the esram isn't used properly and/or the game is very gpu intensive

Probably if you were talking about a i5 or i7 running at 4ghz so 10% of that is allot but 9% of 1.6Ghz is nothing and will do nothing hell the 7core did nothing to resolve the frame issues on Project Cars.

- We stated that high numbers of AI could cause CPU bound scenario's on XB1 and we used the 7th core to eliminate these cases. This is somewhat acknowledged in your updated article.

How is that SMS developer of Project Cars state the xbox one version was getting CPU bound scenarios while the PS4 didn't.?

PC is a draw call heavy game draw calls are CPU issue so yeah the xbox one had a problem with them the PS4 didn't,so much for 10% + the extra core doing anything.

The xbox one CPU is not a 4.0ghz i5,is a 1.75ghz Jaguar 10% is totally nothing.

@StormyJoe said:

@AzatiS: the reason the PS3 was ridiculed last gen was that while Sony was boasting how powerful it was, it did not have superior multi plats.

The Xbox 360’s CPU has more general purpose processing power because it has three general purpose cores, and Cell has just one.

Xbox 360 has 278.4 GB/s of memory system bandwidth. The PS3 has less than one-fifth of Xbox 360’s (48 GB/s) of total memory system bandwidth.

http://majornelson.com/2005/05/20/xbox-360-vs-ps3-part-1-of-4/

Yeah lest pretend this ^ didn't happen,that is not some bling lunatic fanboy that is Major Nelson on may 20 2005 implaying thexbox 360 was more powerful than the PS3,making shitty ass comparison like the xbox 360 CPU has more processing power than Cel because it had 3 cores vs sony 1 core yeah lets totaly ignore how he side stepped SPE which produce most of the performance and which were the reason for the PS3 beating the xbox 360 graphics wise.

That funky math which they also use this gen,adding bandwidth as if it was oranges which was totally stupid the PS3 with 48GB/s total onwed the xbox 360 with its fake 278gb/s if we are to believe MS lies the xbox 360 had more bandwidth than the xbox one.

It was MS whon started that shit and the only reason why most multiplatforms were better on 360 was because the 360 was the lead platform and was easier,if the PS3 would have being as easy as the PS4 the PS3 would have much more superior exclusives because it had the power to do so the xbox one doesn't.

@blackace said:

This is pretty funny coming from a developer who can't seem to get a FPS to run at 1080P on the PS4 even though other developers have done it with no problems. DX12 will make some differences. Someone should have asked DICE if they will ever be able to make a console game that runs smoothly at 1080P 60fps.

You are buffon,what games have the GPU load BF4 or Star wars BF3 have and are 1080p 60FPS on PS4.?

Dice has more experience with DX12 like code that any developer in the industry you sorry excuse for a fanboy,if you don't know what the hell you are talking about please dude don't comment,your comments are nothing but pure damage control.

How many 1080p 60FPS games are on this gen without problems.?

@StormyJoe said:

@blackace: especially since the Developer of Caffeine said DX12 improves XB1's performance, but some people just can't accept it

http://www.gamezone.com/news/dx12-adding-20-performance-increase-on-pc-even-more-on-xbox-one-3423420

"Oh it should help a lot. I've been in the DirectX 12 developer program for a while.

PR bulshit like always.

The 20% increase he mentioned was also only for PC, with the Xbox One seeing a higher percentage.

Where the lie comes out the same shit Brad Wardell claimed which many developers have prove wrong,the biggest beneficiary of DX12 is PC not xbox one,and the reason is obvious most of DX12 was already on xbox one.

That people is doing PR.

@blackace said:

One dev shuts the door, while other devs who are saying the complete opposite are ignored? LOL!! Ok. Every PC developers will be using Win10/DX12 in the very near future. It won't matter, until it matters. We'll see what happens in 2016.

Other devs seem to have no problems, except for Konami, but at least they can get 1080P on the PS4. The Frostbite and Fox engines are crap anyways. HUMA info?? Don't remember that.

Oh this is to good..hahahahaaaaa

Nothing has gone wrong. All the games that were currently released are using the older dev kits and old firmware. Let's see how the games turn out after the E3 and in 2015. Then ask that same question again.

Blackace in February 2014...lol

Posted by blackace (21916 posts) - 1 year, 11 months ago

@Shewgenja said:

RIP XBone. We never really got to know you. LOL

You will when they start using the new devkits.

Blackace on 2013.

Edited by blackace (21916 posts) - 1 year, 11 months ago

I could care less what resolution it is, as I'm not buying the game. Both CoD & BF4 have large patches for the XBox One version. Why. Probably to change the resolution from 720P/900P to 1080P. I told you fools the devs didn't get the final XB1 DevKit until the end of Aug. Devs are pissed and probably don't have enough time to update all the games. The 2014 E3 is going to be so much fun. I can't wait.

Blackace on 2013...lol Not only his prediction for a patch failed but look at his waiting game.

Now is wait for 2016...hahahahahaaa

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#254 Zero_epyon
Member since 2004 • 20502 Posts
@tormentos said:
@blackace said:

One dev shuts the door, while other devs who are saying the complete opposite are ignored? LOL!! Ok. Every PC developers will be using Win10/DX12 in the very near future. It won't matter, until it matters. We'll see what happens in 2016.

Other devs seem to have no problems, except for Konami, but at least they can get 1080P on the PS4. The Frostbite and Fox engines are crap anyways. HUMA info?? Don't remember that.

Oh this is to good..hahahahaaaaa

Nothing has gone wrong. All the games that were currently released are using the older dev kits and old firmware. Let's see how the games turn out after the E3 and in 2015. Then ask that same question again.

Blackace in February 2014...lol

Posted by blackace (21916 posts) - 1 year, 11 months ago

@Shewgenja said:

RIP XBone. We never really got to know you. LOL

You will when they start using the new devkits.

Blackace on 2013.

Edited by blackace (21916 posts) - 1 year, 11 months ago

I could care less what resolution it is, as I'm not buying the game. Both CoD & BF4 have large patches for the XBox One version. Why. Probably to change the resolution from 720P/900P to 1080P. I told you fools the devs didn't get the final XB1 DevKit until the end of Aug. Devs are pissed and probably don't have enough time to update all the games. The 2014 E3 is going to be so much fun. I can't wait.

Blackace on 2013...lol Not only his prediction for a patch failed but look at his waiting game.

Now is wait for 2016...hahahahahaaa

Gonna need a senzu for that.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#255  Edited By commander
Member since 2010 • 16217 Posts

@daious said:

So your argument is that you show a benchmark and pick the weakest CPU (with a game that isn't properly optimized for CPU utilization) and say the obvious that it is the weakest CPU (30 buck) isn't doing much with a build with over thousand dollars worth of GPU without showing difference the the weakest CPU does with a slight overclock to actually give something even close to a comparison to your topic? If you are going to make a comparison show the difference between different clocks. You are picking CPUs with different IPC and pretending to have an argument about the difference mhz it makes. All this with using a game that wasn't properly coded for CPU utilization giving it an artificial increase in the CPU bottleneck. A game so horribly coded that they got attacked by gamers everywhere. A game so badly optimzed that the developers and ubisoft started placing the blame with CPU/GPU hardware manufacturers.

Is that a joke?

Not only the major issues about that game, couldn't someone cherry pick random benchmarks put a 1000dollar CPU (i7 octocore processor) with a ~50 dollar GPU and say the same thing but about a GPU when trying to push a game at a certain setting? Anyone can make any point doing something like this unless you actually make the comparisons relevant.

Find relevant comparisons with the exact same CPU with slight different overclocks while running a CPU extensive game that is properly coded for. Even then it wouldn't be completely relevant seeing that PS4 games and Xboxone games run their games at different settings. PS4 runs at a higher visual settings than most xboxone games. Thereby, making the xboxone game less demanding visually. We don't know how the PS4 will run a game at the same settings as an xboxone game.

On a side note: you made a huge deal with witcher 3 and multiple patch threads about performance. You seem to completely disregarding it now. But what I am curious about your opinion, doesn't the optimization mean that PS4 is capable of providing better framerates (circumventing the slightly weaker CPU issues) while offering a massive increase visual fidelity? Similar framerates and higher visual fidelity which includes higher resolution? What is your metric in better hardware?

The fact is that these are relevant comparisons. The fx 4100 has actually more cpu resources available than the ps4's cpu, not to mention that 1./4 (2 out of 8 cores ) are reverved for the os. To compare you should take an fx 4100 and disable one core and then combine it with a hd 7850 but it still wouldn't give a correct image of the console version.

Simply because the consoles are not the same as pc, even these ones. Consoles use custom detail settings and have less overhead but it's more than that.

The architecture is still custom, we all know that pc games mostly prefer faster cores than more cores. So even the fx 4100 with one core disabled would be a bad comparison. You would have to downclock a fx 8150 to 1.6 ghz, disable one core, kinda gues that one core would be enough for the os and that the rest would be comparable to the ps4 cpu power. but even that wouldn't be enough. We don't even know if pc game is just as optimized as the console game to run on 6 cores. Not to mention the quest for the right detail settings and the right numbers for the overhead.

The fact is that framerate drops have been found in cpu intensive areas in games like gta V and ac unity. it's crystal clear, and also confirmed by different review sites, that this is because the cpu is bottlenecking. That why I used that bench. I know you're going to say, the gtx 980 sli and the fx 4100... but like i said before the bottleneck shows with higher tier cpu's as well, like the i5 2500 and I5 4670 and these are multiple times faster than the ps4 cpu. It's not the same difference between the gtx 980 sli and the ps4's gpu, true but the game is running at much higher detail settings than the ps4 version as well.

Of course I know this isn't a perfect comparison when you look at hardware and detail settings but that isn't the point here. The point is that within a cpu bottleneck a cpu performance increase will translate directly in framerates, if there isn't a gpu bottleneck and that's exactly the case with the ps4 in certain games. The cpu is so weak that with cpu intensive tasks a bottleneck manifests itself.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#256 commander
Member since 2010 • 16217 Posts

@daious:

As for your comment on the witcher 3. Yes the ps4 is capable to use gpgpu tools to counter the cpu bottleneck, that was what this whole gen was about. But not every cpu task can be done by gpgpu tools, which is quite clear in ac unity. Gpgpu tools do ask for gpu power as well, so the ps4 resolution advantage could easily be nullified, which is also clear in ac unity. It's also a lot of work apparently otherwise it wouldn't take that long to get games patches.

If you asked me the question what system had better hardware 1 year ago, i would have said ps4, hands down but now it's really hard to make up my mind. I mean the ps4 crushes the x1 in a lot of games in terms of resolution and/or framerate but in some games, and these are mostly the most demanding games, the x1 is, especially at release, simply better in framerates.

We can talk all day but that's not how I picture 'the most powerfull console'.

Avatar image for Chutebox
Chutebox

51605

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#257 Chutebox  Online
Member since 2007 • 51605 Posts

Damn, fucking laughing at clown blackace getting rolled hard haha

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#258 FoxbatAlpha
Member since 2009 • 10669 Posts

Lots of developers have spoken about the benefits of DX12 on the Xbox One. It seems fitting that cows run with this story and make it gospel. Proof that cows are scared of Teh 12 and more proof they only believe what they want.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#259 scatteh316
Member since 2004 • 10273 Posts

@FoxbatAlpha said:

Lots of developers have spoken about the benefits of DX12 on the Xbox One. It seems fitting that cows run with this story and make it gospel. Proof that cows are scared of Teh 12 and more proof they only believe what they want.

And loads and said it will make sod all difference..... One of which is Phil Spencer.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#260 FoxbatAlpha
Member since 2009 • 10669 Posts

@scatteh316 said:
@FoxbatAlpha said:

Lots of developers have spoken about the benefits of DX12 on the Xbox One. It seems fitting that cows run with this story and make it gospel. Proof that cows are scared of Teh 12 and more proof they only believe what they want.

And loads and said it will make sod all difference..... One of which is Phil Spencer.

Phil said it would make a difference. How much is yet to be seen. He did say it wouldn't be huge or anything. That is vastly different than this Dice troll who says NO difference at all.

Avatar image for LJS9502_basic
LJS9502_basic

180203

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#261 LJS9502_basic
Member since 2003 • 180203 Posts

ITT the clueless try talking the talk about tech for their machine. If only they had the knowledge to do so. LOL

Avatar image for magmadragoonx4
magmadragoonx4

697

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#262 magmadragoonx4
Member since 2015 • 697 Posts

I honestly dismissed witcher dev's saying dx12 won't make a difference but dice is pretty much as good as it gets soooooo...

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#263  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

@commander said:

it took 6 months to run the witcher slighty better on the ps4,

what a joke, it was a slideshow before that.

It could have take 12 years,FACT still stand it wasn't the CPU like you wanted to pretend it was bad coding...lol

Now the xbox one version is the slideshow.lol

Ok have fun playing fallout 4 at playable framerates in 2027 lmao

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#264  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@tormentos:

listen you rabid cow, project cars is not the normal multiplat you fool they took the time to bypass DX's limitations.... "“It’s been challenging splitting the renderer further across threads in an even more fine-grained" . "The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel" In DX you cant run multiple core in parallel splitting the renderer to the gpu.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#265  Edited By StormyJoe
Member since 2011 • 7806 Posts

@tormentos: Again, didn't read a single line. I don't care what the biggest cow fanboy in SW says...

Avatar image for flyincloud1116
Flyincloud1116

6418

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#266 Flyincloud1116
Member since 2014 • 6418 Posts

@tormentos: Xbox1 delegates were spreading lies all over the place.

Avatar image for Suppaman100
Suppaman100

5250

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#267 Suppaman100
Member since 2013 • 5250 Posts

@Zero_epyon said:
@tormentos said:
@blackace said:

One dev shuts the door, while other devs who are saying the complete opposite are ignored? LOL!! Ok. Every PC developers will be using Win10/DX12 in the very near future. It won't matter, until it matters. We'll see what happens in 2016.

Other devs seem to have no problems, except for Konami, but at least they can get 1080P on the PS4. The Frostbite and Fox engines are crap anyways. HUMA info?? Don't remember that.

Oh this is to good..hahahahaaaaa

Nothing has gone wrong. All the games that were currently released are using the older dev kits and old firmware. Let's see how the games turn out after the E3 and in 2015. Then ask that same question again.

Blackace in February 2014...lol

Posted by blackace (21916 posts) - 1 year, 11 months ago

@Shewgenja said:

RIP XBone. We never really got to know you. LOL

You will when they start using the new devkits.

Blackace on 2013.

Edited by blackace (21916 posts) - 1 year, 11 months ago

I could care less what resolution it is, as I'm not buying the game. Both CoD & BF4 have large patches for the XBox One version. Why. Probably to change the resolution from 720P/900P to 1080P. I told you fools the devs didn't get the final XB1 DevKit until the end of Aug. Devs are pissed and probably don't have enough time to update all the games. The 2014 E3 is going to be so much fun. I can't wait.

Blackace on 2013...lol Not only his prediction for a patch failed but look at his waiting game.

Now is wait for 2016...hahahahahaaa

Gonna need a senzu for that.

Fcking gold man, LMFAO!

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#268 Daious
Member since 2013 • 2315 Posts

@commander said:

@daious:

As for your comment on the witcher 3. Yes the ps4 is capable to use gpgpu tools to counter the cpu bottleneck, that was what this whole gen was about. But not every cpu task can be done by gpgpu tools, which is quite clear in ac unity. Gpgpu tools do ask for gpu power as well, so the ps4 resolution advantage could easily be nullified, which is also clear in ac unity. It's also a lot of work apparently otherwise it wouldn't take that long to get games patches.

If you asked me the question what system had better hardware 1 year ago, i would have said ps4, hands down but now it's really hard to make up my mind. I mean the ps4 crushes the x1 in a lot of games in terms of resolution and/or framerate but in some games, and these are mostly the most demanding games, the x1 is, especially at release, simply better in framerates.

We can talk all day but that's not how I picture 'the most powerfull console'.

I stopped reading when I saw you are still talking about ACU

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#269  Edited By commander
Member since 2010 • 16217 Posts

@daious said:
@commander said:

@daious:

As for your comment on the witcher 3. Yes the ps4 is capable to use gpgpu tools to counter the cpu bottleneck, that was what this whole gen was about. But not every cpu task can be done by gpgpu tools, which is quite clear in ac unity. Gpgpu tools do ask for gpu power as well, so the ps4 resolution advantage could easily be nullified, which is also clear in ac unity. It's also a lot of work apparently otherwise it wouldn't take that long to get games patches.

If you asked me the question what system had better hardware 1 year ago, i would have said ps4, hands down but now it's really hard to make up my mind. I mean the ps4 crushes the x1 in a lot of games in terms of resolution and/or framerate but in some games, and these are mostly the most demanding games, the x1 is, especially at release, simply better in framerates.

We can talk all day but that's not how I picture 'the most powerfull console'.

I stopped reading when I saw you are still talking about ACU

it's in other games as well. The witcher 3 may have been able to surpass the x1 framerates after numerous optimizations, but it took 6 months. Kinda shitty if you want to play the game at release

Avatar image for ToScA-
ToScA-

5783

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#270 ToScA-
Member since 2006 • 5783 Posts

@tormentos said:
@commander said:

The cpu in both consoles is so weak that with cpu intensive games you get cpu bottlenecks, so 10 extra performance will translate in real world numbers. The reason why the ps4 gets better with patches is because of gpgpu tools and general optimization.

and yes some games run with dynamic resolution because either the esram isn't used properly and/or the game is very gpu intensive

Probably if you were talking about a i5 or i7 running at 4ghz so 10% of that is allot but 9% of 1.6Ghz is nothing and will do nothing hell the 7core did nothing to resolve the frame issues on Project Cars.

- We stated that high numbers of AI could cause CPU bound scenario's on XB1 and we used the 7th core to eliminate these cases. This is somewhat acknowledged in your updated article.

How is that SMS developer of Project Cars state the xbox one version was getting CPU bound scenarios while the PS4 didn't.?

PC is a draw call heavy game draw calls are CPU issue so yeah the xbox one had a problem with them the PS4 didn't,so much for 10% + the extra core doing anything.

The xbox one CPU is not a 4.0ghz i5,is a 1.75ghz Jaguar 10% is totally nothing.

@StormyJoe said:

@AzatiS: the reason the PS3 was ridiculed last gen was that while Sony was boasting how powerful it was, it did not have superior multi plats.

The Xbox 360’s CPU has more general purpose processing power because it has three general purpose cores, and Cell has just one.

Xbox 360 has 278.4 GB/s of memory system bandwidth. The PS3 has less than one-fifth of Xbox 360’s (48 GB/s) of total memory system bandwidth.

http://majornelson.com/2005/05/20/xbox-360-vs-ps3-part-1-of-4/

Yeah lest pretend this ^ didn't happen,that is not some bling lunatic fanboy that is Major Nelson on may 20 2005 implaying thexbox 360 was more powerful than the PS3,making shitty ass comparison like the xbox 360 CPU has more processing power than Cel because it had 3 cores vs sony 1 core yeah lets totaly ignore how he side stepped SPE which produce most of the performance and which were the reason for the PS3 beating the xbox 360 graphics wise.

That funky math which they also use this gen,adding bandwidth as if it was oranges which was totally stupid the PS3 with 48GB/s total onwed the xbox 360 with its fake 278gb/s if we are to believe MS lies the xbox 360 had more bandwidth than the xbox one.

It was MS whon started that shit and the only reason why most multiplatforms were better on 360 was because the 360 was the lead platform and was easier,if the PS3 would have being as easy as the PS4 the PS3 would have much more superior exclusives because it had the power to do so the xbox one doesn't.

@blackace said:

This is pretty funny coming from a developer who can't seem to get a FPS to run at 1080P on the PS4 even though other developers have done it with no problems. DX12 will make some differences. Someone should have asked DICE if they will ever be able to make a console game that runs smoothly at 1080P 60fps.

You are buffon,what games have the GPU load BF4 or Star wars BF3 have and are 1080p 60FPS on PS4.?

Dice has more experience with DX12 like code that any developer in the industry you sorry excuse for a fanboy,if you don't know what the hell you are talking about please dude don't comment,your comments are nothing but pure damage control.

How many 1080p 60FPS games are on this gen without problems.?

@StormyJoe said:

@blackace: especially since the Developer of Caffeine said DX12 improves XB1's performance, but some people just can't accept it

http://www.gamezone.com/news/dx12-adding-20-performance-increase-on-pc-even-more-on-xbox-one-3423420

"Oh it should help a lot. I've been in the DirectX 12 developer program for a while.

PR bulshit like always.

The 20% increase he mentioned was also only for PC, with the Xbox One seeing a higher percentage.

Where the lie comes out the same shit Brad Wardell claimed which many developers have prove wrong,the biggest beneficiary of DX12 is PC not xbox one,and the reason is obvious most of DX12 was already on xbox one.

That people is doing PR.

@blackace said:

One dev shuts the door, while other devs who are saying the complete opposite are ignored? LOL!! Ok. Every PC developers will be using Win10/DX12 in the very near future. It won't matter, until it matters. We'll see what happens in 2016.

Other devs seem to have no problems, except for Konami, but at least they can get 1080P on the PS4. The Frostbite and Fox engines are crap anyways. HUMA info?? Don't remember that.

Oh this is to good..hahahahaaaaa

Nothing has gone wrong. All the games that were currently released are using the older dev kits and old firmware. Let's see how the games turn out after the E3 and in 2015. Then ask that same question again.

Blackace in February 2014...lol

Posted by blackace (21916 posts) - 1 year, 11 months ago

@Shewgenja said:

RIP XBone. We never really got to know you. LOL

You will when they start using the new devkits.

Blackace on 2013.

Edited by blackace (21916 posts) - 1 year, 11 months ago

I could care less what resolution it is, as I'm not buying the game. Both CoD & BF4 have large patches for the XBox One version. Why. Probably to change the resolution from 720P/900P to 1080P. I told you fools the devs didn't get the final XB1 DevKit until the end of Aug. Devs are pissed and probably don't have enough time to update all the games. The 2014 E3 is going to be so much fun. I can't wait.

Blackace on 2013...lol Not only his prediction for a patch failed but look at his waiting game.

Now is wait for 2016...hahahahahaaa

LOL. Blackace's career as a system warrior came to an end with this post.

Avatar image for Sagemode87
Sagemode87

3438

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#271 Sagemode87
Member since 2013 • 3438 Posts

No surprise here,only diehard lems seems to think there will be a difference.

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#272  Edited By imt558
Member since 2004 • 976 Posts

@commander said:

it's in other games as well. The witcher 3 may have been able to surpass the x1 framerates after numerous optimizations, but it took 6 months. Kinda shitty if you want to play the game at release

PCars is also a demanding game as well ( very CPU heavy ). Framerate still isn't fixed on Xbone after 5 and a half months. Kinda shitty!

Avatar image for the_last_ride
The_Last_Ride

76371

Forum Posts

0

Wiki Points

0

Followers

Reviews: 122

User Lists: 2

#273 The_Last_Ride
Member since 2004 • 76371 Posts

@imt558: so much salt

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#274 commander
Member since 2010 • 16217 Posts

@imt558 said:
@commander said:

it's in other games as well. The witcher 3 may have been able to surpass the x1 framerates after numerous optimizations, but it took 6 months. Kinda shitty if you want to play the game at release

PCars is also a demanding game as well ( very CPU heavy ). Framerate still isn't fixed on Xbone after 5 and a half months. Kinda shitty!

cars stil runs well above 30 fps lmao. That wasn't the case with the witcher

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#275  Edited By imt558
Member since 2004 • 976 Posts

@The_Last_Ride said:

@imt558: so much salt

I'm salty because PCARS isn't fixed yet on Xbone? What an dumbass. LOL!

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#276  Edited By imt558
Member since 2004 • 976 Posts
@commander said:

cars stil runs well above 30 fps lmao. That wasn't the case with the witcher

So, Xbone has few frame drops from 30fps in Witcher 3, that's OK. But PCARS has drops more than 20 frames in some cases from 60. But hey, it's OK when games run above 30. LOL

What an dumb excuse. No surprise from an Xboner.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#277 Zero_epyon
Member since 2004 • 20502 Posts

@Suppaman100: Gotta love TeamFourStar

Avatar image for prawephet
Prawephet

385

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#278 Prawephet
Member since 2014 • 385 Posts

@1080pOnly: Actually last gen the PS3 had the best looking exclusives. Sony fans just gave up on multiplats because the 360 was the lead section platform and the ports always sucked.

PS3 and ended up with more sales after coming back from a 10 million deficit(maybe it's behind again now. It was so close at the end the lead console bounced back and forth).

Now the ps4 has the better sales and the better looking games as well as being the lead dev platform. This is why sales do matter.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#279  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:
@daious said:
@commander said:

@daious:

As for your comment on the witcher 3. Yes the ps4 is capable to use gpgpu tools to counter the cpu bottleneck, that was what this whole gen was about. But not every cpu task can be done by gpgpu tools, which is quite clear in ac unity. Gpgpu tools do ask for gpu power as well, so the ps4 resolution advantage could easily be nullified, which is also clear in ac unity. It's also a lot of work apparently otherwise it wouldn't take that long to get games patches.

If you asked me the question what system had better hardware 1 year ago, i would have said ps4, hands down but now it's really hard to make up my mind. I mean the ps4 crushes the x1 in a lot of games in terms of resolution and/or framerate but in some games, and these are mostly the most demanding games, the x1 is, especially at release, simply better in framerates.

We can talk all day but that's not how I picture 'the most powerfull console'.

I stopped reading when I saw you are still talking about ACU

it's in other games as well. The witcher 3 may have been able to surpass the x1 framerates after numerous optimizations, but it took 6 months. Kinda shitty if you want to play the game at release

You made it clear with your tons of patch threads that would never happen with Witcher 3.

Now PS4 has a 40% resolution increase and a frame rate advantage. Now, all you do is downplay. Your hypocrisy is ridiculous. The fact that a properly coded game can match framerates while overall higher visual settings says a lot.

You make a huge deal about a few frames and always neglect any other advantage the ps4 has because you feel like cherrypicking.

All your argument ever are are cherry picking. All you have to rely is one of the worst developers when it comes to optimization (ubisoft).

You also downplay Dice while hyping up Ubisoft developers/games? Is that a joke? You can't just ignore evidence that doesn't support your claim and cherry pick horrible examples that do.

All I am asking is for you to actually formulate a coherent argument and stop with your ridiculous damage control. How many people here are taking you serious? I mean I get it your a fanboy but come on.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#280 commander
Member since 2010 • 16217 Posts

@daious said:
@commander said:
@daious said:
@commander said:

@daious:

As for your comment on the witcher 3. Yes the ps4 is capable to use gpgpu tools to counter the cpu bottleneck, that was what this whole gen was about. But not every cpu task can be done by gpgpu tools, which is quite clear in ac unity. Gpgpu tools do ask for gpu power as well, so the ps4 resolution advantage could easily be nullified, which is also clear in ac unity. It's also a lot of work apparently otherwise it wouldn't take that long to get games patches.

If you asked me the question what system had better hardware 1 year ago, i would have said ps4, hands down but now it's really hard to make up my mind. I mean the ps4 crushes the x1 in a lot of games in terms of resolution and/or framerate but in some games, and these are mostly the most demanding games, the x1 is, especially at release, simply better in framerates.

We can talk all day but that's not how I picture 'the most powerfull console'.

I stopped reading when I saw you are still talking about ACU

it's in other games as well. The witcher 3 may have been able to surpass the x1 framerates after numerous optimizations, but it took 6 months. Kinda shitty if you want to play the game at release

You made it clear with your tons of patch threads that would never happen with Witcher 3.

Now PS4 has a 40% resolution increase and a frame rate advantage. Now, all you do is downplay. Your hypocrisy is ridiculous. The fact that a properly coded game can match framerates while overall higher visual settings says a lot.

You make a huge deal about a few frames and always neglect any other advantage the ps4 has because you feel like cherrypicking.

All your argument ever are are cherry picking. All you have to rely is one of the worst developers when it comes to optimization (ubisoft).

You also downplay Dice while hyping up Ubisoft developers/games? Is that a joke? You can't just ignore evidence that doesn't support your claim and cherry pick horrible examples that do.

All I am asking is for you to actually formulate a coherent argument and stop with your ridiculous damage control. How many people here are taking you serious? I mean I get it your a fanboy but come on.

It was never such a big problem with the x1, that's why the dev didn't optimize the x1 like the ps4, if they had it would have increased the framerates as well. Besides it's not like the witcher 3 runs that much better than the x1, the difference is much smaller than it was when the x1 was on top.

As for the resolution, ok, we all know the ps4 has a much better graphics card and there was still room for 1080p on the witcher 3, after all the witcher 3 isn't that cpu intensive like ac unity.

I know you think ac unity is an exception, and it is, but not because of coding. It's because it's so cpu intensive with all the npc's. It's going to become very clear wether unity was that badly coded when syndicate releases and that is very soon, so you may want to be carefull what you say.

Of course I'm downplaying dice, those games aren't cpu intensive at all and they still use an engine that was used on teh 360 and xboxone. This is not damage control. 6 months of patches don't need damage control. You better pray ac syndicate isn't going to be a slideshow again on the ps4.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#281  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:
@daious said:
@commander said:
@daious said:

I stopped reading when I saw you are still talking about ACU

it's in other games as well. The witcher 3 may have been able to surpass the x1 framerates after numerous optimizations, but it took 6 months. Kinda shitty if you want to play the game at release

You made it clear with your tons of patch threads that would never happen with Witcher 3.

Now PS4 has a 40% resolution increase and a frame rate advantage. Now, all you do is downplay. Your hypocrisy is ridiculous. The fact that a properly coded game can match framerates while overall higher visual settings says a lot.

You make a huge deal about a few frames and always neglect any other advantage the ps4 has because you feel like cherrypicking.

All your argument ever are are cherry picking. All you have to rely is one of the worst developers when it comes to optimization (ubisoft).

You also downplay Dice while hyping up Ubisoft developers/games? Is that a joke? You can't just ignore evidence that doesn't support your claim and cherry pick horrible examples that do.

All I am asking is for you to actually formulate a coherent argument and stop with your ridiculous damage control. How many people here are taking you serious? I mean I get it your a fanboy but come on.

It was never such a big problem with the x1, that's why the dev didn't optimize the x1 like the ps4, if they had it would have increased the framerates as well. Besides it's not like the witcher 3 runs that much better than the x1, the difference is much smaller than it was when the x1 was on top.

As for the resolution, ok, we all know the ps4 has a much better graphics card and there was still room for 1080p on the witcher 3, after all the witcher 3 isn't that cpu intensive like ac unity.

I know you think ac unity is an exception, and it is, but not because of coding. It's because it's so cpu intensive with all the npc's. It's going to become very clear wether unity was that badly coded when syndicate releases and that is very soon, so you may want to be carefull what you say.

Of course I'm downplaying dice, those games aren't cpu intensive at all and they still use an engine that was used on teh 360 and xboxone. This is not damage control. 6 months of patches don't need damage control. You better pray ac syndicate isn't going to be a slideshow again on the ps4.

Every time xboxone has a slight frame rate advantage while running a game on lower settings you go full on rabid fanboymode.. When a ps4 game does it, you just load on the excuses.

Just pure damage control and hypocrisy

Dice games not CPU extensive? Lol. Their multiplayer games like BF4 conquest large are some of the most CPU extensive games outside of RTS.

I remember last time I said that you posted benchmarks of an empty bf4 map because you can't read.

Lol worthy that you are still hyping up Ubisoft games. One of the worst developers out there for optimization all around. Thanks for the lols. Gotta love fanboys.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#282 scatteh316
Member since 2004 • 10273 Posts

@commander said:
@daious said:
@commander said:
@daious said:
@commander said:

@daious:

As for your comment on the witcher 3. Yes the ps4 is capable to use gpgpu tools to counter the cpu bottleneck, that was what this whole gen was about. But not every cpu task can be done by gpgpu tools, which is quite clear in ac unity. Gpgpu tools do ask for gpu power as well, so the ps4 resolution advantage could easily be nullified, which is also clear in ac unity. It's also a lot of work apparently otherwise it wouldn't take that long to get games patches.

If you asked me the question what system had better hardware 1 year ago, i would have said ps4, hands down but now it's really hard to make up my mind. I mean the ps4 crushes the x1 in a lot of games in terms of resolution and/or framerate but in some games, and these are mostly the most demanding games, the x1 is, especially at release, simply better in framerates.

We can talk all day but that's not how I picture 'the most powerfull console'.

I stopped reading when I saw you are still talking about ACU

it's in other games as well. The witcher 3 may have been able to surpass the x1 framerates after numerous optimizations, but it took 6 months. Kinda shitty if you want to play the game at release

You made it clear with your tons of patch threads that would never happen with Witcher 3.

Now PS4 has a 40% resolution increase and a frame rate advantage. Now, all you do is downplay. Your hypocrisy is ridiculous. The fact that a properly coded game can match framerates while overall higher visual settings says a lot.

You make a huge deal about a few frames and always neglect any other advantage the ps4 has because you feel like cherrypicking.

All your argument ever are are cherry picking. All you have to rely is one of the worst developers when it comes to optimization (ubisoft).

You also downplay Dice while hyping up Ubisoft developers/games? Is that a joke? You can't just ignore evidence that doesn't support your claim and cherry pick horrible examples that do.

All I am asking is for you to actually formulate a coherent argument and stop with your ridiculous damage control. How many people here are taking you serious? I mean I get it your a fanboy but come on.

It was never such a big problem with the x1, that's why the dev didn't optimize the x1 like the ps4, if they had it would have increased the framerates as well. Besides it's not like the witcher 3 runs that much better than the x1, the difference is much smaller than it was when the x1 was on top.

As for the resolution, ok, we all know the ps4 has a much better graphics card and there was still room for 1080p on the witcher 3, after all the witcher 3 isn't that cpu intensive like ac unity.

I know you think ac unity is an exception, and it is, but not because of coding. It's because it's so cpu intensive with all the npc's. It's going to become very clear wether unity was that badly coded when syndicate releases and that is very soon, so you may want to be carefull what you say.

Of course I'm downplaying dice, those games aren't cpu intensive at all and they still use an engine that was used on teh 360 and xboxone. This is not damage control. 6 months of patches don't need damage control. You better pray ac syndicate isn't going to be a slideshow again on the ps4.

DICE games aren't CPU intensive?? Have you looked at the benchmarks on PC???

Check out this guy....lol

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#283 tormentos
Member since 2003 • 33793 Posts

@04dcarraher said:

@tormentos:

listen you rabid cow, project cars is not the normal multiplat you fool they took the time to bypass DX's limitations.... "“It’s been challenging splitting the renderer further across threads in an even more fine-grained" . "The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel" In DX you cant run multiple core in parallel splitting the renderer to the gpu.

“It’s been challenging splitting the renderer further across threads in an even more fine-grained manner – even splitting already-small tasks into 2-3ms chunks. The single-core speed is quite slow compared to a high-end PC though so splitting across cores is essential.

“The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel. There are still some bottlenecks to work out with memory flushing to garlic, even after changing to LCUE, the memory copying is still significant.”

http://www.cinemablend.com/games/Project-CARS-Hits-Some-Hardware-Limits-Xbox-One-PS4-61215.html

You are a fool this is the interview were that was taken from is about both the PS4 and xbox one not just the PS4 you silly lemming.

But Microsoft is not sleeping, really. Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table.

They added tons of features just to work around limitations of the DX11 API model.

They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints.

http://www.eurogamer.net/articles/digitalfoundry-2014-metro-redux-what-its-really-like-to-make-a-multi-platform-game

The problem with you is simple you don't know how to read and you are to slow to understand that DX11.X on xbox one is not fu**ing DX11 on PC not even CLOSE not only MS had tools already to work around LIMITATIONS of DX11 on xbox one,but they even fu**ing have a GNM stily API for do it your self code like the PS4 which the developer of metro could not use because of time.

That interview was more than a year old,in fact is older than the xbox one Holiday update which unlock the 7core which Project cars use so i am sure that GNM style API had being use on xbox one because it was there just like they use the core to deliver the best possible version they could on xbox one.

So yeah you are the rabid fanboy and the one who refuses to see that DX11 on xbox one has never being like DX11 on PC not even close.

@StormyJoe said:

@tormentos: Again, didn't read a single line. I don't care what the biggest cow fanboy in SW says...

Hahahahaaa keep the denial alive bro you were owned hard..lol and by your own link..hahahaaa

@commander said:

it's in other games as well. The witcher 3 may have been able to surpass the x1 framerates after numerous optimizations, but it took 6 months. Kinda shitty if you want to play the game at release

It could take 12 it was optimized over the xbox one version and now you most play in slideshow mode on your xbox one...lol

@commander said:

cars stil runs well above 30 fps lmao. That wasn't the case with the witcher

Hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaa

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#284 commander
Member since 2010 • 16217 Posts

@daious said:

Every time xboxone has a slight frame rate advantage while running a game on lower settings you go full on rabid fanboymode.. When a ps4 game does it, you just load on the excuses.

Just pure damage control and hypocrisy

Dice games not CPU extensive? Lol. Their multiplayer games like BF4 conquest large are some of the most CPU extensive games outside of RTS.

I remember last time I said that you posted benchmarks of an empty bf4 map because you can't read.

Lol worthy that you are still hyping up Ubisoft games. One of the worst developers out there for optimization all around. Thanks for the lols. Gotta love fanboys.

@scatteh316 said:

DICE games aren't CPU intensive?? Have you looked at the benchmarks on PC???

Check out this guy....lol

sure battlefield 4 is a demanding game lmao

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#285  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@tormentos:

Why are you ignoring key points

"In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints."

Again, your ignoring the true multi-threading model that DX11.X lacks that DX12 is introducing.

Digital Foundry: To what extent will DX12 prove useful on Xbox One? Isn't there already a low CPU overhead there in addressing the GPU?

Oles Shishkovstov: No, it's important. All the dependency tracking takes a huge slice of CPU power. And if we are talking about the multi-threaded command buffer chunks generation - the DX11 model was essentially a 'flop', while DX12 should be the right one.

"“The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel. There are still some bottlenecks to work out with memory flushing to garlic, even after changing to LCUE, the memory copying is still significant.”"

enough said

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#286  Edited By Zero_epyon
Member since 2004 • 20502 Posts

@commander said:
@daious said:

Every time xboxone has a slight frame rate advantage while running a game on lower settings you go full on rabid fanboymode.. When a ps4 game does it, you just load on the excuses.

Just pure damage control and hypocrisy

Dice games not CPU extensive? Lol. Their multiplayer games like BF4 conquest large are some of the most CPU extensive games outside of RTS.

I remember last time I said that you posted benchmarks of an empty bf4 map because you can't read.

Lol worthy that you are still hyping up Ubisoft games. One of the worst developers out there for optimization all around. Thanks for the lols. Gotta love fanboys.

@scatteh316 said:

DICE games aren't CPU intensive?? Have you looked at the benchmarks on PC???

Check out this guy....lol

sure battlefield 4 is a demanding game lmao

A game that keeps track of 64 players over a network isn't cpu intensive? Next you're going to tell me planetside 2 is a gpu intensive game?

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#287  Edited By Daious
Member since 2013 • 2315 Posts
@Zero_epyon said:
@commander said:
@daious said:

Every time xboxone has a slight frame rate advantage while running a game on lower settings you go full on rabid fanboymode.. When a ps4 game does it, you just load on the excuses.

Just pure damage control and hypocrisy

Dice games not CPU extensive? Lol. Their multiplayer games like BF4 conquest large are some of the most CPU extensive games outside of RTS.

I remember last time I said that you posted benchmarks of an empty bf4 map because you can't read.

Lol worthy that you are still hyping up Ubisoft games. One of the worst developers out there for optimization all around. Thanks for the lols. Gotta love fanboys.

@scatteh316 said:

DICE games aren't CPU intensive?? Have you looked at the benchmarks on PC???

Check out this guy....lol

sure battlefield 4 is a demanding game lmao

A game that keeps track of 64 players over a network isn't cpu intensive? Next you're going to tell me planetside 2 is a gpu intensive game?

Commander might be the most unintelligible poster currently on system wars. It is actually unbelievable how horrible his arguments are. It is good for the lols but the fact that he consistently throws out the same erroneous argument makes me think something might be off.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#288  Edited By commander
Member since 2010 • 16217 Posts

@Zero_epyon said:

A game that keeps track of 64 players over a network isn't cpu intensive? Next you're going to tell me planetside 2 is a gpu intensive game?

No it isn't , not on the consoles, gpgpu tools were used to transfer this to the gpu and since battlefield 4 isn't a very demanding game, that was possible. I mean battlefield 4 is a cross gen game and it runs at 900p

@daious said:

Commander might be the most unintelligible poster currently on system wars. It is actually unbelievable how horrible his arguments are. It is good for the lols but the fact that he consistently throws out the same erroneous argument makes me think something might be off.

stop calling me names, you don't even have a clue what you're talking about. The only thing you did was find a little mistake in an argument that didn't even matter and it was already pointed out my someone else. For the rest you whine the whole time about the witcher 3. It took 6 months to fix it , get over it.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#290  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:
@Zero_epyon said:

A game that keeps track of 64 players over a network isn't cpu intensive? Next you're going to tell me planetside 2 is a gpu intensive game?

No it isn't , not on the consoles, gpgpu tools were used to transfer this to the gpu and since battlefield 4 isn't a very demanding game, that was possible. I mean battlefield 4 is a cross gen game and it runs at 900p

@daious said:

Commander might be the most unintelligible poster currently on system wars. It is actually unbelievable how horrible his arguments are. It is good for the lols but the fact that he consistently throws out the same erroneous argument makes me think something might be off.

stop calling me names, you don't even have a clue what you're talking about. The only thing you did was find a little mistake in an argument that didn't even matter and it was already pointed out my someone else. For the rest you whine the whole time about the witcher 3. It took 6 months to fix it , get over it.

You have proven multiple times that you lack technical understanding of the majority of topics you post about and capable of repeating the same flawed argument over and over and magically expecting different results. I completely stand by what I said. It is amazing that you don't see the hypocrisy of your posting. It is hilarious. Gotta love fanboys. Keep on damaging controlling.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#291 04dcarraher
Member since 2004 • 23858 Posts

@commander said:
@Zero_epyon said:
@commander said:
@daious said:

Every time xboxone has a slight frame rate advantage while running a game on lower settings you go full on rabid fanboymode.. When a ps4 game does it, you just load on the excuses.

Just pure damage control and hypocrisy

Dice games not CPU extensive? Lol. Their multiplayer games like BF4 conquest large are some of the most CPU extensive games outside of RTS.

I remember last time I said that you posted benchmarks of an empty bf4 map because you can't read.

Lol worthy that you are still hyping up Ubisoft games. One of the worst developers out there for optimization all around. Thanks for the lols. Gotta love fanboys.

@scatteh316 said:

DICE games aren't CPU intensive?? Have you looked at the benchmarks on PC???

Check out this guy....lol

sure battlefield 4 is a demanding game lmao

A game that keeps track of 64 players over a network isn't cpu intensive? Next you're going to tell me planetside 2 is a gpu intensive game?

No it isn't , not on the consoles, gpgpu tools were used to transfer this to the gpu and since battlefield 4 isn't a very demanding game, that was possible. I mean battlefield 4 is a cross gen game and it runs at 900p

BF4 64MP is demanding especially when alot is going on both the cpu and gpu. Single player its not. Both consoles see sizable fps drops in 64MP.

PS4 version seen Async shading used which did speed up cpu to gpu communications removing the gpu being idle waiting on the cpu for data by introducing true multithreading and parallel execution. The X1 didn't see this feature since its API didn't support it This is why X1 seen less performance at a lower resolution.

Battlefront being 40MP is probably the result of these console's cpus preventing a solid 60 FPS experience.

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#292 Spitfire-Six
Member since 2014 • 1378 Posts

@04dcarraher: You know what would be smart? if they already implemented the changes for d3d12 under a feature flag and when the Os goes live they could patch the game. Having to release a game this close to an api transition would be pretty annoying.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#293  Edited By commander
Member since 2010 • 16217 Posts

@daious said:
@commander said:
@Zero_epyon said:

A game that keeps track of 64 players over a network isn't cpu intensive? Next you're going to tell me planetside 2 is a gpu intensive game?

No it isn't , not on the consoles, gpgpu tools were used to transfer this to the gpu and since battlefield 4 isn't a very demanding game, that was possible. I mean battlefield 4 is a cross gen game and it runs at 900p

@daious said:

Commander might be the most unintelligible poster currently on system wars. It is actually unbelievable how horrible his arguments are. It is good for the lols but the fact that he consistently throws out the same erroneous argument makes me think something might be off.

stop calling me names, you don't even have a clue what you're talking about. The only thing you did was find a little mistake in an argument that didn't even matter and it was already pointed out my someone else. For the rest you whine the whole time about the witcher 3. It took 6 months to fix it , get over it.

You have proven multiple times that you lack technical understanding of the majority of topics you post about and capable of repeating the same flawed argument over and over and magically expecting different results. I completely stand by what I said. It is amazing that you don't see the hypocrisy of your posting. It is hilarious. Gotta love fanboys. Keep on damaging controlling.

Oh please, stop embarrassing yourself.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#295  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:
@daious said:
@commander said:
@Zero_epyon said:

A game that keeps track of 64 players over a network isn't cpu intensive? Next you're going to tell me planetside 2 is a gpu intensive game?

No it isn't , not on the consoles, gpgpu tools were used to transfer this to the gpu and since battlefield 4 isn't a very demanding game, that was possible. I mean battlefield 4 is a cross gen game and it runs at 900p

@daious said:

Commander might be the most unintelligible poster currently on system wars. It is actually unbelievable how horrible his arguments are. It is good for the lols but the fact that he consistently throws out the same erroneous argument makes me think something might be off.

stop calling me names, you don't even have a clue what you're talking about. The only thing you did was find a little mistake in an argument that didn't even matter and it was already pointed out my someone else. For the rest you whine the whole time about the witcher 3. It took 6 months to fix it , get over it.

You have proven multiple times that you lack technical understanding of the majority of topics you post about and capable of repeating the same flawed argument over and over and magically expecting different results. I completely stand by what I said. It is amazing that you don't see the hypocrisy of your posting. It is hilarious. Gotta love fanboys. Keep on damaging controlling.

Oh please, stop embarrassing yourself.

It seems to be the exact opposite judging by everyone's responses to your posts here. Doesn't help that you are consistently wrong. Also doesn't help that your arguments are completely faulty.

You think that someone that spends so much time damage controlling would actually be good at damage controlling.

What is even more ridiculous is that one you are called out for being wrong in one thread, you keep continuing to say the same thing. Lol.

Continue cherry picking results and ignoring everything that states otherwise. Your damage control threads are classics.Gotta love fanboys.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#296 commander
Member since 2010 • 16217 Posts

@daious said:
@commander said:
@daious said:
@commander said:
@Zero_epyon said:

A game that keeps track of 64 players over a network isn't cpu intensive? Next you're going to tell me planetside 2 is a gpu intensive game?

No it isn't , not on the consoles, gpgpu tools were used to transfer this to the gpu and since battlefield 4 isn't a very demanding game, that was possible. I mean battlefield 4 is a cross gen game and it runs at 900p

@daious said:

Commander might be the most unintelligible poster currently on system wars. It is actually unbelievable how horrible his arguments are. It is good for the lols but the fact that he consistently throws out the same erroneous argument makes me think something might be off.

stop calling me names, you don't even have a clue what you're talking about. The only thing you did was find a little mistake in an argument that didn't even matter and it was already pointed out my someone else. For the rest you whine the whole time about the witcher 3. It took 6 months to fix it , get over it.

You have proven multiple times that you lack technical understanding of the majority of topics you post about and capable of repeating the same flawed argument over and over and magically expecting different results. I completely stand by what I said. It is amazing that you don't see the hypocrisy of your posting. It is hilarious. Gotta love fanboys. Keep on damaging controlling.

Oh please, stop embarrassing yourself.

It seems to be the exact opposite judging by everyone's responses to your posts here. Doesn't help that you are consistently wrong. Also doesn't help that your arguments are completely faulty.

You think that someone that spends so much time damage controlling would actually be good at damage controlling.

What is even more ridiculous is that one you are called out for being wrong in one thread, you keep continuing to say the same thing. Lol.

Continue cherry picking results and ignoring everything that states otherwise. Your damage control threads are classics.Gotta love fanboys.

Are you now blind as well, not everyone disagrees with me. the only fanboy here is you.

Avatar image for mems_1224
mems_1224

56919

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#297 mems_1224
Member since 2004 • 56919 Posts

For fucks sake, you idiots need to learn how to quote

Avatar image for napo_sp
napo_sp

649

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#298 napo_sp
Member since 2006 • 649 Posts

@commander said:
@daious said:
@commander said:
@daious said:
@commander said:

No it isn't , not on the consoles, gpgpu tools were used to transfer this to the gpu and since battlefield 4 isn't a very demanding game, that was possible. I mean battlefield 4 is a cross gen game and it runs at 900p

stop calling me names, you don't even have a clue what you're talking about. The only thing you did was find a little mistake in an argument that didn't even matter and it was already pointed out my someone else. For the rest you whine the whole time about the witcher 3. It took 6 months to fix it , get over it.

You have proven multiple times that you lack technical understanding of the majority of topics you post about and capable of repeating the same flawed argument over and over and magically expecting different results. I completely stand by what I said. It is amazing that you don't see the hypocrisy of your posting. It is hilarious. Gotta love fanboys. Keep on damaging controlling.

Oh please, stop embarrassing yourself.

It seems to be the exact opposite judging by everyone's responses to your posts here. Doesn't help that you are consistently wrong. Also doesn't help that your arguments are completely faulty.

You think that someone that spends so much time damage controlling would actually be good at damage controlling.

What is even more ridiculous is that one you are called out for being wrong in one thread, you keep continuing to say the same thing. Lol.

Continue cherry picking results and ignoring everything that states otherwise. Your damage control threads are classics.Gotta love fanboys.

Are you now blind as well, not everyone disagrees with me. the only fanboy here is you.

only ignorant fools can believe that either dx12 or 150mhz extra cpu speed or both can somehow make a difference for xbone vs ps4.

Avatar image for SecretPolice
SecretPolice

45675

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#299  Edited By SecretPolice
Member since 2007 • 45675 Posts

By next year the X1 will just be hitting it's stride with Win 10, SecretSauce DX12 and StormCloud..... PS4 is doooomed. :P

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#300 commander
Member since 2010 • 16217 Posts

@napo_sp said:
@commander said:
@daious said:
@commander said:
@daious said:

You have proven multiple times that you lack technical understanding of the majority of topics you post about and capable of repeating the same flawed argument over and over and magically expecting different results. I completely stand by what I said. It is amazing that you don't see the hypocrisy of your posting. It is hilarious. Gotta love fanboys. Keep on damaging controlling.

Oh please, stop embarrassing yourself.

It seems to be the exact opposite judging by everyone's responses to your posts here. Doesn't help that you are consistently wrong. Also doesn't help that your arguments are completely faulty.

You think that someone that spends so much time damage controlling would actually be good at damage controlling.

What is even more ridiculous is that one you are called out for being wrong in one thread, you keep continuing to say the same thing. Lol.

Continue cherry picking results and ignoring everything that states otherwise. Your damage control threads are classics.Gotta love fanboys.

Are you now blind as well, not everyone disagrees with me. the only fanboy here is you.

only ignorant fools can believe that either dx12 or 150mhz extra cpu speed or both can somehow make a difference for xbone vs ps4.

Correction, the only people that disagree with me are sony fanboys.