The ps4 shows again its weakness.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#251  Edited By commander
Member since 2010 • 16217 Posts

@nyadc said:


@tormentos said:


@nyadc said:

@ribstaylor1 said:

one has a GPU roughly twice as powerful.

You seem to be pretty bad at math, the PlayStation 4 GPU has a performance peak which is 40% higher than that of the Xbox One, that's 1.4 times as powerful, not twice...

You're probably one of those brain dead people who see "50%" everywhere and equate that to twice the power, no, that would be 100%...

Stay in school...

Look at the screen on this same post and tell me how much that GPU gap amount to.

I am sure is like 97% in compute or more.

The fact that you're even using that picture as some type of citing example just ended our conversation, this is why I can't take console gamers seriously, you people are discussing things outside of your realm of understanding. It's a synthetic and isolated benchmark that has absolutely zero bearing on practical gaming applications...

That graph is probably pretty correct the only thing is that the esram isn't shown here, it's probably very hard to put that thing in a graph but we all know what it can do , if you look at the multiplats that were released around lanch date, when the esram tools weren't ready, then we see a difference between 1080p and 720p which is actually more than double the power (2138k pixels vs 920k pixels). Of course ms was as stupid in the beginning to reserve 10 percent of the gpu for the kinect and this without the esram tools (talking about a bad lauch...) . Of course these games don't have any cpu bottleneck as well.

but since the esram tools were used and all the gpu power was freed up in the x1, the difference is more 900p vs 720 (920k pixels vs 1400k pixels, which is about 50 percent) Some games run even with dynamic resolution, matching the ps4's resolution in a lot of cases (in that specific game). And in games where the cpu is bottlenecking, yeah there we see a whole different story than in the beginning of this gen.

We can't minimize the ps4 gpu advantage, but we certainly can't minimize the x1 cpu advantage as well, especially when the x1 is a lot cheaper.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#252 tormentos
Member since 2003 • 33793 Posts

@nyadc said:

CPU's still handle the bulk of physics workloads, they're substantially better at crunching that kind of data than a GPU.

Sony cannot overclock, their temperatures are essentially the same as the Xbox One, however the Xbox One has an overclocked CPU and GPU. This accounts for the nearly negligible temperature difference, it would be considerably lower if it were running stock frequencies and voltages. If Sony attempted to overclock their system the temperature would spike dramatically and exceed the thermal safe temps for the hardware which is in the 55+ range, that is why the Xbox One can be overclocked, because it is so well ventilated and stays in a safe temperature range.

Sony doesn't need to as their GPU is more powerful, however it also comes down to they are incapable. The only way they could OC that system and actually have it operate without hardware failure and overheating would be to spin up the fan dramatically which would create an insane amount of noise, that's never going to happen.

You're greatly overestimating the PlayStation 4's capabilities and acting as if its limitations have not already been hit, they have been. The Xbox One and it are capable of outputting games at the exact same graphical fidelity, the only difference will ever be the resolution or independent of that the framerate, that's it. This is all just computer hardware now, Microsoft can run games at the same exact graphical settings they just have to dial back the resolution.

If anything Sony should be worried, they've hit a CPU bottleneck that the Xbox One has not due to its OC which in turn gimps the GPU from operating at full usage, this has been showing up in some games lately. If developers do actually take advantage of that extra core which Microsoft has put on the table it would bring the nearly 10% faster Xbox One CPU up to roughly 25%. If your CPU is a bottleneck in your system it's going to drag the GPU down, this can only get worse not better.

The fact that you're even using that picture as some type of citing example just ended our conversation, this is why I can't take console gamers seriously, you people are discussing things outside of your realm of understanding. It's a synthetic and isolated benchmark that has absolutely zero bearing on practical gaming applications...

NO the bulk is pass to the GPU that is the point,why the hell would you run physics on the GPU is most of the bulk would be on the CPU that make no sense what so ever.

Loading Video...

This is a nice example of how physics drives frame to the floor when you use a CPU and how using a GPU can speed things up considerably.

The frames over 30 all the time,while on CPU they drop as low as 5FPS in head to head that is 6 times slower.

There is something you should know about the xbox one,is in no way over clocked is under clocked,i know you will say oh it was over clock to 1.75ghz on its CPU and 53mhz on its GPU,but fact is it wasn't they just rise the clock a little the xbox one PC GPU equivalent the 7790 is 1027mhz not 800 or 853mhz,so is the PS4 GPU as well is 1.0ghz on PC so you see you are not really over clock to what that specific part work on PC all the contrary.

The same with the CPU which i think can go up to 2.0ghz on that Jaguar.

No it would not you are assuming it would dramatically increase based on nothing,that clock rise MS did probably raise the temp like 5 degrees or less and they aren't running very hot neither.

If the xbox one was so well ventilated it would be colder than the PS4 running games,it does draws 5 less watts yet it produce more heat than the PS4,not only that the PS4 has an internal PSU and a smaller case,it is obvious which is doing more yet running colder the PS4 has a superior cooling solution bigger fan mean nothing if the heat isn't dissipating good enough.

No they are not they can very well over clock they just don't need it,and no one has prove of that not been the case,is just baseless assumptions because the xbox one has a big fan,i took a course on PC repair and one of the first things they teach you is it doesn't matter how many fans your PC has,if is not dissipating well you have a problem.

No it hasn't that is something i have learn from all sony systems first year games are nothing to those who will come latter on,compare Resistance vs Killzone 2 or Uncharted 1 vs TLOU.

The fact that you same exact graphical fidelity,but then say lower resolution basically kill your own argument.

Now that last bold part show that you know shit about this,you are just another alter running the same shit spew here and prove wrong already,a CPU stronger than the ones inside the xbox one will yield 1 or 2 frames more from having a 200mhz difference the xbox one has 150mhz is a weak POS CPU the whole CPU difference will amount to shit,1 or 2 frames when in GPU the PS4 has a up to 30FPS lead in some games,yeah that CPU just make the xbox one on TR instead of 30 lock,32 frames while the PS4 keeps hitting 50 and 60.

The only reason why some game are faster on xbox one are this.

1-Screw up job. ACU,RE.

2- The PS4 punching abode its power,so it hit 1080p but it can hit only 50's or open frames. COD ghost,Adavance warfare.

The difference in CPU is nothing and that 150 will hardly give the xbox one a frame,or nothing specially if you use compute on PS4 to counter,since the xbox one has no spare GPU resources to do the same.

You are a Joke and trying to pretend to be a Hermit is a joke after that shit about the xbox one CPU lemming,that chart you are downplaying show the different in compute between the xbox one and PS4,the CPU is obvious see it.? The GPU one is huge see it as well.? Yeah that chart wasn't done by me it was done by Ubisoft test,time to admit it lemming the XBO will always be behind.

Avatar image for 001011000101101
001011000101101

4395

Forum Posts

0

Wiki Points

0

Followers

Reviews: 337

User Lists: 0

#253  Edited By 001011000101101
Member since 2008 • 4395 Posts

Aaaaand Rockstar fixed the GTA issues. Such weakness, huh.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#254  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

@nyadc said:

CPU's still handle the bulk of physics workloads, they're substantially better at crunching that kind of data than a GPU.


The fact that you're even using that picture as some type of citing example just ended our conversation, this is why I can't take console gamers seriously, you people are discussing things outside of your realm of understanding. It's a synthetic and isolated benchmark that has absolutely zero bearing on practical gaming applications...

NO the bulk is pass to the GPU that is the point,why the hell would you run physics on the GPU is most of the bulk would be on the CPU that make no sense what so ever.

Loading Video...

This is a nice example of how physics drives frame to the floor when you use a CPU and how using a GPU can speed things up considerably.

The frames over 30 all the time,while on CPU they drop as low as 5FPS in head to head that is 6 times slower.

Physics 6 times slower on the cpu?

Physx=physics?

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#255 commander
Member since 2010 • 16217 Posts

@tormentos said:

@nyadc said:

CPU's still handle the bulk of physics workloads, they're substantially better at crunching that kind of data than a GPU.

Sony cannot overclock, their temperatures are essentially the same as the Xbox One, however the Xbox One has an overclocked CPU and GPU. This accounts for the nearly negligible temperature difference, it would be considerably lower if it were running stock frequencies and voltages. If Sony attempted to overclock their system the temperature would spike dramatically and exceed the thermal safe temps for the hardware which is in the 55+ range, that is why the Xbox One can be overclocked, because it is so well ventilated and stays in a safe temperature range.

Sony doesn't need to as their GPU is more powerful, however it also comes down to they are incapable. The only way they could OC that system and actually have it operate without hardware failure and overheating would be to spin up the fan dramatically which would create an insane amount of noise, that's never going to happen.

You're greatly overestimating the PlayStation 4's capabilities and acting as if its limitations have not already been hit, they have been. The Xbox One and it are capable of outputting games at the exact same graphical fidelity, the only difference will ever be the resolution or independent of that the framerate, that's it. This is all just computer hardware now, Microsoft can run games at the same exact graphical settings they just have to dial back the resolution.

If anything Sony should be worried, they've hit a CPU bottleneck that the Xbox One has not due to its OC which in turn gimps the GPU from operating at full usage, this has been showing up in some games lately. If developers do actually take advantage of that extra core which Microsoft has put on the table it would bring the nearly 10% faster Xbox One CPU up to roughly 25%. If your CPU is a bottleneck in your system it's going to drag the GPU down, this can only get worse not better.

The fact that you're even using that picture as some type of citing example just ended our conversation, this is why I can't take console gamers seriously, you people are discussing things outside of your realm of understanding. It's a synthetic and isolated benchmark that has absolutely zero bearing on practical gaming applications...

NO the bulk is pass to the GPU that is the point,why the hell would you run physics on the GPU is most of the bulk would be on the CPU that make no sense what so ever.

Loading Video...

This is a nice example of how physics drives frame to the floor when you use a CPU and how using a GPU can speed things up considerably.

The frames over 30 all the time,while on CPU they drop as low as 5FPS in head to head that is 6 times slower.

There is something you should know about the xbox one,is in no way over clocked is under clocked,i know you will say oh it was over clock to 1.75ghz on its CPU and 53mhz on its GPU,but fact is it wasn't they just rise the clock a little the xbox one PC GPU equivalent the 7790 is 1027mhz not 800 or 853mhz,so is the PS4 GPU as well is 1.0ghz on PC so you see you are not really over clock to what that specific part work on PC all the contrary.

The same with the CPU which i think can go up to 2.0ghz on that Jaguar.

No it would not you are assuming it would dramatically increase based on nothing,that clock rise MS did probably raise the temp like 5 degrees or less and they aren't running very hot neither.

If the xbox one was so well ventilated it would be colder than the PS4 running games,it does draws 5 less watts yet it produce more heat than the PS4,not only that the PS4 has an internal PSU and a smaller case,it is obvious which is doing more yet running colder the PS4 has a superior cooling solution bigger fan mean nothing if the heat isn't dissipating good enough.

No they are not they can very well over clock they just don't need it,and no one has prove of that not been the case,is just baseless assumptions because the xbox one has a big fan,i took a course on PC repair and one of the first things they teach you is it doesn't matter how many fans your PC has,if is not dissipating well you have a problem.

No it hasn't that is something i have learn from all sony systems first year games are nothing to those who will come latter on,compare Resistance vs Killzone 2 or Uncharted 1 vs TLOU.

The fact that you same exact graphical fidelity,but then say lower resolution basically kill your own argument.

Now that last bold part show that you know shit about this,you are just another alter running the same shit spew here and prove wrong already,a CPU stronger than the ones inside the xbox one will yield 1 or 2 frames more from having a 200mhz difference the xbox one has 150mhz is a weak POS CPU the whole CPU difference will amount to shit,1 or 2 frames when in GPU the PS4 has a up to 30FPS lead in some games,yeah that CPU just make the xbox one on TR instead of 30 lock,32 frames while the PS4 keeps hitting 50 and 60.

The only reason why some game are faster on xbox one are this.

1-Screw up job. ACU,RE.

2- The PS4 punching abode its power,so it hit 1080p but it can hit only 50's or open frames. COD ghost,Adavance warfare.

The difference in CPU is nothing and that 150 will hardly give the xbox one a frame,or nothing specially if you use compute on PS4 to counter,since the xbox one has no spare GPU resources to do the same.

You are a Joke and trying to pretend to be a Hermit is a joke after that shit about the xbox one CPU lemming,that chart you are downplaying show the different in compute between the xbox one and PS4,the CPU is obvious see it.? The GPU one is huge see it as well.? Yeah that chart wasn't done by me it was done by Ubisoft test,time to admit it lemming the XBO will always be behind.

@nyadc Are you going to finish him off or should I do it?

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#256 tormentos
Member since 2003 • 33793 Posts

@commander said:

That graph is probably pretty correct the only thing is that the esram isn't shown here, it's probably very hard to put that thing in a graph but we all know what it can do , if you look at the multiplats that were released around lanch date, when the esram tools weren't ready, then we see a difference between 1080p and 720p which is actually more than double the power (2138k pixels vs 920k pixels). Of course ms was as stupid in the beginning to reserve 10 percent of the gpu for the kinect and this without the esram tools (talking about a bad lauch...) . Of course these games don't have any cpu bottleneck as well.

but since the esram tools were used and all the gpu power was freed up in the x1, the difference is more 900p vs 720 (920k pixels vs 1400k pixels, which is about 50 percent) Some games run even with dynamic resolution, matching the ps4's resolution in a lot of cases (in that specific game). And in games where the cpu is bottlenecking, yeah there we see a whole different story than in the beginning of this gen.

We can't minimize the ps4 gpu advantage, but we certainly can't minimize the x1 cpu advantage as well, especially when the x1 is a lot cheaper.

That bold part is one of the dumbest things i ever read,ESRAM is not on that chart because ESRAM isn't a GPU or a CPU is has no performance what so ever,ESRAM is memory it does non of things a GPU or CPU would.

ESRAM tools were ready since launch,all games on xbox one use ESRAM since day 1,not using ESRAM would result in the xbox one running its game lower than 720p at next gen quality since DDR3 bandwidth is not enough for both CPU and GPU.

The xbox one keep getting 720p games like on launch it juts got one last month Battlefield Hardline and MGS5 will probably be 720p as well.

Dynamic resolution is implemented in games that can't hold a constant resolution so frames are not affected or they are affected minimally,COD Advance warfare was 1360x1080 Dynamic the problem was that it can never hold 1920x1080p unless nothing is happening,as soon as the shooting start the xbox one version drops.

The same happen with Wolfensteign which drops allot as well the PS4 version barely drops.

The CPU advantage of the xbox one amount to 1 Frame per second 150mhz will not give the xbox one 10 frames more not in a million years,you people are over hyping miserable 150mhz because you lack any knowledge of PC and what such a gap would amount to.

See this ^^ chart See that i3 4360 and the i3 4330.? They are the same shit,but one is 3.7ghz the 4360 and the other is 3.5ghz the 4330,do you see the different in frames.?

Is almost non that is the reality of the xbox one,its 150mhz faster will yield close to nothing,oh and that is a i3 which is stronger than a jaguar clock per clock.

@commander said:

@nyadc Well are you going to finish him off or should I do it?

Non of the 2 are qualify for that job.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#257  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

@commander said:

That graph is probably pretty correct the only thing is that the esram isn't shown here, it's probably very hard to put that thing in a graph but we all know what it can do , if you look at the multiplats that were released around lanch date, when the esram tools weren't ready, then we see a difference between 1080p and 720p which is actually more than double the power (2138k pixels vs 920k pixels). Of course ms was as stupid in the beginning to reserve 10 percent of the gpu for the kinect and this without the esram tools (talking about a bad lauch...) . Of course these games don't have any cpu bottleneck as well.

but since the esram tools were used and all the gpu power was freed up in the x1, the difference is more 900p vs 720 (920k pixels vs 1400k pixels, which is about 50 percent) Some games run even with dynamic resolution, matching the ps4's resolution in a lot of cases (in that specific game). And in games where the cpu is bottlenecking, yeah there we see a whole different story than in the beginning of this gen.

We can't minimize the ps4 gpu advantage, but we certainly can't minimize the x1 cpu advantage as well, especially when the x1 is a lot cheaper.

That bold part is one of the dumbest things i ever read,ESRAM is not on that chart because ESRAM isn't a GPU or a CPU is has no performance what so ever,ESRAM is memory it does non of things a GPU or CPU would.

ESRAM tools were ready since launch,all games on xbox one use ESRAM since day 1,not using ESRAM would result in the xbox one running its game lower than 720p at next gen quality since DDR3 bandwidth is not enough for both CPU and GPU.

The xbox one keep getting 720p games like on launch it juts got one last month Battlefield Hardline and MGS5 will probably be 720p as well.

Dynamic resolution is implemented in games that can't hold a constant resolution so frames are not affected or they are affected minimally,COD Advance warfare was 1360x1080 Dynamic the problem was that it can never hold 1920x1080p unless nothing is happening,as soon as the shooting start the xbox one version drops.

The same happen with Wolfensteign which drops allot as well the PS4 version barely drops.

The CPU advantage of the xbox one amount to 1 Frame per second 150mhz will not give the xbox one 10 frames more not in a million years,you people are over hyping miserable 150mhz because you lack any knowledge of PC and what such a gap would amount to.

See this ^^ chart See that i3 4360 and the i3 4330.? They are the same shit,but one is 3.7ghz the 4360 and the other is 3.5ghz the 4330,do you see the different in frames.?

Is almost non that is the reality of the xbox one,its 150mhz faster will yield close to nothing,oh and that is a i3 which is stronger than a jaguar clock per clock.

Sorry but you pretty much lost all credibility when you tried to prove that physics are harder on the cpu with a physX cpu benchmark. Hilarious

The esram isn't put into that ubisoft graph, because there not really a way of measuring it but it can heavily alleviate the workload for the gpu, with specific tools like texture tiling, one of the main features of dx12. There's no way too achieve the kind of resolution and detail setting with the x1's gpu only with games that were released last year and this year.

Esram tools weren't ready at launch hence the massive difference in resolution with a game like call of duty ghost, why would the difference be less with games that were released later, did the ps4 suddenly had a gpu handicap? I don't think so. Those games weren't cpu bottlenecked either. Of course at release date 10 percent of the gpu was reserved for the kinect as well, which even made the difference bigger. But 10 percent isn't the difference between 900p and 1080p.

The ps4 does stay significantly stronger in teh gpu departement of course, we all know that, but this thread is about the cpu difference.

That graph you show about tomb raider is ridiculous, that game isn't cpu bound in any way.

Ac unity on the other hand is cpu bound and look at the difference between the fx 4300 and fx 4100. That only a difference of 5 percent yet the fx 4300 is 15 percent faster. Yes that's what cpu bottlenecks do and the 175 mhz extra on the x1 cpu is an overclock of 10 percent.

Besides that 10 percent overclock on the x1 isn't the only thing, that 7th core will bring the perfomance difference to 25 percent making the ps4 a much weaker system in cpu bound games.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#258 tormentos
Member since 2003 • 33793 Posts

@commander said:

The esram isn't put into that ubisoft graph, because there not really a way of measuring it but it can heavily alleviate the workload for the gpu, with specific tools like texture tiling, one of the main features of dx12.

The difference is the i3 4360 and the 4330 support exactly the same instruction sets,they are down right the same.

The FX 4100 doesn't support all the instructions set the FX4300 does.

Not to mention that you are using one of the broke ass games to ever come out of a developer.

You are an idiot Physics are Physics just because the name is Phyxs doesn't mean is something different or exclusive to Nvidia.

Loading Video...

Here CPU vs GPU for dummies.

Now i erase all your post just to concentrate on that shit there you say.

100% proven you know absolutely shit about what your talking.

ESRAM doesn't alleviate anything on the xbox one GPU,tiling is a process that work with or without ESRAM,ESRAM is memory it doesn't offload anything from the GPU at all it is there because MS fu** up using DDR3.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#259  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

@commander said:

The esram isn't put into that ubisoft graph, because there not really a way of measuring it but it can heavily alleviate the workload for the gpu, with specific tools like texture tiling, one of the main features of dx12.

The difference is the i3 4360 and the 4330 support exactly the same instruction sets,they are down right the same.

The FX 4100 doesn't support all the instructions set the FX4300 does.

Not to mention that you are using one of the broke ass games to ever come out of a developer.

You are an idiot Physics are Physics just because the name is Phyxs doesn't mean is something different or exclusive to Nvidia.

Loading Video...

Here CPU vs GPU for dummies.

Now i erase all your post just to concentrate on that shit there you say.

100% proven you know absolutely shit about what your talking.

ESRAM doesn't alleviate anything on the xbox one GPU,tiling is a process that work with or without ESRAM,ESRAM is memory it doesn't offload anything from the GPU at all it is there because MS fu** up using DDR3.

The graph you used with tomb raider isn't cpu bound, ac unity is. simply as that. The example with the i3-4330 vs the i3-4360 isn't correct either since it's only a difference of 200 mhz on 3.5 ghz. That's only a difference of 5 percent. If you think the difference in performance between the fx 4100 and fx 4300 is because of the instructions set, look at the fx 6xxx series then , they show the same thing.

And physics isn't the same as physX. PhysX is written for an nvidia gpu. It's only normal when a cpu tries to emulate it that it works a lot slower. Not to mention its one of the phyics engines with the least amount of features, hence it's never widely adopted. It works well for particles but that's about it.

That mythbusters video doesn't work for everything besides it's not even a correct comparison. It's idd cpu vs gpu for dummies but there's still a lot of things the gpu simply can't do and/or a lot of things the cpu is simply better at. A cpu runs at much higher speeds than 1 gpu processor and for tasks that can't be split up the cpu is simply a lot better. Not to mention the cpu has dozens of features that a gpu doesn't have.

Esram does alleviate work for the xboxone gpu, it's specifically made for that. According to you it just sits there and does nothing?

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#260  Edited By tormentos
Member since 2003 • 33793 Posts

@commander said:

The graph you used with tomb raider isn't cpu bound, ac unity is. simply as that. The example with the i3-4330 vs the i3-4360 isn't correct either since it's only a difference of 200 mhz on 3.5 ghz. That's only a difference of 5 percent. If you think the difference in performance between the fx 4100 and fx 4300 is because of the instructions set, look at the fx 6xxx series then , they show the same thing.

And physics isn't the same as physX. PhysX is written for an nvidia gpu. It's only normal when a cpu tries to emulate it that it works a lot slower. Not to mention its one of the phyics engines with the least amount of features, hence it's never widely adopted. It works well for particles but that's about it.

That mythbusters video doesn't work for everything besides it's not even a correct comparison. It's idd cpu vs gpu for dummies but there's still a lot of things the gpu simply can't do and/or a lot of things the cpu is simply better at. A cpu runs at much higher speeds than 1 gpu processor and for tasks that can't be split up the cpu is simply a lot better. Not to mention the cpu has dozens of features that a gpu doesn't have.

Esram does alleviate work for the xboxone gpu, it's specifically made for that. According to you it just sits there and does nothing?

ACU is a fu** up game even on PC is no secret,and in fact Ubisoft stated that the CPU over head wasn't the problem as the lower the amount of NPC on the PS4 version to test and the problem was still there,they made a patch an performance increase.

PhysX is the name of the Cards make by Ageia before Nvidia bought them out,what PhysX use to do was run physics on a PPU,those days are long gone now is basically a library integrated into Nvidia GPU's nothing more,even the PS4 support PhysX dude.

It does for physics,yes it is correct if you actually can understand it,parallelism is the point of the demo,running multiple processors to do multiple jobs.

Yes there are some things the CPU is better Physics is not one of them.

While a Jaguar have 8 cores ,the xbox one GPU has 768 processors that is the difference the cheer number working in parallel is what boost those performance over what a CPU can do,is the reason why CPU suck at folding@home even the PS3 spanked CPU like the i7 on it because cell was much like a PPU from Ageia.

No ESRAM is there to mitigate bandwidth issues inside the xbox one because DDR3 is to slow,that is the real reason it is there,so that the GPU doesn't starve bandwidth wise and even for that is not enough as is to small for certain task.

Is not there to offload anything from the GPU that is completely moronic,and show you people talk without even knowing shit..

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#261  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

@commander said:

The graph you used with tomb raider isn't cpu bound, ac unity is. simply as that. The example with the i3-4330 vs the i3-4360 isn't correct either since it's only a difference of 200 mhz on 3.5 ghz. That's only a difference of 5 percent. If you think the difference in performance between the fx 4100 and fx 4300 is because of the instructions set, look at the fx 6xxx series then , they show the same thing.

And physics isn't the same as physX. PhysX is written for an nvidia gpu. It's only normal when a cpu tries to emulate it that it works a lot slower. Not to mention its one of the phyics engines with the least amount of features, hence it's never widely adopted. It works well for particles but that's about it.

That mythbusters video doesn't work for everything besides it's not even a correct comparison. It's idd cpu vs gpu for dummies but there's still a lot of things the gpu simply can't do and/or a lot of things the cpu is simply better at. A cpu runs at much higher speeds than 1 gpu processor and for tasks that can't be split up the cpu is simply a lot better. Not to mention the cpu has dozens of features that a gpu doesn't have.

Esram does alleviate work for the xboxone gpu, it's specifically made for that. According to you it just sits there and does nothing?

ACU is a fu** up game even on PC is no secret,and in fact Ubisoft stated that the CPU over head wasn't the problem as the lower the amount of NPC on the PS4 version to test and the problem was still there,they made a patch an performance increase.

PhysX is the name of the Cards make by Ageia before Nvidia bought them out,what PhysX use to do was run physics on a PPU,those days are long gone now is basically a library integrated into Nvidia GPU's nothing more,even the PS4 support PhysX dude.

It does for physics,yes it is correct if you actually can understand it,parallelism is the point of the demo,running multiple processors to do multiple jobs.

Yes there are some things the CPU is better Physics is not one of them.

While a Jaguar have 8 cores ,the xbox one GPU has 768 processors that is the difference the cheer number working in parallel is what boost those performance over what a CPU can do,is the reason why CPU suck at folding@home even the PS3 spanked CPU like the i7 on it because cell was much like a PPU from Ageia.

No ESRAM is there to mitigate bandwidth issues inside the xbox one because DDR3 is to slow,that is the real reason it is there,so that the GPU doesn't starve bandwidth wise and even for that is not enough as is to small for certain task.

Is not there to offload anything from the GPU that is completely moronic,and show you people talk without even knowing shit..

It doesn't matter what the reason is why ac unity is cpu bound, the results are the same.

Physx doesn't run on the ps4. The borderlands game is good example of it. It only runs on nvidia cards but can be emulted on a cpu. That will never happen on a ps4, the cpu isn't strong enough for it. That doesn't mean physx is a very demanding process, it's just because it's written for a gpu not a cpu.

Well that mythbusters video can be good to explain certain calculations on a gpu, but not for everything. I never said that the gpu couldn't do any cpu tasks (or that it would bad at calculating physics). I 'm saying that the gpu cannot do every cpu task, it can only do so much. It's a fact that certain physics can be done better on a gpu, but not for all kind of physics, some physics like gameplay physics are sometimes better done on a cpu since it has less latency.

You're also forgetting that the xboxone has an apu as well, which could also do gpgpu tasks like physics. Physics are not very demanding to the hardware as well, the widely adopted havoc engine already ran on old pentium III's and the physX you're talking about runs on old videocards. Physics is only a small part of what a computer/console needs to process when running a game. Of course the demand for resourcers have become higher with newer physics engines since halflife 2 but cpu and gpu power has increased as well.

So the ps4 may have a lot of gpu grunt to spare that doesn't mean it will be able to counter a cpu bottleneck like the x1 does, since the x1 has simply more cpu power available. But when there's a shortage of gpu power on the x1, they will simply dial back the resolution and/or lower detail settings. That won't happen anytime soon though with dx12 coming up. The ps4 will mostly have the better resolution though, but it could be that the framerate takes huge hits in cpu bottlenecks. We've already seen it in a number of games, and newer games will only push the cpu further.

About the esram, like I said it makes the gpu in the x1 faster, or as you like to say it , less slower. The end result is the same.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#262  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:

@daious said:

@commander said:

word words hollow words, everything you said you haven't back up once with numbers, sources nothing.

On top of that you put words in my mouth, you don't understand context, you forget everything i've proven and then deny it again and /or you simply deny plain facts. All this post after post. l might as well try to explain quatum physics to a 5 year old.

I really must be hard be a single console owner and a fanboy.

Thank you for ignoring those points and continuing to favor one console and ignoring the faults based on solely the console you own/don't own.

How about you wait until you know.... there are games that actually support your statement that the difference between CPUs will negate the massive difference between GPUs? Wouldn't that the logical thing to do? If a game is 30fps locked on ps4 and 40+fps locked on the xboxone, you would have an argument. Instead of guess, why don't you wait until there is actual definitive proof?

Battlefield multiplayer is hugely dependent on CPU performance. Yet, PS4 and xboxone perform equally as well. "So far, neither console shows a conclusive advantage over the other in frame-rate (exact like-for-like comparisons of render tech will have to wait for the single-player analysis).". PS4 and Xboxone matches each other in frame rate while the ps4 has a higher resolution. That is the last big AAA title to come out. But of course you will disregard this because the game is on last-gen platforms as well.

You declaring me a fanboy while you are one yourself is funny.

Battlefield is a cross gen game and not nearly as cpu bound as a game like unity. I think these graphs explain it quite nicely

I know what you will say, it's horribly optimzed bla bla bla but battlefield doesn't have 500 npc's with AI at the same time on screen.

There's no doubt that the ps4 has a much stronger gpu, but there no doubt the x1 has a lot more cpu power available. It just didn't show yet, because that 7 th core was only unlocked in january for devs.Unity was just a taste what just 10 percent more cpu performance can do.

I'm not saying that this will become a trend (that the x1 outperforms the ps4) but it surely is a nice treat for a much cheaper system.

However, it has been proven in the past that devs are innovating so they can make a profit.

Innovating in the video game industry goes hand in hand with higher system requirements. Lack of gpu power can be solved by lowering the resolution but gpu's have a lot of optimization room as well, dx 12 for instance, dynamic resolution, texture tiling. But lack of cpu power is whole different matter.

So fanboys like yourself better pray that devs don't push the enveloppe like ubisoft did, because that cpu difference could play a very big role lol (yes i know it's speculating, but it's educated speculating!)

I am calling you a fanboy because you clearly are.

You used call of duty for your argument (a crossgen game) but I can't use battlefield hardline because you arbitrary say so. I mean is that a joke?

I am a fanboy for telling you to wait and actually get proof? Instead of just speculating? Like I said for the ten thousandth time, let games show the difference instead of these premature statements.

Isn't your argument that sony isn't likely doing software optimization for CPU because they haven't announced faulty because Microsoft weren't the ones that announced their CPU optimization. It was leaked via a hacker group.

Also LOL at not checking your source and reading about the battlefield hardline BETA CPU benchmark. Thanks for the laugh on that. It was testing empty maps on the multiplayer and not actual 64 player multiplayer. Battlefield 64 player multiplayer is completely dependent on CPU. Yet, xboxone and ps4 performs the same. I always enjoy your terrible arguments. Keep at them.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#263  Edited By commander
Member since 2010 • 16217 Posts

@daious said:

@commander said:

You declaring me a fanboy while you are one yourself is funny.

Battlefield is a cross gen game and not nearly as cpu bound as a game like unity. I think these graphs explain it quite nicely

I know what you will say, it's horribly optimzed bla bla bla but battlefield doesn't have 500 npc's with AI at the same time on screen.

There's no doubt that the ps4 has a much stronger gpu, but there no doubt the x1 has a lot more cpu power available. It just didn't show yet, because that 7 th core was only unlocked in january for devs.Unity was just a taste what just 10 percent more cpu performance can do.

I'm not saying that this will become a trend (that the x1 outperforms the ps4) but it surely is a nice treat for a much cheaper system.

However, it has been proven in the past that devs are innovating so they can make a profit.

Innovating in the video game industry goes hand in hand with higher system requirements. Lack of gpu power can be solved by lowering the resolution but gpu's have a lot of optimization room as well, dx 12 for instance, dynamic resolution, texture tiling. But lack of cpu power is whole different matter.

So fanboys like yourself better pray that devs don't push the enveloppe like ubisoft did, because that cpu difference could play a very big role lol (yes i know it's speculating, but it's educated speculating!)

I am calling you a fanboy because you clearly are.

You used call of duty for your argument (a crossgen game) but I can't use battlefield hardline because you arbitrary say so. I mean is that a joke?

I am a fanboy for telling you to wait and actually get proof? Instead of just speculating? Like I said for the ten thousandth time, let games show the difference instead of these premature statements.

Isn't your argument that sony isn't likely doing software optimization for CPU because they haven't announced faulty because Microsoft weren't the ones that announced their CPU optimization. It was leaked via a hacker group.

Also LOL at not checking your source and reading about the battlefield hardline BETA CPU benchmark. Thanks for the laugh on that. It was testing empty maps on the multiplayer and not actual 64 player multiplayer. Battlefield 64 player multiplayer is completely dependent on CPU. Yet, xboxone and ps4 performs the same. I always enjoy your terrible arguments. Keep at them.

Well cod of is an execption here, most cross gen game don't really push the enveloppe, gta is an exception here too but gta V released a year later on next gen.

Battlefield multiplayer may be more cpu intensive on 64 player multiplayer maps, it's stil not really a game that pushes the cpu to it's limits because it runs on an old engine.

Again, if sony would have been able to increase cpu speed they would have done it, they can really use it. Bloodborne 1 minute loading times are ridiculous

And the cpu-differences have already been shown in numerous games, the ps4 had to be patched to make it more playable and it still can't match the x1 framerate, all this with a gpu that's almost double as performant.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#264  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:

@daious said:

@commander said:

You declaring me a fanboy while you are one yourself is funny.

Battlefield is a cross gen game and not nearly as cpu bound as a game like unity. I think these graphs explain it quite nicely

I know what you will say, it's horribly optimzed bla bla bla but battlefield doesn't have 500 npc's with AI at the same time on screen.

There's no doubt that the ps4 has a much stronger gpu, but there no doubt the x1 has a lot more cpu power available. It just didn't show yet, because that 7 th core was only unlocked in january for devs.Unity was just a taste what just 10 percent more cpu performance can do.

I'm not saying that this will become a trend (that the x1 outperforms the ps4) but it surely is a nice treat for a much cheaper system.

However, it has been proven in the past that devs are innovating so they can make a profit.

Innovating in the video game industry goes hand in hand with higher system requirements. Lack of gpu power can be solved by lowering the resolution but gpu's have a lot of optimization room as well, dx 12 for instance, dynamic resolution, texture tiling. But lack of cpu power is whole different matter.

So fanboys like yourself better pray that devs don't push the enveloppe like ubisoft did, because that cpu difference could play a very big role lol (yes i know it's speculating, but it's educated speculating!)

I am calling you a fanboy because you clearly are.

You used call of duty for your argument (a crossgen game) but I can't use battlefield hardline because you arbitrary say so. I mean is that a joke?

I am a fanboy for telling you to wait and actually get proof? Instead of just speculating? Like I said for the ten thousandth time, let games show the difference instead of these premature statements.

Isn't your argument that sony isn't likely doing software optimization for CPU because they haven't announced faulty because Microsoft weren't the ones that announced their CPU optimization. It was leaked via a hacker group.

Also LOL at not checking your source and reading about the battlefield hardline BETA CPU benchmark. Thanks for the laugh on that. It was testing empty maps on the multiplayer and not actual 64 player multiplayer. Battlefield 64 player multiplayer is completely dependent on CPU. Yet, xboxone and ps4 performs the same. I always enjoy your terrible arguments. Keep at them.

Well cod of is an execption here, most cross gen game don't really push the enveloppe, gta is an exception here too but gta V released a year later on next gen.

Battlefield multiplayer may be more cpu intensive on 64 player multiplayer maps, it's stil not really a game that pushes the cpu to it's limits because it runs on an old engine.

Again, if sony would have been able to increase cpu speed they would have done it, they can really use it. Bloodborne 1 minute loading times are ridiculous

And the cpu-differences have already been shown in numerous games, the ps4 had to be patched to make it more playable and it still can't match the x1 framerate, all this with a gpu that's almost double as performant.

I never said anything ever about sony increasing cpu clock speeds. I already stated that sony picked to add a second CPU over the overclocking like MS did. Sony and MS are constantly making updates to their OS and other software. If one is capable of it, the other is too. Your prior argument that Sony would be announced it ahead of time is faulty.

BF 64 multiplayer is very CPU heavy. You can't spin that. Heck according to your benchmarks, empty BF hardline maps versus assassin creed unity are pretty close in cpu performance. Part of the reason was Unity was so broken at launch. AMD had way too many problems on PC. There were tons of bugs and ect...

Most reviewers don't review 64 player because it so variable from match to match that they would have to average hours of game plays for like 20 different test setups to get a decent review. Instead they just roam around for 5 minutes in near empty maps.

You can't just accept evidence and ignore evidence based on your choosing. You can't throw out XYZ cross-gen game and allow 123 cross-gen game. Wait until there are actual games that actually prove your point. BF hardline 64 multiplayer is a perfectly good example of CPU performance.

10% overclock doesn't necessary mean 10% performance. For instance, below. There are plenty of other examples.

So for the love of god, wait until xboxone has games to show CPU domination instead of jumping the gun with weak evidence and speculation.

Plus the GPU is no way twice as powerful in the ps4 as the xboxone. That benchmark was a stress test and not real world performance. No developer is going to stress systems to their max.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#265 tormentos
Member since 2003 • 33793 Posts

@commander said:

It doesn't matter what the reason is why ac unity is cpu bound, the results are the same.

Physx doesn't run on the ps4. The borderlands game is good example of it. It only runs on nvidia cards but can be emulted on a cpu. That will never happen on a ps4, the cpu isn't strong enough for it. That doesn't mean physx is a very demanding process, it's just because it's written for a gpu not a cpu.

Well that mythbusters video can be good to explain certain calculations on a gpu, but not for everything. I never said that the gpu couldn't do any cpu tasks (or that it would bad at calculating physics). I 'm saying that the gpu cannot do every cpu task, it can only do so much. It's a fact that certain physics can be done better on a gpu, but not for all kind of physics, some physics like gameplay physics are sometimes better done on a cpu since it has less latency.

You're also forgetting that the xboxone has an apu as well, which could also do gpgpu tasks like physics. Physics are not very demanding to the hardware as well, the widely adopted havoc engine already ran on old pentium III's and the physX you're talking about runs on old videocards. Physics is only a small part of what a computer/console needs to process when running a game. Of course the demand for resourcers have become higher with newer physics engines since halflife 2 but cpu and gpu power has increased as well.

So the ps4 may have a lot of gpu grunt to spare that doesn't mean it will be able to counter a cpu bottleneck like the x1 does, since the x1 has simply more cpu power available. But when there's a shortage of gpu power on the x1, they will simply dial back the resolution and/or lower detail settings. That won't happen anytime soon though with dx12 coming up. The ps4 will mostly have the better resolution though, but it could be that the framerate takes huge hits in cpu bottlenecks. We've already seen it in a number of games, and newer games will only push the cpu further.

About the esram, like I said it makes the gpu in the x1 faster, or as you like to say it , less slower. The end result is the same.

It wasn't CPU bound it just have a terrible code which is different,lowering the NPC did nothing which is what would rise performance if CPU bound was the problem,worse the company admit to have hold back the PS4 version to avoid arguments.

Nvidia may not be powering Sony's next-gen console hardware, but its physics software, PhysX and APEX, will be present in PlayStation 4 games. Today the company announced its support for the PlayStation 4 with the introduction of new PhysX and APEX SDKs for developers. PhysX and APEX provide dynamic physics to environments, particles, and objects, delivering more life-like destruction, character models, and effects that react in real-time to player interaction and in-game action. PhysX can be found in many current-gen titles, including Hawken, Borderlands 2, and Batman: Arkham City.

http://www.ign.com/articles/2013/03/07/nvidia-announces-physx-support-for-playstation-4

But but but the PS4 doesn't support PhysX i guess Nvidia are liars,dude like i already told you PhySX is basically a library is not hardware any more,and i am sure it would run on AMD GPU if Nvidia allowed it,just like TressFX runs on nvidia GPU.

Physic run faster on the GPU because of GPU high parallelism,you just don't want to understand that.

Yes the xbox one can do GpGPU it has been doing it since day 1 with Kinect,the problem lies in the modification the PS4 has for it,and the extra power the PS4 has over the xbox one,the PS4 was heavily modify to take advantage of compute more so than the xbox one,reason why you see that huge gap between GPU in the screen you posted.

Dude just offloading Physics to the GPU is enough to free resources which can be use for other things,while the xbox one if it use its GPU for the same will be GPU bound quite fast,like i already told you the speed difference is minimal,and the PS4 has a straight forward memory system and is true HSA.

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#266 Shewgenja
Member since 2009 • 21456 Posts

Oh, lol, this is still going. Based on assertions that have been proven incorrect a number of times. This thread needs (unconfirmed) at the end of the title, I think.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#267  Edited By commander
Member since 2010 • 16217 Posts

@daious said:

@commander said:

Well cod of is an execption here, most cross gen game don't really push the enveloppe, gta is an exception here too but gta V released a year later on next gen.

Battlefield multiplayer may be more cpu intensive on 64 player multiplayer maps, it's stil not really a game that pushes the cpu to it's limits because it runs on an old engine.

Again, if sony would have been able to increase cpu speed they would have done it, they can really use it. Bloodborne 1 minute loading times are ridiculous

And the cpu-differences have already been shown in numerous games, the ps4 had to be patched to make it more playable and it still can't match the x1 framerate, all this with a gpu that's almost double as performant.

I never said anything ever about sony increasing cpu clock speeds. I already stated that sony picked to add a second CPU over the overclocking like MS did. Sony and MS are constantly making updates to their OS and other software. If one is capable of it, the other is too. Your prior argument that Sony would be announced it ahead of time is faulty.

BF 64 multiplayer is very CPU heavy. You can't spin that. Heck according to your benchmarks, empty BF hardline maps versus assassin creed unity are pretty close in cpu performance. Part of the reason was Unity was so broken at launch. AMD had way too many problems on PC. There were tons of bugs and ect...

Most reviewers don't review 64 player because it so variable from match to match that they would have to average hours of game plays for like 20 different test setups to get a decent review. Instead they just roam around for 5 minutes in near empty maps.

You can't just accept evidence and ignore evidence based on your choosing. You can't throw out XYZ cross-gen game and allow 123 cross-gen game. Wait until there are actual games that actually prove your point. BF hardline 64 multiplayer is a perfectly good example of CPU performance.

10% overclock doesn't necessary mean 10% performance. For instance, below. There are plenty of other examples.

So for the love of god, wait until xboxone has games to show CPU domination instead of jumping the gun with weak evidence and speculation.

Plus the GPU is no way twice as powerful in the ps4 as the xboxone. That benchmark was a stress test and not real world performance. No developer is going to stress systems to their max.

You always say I speculate that sony wouldn't be able to increase cpu speeds, well again they won't , not with this ps4 they won't. I also already said that that second gpu in the ps4 is for os tasks like background downloading only. It has no grunt to do anything.

BF 64 multiplayer maybe more cpu heavy than you would play single player or with less players , it's still no comparison with games like ac unity in terms of cpu power. The framerate drops to about 50fps with 64 players on both systems. One level it can go to 40 but that's also because of the graphics intensity in that level. Unity drops way lower than that. Besides if it really was such a cpu impact like you're saying, it would do a lot more than just drop 8 percent on the weak console cpu's.

That bench you posted doesn't mean anything, the highest fx 9xxx series has trouble keeping up with low end sandy's/ivy's from intel, and it has double the cores.

It's you that cannot accept evidence, you keep on coming with arguments that already have been debunked. The extra cpu-power has already been seen on the x1. I never said that it would dominate the ps4 but it will surely be more performant when it comes to framerates in cpu heavy games. The fact that the ps4 was patched afterwards and still couldn't match the x1 only makes it more obvious.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#268  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:

@daious said:

@commander said:

Well cod of is an execption here, most cross gen game don't really push the enveloppe, gta is an exception here too but gta V released a year later on next gen.

Battlefield multiplayer may be more cpu intensive on 64 player multiplayer maps, it's stil not really a game that pushes the cpu to it's limits because it runs on an old engine.

Again, if sony would have been able to increase cpu speed they would have done it, they can really use it. Bloodborne 1 minute loading times are ridiculous

And the cpu-differences have already been shown in numerous games, the ps4 had to be patched to make it more playable and it still can't match the x1 framerate, all this with a gpu that's almost double as performant.

I never said anything ever about sony increasing cpu clock speeds. I already stated that sony picked to add a second CPU over the overclocking like MS did. Sony and MS are constantly making updates to their OS and other software. If one is capable of it, the other is too. Your prior argument that Sony would be announced it ahead of time is faulty.

BF 64 multiplayer is very CPU heavy. You can't spin that. Heck according to your benchmarks, empty BF hardline maps versus assassin creed unity are pretty close in cpu performance. Part of the reason was Unity was so broken at launch. AMD had way too many problems on PC. There were tons of bugs and ect...

Most reviewers don't review 64 player because it so variable from match to match that they would have to average hours of game plays for like 20 different test setups to get a decent review. Instead they just roam around for 5 minutes in near empty maps.

You can't just accept evidence and ignore evidence based on your choosing. You can't throw out XYZ cross-gen game and allow 123 cross-gen game. Wait until there are actual games that actually prove your point. BF hardline 64 multiplayer is a perfectly good example of CPU performance.

10% overclock doesn't necessary mean 10% performance. For instance, below. There are plenty of other examples.

So for the love of god, wait until xboxone has games to show CPU domination instead of jumping the gun with weak evidence and speculation.

Plus the GPU is no way twice as powerful in the ps4 as the xboxone. That benchmark was a stress test and not real world performance. No developer is going to stress systems to their max.

You always say I speculate that sony wouldn't be able to increase cpu speeds, well again they won't , not with this ps4 they won't. I also already said that that second gpu in the ps4 is for os tasks like background downloading only. It has no grunt to do anything.

BF 64 multiplayer maybe more cpu heavy than you would play single player or with less players , it's still no comparison with games like ac unity in terms of cpu power. The framerate drops to about 50fps with 64 players on both systems. One level it can go to 40 but that's also because of the graphics intensity in that level. Unity drops way lower than that. Besides if it really was such a cpu impact like you're saying, it would do a lot more than just drop 8 percent on the weak console cpu's.

That bench you posted doesn't mean anything, the highest fx 9xxx series has trouble keeping up with low end sandy's/ivy's from intel, and it has double the cores.

It's you that cannot accept evidence, you keep on coming with arguments that already have been debunked. The extra cpu-power has already been seen on the x1. I never said that it would dominate the ps4 but it will surely be more performant when it comes to framerates in cpu heavy games. The fact that the ps4 was patched afterwards and still couldn't match the x1 onlymakes it more obvious.

It is the same with intel.

I picked AMD because hardline uses the majority of the CPU cores and AMD is used in both consoles. Showing that twice the clockspeeds does not always = twice the performance. Increase in 20% of clock speed doesn't necessary mean a +20% performance. Yet you claim it definitively does. Sony's second CPU is for background operations which would otherwise be put on the primary CPU. MS has a single CPU and all background operations on it.

"BF 64 multiplayer maybe more cpu heavy than you would play single player or with less players , it's still no comparison with games like ac unity in terms of cpu power."

Again... 64 player multiplayer is one of the most taxing things on CPUs period. You do realize EA delayed hardline by 6 months to optimize/improve the game on all systems. BF hardline wasn't completely broken at launch after the huge failure of the BF4 launch. However, Unity was broken at launch. I mean look at Dying Light on PC. That game was more taxing on PC (at launch) than so many other games because the game was so poorly optimized not because it was so much more CPU extensive than other games.

its about optimization. CPU, ram, and gpu usage is constantly being optimized without changes in clock speeds. I am not talking about clock speeds. Sony/MS will continue doing it all generations. Developers have to optimize as well.

You can't just say but but but but Unity. You are going to have to wait until there are games that definitive prove your statement and not rely on a single game that had such major optimization issues.

I mean I love how you used a comparison between BFhardline and Unity as a way of saying it's more CPU extensive. Then once you figured out that it said the opposite, you know disregard it.

I am sorry that it is too much for me to ask you to actual prove it when actual evidence becomes apparent. I suggest making a thread later in the year when there are actual games to support it. I never denied that xboxone has a more powerful CPU. I said the difference between the CPU currently does not make the xboxone the more powerful console. A few frames in one game isn't changing that now.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#269  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:

@tormentos said:

That bold part is one of the dumbest things i ever read,ESRAM is not on that chart because ESRAM isn't a GPU or a CPU is has no performance what so ever,ESRAM is memory it does non of things a GPU or CPU would.

ESRAM tools were ready since launch,all games on xbox one use ESRAM since day 1,not using ESRAM would result in the xbox one running its game lower than 720p at next gen quality since DDR3 bandwidth is not enough for both CPU and GPU.

The xbox one keep getting 720p games like on launch it juts got one last month Battlefield Hardline and MGS5 will probably be 720p as well.

Dynamic resolution is implemented in games that can't hold a constant resolution so frames are not affected or they are affected minimally,COD Advance warfare was 1360x1080 Dynamic the problem was that it can never hold 1920x1080p unless nothing is happening,as soon as the shooting start the xbox one version drops.

The same happen with Wolfensteign which drops allot as well the PS4 version barely drops.

The CPU advantage of the xbox one amount to 1 Frame per second 150mhz will not give the xbox one 10 frames more not in a million years,you people are over hyping miserable 150mhz because you lack any knowledge of PC and what such a gap would amount to.

See this ^^ chart See that i3 4360 and the i3 4330.? They are the same shit,but one is 3.7ghz the 4360 and the other is 3.5ghz the 4330,do you see the different in frames.?

Is almost non that is the reality of the xbox one,its 150mhz faster will yield close to nothing,oh and that is a i3 which is stronger than a jaguar clock per clock.


Ac unity on the other hand is cpu bound and look at the difference between the fx 4300 and fx 4100. That only a difference of 5 percent yet the fx 4300 is 15 percent faster. Yes that's what cpu bottlenecks do and the 175 mhz extra on the x1 cpu is an overclock of 10 percent.

You do know that 4300 and 4100 are different chips... just because they are both 4 core doesn't make them the same chip...

AMD 4100 has a worse IPC than the 4300. 4100 is the older Zambezi cores and while the 4300 has the newer Vishera cores. The 4100 is not a downclocked 4300.

You should really do more research before you post.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#270  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

1. It wasn't CPU bound it just have a terrible code which is different,lowering the NPC did nothing which is what would rise performance if CPU bound was the problem,worse the company admit to have hold back the PS4 version to avoid arguments.

Nvidia may not be powering Sony's next-gen console hardware, but its physics software, PhysX and APEX, will be present in PlayStation 4 games. Today the company announced its support for the PlayStation 4 with the introduction of new PhysX and APEX SDKs for developers. PhysX and APEX provide dynamic physics to environments, particles, and objects, delivering more life-like destruction, character models, and effects that react in real-time to player interaction and in-game action. PhysX can be found in many current-gen titles, including Hawken, Borderlands 2, and Batman: Arkham City.

http://www.ign.com/articles/2013/03/07/nvidia-announces-physx-support-for-playstation-4

2. But but but the PS4 doesn't support PhysX i guess Nvidia are liars,dude like i already told you PhySX is basically a library is not hardware any more,and i am sure it would run on AMD GPU if Nvidia allowed it,just like TressFX runs on nvidia GPU.

3. Physic run faster on the GPU because of GPU high parallelism,you just don't want to understand that.

4. Yes the xbox one can do GpGPU it has been doing it since day 1 with Kinect,the problem lies in the modification the PS4 has for it,and the extra power the PS4 has over the xbox one,the PS4 was heavily modify to take advantage of compute more so than the xbox one,reason why you see that huge gap between GPU in the screen you posted.

5. Dude just offloading Physics to the GPU is enough to free resources which can be use for other things,while the xbox one if it use its GPU for the same will be GPU bound quite fast,like i already told you the speed difference is minimal,and the PS4 has a straight forward memory system and is true HSA.

1. What does it matter if it's badly coded or that's cpu bound, the end results are the same, both require extra cpu performance.

2. Well, I don't know what's true about that article but borderlands 2 doesn't support any physx that you can find on the pc. That article is from 2013 as well, maybe it's completely out of date.

The benchmark you posted before is about physx on the pc. If nvidia has a library for other computer physics, that can be true but it has nothing to with what you posted. The physics that you can find in Nvidia physX don't take up that much resources, it just takes up a lot of cpu resources because it's coded for gpu's.

3. I never said physics wouldn't run faster on the gpu, at least not the bigger part, there are some calculations that are still better done on the cpu because of the lower latency, if you don't believe look it up on wiki

4. The huge gap on that graph is because the the ps4's gpu has 50 percent more stream processors and has gddr5. Not because it's heavily modified to do gpgpu. Any current gen gpu can do gpgpu.

5. And again, physics that we see today don't take up much resources, it can be easily done on the xbox1 apu. Of course if you would use hair or clothing physics of 500 npc's, then a gpgpu is greatly advised. That is what they tried in unity but apparently gpgpu wasn't enough, because that graph you mentioned comes from an article that's about clothing physics in unity.

I know what your going to say, it's badly coded, yeah they said that about crysis as well and they were right but the optimized crysis warhead still murdered systems. Well ubisoft did find the culprit with unity didn't they, it was an 'overloaded instuction set' and they patched it with a fix and it still couldn't match the xbox one framerate.

Don't talk to me about parity as well, they took their words back. They didn't want to look like a fool after all the ps4 praise with watchdogs and with their gpgpu statements. If they would have tried 1080p on the ps4, it would have simply gone up in smoke.

Consoles are just not strong enough to pull off that kind computational power. Just offloading all physics to gpgpu doesn't work when you don't have a strong cpu as well, the more physics you use, even if the bigger part is done by the gpu, you'll still need more cpu power as well. You just need that low latency high speed clock clyces to do on the fly calculations and optimizations can do a lot but it can only go so far.

And that is the handicap with sony. They put so much faith in that gpu and gpgpu tools that they completely forgot about the cpu, xbox made the same mistake though, even worse they use a much weaker gpu. Stil the xbox got kinda lucky because 25 percent difference in cpu performance isn't minimal. (10 percent overclock, 15 percent from the 7th core). That 10 percent overclock brought them back into the game and that 7th core unlock makes the x1 very interesting at that cheaper pricetag.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#271  Edited By ronvalencia
Member since 2008 • 29612 Posts

@commander:

@commander said:

@ronvalencia said:

@commander:

Sony already shown their PS4 CPU effective bandwidth usage.

GDDR5 storage capacity doesn't match DDR3 hence they not being use as system memory.

GDDR5 memory module latency are comparable to their DDR3 counterparts.

maybe you should use some sources to support what you're saying

Let me help you

Xboxone

DDR3 to GPU is 68GB/s

esRAM to GPU is 206GB/s

DDR3 to CPU is 30GB/s

esRAM to CPU is 30GB/s

PS4

DDR5 to GPU is 178GB/s

DDR5 to CPU is 10GB/s

looks like that 'effective banwitdth' for the cpu looks a bit shabby for the ps4

You are not comparing apples to apples.

XBO's effective bandwidth between GPU and DDR3 is around 55 GB/s NOT 68 GB/s.

DRAM refresh rate destroys any theoretical bandwidth numbers. Like GDDR5, HBM is still DRAM.

DRAM's latencies destroys any theoretical bandwidth numbers. HBM attempts to address this.

AMD memory controller destroys any theoretical bandwidth numbers. HBM attempts to address this.

External trace lines on PCB destroys any theoretical bandwidth numbers. HBM attempts to address this.

I'm already aware of XBO's 30 GB/s theoretical bandwidth to the DDR3 memory.

----------------------

Intel's Havok Physics running on PS4's GCN. There's very little need for NVIDIA's PhysX.

Loading Video...

http://www.pcworld.com/article/2465211/intel-microsoft-promise-directx-12-could-halve-pc-graphics-power-draw.html

To test the power, Intel “frame locked” the demonstration app, then let the demo run. The power consumed dropped in half. But Intel then unlocked the frame rate and let the app run at full power—and the frame rate improved by more than 50 percent, too. In part, that’s because DirectX 12 enables the GPU to take over more of the traditional computing load, including physics and collisions.

DirectX12 is a double edge sword for XBO i.e. it enables more traditional computing work loads on the GPU, but XBO is limited to 12 CU (~1.31 TFLOPS).

Note that another limitation is back write (BW) memory bandwidth and transferring GPU's calculated results back to the CPU.

AMD plans to build 300 watts monster APU for servers which is not limited to PC's PCI-E 16X ver 3.0 bandwidth(combined 32 GB/s peak, 16 GB/s peak each way).

By 2017 AMD plans to introduce what it described as a High Performance Computing APU or HPC for short. This APU will carry a sizable TDP between 200 and 300 watts. This sort of APU, AMD expects, will excel in HPC applications. Similarly powerful APUs were not attempted up to this point because they were simply not viable due to the amount of memory bandwidth required to keep such a powerful APU fed. Thankfully however stacked HBM ( High Bandwidth Memory ) will make such designs not only possible but extremely effective as well. As the second generation of HBM is 9 times faster than GDDR5 memory and a whopping 128 times faster than DDR3 memory

Read more: http://wccftech.com/amd-gpu-apu-roadmaps-2015-2020-emerge/#ixzz3WQXjXRjO

The consoles after this generation would be built on the above road map i.e. should have enough compute and memory bandwidth power for 4K games.

Effective BW(back write) for XBO's would be DDR3's ~55 GB/s + ESRAM's ~104 GB/s e.g. split frame buffer rendering. This rivals PS4's effective BW. Remember, we already have a 12 CU at 828Mhz (FirePro W5000, 1.23 TFLOPS) with 153 GB/s theoretical bandwidth against 16 CU at 860Mhz (7850, 1.76 TFLOPS) with 153 GB/s theoretical bandwidth and 7850 wins every time (with same CPU and software ecosystem).

An example

FirePro W5000 = 33 fps.

7850 = 45 fps.

The results are obtained via a large Intel CPU and results would differ on AMD CPUs (less consistent performance and highly dependant on programmer's concurrent multithreading draw calls usage, lower performance single CPU core draw calls budgeting).

The end game for the GPU front

R7-265 with Mantle would be similar to PS4 > W5000 with Mantle would be similar to XBO = PS4 wins.

The end game for the CPU front

XBO has an advantage with the CPU side i.e. higher clock speed and I/O bandwidth.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#272  Edited By tormentos
Member since 2003 • 33793 Posts

@commander said:

1. What does it matter if it's badly coded or that's cpu bound, the end results are the same, both require extra cpu performance.

2. Well, I don't know what's true about that article but borderlands 2 doesn't support any physx that you can find on the pc. That article is from 2013 as well, maybe it's completely out of date.

The benchmark you posted before is about physx on the pc. If nvidia has a library for other computer physics, that can be true but it has nothing to with what you posted. The physics that you can find in Nvidia physX don't take up that much resources, it just takes up a lot of cpu resources because it's coded for gpu's.

3. I never said physics wouldn't run faster on the gpu, at least not the bigger part, there are some calculations that are still better done on the cpu because of the lower latency, if you don't believe look it up on wiki

4. The huge gap on that graph is because the the ps4's gpu has 50 percent more stream processors and has gddr5. Not because it's heavily modified to do gpgpu. Any current gen gpu can do gpgpu.

5. And again, physics that we see today don't take up much resources, it can be easily done on the xbox1 apu. Of course if you would use hair or clothing physics of 500 npc's, then a gpgpu is greatly advised. That is what they tried in unity but apparently gpgpu wasn't enough, because that graph you mentioned comes from an article that's about clothing physics in unity.

I know what your going to say, it's badly coded, yeah they said that about crysis as well and they were right but the optimized crysis warhead still murdered systems. Well they did find the culprit didn't they, it was an 'overloaded instuction set' and they patched it with a fix and it still couldn't match the xbox one framerate.

Don't talk to me about parity as well, they took their words back. They didn't want to look like a fool after all the ps4 praise with watchdogs and with their gpgpu statements. If they would have tried 1080p on the ps4, it would have simply gone up in smoke.

Consoles are just not strong enough to pull off that kind computational power. Just offloading all physics to gpgpu doesn't work when you don't have a strong cpu as well, the more physics you use, even if the bigger part is done by the gpu, you'll still need more cpu power as well. You just need that low latency high speed clock clyces to do on the fly calculations and optimizations can do a lot but it can only go so far.

And that is the handicap with sony. They put so much faith in that gpu and gpgpu tools that they completely forgot about the cpu, xbox made the same mistake though, even worse they use a much weaker gpu. Stil the xbox got kinda lucky because 25 percent difference in cpu performance isn't minimal. (10 percent overclock, 15 percent from the 7th core). That 10 percent overclock brought them back into the game and that 7th core unlock makes the x1 very interesting at that cheaper pricetag.

1-Bad code will run bad on every platform reason why it ran bad on everything even PC,that is not a problem of the hardware is the developer.

2-By this point we corroborate how full of shit you are,my link is undeniable you are a joke who knows shit about what your talk,you are just another lemming repeating the same crap prove wrong already.

I don't really need to counter anything more it just seem that you lack any knowledge on this subject.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#273  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

@commander said:

1. What does it matter if it's badly coded or that's cpu bound, the end results are the same, both require extra cpu performance.

2. Well, I don't know what's true about that article but borderlands 2 doesn't support any physx that you can find on the pc. That article is from 2013 as well, maybe it's completely out of date.

The benchmark you posted before is about physx on the pc. If nvidia has a library for other computer physics, that can be true but it has nothing to with what you posted. The physics that you can find in Nvidia physX don't take up that much resources, it just takes up a lot of cpu resources because it's coded for gpu's.

3. I never said physics wouldn't run faster on the gpu, at least not the bigger part, there are some calculations that are still better done on the cpu because of the lower latency, if you don't believe look it up on wiki

4. The huge gap on that graph is because the the ps4's gpu has 50 percent more stream processors and has gddr5. Not because it's heavily modified to do gpgpu. Any current gen gpu can do gpgpu.

5. And again, physics that we see today don't take up much resources, it can be easily done on the xbox1 apu. Of course if you would use hair or clothing physics of 500 npc's, then a gpgpu is greatly advised. That is what they tried in unity but apparently gpgpu wasn't enough, because that graph you mentioned comes from an article that's about clothing physics in unity.

I know what your going to say, it's badly coded, yeah they said that about crysis as well and they were right but the optimized crysis warhead still murdered systems. Well they did find the culprit didn't they, it was an 'overloaded instuction set' and they patched it with a fix and it still couldn't match the xbox one framerate.

Don't talk to me about parity as well, they took their words back. They didn't want to look like a fool after all the ps4 praise with watchdogs and with their gpgpu statements. If they would have tried 1080p on the ps4, it would have simply gone up in smoke.

Consoles are just not strong enough to pull off that kind computational power. Just offloading all physics to gpgpu doesn't work when you don't have a strong cpu as well, the more physics you use, even if the bigger part is done by the gpu, you'll still need more cpu power as well. You just need that low latency high speed clock clyces to do on the fly calculations and optimizations can do a lot but it can only go so far.

And that is the handicap with sony. They put so much faith in that gpu and gpgpu tools that they completely forgot about the cpu, xbox made the same mistake though, even worse they use a much weaker gpu. Stil the xbox got kinda lucky because 25 percent difference in cpu performance isn't minimal. (10 percent overclock, 15 percent from the 7th core). That 10 percent overclock brought them back into the game and that 7th core unlock makes the x1 very interesting at that cheaper pricetag.

1-Bad code will run bad on every platform reason why it ran bad on everything even PC,that is not a problem of the hardware is the developer.

2-By this point we corroborate how full of shit you are,my link is undeniable you are a joke who knows shit about what your talk,you are just another lemming repeating the same crap prove wrong already.

I don't really need to counter anything more it just seem that you lack any knowledge on this subject.

1. That's true, it's also the only game with hundreds of npc's simultanously on screen. It's pretty much one of the only next gen games. Even if was that badly coded, some other games seem to dig the extra cpu power on the x1 as well. We'll see with upcoming next gen multiplats what's going to happen. I remember you saying that esram wasn't going to improve anything as well back in 2013.

2. From eurogamer 'taking a closer look at the game reveals that both console versions of the Handsome Collection are a match for the PC game running on the highest settings, minus the use of the Nvidia-powered PhysX effect'

http://www.eurogamer.net/articles/digitalfoundry-2015-borderlands-the-handsome-collection-face-off

Ps4 has no nvidia physX, end of story.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#274 tormentos
Member since 2003 • 33793 Posts

@commander said:

1. That's true, it's also the only game with hundreds of npc's simultanously on screen. It's pretty much one of the only next gen games. Even if was that badly coded, some other games seem to dig the extra cpu power on the x1 as well. We'll see with upcoming next gen multiplats what's going to happen. I remember you saying that esram wasn't going to improve anything as well back in 2013.

2. From eurogamer 'taking a closer look at the game reveals that both console versions of the Handsome Collection are a match for the PC game running on the highest settings, minus the use of the Nvidia-powered PhysX effect'

http://www.eurogamer.net/articles/digitalfoundry-2015-borderlands-the-handsome-collection-face-off

Ps4 has no nvidia physX, end of story.

You are an idiot..

Borderland didn't use Nvidia PhysX that doesn't mean is not you moron i quoted Nvidia stated PhysX work on PS4 you are on denial..

Nvidia states so and PhysX even worked on the xbox 360 and PS3 buffoon.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#275 commander
Member since 2010 • 16217 Posts

@tormentos said:

@commander said:

1. That's true, it's also the only game with hundreds of npc's simultanously on screen. It's pretty much one of the only next gen games. Even if was that badly coded, some other games seem to dig the extra cpu power on the x1 as well. We'll see with upcoming next gen multiplats what's going to happen. I remember you saying that esram wasn't going to improve anything as well back in 2013.

2. From eurogamer 'taking a closer look at the game reveals that both console versions of the Handsome Collection are a match for the PC game running on the highest settings, minus the use of the Nvidia-powered PhysX effect'

http://www.eurogamer.net/articles/digitalfoundry-2015-borderlands-the-handsome-collection-face-off

Ps4 has no nvidia physX, end of story.

You are an idiot..

Borderland didn't use Nvidia PhysX that doesn't mean is not you moron i quoted Nvidia stated PhysX work on PS4 you are on denial..

Nvidia states so and PhysX even worked on the xbox 360 and PS3 buffoon.

ok hotshot

show me a game that uses physX then

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#276  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

Sony's 1st party games uses Havok.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#277  Edited By tormentos
Member since 2003 • 33793 Posts

@ronvalencia said:

@tormentos:

Sony's 1st party games uses Havok.

Yep i know but the dude arguing with you and me,claim Nvidia PhysX is something different to Physics,and he believe it can't run on PS4 when PhysX on its original state even ran on 360.

@commander said:

ok hotshot

show me a game that uses physX then

http://www.tomsguide.com/us/PlayStation-3-PhysX-Nvidia,news-3638.html

It even ran on PS3 and that is when Nvidia signed a deal with sony,before that when PhysX belong to Ageia it worked to.

The only thing hold PhysX from running on AMD cards is the license which belong to Nvidia as of now PhysX is just a library that run on mutiple CPU and even GPU Nvidia just doesn't want it on AMD card to use it as a weapon vs AMD.

PhysX = name of the Physics SDK from Nvidia Much like Havok is just a name.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#278  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

@ronvalencia said:

@tormentos:

Sony's 1st party games uses Havok.

Yep i know but the dude arguing with you and me,claim Nvidia PhysX is something different to Physics,and he believe it can't run on PS4 when PhysX on its original state even ran on 360.

@commander said:

ok hotshot

show me a game that uses physX then

http://www.tomsguide.com/us/PlayStation-3-PhysX-Nvidia,news-3638.html

It even ran on PS3 and that is when Nvidia signed a deal with sony,before that when PhysX belong to Ageia it worked to.

The only thing hold PhysX from running on AMD cards is the license which belong to Nvidia as of now PhysX is just a library that run on mutiple CPU and even GPU Nvidia just doesn't want it on AMD card to use it as a weapon vs AMD.

PhysX = name of the Physics SDK from Nvidia Much like Havok is just a name.

Sorry but havok isn't part of nivida phsysX library. Havok was actually mostly used in videogames last gen.

And hat article is from 2009. You stil haven't shown me a game that uses physX

Avatar image for emil_fontz
Emil_Fontz

799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#279 Emil_Fontz
Member since 2014 • 799 Posts

The PS4 has no weakness; it's a beautifully built console-powerhouse that will serve as the home for a multitude of exclusive AAA titles. The XBONE pales in comparison.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#280 lostrib
Member since 2009 • 49999 Posts

@emil_fontz said:

The PS4 has no weakness;

you can't be serious

Avatar image for robert_mueller
Robert_Mueller

164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#281 Robert_Mueller
Member since 2015 • 164 Posts
@nyadc said:

@clyde46 said:

Hang on, the CPU in both consoles is the same bar a 100MHz different in clock speed. If you think that software can suddenly make the X1 more potent then you are delusional.

It's 150Mhz, also the Xbox One has the ability to parse an entire CPU core more than the PlayStation 4 towards game development if a developer sees fit to make use of it.

That's not actually true. The 7th core has *not* been made available for gaming applications *completely*.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#282 NyaDC
Member since 2014 • 8006 Posts

@tormentos said:

@nyadc said:

CPU's still handle the bulk of physics workloads, they're substantially better at crunching that kind of data than a GPU.

Sony cannot overclock, their temperatures are essentially the same as the Xbox One, however the Xbox One has an overclocked CPU and GPU. This accounts for the nearly negligible temperature difference, it would be considerably lower if it were running stock frequencies and voltages. If Sony attempted to overclock their system the temperature would spike dramatically and exceed the thermal safe temps for the hardware which is in the 55+ range, that is why the Xbox One can be overclocked, because it is so well ventilated and stays in a safe temperature range.

Sony doesn't need to as their GPU is more powerful, however it also comes down to they are incapable. The only way they could OC that system and actually have it operate without hardware failure and overheating would be to spin up the fan dramatically which would create an insane amount of noise, that's never going to happen.

You're greatly overestimating the PlayStation 4's capabilities and acting as if its limitations have not already been hit, they have been. The Xbox One and it are capable of outputting games at the exact same graphical fidelity, the only difference will ever be the resolution or independent of that the framerate, that's it. This is all just computer hardware now, Microsoft can run games at the same exact graphical settings they just have to dial back the resolution.

If anything Sony should be worried, they've hit a CPU bottleneck that the Xbox One has not due to its OC which in turn gimps the GPU from operating at full usage, this has been showing up in some games lately. If developers do actually take advantage of that extra core which Microsoft has put on the table it would bring the nearly 10% faster Xbox One CPU up to roughly 25%. If your CPU is a bottleneck in your system it's going to drag the GPU down, this can only get worse not better.

The fact that you're even using that picture as some type of citing example just ended our conversation, this is why I can't take console gamers seriously, you people are discussing things outside of your realm of understanding. It's a synthetic and isolated benchmark that has absolutely zero bearing on practical gaming applications...

NO the bulk is pass to the GPU that is the point,why the hell would you run physics on the GPU is most of the bulk would be on the CPU that make no sense what so ever.

Loading Video...

This is a nice example of how physics drives frame to the floor when you use a CPU and how using a GPU can speed things up considerably.

The frames over 30 all the time,while on CPU they drop as low as 5FPS in head to head that is 6 times slower.

There is something you should know about the xbox one,is in no way over clocked is under clocked,i know you will say oh it was over clock to 1.75ghz on its CPU and 53mhz on its GPU,but fact is it wasn't they just rise the clock a little the xbox one PC GPU equivalent the 7790 is 1027mhz not 800 or 853mhz,so is the PS4 GPU as well is 1.0ghz on PC so you see you are not really over clock to what that specific part work on PC all the contrary.

The same with the CPU which i think can go up to 2.0ghz on that Jaguar.

No it would not you are assuming it would dramatically increase based on nothing,that clock rise MS did probably raise the temp like 5 degrees or less and they aren't running very hot neither.

If the xbox one was so well ventilated it would be colder than the PS4 running games,it does draws 5 less watts yet it produce more heat than the PS4,not only that the PS4 has an internal PSU and a smaller case,it is obvious which is doing more yet running colder the PS4 has a superior cooling solution bigger fan mean nothing if the heat isn't dissipating good enough.

No they are not they can very well over clock they just don't need it,and no one has prove of that not been the case,is just baseless assumptions because the xbox one has a big fan,i took a course on PC repair and one of the first things they teach you is it doesn't matter how many fans your PC has,if is not dissipating well you have a problem.

No it hasn't that is something i have learn from all sony systems first year games are nothing to those who will come latter on,compare Resistance vs Killzone 2 or Uncharted 1 vs TLOU.

The fact that you same exact graphical fidelity,but then say lower resolution basically kill your own argument.

Now that last bold part show that you know shit about this,you are just another alter running the same shit spew here and prove wrong already,a CPU stronger than the ones inside the xbox one will yield 1 or 2 frames more from having a 200mhz difference the xbox one has 150mhz is a weak POS CPU the whole CPU difference will amount to shit,1 or 2 frames when in GPU the PS4 has a up to 30FPS lead in some games,yeah that CPU just make the xbox one on TR instead of 30 lock,32 frames while the PS4 keeps hitting 50 and 60.

The only reason why some game are faster on xbox one are this.

1-Screw up job. ACU,RE.

2- The PS4 punching abode its power,so it hit 1080p but it can hit only 50's or open frames. COD ghost,Adavance warfare.

The difference in CPU is nothing and that 150 will hardly give the xbox one a frame,or nothing specially if you use compute on PS4 to counter,since the xbox one has no spare GPU resources to do the same.

You are a Joke and trying to pretend to be a Hermit is a joke after that shit about the xbox one CPU lemming,that chart you are downplaying show the different in compute between the xbox one and PS4,the CPU is obvious see it.? The GPU one is huge see it as well.? Yeah that chart wasn't done by me it was done by Ubisoft test,time to admit it lemming the XBO will always be behind.

@commander said:

Physics 6 times slower on the cpu?

Physx=physics?

Yeah I got a pretty good laugh out of that as well, console gamers strike again... He actually thinks that the PhysX feature which is in like 40 games spanning its entire existence has any relevance to how CPU's and GPU's handle normal physics calculations, it's hilarious. I don't even need to say anything really, he just keeps making an ass of himself with absolutely no help from us... This is what happens when you speak out of your realm of knowledge and understanding folks...

Tormentos, stop talking dude, the PlayStation 4 has a weaker CPU and it's a bottleneck. Stop trying to skirt the issue on ACU having poor development, it can't handle the AI calculations being thrown at the CPU as well as the Xbox One CPU as it's more powerful. Sure both of them perform like crap but the PlayStation 4 version performs the way it does specifically because of its CPU, having that 40% more powerful GPU won't do shit for you if your CPU is bottlenecked by AI calculations.

P.S. Graphical fidelity has to do with the visual capability and settings within a game engine, it has nothing to do with the resolution.

P.P.S. Using a synthetic benchmark to gauge real world game performance is something an uninformed ignorant person would do. You think citing that Ubisoft benchmark helps your argument lol? On the contrary, it damages what you're trying to say considerably. I'm going to keep saying it as many times as is necessary, you console gamers talk so far beyond your capacity of knowledge and information, it's incredible, seriously shut the hell up....

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#283  Edited By NyaDC
Member since 2014 • 8006 Posts

@robert_mueller said:
@nyadc said:

@clyde46 said:

Hang on, the CPU in both consoles is the same bar a 100MHz different in clock speed. If you think that software can suddenly make the X1 more potent then you are delusional.

It's 150Mhz, also the Xbox One has the ability to parse an entire CPU core more than the PlayStation 4 towards game development if a developer sees fit to make use of it.

That's not actually true. The 7th core has *not* been made available for gaming applications *completely*.

It's been made entirely available however they are running into conflicts, this has all been gone over and they may need to dial back the usage to avoid it if another solution is not reached.

Avatar image for robert_mueller
Robert_Mueller

164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#284  Edited By Robert_Mueller
Member since 2015 • 164 Posts
@nyadc said:

Tormentos, stop talking dude, the PlayStation 4 has a weaker CPU and it's a bottleneck.

Yes, the PS4 is unbalanced on purpose. It's not something they overlooked. It was designed that way, because Sony does not want to build consoles that can be squeezed out to the max at the beginning of their lifetime already.

Stop trying to skirt the issue on ACU having poor development, it can't handle the AI calculations being thrown at the CPU as well as the Xbox One CPU as it's more powerful. Sure both of them perform like crap but the PlayStation 4 version performs the way it does specifically because of its CPU, having that 40% more powerful GPU won't do shit for you if your CPU is bottlenecked by AI calculations.

While it might be true that some AI calcalutions are not well suited for GPGPU, there are other calculations that are traditionally performed on the CPU although they could be moved to the GPU. If every calculation that *can* be moved to the GPU, *is* actually moved to the GPU, this will free CPU resources that can then be used for calculations that are not well suited for GPGPU.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#285 tormentos
Member since 2003 • 33793 Posts

@commander said:

Sorry but havok isn't part of nivida phsysX library. Havok was actually mostly used in videogames last gen.

And hat article is from 2009. You stil haven't shown me a game that uses physX

Who say Havok was part of PhysX.? Havok is a physics middle ware just like PhysX the only difference is that Havok get license to every one,unlike PhysX which Nvidia doesn't want it on AMD has.

No console out now need PhysX to run Physics and Havok like PhysX both worked on last gen consoles and still work on new ones,you are just to ignorant to get it.

Batman uses PhysX for batman cape and other cloth simulation,the Order use cloth simulation to it look incredibly impressive and doesn't use PhysX.

So not only PhysX work on PS4 is also not need it,Havok is there as well as others is up to developers to decide which middleware they chose for the job,just like choosing Unreal vs another engine.

Butbubut....lol

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#286  Edited By commander
Member since 2010 • 16217 Posts

@robert_mueller said:
@nyadc said:

Tormentos, stop talking dude, the PlayStation 4 has a weaker CPU and it's a bottleneck.

Yes, it the PS4 is unbalanced on purpose. It's not something they overlooked. It was built that way, because Sony does not want to build consoles that can be squeezed out to the max at the beginning of their lifetime already.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#287  Edited By NyaDC
Member since 2014 • 8006 Posts

@robert_mueller said:
@nyadc said:

Tormentos, stop talking dude, the PlayStation 4 has a weaker CPU and it's a bottleneck.

Yes, the PS4 is unbalanced on purpose. It's not something they overlooked. It was designed that way, because Sony does not want to build consoles that can be squeezed out to the max at the beginning of their lifetime already.

That's not how this works lol...

Avatar image for robert_mueller
Robert_Mueller

164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#288 Robert_Mueller
Member since 2015 • 164 Posts
@commander said:

@robert_mueller said:

Yes, the PS4 is unbalanced on purpose. It's not something they overlooked. It was designed that way, because Sony does not want to build consoles that can be squeezed out to the max at the beginning of their lifetime already.

You did not know that?

I thought it was obvious. Look at the PS2, look at the PS3. Read Kutaragi's comments. Look at the PS4. Read Cerny's comments.

There is no doubt that the imbalance between the CPU and GPU of the PS4 is an intentional decision of the designers. And the reason is that Sony still wants to challenge the developers. Maybe not as much as with the PS3, but still enough to prevent them from exploiting the hardware too early.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#289 tormentos
Member since 2003 • 33793 Posts

@nyadc said:

Yeah I got a pretty good laugh out of that as well, console gamers strike again... He actually thinks that the PhysX feature which is in like 40 games spanning its entire existence has any relevance to how CPU's and GPU's handle normal physics calculations, it's hilarious. I don't even need to say anything really, he just keeps making an ass of himself with absolutely no help from us... This is what happens when you speak out of your realm of knowledge and understanding folks...

Tormentos, stop talking dude, the PlayStation 4 has a weaker CPU and it's a bottleneck. Stop trying to skirt the issue on ACU having poor development, it can't handle the AI calculations being thrown at the CPU as well as the Xbox One CPU as it's more powerful. Sure both of them perform like crap but the PlayStation 4 version performs the way it does specifically because of its CPU, having that 40% more powerful GPU won't do shit for you if your CPU is bottlenecked by AI calculations.

P.S. Graphical fidelity has to do with the visual capability and settings within a game engine, it has nothing to do with the resolution.

P.P.S. Using a synthetic benchmark to gauge real world game performance is something an uninformed ignorant person would do. You think citing that Ubisoft benchmark helps your argument lol? On the contrary, it damages what you're trying to say considerably. I'm going to keep saying it as many times as is necessary, you console gamers talk so far beyond your capacity of knowledge and information, it's incredible, seriously shut the hell up....

Probably a desperation laughter right.? because PhysX is Physics that is the name of Nvidia Physics library you fools..hahahaha not even that you know...lol

You don't say anything because you CANT what i stated is a fact.

The CPU difference between both is nothing,the GPU difference is quite bigger and fanboys like you have downplay it for more than a year,so 50% more CU will do nothing,but 9% CPU boost will change anything..hahaha

You lemming are really butthurt this gen,it most be killing you that even a broke ass sony did a more poweful console than MS..hahaha

All perform like crap even the PC version and it was patch by the way Ubisoft admit the CPU wasn't the problem..hahaha

Wait so citing Ubisoft benchmark serve me for nothing,even that it show the real gap in compute both unit has.? But some how your argument serve us for something right.?

Man stop you are a butthurt lemming and trying to act like a hermit serve you for nothing we already have see that act here 100 times,and after that crap you pull there you keep killing your own arguments,now Ubisoft benchmark is invalid oh sure it is because it prove you wrong...hahaha

So here again so you can admire the huge ass gap between GPU in compute for both platforms..

while we are on it here, it is going to be a long gen for you lemmings better have something to wipe those tears.

@nyadc said:

It's been made entirely available however they are running into conflicts, this has all been gone over and they may need to dial back the usage to avoid it if another solution is not reached.

NO learn to read is only 80% and from that you can only use certain amount,as Kinect still has access to system calls,if you use the 7 core you can use a small part which was destiny to in game voice commands,but the command of the system like box record this or something like that still are alive and will consume 50% of the core instantly so you can't use it,so only like 30% can be use and there is no way to know how much is free.

You people should educate your self instead of repeating crap like monkeys.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#290 NyaDC
Member since 2014 • 8006 Posts

I can't deal with this idiocy any longer, commander you can have fun dude, you can't change stupid....

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#291 commander
Member since 2010 • 16217 Posts

@robert_mueller said:
@commander said:

@robert_mueller said:

Yes, the PS4 is unbalanced on purpose. It's not something they overlooked. It was designed that way, because Sony does not want to build consoles that can be squeezed out to the max at the beginning of their lifetime already.

You did not know that?

I thought it was obvious. Look at the PS2, look at the PS3. Read Kutaragi's comments. Look at the PS4. Read Cerny's comments.

There is no doubt that the imbalance between the CPU and GPU of the PS4 is an intentional decision of the designers. And the reason is that Sony still wants to challenge the developers. Maybe not as much as with the PS3, but still enough to prevent them from exploiting the hardware too early.

That's not on purpose, sony is just not very good at making consoles

Avatar image for robert_mueller
Robert_Mueller

164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#292 Robert_Mueller
Member since 2015 • 164 Posts
@commander said:

That's not on purpose, sony is just not very good at making consoles

You are wrong. There is evidence that it was done on purpose:

  1. The additional Onion+ bus.
  2. The volatile bit that was added by AMD on Sony's explicit request.
  3. The number of ACE units that was increased by AMD on Sony's explicit request.

It is debatable how effective these measures are, but they clearly prove that Sony went for GPGPU deliberately. And when designing a platform w.r.t. GPGPU, then you must shift the CPU/GPU ratio towards the GPU.

So please stop denying and admit that Sony did this on purpose, because they wanted to focus on GPGPU. Feel free to doubt the effectiveness of this design, but do not deny that the PS4 was designed this way on purpose, because otherwise you would be lying consciously.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#293  Edited By commander
Member since 2010 • 16217 Posts

@robert_mueller said:
@commander said:

That's not on purpose, sony is just not very good at making consoles

You are wrong. There is evidence that it was done on purpose:

  1. The additional Onion+ bus.
  2. The volatile bit that was added by AMD on Sony's explicit request.
  3. The number of ACE units that was increased by AMD on Sony's explicit request.

It is debatable how effective these measures are, but they clearly prove that Sony went for GPGPU deliberately. And when designing a platform w.r.t. GPGPU, then you must shift the CPU/GPU ratio towards the GPU.

So please stop denying and admit that Sony did this on purpose, because they wanted to focus on GPGPU. Feel free to doubt the effectiveness of this design, but do not deny that the PS4 was designed this way on purpose, because otherwise you would be lying consciously.

Yeah of course they went with gpgpu deliberately because it was the cheapest option.

The ps3 was another story, there they wanted to make the strongest system possible to surpass the x360, they didn't forget how microsoft barged into the console business with the xbox. The main selling point of the first xbox was that it was a lot stronger than the ps2.

The only problem there was , the x360 was already a very powerfull system for it's time. At that time you really couldn't make a console stronger than the ps3 or you had to sli cards, no one ever did that besides the power enveloppe would have been crazy as well.

The ps2 wasn't very strong because they didn't really need to with the huge success of the ps1 and they released before all other systems (besides the inferior dreamcast then).

The ps1 was only good because it had two things, it made a gamble with going for 3d, and it used the cd for data storage The combination of those two, made that system so successfull. It's actually the only good system sony ever made. But had nintendo made use of the cd (which was actually initial plan, sony and ninty would work together on a console) then the n64 would have been a very powerfull system.

It would have been way better than the playstation but because it was so nerfed because of that cartrigde, sony conquered the whole console market (probably even with 3d plans that were made at nintendo)

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#294 tormentos
Member since 2003 • 33793 Posts

@nyadc said:

I can't deal with this idiocy any longer, commander you can have fun dude, you can't change stupid....

Oh don't be so hard on your self...

@commander said:

That's not on purpose, sony is just not very good at making consoles

Is not sony who has a console that look like an 80's VHS while been weaker and running hotter.lol

@nyadc said:

That's not how this works lol...

Actually the PS4 is a very balance system contrary to what he thinks,what is unbalance is the xbox one,weak GPU,big pool of slow memory small pool of fast memory system reservations.

Avatar image for robert_mueller
Robert_Mueller

164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#295  Edited By Robert_Mueller
Member since 2015 • 164 Posts
@commander said:

Yeah of course they went with gpgpu deliberately because it was the cheapest option.

That is obviously *not* correct, because Microsofts's approach with the ESRAM and DDR3 was even cheaper.

The ps3 was another story, there they wanted to make the strongest system possible to surpass the x360, they didn't forget how microsoft barged into the console business with the xbox. The main selling point of the first xbox was that it was a lot stronger than the ps2.

The only problem there was , the x360 was already a very powerfull system for it's time. At that time you really couldn't make a console stronger than the ps3 or you had to sli cards, no one ever did that besides the power enveloppe would have been crazy as well.

The main issue with the PS3 was that it came out one year behind schedule. The original design did not include a full GPU, but only a pure rasterizer chip as in the PS2. The rest of the graphics calculations were supposed to be performed on the Cell's SPUs. At some point, they came to the conclusion that this approach (that had been inherited from the PS2) would no longer be competitive. Thus, they did a major redesign resulting in a one year delay.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#296 Daious
Member since 2013 • 2315 Posts

@robert_mueller said:
@commander said:

@robert_mueller said:

Yes, the PS4 is unbalanced on purpose. It's not something they overlooked. It was designed that way, because Sony does not want to build consoles that can be squeezed out to the max at the beginning of their lifetime already.

You did not know that?

I thought it was obvious. Look at the PS2, look at the PS3. Read Kutaragi's comments. Look at the PS4. Read Cerny's comments.

There is no doubt that the imbalance between the CPU and GPU of the PS4 is an intentional decision of the designers. And the reason is that Sony still wants to challenge the developers. Maybe not as much as with the PS3, but still enough to prevent them from exploiting the hardware too early.

I wonder how many people this will bait to responding

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#297  Edited By ronvalencia
Member since 2008 • 29612 Posts

@robert_mueller:

@robert_mueller said:
@commander said:

That's not on purpose, sony is just not very good at making consoles

You are wrong. There is evidence that it was done on purpose:

  1. The additional Onion+ bus.
  2. The volatile bit that was added by AMD on Sony's explicit request.
  3. The number of ACE units that was increased by AMD on Sony's explicit request.

It is debatable how effective these measures are, but they clearly prove that Sony went for GPGPU deliberately. And when designing a platform w.r.t. GPGPU, then you must shift the CPU/GPU ratio towards the GPU.

So please stop denying and admit that Sony did this on purpose, because they wanted to focus on GPGPU. Feel free to doubt the effectiveness of this design, but do not deny that the PS4 was designed this way on purpose, because otherwise you would be lying consciously.

3. Hawaii GCN was released ahead of PS4.

Temash APU has 4 ACE units and it was released ahead of PS4.

Hints for ACE's future unit growth was from 7970's release (DEC 2012) i.e. nth count.

@commander said:

Sorry but havok isn't part of nivida phsysX library. Havok was actually mostly used in videogames last gen.

And hat article is from 2009. You stil haven't shown me a game that uses physX

Read http://en.wikipedia.org/wiki/List_of_games_using_Havok Sort by year.

Crytek's CryEngine 3 has it's own physics engine.

EA's Frostbite 3 has it's own physics engine.

Oxide's Nitrous has it's own physics engine. Native DirectX12 and Mantle 3D+physics engine.

Square Enix/Crystal Dynamics has it's own physics engine and leverages AMD's TressFX.

AMD Gaming Evolved games usually avoids NVIDIA PhysX library.

AMD's TressFX v2 is running on GCN consoles and DX11 PCs. There's very little need for NVIDIA Hairworks.

Oxide's DX12/Mantle engine with large scale destruction RTS, hence it doesn't need NVIDIA's PhysX destruction.

Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#299 Gue1
Member since 2004 • 12171 Posts

@robert_mueller said:

You are wrong. There is evidence that it was done on purpose:

  1. The additional Onion+ bus.
  2. The volatile bit that was added by AMD on Sony's explicit request.
  3. The number of ACE units that was increased by AMD on Sony's explicit request.

It is debatable how effective these measures are, but they clearly prove that Sony went for GPGPU deliberately. And when designing a platform w.r.t. GPGPU, then you must shift the CPU/GPU ratio towards the GPU.

So please stop denying and admit that Sony did this on purpose, because they wanted to focus on GPGPU. Feel free to doubt the effectiveness of this design, but do not deny that the PS4 was designed this way on purpose, because otherwise you would be lying consciously.

Kutaragi just liked to go exotic when it came to hardware. The guy was a visionary and was always on a mission to create something revolutionary/new. And while it's true that the PS4 was made unbalanced on purpose I think it was to save up some money.

Avatar image for SonySoldier-_-
SonySoldier-_-

1186

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#300 SonySoldier-_-
Member since 2012 • 1186 Posts

The TC has convinced me. I'm getting an Xbone to enjoy multiplats in lower resoltuions because of the weaker GPU.

I'm looking forward to Dragon Age in 900P, COD Ghost in 720P, Advanced Warfare in 900P, BF4 in 720P, BF Hardline in 720P etc. the list goes on and on and the Xbone never stops showing it's weakness.