PS4's weak spot quite obvious in 1.08 patch of TW3

  • 93 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#51  Edited By commander
Member since 2010 • 16217 Posts

@04dcarraher said:
@commander said:
@nyadc said:

Post-processing is entirely handled by the GPU, it's a post rendering... While it's obvious there is a CPU bottleneck taking place on the PlayStation 4, don't make shit up....

Not necessarily , motion blur is been done on the gpu since gpgpu tools but it was mostly done on the cpu in the past.

Since the xboxone doesn't get any framerate hit because of it, while it has a weaker gpu and a stronger cpu, it's obvious it's done on the cpu.

Or the esram could be handling it, either way even when enabled on the xboxone and disabled on the ps4, the xboxone still has better framerates.

Post processing effects ie graphical is handled by the gpu not the cpu...... fact is that the X1 has abit faster cpu with a weaker gpu, which requires less rate of data to be fed(less cpu work). Allows the system not to be bottlenecked as much as PS4 when more cpu intensive items come up. Has nothing to do with esram or cpu handing the post processing....

@nyadc said:

This isn't a debate, all post processing is a GPU bound task. It takes the rendering output and reprocesses the already rendered image and adds visual effects.

Motion blur, anti-aliasing, anisotropic filtering etc, these are all post processes, all post processing is done by the GPU...

Not in the past it wasn't (click on the picture if you can't read it)

link

It seems that they did the motion blur on the cpu with the witcher 3, that explains why the xboxone doesn't get a framerate hit and the ps4 does.

Why didn't they use gpgpu directcompute , well probably because they didn't have any gpu resources left because they run the game at 1080p.

Don't forget even while they would have use gpgpu directcompute, the game would still run worse on the ps4 and it would run at the same resolution as the xboxone (since the x1 still has better framerates than the ps4 when the motion blur is enabled on the x1 and disabled on the ps4)

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#53 04dcarraher
Member since 2004 • 23858 Posts

@commander said:

Not in the past it wasn't (click on the picture if you can't read it)

link

It seems that they did the motion blur on the cpu with the witcher 3, that explains why the xboxone doesn't get a framerate hit and the ps4 does.

Why didn't they use gpgpu directcompute , well probably because they didn't have any gpu resources left because they run the game at 1080p.

Don't forget even while they would have use gpgpu directcompute, the game would still run worse on the ps4 and it would run at the same resolution as the xboxone (since the x1 still has better framerates than the ps4 when the motion blur is enabled on the x1 and disabled on the ps4)

The use of the cpu for motion blur was the pre unified shader architecture based gpu era, it has no bearing on The Witcher 3. Around 2007 motion blur killed gpu's performance that were not unified shader based....

So no they did not use the cpu for motion blur on the X1

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#54 commander
Member since 2010 • 16217 Posts

@04dcarraher said:
@commander said:

Not in the past it wasn't (click on the picture if you can't read it)

link

It seems that they did the motion blur on the cpu with the witcher 3, that explains why the xboxone doesn't get a framerate hit and the ps4 does.

Why didn't they use gpgpu directcompute , well probably because they didn't have any gpu resources left because they run the game at 1080p.

Don't forget even while they would have use gpgpu directcompute, the game would still run worse on the ps4 and it would run at the same resolution as the xboxone (since the x1 still has better framerates than the ps4 when the motion blur is enabled on the x1 and disabled on the ps4)

The use of the cpu for motion blur was the pre unified shader architecture based gpu era, it has no bearing on The Witcher 3. Around 2007 motion blur killed gpu's performance that were not unified shader based....

So no they did not use the cpu for motion blur on the X1

how do you explain the framerate hit on the ps4 then and not on the xboxone.

The ps4 has a much better graphics card, so that doesn't make any sense.

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#55 slimdogmilionar
Member since 2014 • 1345 Posts

@DragonfireXZ95 said:

Does it matter? Playing The Witcher 3 at 30 fps or less is a terrible injustice to gaming, anyway. If you're playing on console, you're already playing it in an inferior way.

Unless it's 30 fps with ultra setting and hairworks turned on.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#56  Edited By 04dcarraher
Member since 2004 • 23858 Posts
@commander said:
@04dcarraher said:

The use of the cpu for motion blur was the pre unified shader architecture based gpu era, it has no bearing on The Witcher 3. Around 2007 motion blur killed gpu's performance that were not unified shader based....

So no they did not use the cpu for motion blur on the X1

how do you explain the framerate hit on the ps4 then and not on the xboxone.

The ps4 has a much better graphics card, so that doesn't make any sense.

PS4 cpu cant feed its stronger gpu with the data it needs. The Witcher 3 uses deferred multithreading on all platforms, which means that a single core is the funnel for the other cores/threads 2nd priority gpu workloads. The main core still handles the vast majority of the draw calls and has to feed the gpu all the data it needs. So they have to to rely on a single core that is slower than 2009 era AMD pc cpu that's being only 1.6 ghz.

So in simple terms a 1.75ghz core can feed a weaker gpu better then a 1.6 ghz core can feed a stronger gpu. The gpu has to wait on the cpu for the data hence the fps drops. When you have cpu intensive areas like towns, villages etc, more cpu resources have to be put there taking away from the gpu. This is why DX12 standard is needed this allows all cores/threads to talk to the gpu directly removing the single core or deferred methods. Both the X1 and PS4 gain from using those standards in multiplats.

PS4's low level coding can do the same thing but takes time and effort to code, This is why like with BF4 PS4 reins supreme against the X1 with higher resolution and fps, since DICE coded for async with PS4.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#57 commander
Member since 2010 • 16217 Posts

@04dcarraher said:
@commander said:
@04dcarraher said:

The use of the cpu for motion blur was the pre unified shader architecture based gpu era, it has no bearing on The Witcher 3. Around 2007 motion blur killed gpu's performance that were not unified shader based....

So no they did not use the cpu for motion blur on the X1

how do you explain the framerate hit on the ps4 then and not on the xboxone.

The ps4 has a much better graphics card, so that doesn't make any sense.

PS4 cpu cant feed its stronger gpu with the data it needs. The Witcher 3 uses deferred multithreading on all platforms, which means that a single core is the funnel for the other cores/threads 2nd priority gpu workloads. The main core still handles the vast majority of the draw calls and has to feed the gpu all the data it needs. So they have to to rely on a single core that is slower than 2008 era pc cpu that's at 1.6 ghz. So in simple terms a 1.75ghz core can feed a weaker gpu better then a 1.6 ghz core can feed a stronger gpu. The gpu hasd to wait on the cpu for the data hence the fps drops.

So it is 'also' cpu bound instead of just cpu bound?

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#58  Edited By 04dcarraher
Member since 2004 • 23858 Posts
@commander said:

So it is 'also' cpu bound instead of just cpu bound?

So in simple terms a 1.75ghz core can feed a weaker gpu better then a 1.6 ghz core can feed a stronger gpu. The gpu has to wait on the cpu for the data hence the fps drops. When you have cpu intensive areas like towns, villages etc, more cpu resources have to be put there taking away from the gpu. This is why DX12 standard is needed this allows all cores/threads to talk to the gpu directly removing the single core or deferred methods. Both the X1 and PS4 gain from using those standards in multiplats.

PS4's low level coding can do the same thing but takes time and effort to code, This is why only a select few games use it, like with BF4. The PS4 reins supreme against the X1 with higher resolution and fps, since DICE coded for async with PS4. allowing PS4's cpu and gpu to shine.

Avatar image for deactivated-5d68555a05c4b
deactivated-5d68555a05c4b

1024

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 deactivated-5d68555a05c4b
Member since 2015 • 1024 Posts

Did sony piss in your wheaties or something?

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#60 commander
Member since 2010 • 16217 Posts

@04dcarraher said:
@commander said:

So it is 'also' cpu bound instead of just cpu bound?

So in simple terms a 1.75ghz core can feed a weaker gpu better then a 1.6 ghz core can feed a stronger gpu. The gpu has to wait on the cpu for the data hence the fps drops. When you have cpu intensive areas like towns, villages etc, more cpu resources have to be put there taking away from the gpu. This is why DX12 standard is needed this allows all cores/threads to talk to the gpu directly removing the single core or deferred methods. Both the X1 and PS4 gain from using those standards in multiplats.

PS4's low level coding can do the same thing but takes time and effort to code, This is why only a select few games use it, like with BF4. The PS4 reins supreme against the X1 with higher resolution and fps, since DICE coded for async with PS4. allowing PS4's cpu and gpu to shine.

so it's also cpu bound, like i said, I should have mentioned 'also '(I 've added the word 'also' in my first post) because a gpu is of course always involved with manipulating textures and shaders. But either way it's always going to eat up cpu power (until dx12 maybe)

But i don't agree with battlefield 4, BF 4 was made when the xboxone didn't have the esram tools yet and when they made bf4 microsoft still reserved 10 percent gpu resources for the kinect.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#61  Edited By commander
Member since 2010 • 16217 Posts

@meathead373 said:

Did sony piss in your wheaties or something?

They killed my sega

Loading Video...

Avatar image for Chutebox
Chutebox

51608

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 Chutebox
Member since 2007 • 51608 Posts

@commander said:
@meathead373 said:

Did sony piss in your wheaties or something?

They killed my sega

Saturn killed Sega

Avatar image for Ant_17
Ant_17

13634

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#63 Ant_17
Member since 2005 • 13634 Posts

@commander said:
@meathead373 said:

Did sony piss in your wheaties or something?

They killed my sega

It was a suicide case.

Witneses , pictures and video showed Sega killing themselves.

Avatar image for AM-Gamer
AM-Gamer

8116

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 AM-Gamer
Member since 2012 • 8116 Posts

@04dcarraher: there's about a 5 fps difference or less . If you were talking about 900p 60fps vs 1080p sub 30 then that would be another story. Regardless both consoles scored a 10 and they did a fairly good job considering it was a straight PC port.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#67 ReadingRainbow4
Member since 2012 • 18733 Posts

@commander said:
@meathead373 said:

Did sony piss in your wheaties or something?

They killed my sega

Loading Video...

Sega killed Sega and Sega's fanbase certainly didn't help with how poor the dreamcast did.

Avatar image for a-new-guardian
A-new-Guardian

2458

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68 A-new-Guardian
Member since 2015 • 2458 Posts

It's funny how the PS4 CPU is actually weaker than the CELL on the PS3 lol.

Also I read that you can disable motion blur and effects on the PS4 version in the games options to get the Xbone's performance. I'd like for developers to give us the option to run games even at 720p and get 60 fps. basically three options, 1080p,900p,720p and just let us choose. but i'm guessing it's too much work.

Even Bioshock 1 a game from 2007 had the option to lower the graphical detail for higher frame rate on consoles.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 RyviusARC
Member since 2011 • 5708 Posts

Must suck having to play The Witcher 3 on the weak obsolete hardware that is inside the PS4 and Xbox One.

I am enjoying it with settings beyond the game's normal max and while I usually play at 1440p (which is almost twice the resolution of the PS4) I can still play at a better frame rate than consoles if I chose to play at 4k res.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 RyviusARC
Member since 2011 • 5708 Posts

@a-new-guardian said:

It's funny how the PS4 CPU is actually weaker than the CELL on the PS3 lol.

Also I read that you can disable motion blur and effects on the PS4 version in the games options to get the Xbone's performance. I'd like for developers to give us the option to run games even at 720p and get 60 fps. basically three options, 1080p,900p,720p and just let us choose. but i'm guessing it's too much work.

Even Bioshock 1 a game from 2007 had the option to lower the graphical detail for higher frame rate on consoles.

Actually the PS4 CPU is stronger than the CELL in the PS3.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#71  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@commander said:

so it's also cpu bound, like i said, I should have mentioned 'also '(I 've added the word 'also' in my first post) because a gpu is of course always involved with manipulating textures and shaders. But either way it's always going to eat up cpu power (until dx12 maybe)

But i don't agree with battlefield 4, BF 4 was made when the xboxone didn't have the esram tools yet and when they made bf4 microsoft still reserved 10 percent gpu resources for the kinect.

With BF4 PS4 had a 10 fps higher average in MP while rendering 30% more pixels with higher settings the immature esram and 10% reserve wouldn't make up for that big of a lead because of them using async on PS4.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#72  Edited By commander
Member since 2010 • 16217 Posts

@04dcarraher said:
@commander said:

so it's also cpu bound, like i said, I should have mentioned 'also '(I 've added the word 'also' in my first post) because a gpu is of course always involved with manipulating textures and shaders. But either way it's always going to eat up cpu power (until dx12 maybe)

But i don't agree with battlefield 4, BF 4 was made when the xboxone didn't have the esram tools yet and when they made bf4 microsoft still reserved 10 percent gpu resources for the kinect.

With BF4 PS4 had a 10 fps higher average in MP while rendering 30% more pixels with higher settings the immature esram and 10% reserve wouldn't make up for that big of a lead because of them using async on PS4.

you're forgetting the esram performance, i'm not saying you're not right about the async but a lot of games at the xboxone's launch were running at a lot lower settings because of this.

You saw differences between 720p and 1080p, now you don't see that anymore

Avatar image for RTUUMM
RTUUMM

4859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#73  Edited By RTUUMM
Member since 2008 • 4859 Posts

@FoxbatAlpha said:

10/10 thread. Give this guy a fucking metal.

you go ahead and give that metal, but he doesnt deserve a medal

Avatar image for frank_castle
Frank_Castle

1982

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#74 Frank_Castle
Member since 2015 • 1982 Posts

Few things are funnier than console gamers getting into pissing matches about graphics and performance.

Avatar image for ten_pints
Ten_Pints

4072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#75 Ten_Pints
Member since 2014 • 4072 Posts

I think it's a case of the game being badly optimised for consoles.

The game plays better on weaker PCs which makes absolutely no sense.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#76  Edited By NyaDC
Member since 2014 • 8006 Posts

@commander said:
@04dcarraher said:
@commander said:

Not in the past it wasn't (click on the picture if you can't read it)

link

It seems that they did the motion blur on the cpu with the witcher 3, that explains why the xboxone doesn't get a framerate hit and the ps4 does.

Why didn't they use gpgpu directcompute , well probably because they didn't have any gpu resources left because they run the game at 1080p.

Don't forget even while they would have use gpgpu directcompute, the game would still run worse on the ps4 and it would run at the same resolution as the xboxone (since the x1 still has better framerates than the ps4 when the motion blur is enabled on the x1 and disabled on the ps4)

The use of the cpu for motion blur was the pre unified shader architecture based gpu era, it has no bearing on The Witcher 3. Around 2007 motion blur killed gpu's performance that were not unified shader based....

So no they did not use the cpu for motion blur on the X1

how do you explain the framerate hit on the ps4 then and not on the xboxone.

The ps4 has a much better graphics card, so that doesn't make any sense.

Because the CPU is bottlenecked on the PlayStation 4, when post is added it tries to allocate resources from the CPU to feed the GPU, since it can't do this it pulls from existing resources already in use thus the frame rate decreases.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#77 commander
Member since 2010 • 16217 Posts

@nyadc said:
@commander said:
@04dcarraher said:
@commander said:

Not in the past it wasn't (click on the picture if you can't read it)

link

It seems that they did the motion blur on the cpu with the witcher 3, that explains why the xboxone doesn't get a framerate hit and the ps4 does.

Why didn't they use gpgpu directcompute , well probably because they didn't have any gpu resources left because they run the game at 1080p.

Don't forget even while they would have use gpgpu directcompute, the game would still run worse on the ps4 and it would run at the same resolution as the xboxone (since the x1 still has better framerates than the ps4 when the motion blur is enabled on the x1 and disabled on the ps4)

The use of the cpu for motion blur was the pre unified shader architecture based gpu era, it has no bearing on The Witcher 3. Around 2007 motion blur killed gpu's performance that were not unified shader based....

So no they did not use the cpu for motion blur on the X1

how do you explain the framerate hit on the ps4 then and not on the xboxone.

The ps4 has a much better graphics card, so that doesn't make any sense.

Because the CPU is bottlenecked on the PlayStation 4, when post is added it tries to allocate resources from the CPU to feed the GPU, since it can't do this it pulls from existing resources already in use thus the frame rate decreases.

Yes, indeed and the result is the same, the ps4 struggles due to lack of cpu resources

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78  Edited By Wickerman777
Member since 2013 • 2164 Posts

@Heil68 said:

X1 is even worst off since PS4 is 50% more powerful as the world's most powerful video game console in the history of video games.

It just happens to be current gen 8 console leader and current NPD winner 2 months straight, covering E3 and after.

That's an exaggeration. Its GPU is around 40% more powerful but it's a little behind X1 when it comes to the CPU despite them being physically the same thing. That's because X1's CPU is clocked 10% faster and is using an extra core for games. The latter is because MS is simply better at coding than Sony is, able to do more with less because of it. That's why they're able to do bc via emulation and Sony can't and also the same reason X360 used less memory for its OS than PS3 did last gen. All together PS4 is probably 25%-30% more powerful than X1, certainly not 50%.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#79 NyaDC
Member since 2014 • 8006 Posts

@Wickerman777 said:
@Heil68 said:

X1 is even worst off since PS4 is 50% more powerful as the world's most powerful video game console in the history of video games.

It just happens to be current gen 8 console leader and current NPD winner 2 months straight, covering E3 and after.

That's an exaggeration. Its GPU is around 40% more powerful but it's a little behind X1 when it comes to the CPU despite them being physically the same thing. That's because X1's CPU is clocked 10% faster and is using an extra core for games. The latter is because MS is simply better at coding than Sony is, same reason they're able to do bc via emulation and Sony can't and also the same reason X360 used less memory for its OS than PS3 did last gen. All together PS4 is probably 25%-30% more powerful than X1, certainly not 50%.

This is correct.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#80  Edited By GrenadeLauncher
Member since 2004 • 6843 Posts

Lemmings still mad because their extra half a core still can't get them better frame rates on most games.

Big surprise that @commander is a Segadrone. You'd think 15 years on they'd come to terms with the fact that Sega killed themselves.

Avatar image for legendofsense
legendofsense

320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 legendofsense
Member since 2013 • 320 Posts

Commander, can you please make the NPD threads from now on? I think you'd be a great replacement for tiger.

Avatar image for ten_pints
Ten_Pints

4072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#82  Edited By Ten_Pints
Member since 2014 • 4072 Posts

Like I said it's badly optimised for console.

You have to design the game around the hardware.

Avatar image for ermacness
ermacness

10956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 ermacness
Member since 2005 • 10956 Posts

@Heil68:

You just had to bite the bait huh?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#84 ronvalencia
Member since 2008 • 29612 Posts

@a-new-guardian said:

It's funny how the PS4 CPU is actually weaker than the CELL on the PS3 lol.

Also I read that you can disable motion blur and effects on the PS4 version in the games options to get the Xbone's performance. I'd like for developers to give us the option to run games even at 720p and get 60 fps. basically three options, 1080p,900p,720p and just let us choose. but i'm guessing it's too much work.

Even Bioshock 1 a game from 2007 had the option to lower the graphical detail for higher frame rate on consoles.

PS4's CPU has less patching on the GPU's feature set.

From forum.beyond3d.com/showthread.php?t=57736&page=5

PS3's SPEs "patching" aging GPU's feature set.

------------------------

"I could go on for pages listing the types of things the spu's are used for to make up for the machines aging gpu, which may be 7 series NVidia but that's basically a tweaked 6 series NVidia for the most part. But I'll just type a few off the top of my head:"

1) Two ppu/vmx units

There are three ppu/vmx units on the 360, and just one on the PS3. So any load on the 360's remaining two ppu/vmx units must be moved to spu.

2) Vertex culling

You can look back a few years at my first post talking about this, but it's common knowledge now that you need to move as much vertex load as possible to spu otherwise it won't keep pace with the 360.

3) Vertex texture sampling

You can texture sample in vertex shaders on 360 just fine, but it's unusably slow on PS3. Most multi platform games simply won't use this feature on 360 to make keeping parity easier, but if a dev does make use of it then you will have no choice but to move all such functionality to spu.

4) Shader patching

Changing variables in shader programs is cake on the 360. Not so on the PS3 because they are embedded into the shader programs. So you have to use spu's to patch your shader programs.

5) Branching

You never want a lot of branching in general, but when you do really need it the 360 handles it fine, PS3 does not. If you are stuck needing branching in shaders then you will want to move all such functionality to spu.

6) Shader inputs

You can pass plenty of inputs to shaders on 360, but do it on PS3 and your game will grind to a halt. You will want to move all such functionality to spu to minimize the amount of inputs needed on the shader programs.

7) MSAA alternatives

Msaa runs full speed on 360 gpu needing just cpu tiling calculations. Msaa on PS3 gpu is very slow. You will want to move msaa to spu as soon as you can.

Post processing

8 )360 is unified architecture meaning post process steps can often be slotted into gpu idle time. This is not as easily doable on PS3, so you will want to move as much post process to spu as possible.

9) Load balancing

360 gpu load balances itself just fine since it's unified. If the load on a given frame shifts to heavy vertex or heavy pixel load then you don't care. Not so on PS3 where such load shifts will cause frame drops. You will want to shift as much load as possible to spu to minimize your peak load on the gpu.

10) Half floats

You can use full floats just fine on the 360 gpu. On the PS3 gpu they cause performance slowdowns. If you really need/have to use shaders with many full floats then you will want to move such functionality over to the spu's.

11) Shader array indexing

You can index into arrays in shaders on the 360 gpu no problem. You can't do that on PS3. If you absolutely need this functionality then you will have to either rework your shaders or move it all to spu.

PS3's SPEs are being use for GPU work.

Avatar image for ellos
ellos

2532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85  Edited By ellos
Member since 2015 • 2532 Posts

Its a very tall order to argue that ps4 is not the most powerful console overall. Even in mainstream terms lets face it allot more games run at higher fps while being at higher resolution on ps4. We have seen bigger fps advantages for ps4 than this. Overall games tend to be gpu bound and ps4 got a bigger gap there. I also don't believe its a matter of cant, for Sony to do the same and allocate 80% of an extra core. Its probably a matter of don't care don't really matter right now, and don't want to spend the money.

It is rather ironic though that blur, and motion blur when turn off improves ps4's performance albeit only in that area. As stated it doesn't seem to effect the image quality much on ps4. Why not have it of on that area as a default. Why not play on the fact that its a higher res image anyways. Especially when it does nothing in towns and other areas.

Avatar image for hrt_rulz01
hrt_rulz01

22688

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86 hrt_rulz01
Member since 2006 • 22688 Posts

@GrenadeLauncher said:

Can we patch out your propensity for making shit threads?

Lol... can't help but giggle.

Avatar image for running-target
Running-Target

413

Forum Posts

0

Wiki Points

0

Followers

Reviews: 71

User Lists: 0

#87 Running-Target
Member since 2014 • 413 Posts

lol

Avatar image for Heil68
Heil68

60833

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#88 Heil68
Member since 2004 • 60833 Posts

Since PS4 is the world's most powerful video game console, do you think Xbox will drop Kinect all together and focus on actual hardware???

Avatar image for santoron
santoron

8584

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#89 santoron
Member since 2006 • 8584 Posts

@commander said:

...Not in the past it wasn't ...

Hilarious. You're trying to fanboy with concepts of computing a decade out of date.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#90  Edited By commander
Member since 2010 • 16217 Posts

@santoron said:
@commander said:

...Not in the past it wasn't ...

Hilarious. You're trying to fanboy with concepts of computing a decade out of date.

That article was from 2010...

and that post wasn't that important the argument still stands, not enough cpu resources on ps4....

Try again

Avatar image for Heil68
Heil68

60833

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#91 Heil68
Member since 2004 • 60833 Posts

@Heil68 said:

Since PS4 is the world's most powerful video game console, do you think Xbox will drop Kinect all together and focus on actual hardware???

I haven't found any research yet, but I bet that's what MS does. They drop kinect to try and keep pace with Sony, makers of the world's most powerful video game console.

Avatar image for mr_huggles_dog
Mr_Huggles_dog

7805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#92 Mr_Huggles_dog
Member since 2014 • 7805 Posts

More PS4 hate from the peanut gallery.

Scared.....all scared.

Avatar image for Darkhorse-Gamer
Darkhorse-Gamer

277

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#93 Darkhorse-Gamer
Member since 2012 • 277 Posts

@commander said:
@meathead373 said:

Did sony piss in your wheaties or something?

They killed my sega

Loading Video...

♪♫♪♫ SEGA ♪♫♪♫

Avatar image for l34052
l34052

3906

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#94 l34052
Member since 2005 • 3906 Posts

@Shewgenja said:

Sometimes, I forget the XBone has Blast Processing due to that 0.10 clock difference.

Blast processing, wow that takes ages!!