Are PS4 Pro level graphics good enough for this gen?

Avatar image for crashnburn281
CrashNBurn281

1574

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#151 CrashNBurn281
Member since 2014 • 1574 Posts

The only issue with the Pro I am worried about is the lack of a substantial CPU upgrade. I still seems to me that it will cause bottleneck issues.

I am sure that the system will create fantastic games, but it may hold back advancements.

It will be interesting to see what kind of CPU is in the Scorpio. Of course then it will not matter for games also designed for the standard Xbone.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#152  Edited By ronvalencia
Member since 2008 • 29612 Posts

@waahahah said:
@ronvalencia said:

Mobile gaming brought back the importance of half floats. What is described by Cerny here happened with the custom Apple GPU (derived from PowerVR mobile GPUs) where they, too, went the "half floats can share a float register" method and it was a major component on why iPhones were so much more performant in 3d applications than comparable phones.

Here's an article http://www.realworldtech.com/apple-custom-gpu/

On non-mobile GPU parts, Vega 10, PS4 Pro and GF100 (the only real new Pascal SM design) supports double rate FP16 shaders.

The method for packing multiple FP16 data elements into a register is a typical SIMD method. This is similar to 8 data element FP16 extensions for X86 SSE 128 bit register.

You're an idiot. I understand how it works. You're still not getting my point about terminology and how 12TFLOPS isn't a useful metric and it's misleading. FP16 will increase efficiency *where possible* and increase FLOPS, it will never double FLOPS.

Fack you. There's a difference between hardware capability and usage. You want a flame war, I'll give it to you. I'll take you on.

VooFoo Dev Was Initially Doubtful About PS4 PRO GPU’s Capabilities But Was Pleasantly Surprised Later

“I was actually very pleasantly surprised. Not initially – the specs on paper don’t sound great, as you are trying to fill four times as many pixels on screen with a GPU that is only just over twice as powerful, and without a particularly big increase in memory bandwidth,” he explained, echoing the sentiment that a lot of us seem to have, before adding, “But when you drill down into the detail, the PS4 Pro GPU has a lot of new features packed into it too, which means you can do far more per cycle than you can with the original GPU (twice as much in fact, in some cases). You’ve still got to work very hard to utilise the extra potential power, but we were very keen to make this happen in Mantis Burn Racing.

“In Mantis Burn Racing, much of the graphical complexity is in the pixel detail, which means most of our cycles are spent doing pixel shader work. Much of that is work that can be done at 16-bit rather than 32-bit precision, without any perceivable difference in the end result – and PS4 Pro can do 16 bit-floating point operations twice as fast as the 32-bit equivalent.”

Mantis Burn Racing PS4 Pro version reached 4K/60 fps which is 4X effectiveness over the original PS4 version's 1080p/60 fps.

FP16 optimisation is similar to GeForce FX's era The Way Meant to be Played optimisation paths.

On the original PS4, developers already using FP16 for raster hardware e.g. Kill Zone ShadowFall.

Sebbbi a dev on Beyond3D had this to say about FP16

Sebbbi (A dev on Beyond3D) comment on FP16.

Originally Posted by Sebbbi on Beyond3D 2 years ago

Sometimes it requires more work to get lower precision calculations to work (with zero image quality degradation), but so far I haven't encountered big problems in fitting my pixel shader code to FP16 (including lighting code). Console developers have a lot of FP16 pixel shader experience because of PS3. Basically all PS3 pixel shader code was running on FP16.

It is still is very important to pack the data in memory as tightly as possible as there is never enough bandwidth to lose. For example 16 bit (model space) vertex coordinates are still commonly used, the material textures are still dxt compressed (barely 8 bit quality) and the new HDR texture formats (BC6H) commonly used in cube maps have significantly less precision than a 16 bit float. All of these can be processed by 16 bit ALUs in pixel shader with no major issues. The end result will still be eventually stored to 8 bit per channel back buffer and displayed.

Could you give us some examples of operations done in pixel shaders that require higher than 16 bit float processing?

EDIT: One example where 16 bit float processing is not enough: Exponential variance shadow mapping (EVSM) needs both 32 bit storage (32 bit float textures + 32 bit float filtering) and 32 bit float ALU processing.

However EVSM is not yet universally possible on mobile platforms right now, as there's no standard support for 32 bit float filtering in mobile devices (OpenGL ES 3.0 just recently added support for 16 bit float filtering, 32 bit float filtering is not yet present). Obviously GPU manufacturers can have OpenGL ES extensions to add FP32 filtering support if their GPU supports it (as most GPUs should as this has been a required feature in DirectX since 10.0).

#33sebbbi, Oct 18, 2014 Last edited by a moderator: Oct 18, 2014

MS , AMD , Sony & Intel think it will work for graphics

Try again clown. The only idiot is you.

Avatar image for HalcyonScarlet
HalcyonScarlet

13838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#153 HalcyonScarlet
Member since 2011 • 13838 Posts

The PS4 Pro meets the standard for professional gamers. They'll tell you how important 30fps is.

So no one needs to worry.

Avatar image for Dark_man123
Dark_man123

4012

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#154  Edited By Dark_man123
Member since 2005 • 4012 Posts

nothing is good this Gen, everyone want photo realistic graphics

Avatar image for djoffer
djoffer

1856

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#155 djoffer
Member since 2007 • 1856 Posts

These days i would prefer the devs used some more time actually making good games instead of counting pixels.. the graphics we have currently is more than good enough for me... the most fun I had this year with games is tyranny, hard west, Xcom 2, overwatch and TW3 expansion.. and yeah only TW3 had cutting edge graphics....

Avatar image for Phreek300
Phreek300

672

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#156 Phreek300
Member since 2007 • 672 Posts

I think we have reached a point that graphics are great. As I said in the Switch thread and was told I was wrong about, the graphics do not make the whole experience. I thought the graphics were fine at the start of this gen. Were they making my eyes melt from their fidelity, no. But they were pretty great non the less. I feel that we rush passed tech and don't let it mature with software. We let it mature with the PS3 and Xbox 360. Those machines did more than they had any right to be able to do with their hardware.

With that being said, I am not a fan of the refreshes PS4 Pro and Scorpio. I realize why they are needed. 4K TV's are a thing. I just don't like it. EG101 was right in the fact that they should have just beefed up the specs a bit and let this gen go 5-6 years and do the same again at the end of that time frame.

Avatar image for lrdfancypants
lrdfancypants

3850

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#157 lrdfancypants
Member since 2014 • 3850 Posts

@SolidGame_basic:

Does Titanfall 2 look substantially better than 1? Part 1 looked like a generation behind.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#158  Edited By waahahah
Member since 2014 • 2462 Posts

@ronvalencia said:
@waahahah said:
@ronvalencia said:

Mobile gaming brought back the importance of half floats. What is described by Cerny here happened with the custom Apple GPU (derived from PowerVR mobile GPUs) where they, too, went the "half floats can share a float register" method and it was a major component on why iPhones were so much more performant in 3d applications than comparable phones.

Here's an article http://www.realworldtech.com/apple-custom-gpu/

On non-mobile GPU parts, Vega 10, PS4 Pro and GF100 (the only real new Pascal SM design) supports double rate FP16 shaders.

The method for packing multiple FP16 data elements into a register is a typical SIMD method. This is similar to 8 data element FP16 extensions for X86 SSE 128 bit register.

You're an idiot. I understand how it works. You're still not getting my point about terminology and how 12TFLOPS isn't a useful metric and it's misleading. FP16 will increase efficiency *where possible* and increase FLOPS, it will never double FLOPS.

Fack you. There's a difference between hardware capability and usage. You want a flame war, I'll give it to you. I'll take you on.

VooFoo Dev Was Initially Doubtful About PS4 PRO GPU’s Capabilities But Was Pleasantly Surprised Later

You haven't said anything new. Again you haven't said why calling it a 12FLOP card isn't mis leading. FLOPS are generally measurements in single precision. FP16 is still a feature to increase efficiency where possible.

Avatar image for putaspongeon
PutASpongeOn

4897

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#159 PutASpongeOn
Member since 2014 • 4897 Posts

You can't create a gaming pc that is as good for gaming as ps4 and ps4 pro for how much they cost. You can get hardware that is as powerful maybe (probably not) but it wouldn't be only for gaming (as in the ps4 hardware is made so that it is based around games and thus is more efficient )

So no, it's not too weak, especially considering consoles and most of all ps4 has the market share. You shouldn't really compare 1000 dollar pcs that can dominate ps4's to 250-300 dollar consoles.

Avatar image for putaspongeon
PutASpongeOn

4897

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#160 PutASpongeOn
Member since 2014 • 4897 Posts

@kvally said:

@Pray_to_me: nobody except PS fans give a shit about the PS Pro so what is your point?

Isn't that kind of a stupid statement? No one but the consumer who buys that product cares.

You can literally say that about anything and the same is true about pc gamers, nintendo gamers, and xbox fans. It's just a redundant and irrelevant statement.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#161 blackace
Member since 2002 • 23576 Posts

@SolidGame_basic: They are good enough for me at this point. Not really a graphic whore like some people are. It's more about the gameplay and fun for me. I'd rather see games running at 60fps with perfect controls.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#162 blackace
Member since 2002 • 23576 Posts

@Pray_to_me said:

Blah blah blah. PS4 Pro is the most powerful console in the world and Lemmings will just have to deal with it. As for "teh Scorpio" literally nobody gives a **** about it except for Lemmings. Devs aren't going to make games exclusively for it and Microsoft has no first party. So all it will get will be PS4 ports for 3X the price. It's just gonna get dominated like how the original oh so powerful Xbox got dominated by PS2. Face it Lemmings, the Xbox brand is a joke and so are you.

Joke of the day confirmed. lol!! The low power Wii dominated the PS3 last gen. Doh!?!? Forgot about that right. Scorpio will get ALL Win 10 PC ports. No games my @ss. Developers are already working on them now. Games like Neverwinter Online, Gigantic, Smite, Divinity: Original Sin, Diablo 3, World of Tanks, Minecraft, Warframe, DC Online, etc.. coming to XB1 was all warmup for future Win 10 games coming to Scorpio. I think the only joke is you. And before you call me a lemming, I own all the systems; even the handhelds. lol!!

Avatar image for Pray_to_me
Pray_to_me

4041

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#163 Pray_to_me
Member since 2011 • 4041 Posts

@blackace:

Lol you been a Lemming bum. "Weak ass Wii dominated PS3" Yup and it dominated 360 too, which actually supports the argument that teh "Scorpio's power advantage" won't mean shyt lol!

Now get back in your corner bum. Your arms too short to box with God.

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#164 BlackShirt20
Member since 2005 • 2631 Posts

@SolidGame_basic: to be honest. Only time will tell. If the Scorpio delivers on its promises. It just might. I'm mighty disappointed in Sony on the PS4 Pro. They Jew'd us on a lot of features that should have been implemented. Not to mention skimping on GPU power knowing damn well 4Tf wasn't going to cut it.

Then the excuse of "with HDR you won't tell a difference between 4k upscaled and true 4k". Reminds me of Microsoft trying to sell everyone on "the cloud".

Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#165 stereointegrity
Member since 2007 • 12151 Posts

@BlackShirt20 said:

@SolidGame_basic: to be honest. Only time will tell. If the Scorpio delivers on its promises. It just might. I'm mighty disappointed in Sony on the PS4 Pro. They Jew'd us on a lot of features that should have been implemented. Not to mention skimping on GPU power knowing damn well 4Tf wasn't going to cut it.

Then the excuse of "with HDR you won't tell a difference between 4k upscaled and true 4k". Reminds me of Microsoft trying to sell everyone on "the cloud".

when did they say with hdr you cant tell a difference?

and as long as MS doesnt skimp out on the cpu the scorpio will be a beast

Avatar image for kvally
kvally

8445

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 9

#166 kvally
Member since 2014 • 8445 Posts

@putaspongeon: agreed. So what is your point?

Avatar image for Commiesdie
Commiesdie

372

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#167 Commiesdie
Member since 2006 • 372 Posts

Bump

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#168  Edited By ronvalencia
Member since 2008 • 29612 Posts

@waahahah said:
@ronvalencia said:
@waahahah said:
@ronvalencia said:

Mobile gaming brought back the importance of half floats. What is described by Cerny here happened with the custom Apple GPU (derived from PowerVR mobile GPUs) where they, too, went the "half floats can share a float register" method and it was a major component on why iPhones were so much more performant in 3d applications than comparable phones.

Here's an article http://www.realworldtech.com/apple-custom-gpu/

On non-mobile GPU parts, Vega 10, PS4 Pro and GF100 (the only real new Pascal SM design) supports double rate FP16 shaders.

The method for packing multiple FP16 data elements into a register is a typical SIMD method. This is similar to 8 data element FP16 extensions for X86 SSE 128 bit register.

You're an idiot. I understand how it works. You're still not getting my point about terminology and how 12TFLOPS isn't a useful metric and it's misleading. FP16 will increase efficiency *where possible* and increase FLOPS, it will never double FLOPS.

Fack you. There's a difference between hardware capability and usage. You want a flame war, I'll give it to you. I'll take you on.

VooFoo Dev Was Initially Doubtful About PS4 PRO GPU’s Capabilities But Was Pleasantly Surprised Later

You haven't said anything new. Again you haven't said why calling it a 12FLOP card isn't mis leading. FLOPS are generally measurements in single precision. FP16 is still a feature to increase efficiency where possible.

You don't know shit. There's a reason when the PC GPU vendors are restoring native FP16 support in addition to double rate feature. DX10's pure FP32 was the wrong direction.

I divide FLOPS into separate FP16 and FP32 categories. I have posted 3 developers supporting FP16 argument. It's the restoration for GeForce FX/6/7/RSX FP16 shader optimisation path.

PS4 Pro's GPU is first Radeon type GPU to support native FP16 shaders with double rate feature like GeForce FX since ATI's first DX9 Radeon 9700 only supported native FP24.

VooFoo Dev Was Initially Doubtful About PS4 PRO GPU’s Capabilities But Was Pleasantly Surprised Later

“I was actually very pleasantly surprised. Not initially – the specs on paper don’t sound great, as you are trying to fill four times as many pixels on screen with a GPU that is only just over twice as powerful, and without a particularly big increase in memory bandwidth,” he explained, echoing the sentiment that a lot of us seem to have, before adding, “But when you drill down into the detail, the PS4 Pro GPU has a lot of new features packed into it too, which means you can do far more per cycle than you can with the original GPU (twice as much in fact, in some cases). You’ve still got to work very hard to utilise the extra potential power, but we were very keen to make this happen in Mantis Burn Racing.

“In Mantis Burn Racing, much of the graphical complexity is in the pixel detail, which means most of our cycles are spent doing pixel shader work. Much of that is work that can be done at 16-bit rather than 32-bit precision, without any perceivable difference in the end result – and PS4 Pro can do 16 bit-floating point operations twice as fast as the 32-bit equivalent.”

VooFoo dev has some concerns with PS4 Pro's GPU but there are new features for 4K.

Note why MS claimed "highest quality pixels" e.g. FP32 shaders 4K handling like the current R9-390X OC.

Phil Spenser has stated PS4 Pro's level hardware would be their Xbox Next year 2016 solution.

Vega 10 has 12.5 TFLOPS (1.54 Ghz, 64 CU) and half version would be 6.25 TFLOPS FP32.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#169  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@ronvalencia: good grief your not getting the point are you? You stating a gpu going from 6 tflops with 32 bit to 12 tflops with 16 bit is not realistic output of the gpu usage. Which is the point waah was making.You rambling on and on about it your ignoring the incorrect statement your posting about the gpu output realistically

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#170  Edited By ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher said:

@ronvalencia: good grief your not getting the point are you? You stating a gpu going from 6 tflops with 32 bit to 12 tflops with 16 bit is not realistic output of the gpu usage. Which is the point waah was making.You rambling on and on about it your ignoring the incorrect statement your posting about the gpu output realistically

No, both you and wah are wrong. I specifically split FLOPS at FP16 and FP32 hardware capability.

The primary Xbox generation jump example is with Xbox 360 to Xbox One. Using the original Xbox to Xbox 360 as generation jump example is flawed since it's pixel shaders are integer 16 with floating point vertex shaders. Xbox 360's GPU has support for FP10, FP16 and FP32. Xbox 360's GPU has texture filtering problems with FP16 buffer, hence FP10 is used e.g. double pass FP10 render for HDR in Halo 3. Have you forgotten this workaround?

I have 3 game developers (Gorilla Games for Killzone Shadow Fall (raster hardware), RedLynx (Sebbbi), VooFoo) argued for FP16 case. Who are you again?

Notice how FP16 can be faster than FP32, even though the G70 has all of the transistors and computational power necessary for FP32 precision. 32-bit values still eat up additional register space, cache room, and (if it comes to it) memory bandwidth.

Shader Model 6 restores native support for 16 bit FP shaders.

For RGBA16F, TMU workaround overcomes ROPS (raster hardware) bound. Native Shader FP16 feature matches with raster RGBA16F!

That's 4 developers arguing on FP16.

DirectX12's Shader Model 6 reverses DirectX10's Shader Model 4's pure 32 bit FP shaders bias. NVIDIA fanboys can't handle it when MS and AMD facks-up NVIDIA's current GPUs. Shader Model 6 is another DX12 async compute MS/AMD joint gimping on NVIDIA hardware. This is the reason why I'll wait for NVIDIA Volta and AMD Vega 10 (with double rate FP16 mode). AMD GCN 1.2 to 1.3 has native FP16 shader support i.e. AMD knows about MS's SM6 plans.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#171  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@ronvalencia: your not understanding the point you fool so before you start calling people idiots you need to RE read what he or I posted.... we aren't denying the base of fp16 usage can free up for certain tasks and gpu resources that don't need the precsion of 32 fp. What I'm saying is you can not take a gpu and double fp 16 to get flops on the because you can't allocate the gpu to pure 16fp because there is quite abit you need 32bit. So it's not a real performance number the gpu will ever really see. So taking a gpu's 32bit fp flops number and doubling it for 16 fp is not correct when games will be using mixed precision.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#172  Edited By ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher said:

@ronvalencia: your not understanding the point you fool so before you start calling people idiots you need to RE read what he or I posted.... we aren't denying the base of fp16 usage can free up for certain tasks and gpu resources that don't need the precsion of 32 fp. What I'm saying is you can not take a gpu and double fp 16 to get flops on the because you can't allocate the gpu to pure 16fp because there is quite abit you need 32bit. So it's not a real performance number the gpu will ever really see. So taking a gpu's 32bit fp flops number and doubling it for 16 fp is not correct when games will be using mixed precision.

Irreverent bullshit. I already divided FLOPS compute power between FP16 and FP32.

The trigger post

@mjebb said:

It is not.

To start a new generation of consoles, it needs to be at least 10 times more powerful, as per PS3 to PS4

PS3's FLOPS power

RSX: 400 GFLOPS (24 x 27 Flops x 550 )+ (8 x 10 Flops * 550). But, the 27 FLOPS are not presented as a flat structure as in generalised unified shaders GPUs.

Pixel Shader pipeline for NVIDIA G7X/RSX

The bottleneck is input fragment data i.e. FP32 Shader Unit 2 is dependant on FP32 Shader Unit 1. RSX's pixel shader FP units are NOT generalised as per AMD's GCN CU stream processors i.e. not apples to apples comparison.

RSX's pixel shader 356 GFLOPS can be reduce to 178 GFLOPS when generalised i.e. FPU unit 1 and unit 2 are treated as a single FPU unit.

Overall, RSX is like 222 GFLOPS GPU with 44 GFLOPS vertex shaders + 178 GFLOPS pixel shaders.

Xbox 360's GPU with 240 GFLOPS are generalised and unified.

Vertex Shader pipeline for NVIDIA G7X/RSX

The bottleneck is at input vertex data. These FP units are NOT generalised as per AMD's GCN CU stream processors.

CELL: 204 GFLOPS from PPE + 7 SPE. About 153.6 GFLOPS from 6 SPEs can be use as FP32 shaders.

PS4's GPU has 1840 GFLOPS FP32 shaders without factoring scalar processors for each CU.

PS4's GPU shader FLOPS as counted like PS3 RSX...

1152 (stream processors) X 0.800 (frequency in GHz) X 2 = 1843.2 GFLOPS

18 (scalar processors) X 0.800 (frequency in GHz) X 2 = 28 GFLOPS

Total GFLOPS counted like RSX is 1872 GFLOPS.

The raw shader FLOPS jump from PS3's RSX+6 SPE's 550 GFLOPS FP32 vs PS4's GPU 1872 GFLOPS is 3.38X.

When RSX's pixel shaders are treated as generalise form, the FLOPS jump from PS3's RSX+6 SPE's 375.6 GFLOPS FP32 to PS4's GPU 1872 GFLOPS Fp32 is about 4.90X.

PS3's raw GFLOPS did not deliver the same GFLOPS work as in GeForce 8800 GTX's 345.6 GFLOPS.

In terms of usage, RSX's optimised pixel shaders are mostly in FP16.

From https://forum.beyond3d.com/posts/1460125/

------------------------

"I could go on for pages listing the types of things the spu's are used for to make up for the machines aging gpu, which may be 7 series NVidia but that's basically a tweaked 6 series NVidia for the most part. But I'll just type a few off the top of my head:"

1) Two ppu/vmx units

There are three ppu/vmx units on the 360, and just one on the PS3. So any load on the 360's remaining two ppu/vmx units must be moved to spu.

2) Vertex culling

You can look back a few years at my first post talking about this, but it's common knowledge now that you need to move as much vertex load as possible to spu otherwise it won't keep pace with the 360.

3) Vertex texture sampling

You can texture sample in vertex shaders on 360 just fine, but it's unusably slow on PS3. Most multi platform games simply won't use this feature on 360 to make keeping parity easier, but if a dev does make use of it then you will have no choice but to move all such functionality to spu.

4) Shader patching

Changing variables in shader programs is cake on the 360. Not so on the PS3 because they are embedded into the shader programs. So you have to use spu's to patch your shader programs.

5) Branching

You never want a lot of branching in general, but when you do really need it the 360 handles it fine, PS3 does not. If you are stuck needing branching in shaders then you will want to move all such functionality to spu.

6) Shader inputs

You can pass plenty of inputs to shaders on 360, but do it on PS3 and your game will grind to a halt. You will want to move all such functionality to spu to minimize the amount of inputs needed on the shader programs.

7) MSAA alternatives

Msaa runs full speed on 360 gpu needing just cpu tiling calculations. Msaa on PS3 gpu is very slow. You will want to move msaa to spu as soon as you can.

Post processing

360 is unified architecture meaning post process steps can often be slotted into gpu idle time. This is not as easily doable on PS3, so you will want to move as much post process to spu as possible.

9) Load balancing

360 gpu load balances itself just fine since it's unified. If the load on a given frame shifts to heavy vertex or heavy pixel load then you don't care. Not so on PS3 where such load shifts will cause frame drops. You will want to shift as much load as possible to spu to minimize your peak load on the gpu.

10) Half floats

You can use full floats just fine on the 360 gpu. On the PS3 gpu they cause performance slowdowns. If you really need/have to use shaders with many full floats then you will want to move such functionality over to the spu's.

11) Shader array indexing

You can index into arrays in shaders on the 360 gpu no problem. You can't do that on PS3. If you absolutely need this functionality then you will have to either rework your shaders or move it all to spu.

Etc, etc, etc...

Xbox One's GPU has problems with texture filtering(raster hardware) with FP16 framebuffer, hence infamous double pass FP10 for HDR for Halo 3.

Mjebb wanted 10X game console generation jump and I answered with the same manner i.e. Scorpio GPU being used like NVIDIA RSX (mostly FP16) which is the apples to apples usage comparison.

Both you and wah didn't normalized with the old console generation's FP16 usage factor.

Avatar image for jg4xchamp
jg4xchamp

64057

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#173 jg4xchamp
Member since 2006 • 64057 Posts

I tend to go back and play a lot of old games, so regular ass PS4 graphics were more than fine.

I'd like more creative risks and experiences I've never had before out of this gen.

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#174  Edited By BlackShirt20
Member since 2005 • 2631 Posts

Again time will tell. Not a single game can run natively at 4k on the PS4 that is within this gen's cycle. Sony dropped the ball on a lot of features that needed to be included. Especially when you own BluRay and decide not to include a UHD drive. Claiming everything is digital. Yeah Sony. Sure. Plus streaming movies in 4k or even 1080p don't look as good as a movie from a physical disk.

As for Microsoft, If they don't include a UHD drive I'm out. Unless this new system can run native 4k games at 60fps I'm out. If they try and sell it for more then $499. I'm out.

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#175 ConanTheStoner
Member since 2011 • 23838 Posts

I'd be fine with vanilla PS4 graphics, or even WiiU graphics for that matter.

My only beef with console gaming on the technical end is performance. And this is something that won't be fixed with better hardware. A lot of console games perform like ass because developers are constantly trying to push these machines beyond their limits in an attempt to appease gamers with pretty pictures. Priorities are fucked.

Hoping stuff like Nioh with its multiple graphics/performance options catches on.

Avatar image for KungfuKitten
KungfuKitten

27389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#176  Edited By KungfuKitten
Member since 2006 • 27389 Posts

Sorry my post turned more into a PC vs console thing. That's because PC is my standard for gaming.

Hmm. You're right when it comes to 1080p gaming. The Pro and Scorpio will have nearly as good a performance and visuals as a mid maybe mid-high end PC. 1080p is kinda hitting limits now.

But I've seen 1440p ultrawide and it kinda bewitched me. I guess it's subjective, and I didn't spend enough time to 'get used to it', but I think it actually changes the way games play a little. It's not just better looking. I'm always hesitant with upgrades that just make a game look better because I get used to it pretty quickly. The additional coverage of the game makes it feel more immersive, more real, and some games look so deliciously smooth (Doom specifically when they had it on display) it makes you want to reach through the panel to feel it running through your fingers. One con is that I had trouble focusing on one place of the screen like with 16:9 and not move my head around all the time. I wonder if that's something I'd get used to or if that kind of destroys ultrawide for me. Anyway when I look at the way consoles are progressing since the 360 it doesn't sound like they'll get to that level of fidelity or immersion this generation and maybe even the next. I feel we've been stuck at 1080p for a while now and we are about to take the next meaningful step en masse. Be it VR, be it 2k at higher fps.

At this moment 1440p even at 16:9 is very expensive to run smoothly. It's not even slightly fair to compare that to consoles. I'm very excited about this year's CES which starts tomorrow (3-8 january) because we'll see new screens and TV's that will come out this year. And I heard some very cool things about what they'll show. Not price-wise but tech-wise. Maybe with some more competition 1440p high fps gaming will become affordable this year. I really hope so.

Avatar image for BassMan
BassMan

18741

Forum Posts

0

Wiki Points

0

Followers

Reviews: 233

User Lists: 0

#177  Edited By BassMan
Member since 2002 • 18741 Posts
@ConanTheStoner said:

I'd be fine with vanilla PS4 graphics, or even WiiU graphics for that matter.

My only beef with console gaming on the technical end is performance. And this is something that won't be fixed with better hardware. A lot of console games perform like ass because developers are constantly trying to push these machines beyond their limits in an attempt to appease gamers with pretty pictures. Priorities are fucked.

Hoping stuff like Nioh with its multiple graphics/performance options catches on.

Priorities are fucked indeed. These consoles are trying to wow people with graphics that they can't deliver at acceptable performance. Console gamers need to accept the fact that these games will not be as impressive as on PC. So, instead of asking for fancy graphics, they should be asking for better performance and more innovative design.

There is one positive for developers pushing the consoles beyond what they are capable of.... That means they usually don't have to increase the budget much to make a quality PC version as the game already has high fidelity graphics. With the PC version, we get to enjoy the fancy graphics and actually run the game properly. If developers optimized all games for console to run at 60fps, they would have to increase the budget to take advantage of PC and that is less likely to happen for multi-platform titles. So, a console performance target of 30fps sucks for console gamers, but benefits PC gamers. :)

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#178  Edited By ronvalencia
Member since 2008 • 29612 Posts

@BassMan said:
@ConanTheStoner said:

I'd be fine with vanilla PS4 graphics, or even WiiU graphics for that matter.

My only beef with console gaming on the technical end is performance. And this is something that won't be fixed with better hardware. A lot of console games perform like ass because developers are constantly trying to push these machines beyond their limits in an attempt to appease gamers with pretty pictures. Priorities are fucked.

Hoping stuff like Nioh with its multiple graphics/performance options catches on.

Priorities are fucked indeed. These consoles are trying to wow people with graphics that they can't deliver at acceptable performance. Console gamers need to accept the fact that these games will not be as impressive as on PC. So, instead of asking for fancy graphics, they should be asking for better performance and more innovative design.

There is one positive for developers pushing the consoles beyond what they are capable of.... That means they usually don't have to increase the budget much to make a quality PC version as the game already has high fidelity graphics. With the PC version, we get to enjoy the fancy graphics and actually run the game properly. If developers optimized all games for console to run at 60fps, they would have to increase the budget to take advantage of PC and that is less likely to happen for multi-platform titles. So, a console performance target of 30fps sucks for console gamers, but benefits PC gamers. :)

Before RX-480's release that replaces mid-range gaming GPU price segments e.g. RX-470, RX-480 and GTX 1060.

For the majority, end user's priorities doesn't have your POV.

Again, Scorpio's estimate from R9-390X results.

The graphics details from the above benchmarks are higher than XBO's graphics detail settings.

GPUs such as Titan X Maxwell, 980 Ti, GTX 1070 and Fury X are few frames per second faster than R9-390X.

CES 2017 may show Vega 10 and 1080 Ti (another cut down GP102).

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#179 Juub1990
Member since 2013 • 12622 Posts

@ConanTheStoner said:

multiple graphics/performance options

Mmmh...reminds me of a platform. Can't remember which one lol.

Avatar image for deactivated-587acdd100f19
deactivated-587acdd100f19

908

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#180 deactivated-587acdd100f19
Member since 2008 • 908 Posts

@BassMan: I don't really think these companies designed these mid system upgrades with what consumers had in mind. They did what they did at the behest of the developers.

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#181 ConanTheStoner
Member since 2011 • 23838 Posts

@Juub1990 said:
@ConanTheStoner said:

multiple graphics/performance options

Mmmh...reminds me of a platform. Can't remember which one lol.

Funny thing is, I still see them calling this one of the "hassles" of PC gaming haha.

Just wait until it becomes standard for consoles, they'll embrace it (as they should), but the hypocrisy will be unreal.

The road to consoles becoming more and more like PCs is littered with flip flops.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#182  Edited By ronvalencia
Member since 2008 • 29612 Posts

@BlackShirt20 said:

Again time will tell. Not a single game can run natively at 4k on the PS4 that is within this gen's cycle. Sony dropped the ball on a lot of features that needed to be included. Especially when you own BluRay and decide not to include a UHD drive. Claiming everything is digital. Yeah Sony. Sure. Plus streaming movies in 4k or even 1080p don't look as good as a movie from a physical disk.

As for Microsoft, If they don't include a UHD drive I'm out. Unless this new system can run native 4k games at 60fps I'm out. If they try and sell it for more then $499. I'm out.

Go buy Quadro P6000 (full GP102) for the best 4K/60 results and it's more than $499 USD i.e. it's $4850 USD.

Without AMD competition. It's the return to 1990s SGI workstation GPU prices. All pro-NVidia MSM (Mainstream News Media) who supported GTX 680 which is now slower than 7970 with recent games has damaged AMD and help created this current situation.

MS will not pay for $4850 GPU for $499 game console.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#183  Edited By ronvalencia
Member since 2008 • 29612 Posts

@ConanTheStoner said:
@Juub1990 said:
@ConanTheStoner said:

multiple graphics/performance options

Mmmh...reminds me of a platform. Can't remember which one lol.

Funny thing is, I still see them calling this one of the "hassles" of PC gaming haha.

Just wait until it becomes standard for consoles, they'll embrace it (as they should), but the hypocrisy will be unreal.

The road to consoles becoming more and more like PCs is littered with flip flops.

Sorry, MS will not pay for $4850 GPU for $499 games console. This current situation was created by NVIDIA.

I have supported AMD with 7950 OC, 7970 GE, R9-290 and R9-290X OC purchases. Live with a monopoly in high end GPUs.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#184 Juub1990
Member since 2013 • 12622 Posts

@ronvalencia said:
@ConanTheStoner said:
@Juub1990 said:
@ConanTheStoner said:

multiple graphics/performance options

Mmmh...reminds me of a platform. Can't remember which one lol.

Funny thing is, I still see them calling this one of the "hassles" of PC gaming haha.

Just wait until it becomes standard for consoles, they'll embrace it (as they should), but the hypocrisy will be unreal.

The road to consoles becoming more and more like PCs is littered with flip flops.

Sorry, MS will not pay for $4850 GPU for $499 games console. This current situation was created by NVIDIA.

@ConanTheStoner Do you know what he's on about? Cause I don't.

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#185 ConanTheStoner
Member since 2011 • 23838 Posts

@ronvalencia said:
@ConanTheStoner said:
@Juub1990 said:
@ConanTheStoner said:

multiple graphics/performance options

Mmmh...reminds me of a platform. Can't remember which one lol.

Funny thing is, I still see them calling this one of the "hassles" of PC gaming haha.

Just wait until it becomes standard for consoles, they'll embrace it (as they should), but the hypocrisy will be unreal.

The road to consoles becoming more and more like PCs is littered with flip flops.

Sorry, MS will not pay for $4850 GPU for $499 games console. This current situation was created by NVIDIA.

Ron, you seem like an ok dude, but sometimes when you respond to me I feel as if you meant to quote somebody else.

All we're talking about here is console games getting options for graphics settings. Options that allow you to prioritize performance. And how funny it will be when console gamers embrace it, even though it's still one of their arguments about how PC gaming is a hassle.

Maybe the point of your post just flew right over my head? Idk. It just seems unrelated.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#186  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Juub1990 said:
@ronvalencia said:
@ConanTheStoner said:
@Juub1990 said:

Mmmh...reminds me of a platform. Can't remember which one lol.

Funny thing is, I still see them calling this one of the "hassles" of PC gaming haha.

Just wait until it becomes standard for consoles, they'll embrace it (as they should), but the hypocrisy will be unreal.

The road to consoles becoming more and more like PCs is littered with flip flops.

Sorry, MS will not pay for $4850 GPU for $499 games console. This current situation was created by NVIDIA.

@ConanTheStoner Do you know what he's on about? Cause I don't.

Your memory is short.

AMD reverted Radeon group back to it's original business plan i.e. serving lower cost PC OEM first.

Regardless of 7970 being faster than GTX 680 counterpart with recent games, the result didn't benefit the Radeon group.

Regardless of R9-290X being faster than GTX 780 Ti counterpart with recent games, the result didn't benefit the Radeon group.

A lot of mainstream news media was bias towards GTX 680 and NVIDIA.

Radeon group's matching NVIDIA at 1 to 1 business plan is dead.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#187 Juub1990
Member since 2013 • 12622 Posts

@ConanTheStoner said:
@ronvalencia said:
@ConanTheStoner said:
@Juub1990 said:
@ConanTheStoner said:

multiple graphics/performance options

Mmmh...reminds me of a platform. Can't remember which one lol.

Funny thing is, I still see them calling this one of the "hassles" of PC gaming haha.

Just wait until it becomes standard for consoles, they'll embrace it (as they should), but the hypocrisy will be unreal.

The road to consoles becoming more and more like PCs is littered with flip flops.

Sorry, MS will not pay for $4850 GPU for $499 games console. This current situation was created by NVIDIA.

Ron, you seem like an ok dude, but sometimes when you respond to me I feel as if you meant to quote somebody else.

All we're talking about here is console games getting options for graphics settings. Options that allow you to prioritize performance. And how funny it will be when console gamers embrace it, even though it's still one of their arguments about how PC gaming is a hassle.

Maybe the point of your post just flew right over my head? Idk. It just seems unrelated.

And there he goes again...

I have a theory he lives in an alternate dimension and is a Gamespot user but somehow his account ended up in our dimension. In his dimension what we say means something completely different.

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#188 ConanTheStoner
Member since 2011 • 23838 Posts

@Juub1990:

Ok, I really don't know.

I'm not even talking about GPUs and I'm certainly not talking about Nvidia vs AMD. I don't care for GPU wars, nor am I knowledgeable enough to even speak on that shit lol.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#189  Edited By Juub1990
Member since 2013 • 12622 Posts

@ConanTheStoner said:

@Juub1990:

Ok, I really don't know.

I'm not even talking about GPUs and I'm certainly not talking about Nvidia vs AMD. I don't care for GPU wars, nor am I knowledgeable enough to even speak on that shit lol.

Dude I have no idea what the hell he is talking about lol. He quoted us then started rambling about GPU architecture. We made no mention of that. Maybe if we ignore him he'll leave?

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#190 ConanTheStoner
Member since 2011 • 23838 Posts

@Juub1990:

Sounds like a Kojima game btw.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#191  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Juub1990 said:
@ConanTheStoner said:
@ronvalencia said:
@ConanTheStoner said:

Funny thing is, I still see them calling this one of the "hassles" of PC gaming haha.

Just wait until it becomes standard for consoles, they'll embrace it (as they should), but the hypocrisy will be unreal.

The road to consoles becoming more and more like PCs is littered with flip flops.

Sorry, MS will not pay for $4850 GPU for $499 games console. This current situation was created by NVIDIA.

Ron, you seem like an ok dude, but sometimes when you respond to me I feel as if you meant to quote somebody else.

All we're talking about here is console games getting options for graphics settings. Options that allow you to prioritize performance. And how funny it will be when console gamers embrace it, even though it's still one of their arguments about how PC gaming is a hassle.

Maybe the point of your post just flew right over my head? Idk. It just seems unrelated.

And there he goes again...

I have a theory he lives in an alternate dimension and is a Gamespot user but somehow his account ended up in our dimension. In his dimension what we say means something completely different.

My point. MS and Sony are not to blame for the current game console hardware power.

NVIDIA's game console offering with Switch is only breadcrumbs from NVDIA's GPU table.

The alternate universe is your NVIDIA GTX 1080 mind set. Sorry pal, your GTX 1080 with higher price GPU mindset is in the minority

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#192 ConanTheStoner
Member since 2011 • 23838 Posts
@ronvalencia said:

My point. MS and Sony are not to blame for the current game console hardware power.

That's cool, I wasn't blaming Sony or MS to begin with and I don't think Juub was either.

@ronvalencia said:

The alternate universe is your NVIDIA GTX 1080 mind set. Sorry pal, your GTX 1080 with higher price GPU mindset is in the minority

I also have a GTX 1080 and I do not care about whether or not I'm in the minority. I bought a product I wanted and I'm enjoying it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#193 ronvalencia
Member since 2008 • 29612 Posts

@ConanTheStoner said:
@ronvalencia said:
@ConanTheStoner said:
@Juub1990 said:
@ConanTheStoner said:

multiple graphics/performance options

Mmmh...reminds me of a platform. Can't remember which one lol.

Funny thing is, I still see them calling this one of the "hassles" of PC gaming haha.

Just wait until it becomes standard for consoles, they'll embrace it (as they should), but the hypocrisy will be unreal.

The road to consoles becoming more and more like PCs is littered with flip flops.

Sorry, MS will not pay for $4850 GPU for $499 games console. This current situation was created by NVIDIA.

Ron, you seem like an ok dude, but sometimes when you respond to me I feel as if you meant to quote somebody else.

All we're talking about here is console games getting options for graphics settings. Options that allow you to prioritize performance. And how funny it will be when console gamers embrace it, even though it's still one of their arguments about how PC gaming is a hassle.

Maybe the point of your post just flew right over my head? Idk. It just seems unrelated.

For the most part, console games are developer's per-set vision for a given hardware performance.

Avatar image for deactivated-5d6bb9cb2ee20
deactivated-5d6bb9cb2ee20

82724

Forum Posts

0

Wiki Points

0

Followers

Reviews: 56

User Lists: 0

#194 deactivated-5d6bb9cb2ee20
Member since 2006 • 82724 Posts

I'm confused as to what any of this has to do with Nvidia and AMD.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#195 Xplode_games
Member since 2011 • 2540 Posts

@Ghost_Dub said:
@commander said:
@Ghost_Dub said:
@commander said:
@dynamitecop said:

We do, people like him do not.

You're not only insulting my intelligence , you're insulting yourself, of course it's not power alone, there need to be games, but do you really think there won't be any games?

The sun may not come up either tomorrow but that's not really something I take into consideration because it's too far fetched to say the least.

lol it's not games, it's time. Chronology.

what does it have to do with time? or even chronology?

Game generations are determined, for the most part, by time. Things become too complicated when you have to figure for things like Sega CD, N64 expansion pack, PS4 Pro, etc. Therefore, the game generations are broken up and categorized by timelines instead of power.

That's absurd! Of course one of the most defining characteristics of a new generation is significantly improved hardware. Can you tell me a time when Sony for example, released a next generation console that was not significantly more powerful than it's predecessor? History has shown that when a next gen console is underpowered compared to it's peers it doesn't do very well. For example the Xbox one, Nintendo Wii U, Sega Dreamcast and the Turbo Grafx-16 to name a few. The anomaly that has you so confused, the Wii is easy to explain(hindsight is 20/20). First it was more powerful than the Gamecube just not a serious jump like Sony and MS were doing. However it did have a radically advanced control scheme that was extremely innovative at the time and it arrived at a rock bottom price of $250 launch price and backed by the highly reputable company Nintendo. The market had never seen anything like that and I remember it being sold out for over one year after it's release.

To summarize, of course a next generation console must offer a dramatically upgraded play experience to compel consumers to upgrade. If not why would anyone move up and pay the premium if it's practically the same as it's predecessor?

Avatar image for mazuiface
mazuiface

1617

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#196 mazuiface
Member since 2016 • 1617 Posts

@Ghost_Dub said:

It is for me. I don't care how much someone is emotionally attached to pixels and game tech, these games on eighth gen consoles look fantastic.

I agree. Just make sure the game runs at 60 fps and it's fine. The rush to make games 2160p is going to be a disaster and an insult to gamers.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#197  Edited By ronvalencia
Member since 2008 • 29612 Posts

@charizard1605 said:

I'm confused as to what any of this has to do with Nvidia and AMD.

@ConanTheStoner said:

I'd be fine with vanilla PS4 graphics, or even WiiU graphics for that matter.

My only beef with console gaming on the technical end is performance. And this is something that won't be fixed with better hardware. A lot of console games perform like ass because developers are constantly trying to push these machines beyond their limits in an attempt to appease gamers with pretty pictures. Priorities are fucked.

Hoping stuff like Nioh with its multiple graphics/performance options catches on.

Wii U's graphics quality results are sales failure for the hardware. PS4's priority for graphics quality bias at the given hardware power has a better sales results when compared to Wii U's results.

Asking for good frame rate performance with current PS4/PS4 pro games requires very expensive GPUs, hence my AMD vs NVIDIA comments.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#198  Edited By Gatygun
Member since 2010 • 2709 Posts

@Xplode_games said:
@Ghost_Dub said:
@commander said:
@Ghost_Dub said:

lol it's not games, it's time. Chronology.

what does it have to do with time? or even chronology?

Game generations are determined, for the most part, by time. Things become too complicated when you have to figure for things like Sega CD, N64 expansion pack, PS4 Pro, etc. Therefore, the game generations are broken up and categorized by timelines instead of power.

That's absurd! Of course one of the most defining characteristics of a new generation is significantly improved hardware. Can you tell me a time when Sony for example, released a next generation console that was not significantly more powerful than it's predecessor? History has shown that when a next gen console is underpowered compared to it's peers it doesn't do very well. For example the Xbox one, Nintendo Wii U, Sega Dreamcast and the Turbo Grafx-16 to name a few. The anomaly that has you so confused, the Wii is easy to explain(hindsight is 20/20). First it was more powerful than the Gamecube just not a serious jump like Sony and MS were doing. However it did have a radically advanced control scheme that was extremely innovative at the time and it arrived at a rock bottom price of $250 launch price and backed by the highly reputable company Nintendo. The market had never seen anything like that and I remember it being sold out for over one year after it's release.

To summarize, of course a next generation console must offer a dramatically upgraded play experience to compel consumers to upgrade. If not why would anyone move up and pay the premium if it's practically the same as it's predecessor?

People bought the wii, because it was the only payable console on the market for the audiences that consoles had at the time. It was also easy to sell to your parents because of fitness software, if you where a kid. Much like ps2 was sellable because of the dvd player and later eye toy.

Generations are divined by market. If tommorow the scorpio comes out and the market drops all support for ps4/ps4 pro and just only makes there games for scorpio + pc, then scorpio launched the new generation.

Time has nothing to do with it. Generations will always involve more horsepower but the factor of that horsepower isn't any interesting.

But that isn't going to happen. Because scorpio and ps4 pro will have to run games from ps4 and xbox one.

This generation didn't start with the wii-u, it started with the ps4, and it will end the moment the market shifts away from the ps4 with that.

about 4k

4k really isn't much interesting for most of the market and let me explain why.

For consoles:

TV tech is to expensive, only minimum amounts of software tv channels that even support it which makes it completely useless for most people.

For PC:

Going for 4k means:

1) sacrificing huge amounts of performance = need loads more money and hardware to get your framerates back

2) no gsynch / freesync

3) bye 1 ms

4) 144hz i don't think any piece of hardware on the market will be able to even push it if it even did exist.

People often forget that 30/60 fps are a console thing entirely, PC's and shooters specially with mouse controllers will have to hit 100+ fps to feel anywhere near smooth on a as low as possible ms screen.

Anybody that is a enthousaist and needs to choose between 4k 60 hz or 1080/1440p screen with 144hz, will choose for the second one.

4k is nothing more then a nice buzzword at this point in time. You also will have to reduce your settings no matter how much hardware you got in your PC to get that resolution going on any decent speeds. Which is another sacrifice.

So 4k in my vision atm, is a complete waste.

Sony and Microsoft could have better gone with a 1080p picture and push a rock solid 60fps on any game without a single hickup with some downsampling going on. But instead they push a resolution that kills its gpu and makes things run all kinds of wonky.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#199  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Gatygun: Scorpio's marketing game plan is similar to the original PS4's approach (higher res, higher details, faster fps than the direct competition) but with better hardware. PS4's approach is the sales winner over Wii U and XBO.

Avatar image for BassMan
BassMan

18741

Forum Posts

0

Wiki Points

0

Followers

Reviews: 233

User Lists: 0

#200  Edited By BassMan
Member since 2002 • 18741 Posts

@ronvalencia: With all due respect, your reading comprehension is lacking these days and you post things that are not relevant to the immediate conversation and are confusing people. Like I mentioned before, you should go easy on the charts as nobody wants to take the time to go through that shit unless it is absolutely essential. Best to keep things short, sweet and to the point.