Dev - "PS4 Pro Is Like A 5 Year Old PC And It's Holding Developers Back"

  • 200 results
  • 1
  • 2
  • 3
  • 4
Avatar image for lexxluger
lexxluger

599

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#151 lexxluger
Member since 2017 • 599 Posts

@Sam3231 said:

What is it with you guys and "meltdowns"

Apparently, whenever someone posts a response to any other users post here on this board, they are having a "meltdown"!

Oopse :( I just replied to your post, you know what that means?

Avatar image for Sam3231
Sam3231

3221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 313

User Lists: 0

#152 Sam3231
Member since 2008 • 3221 Posts

@lexxluger: Ah ha! Look at this meltdown! Fucking cow lem. Clem.

I think I'm getting the hang of this.

Avatar image for Ghost120x
Ghost120x

6060

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#153 Ghost120x
Member since 2009 • 6060 Posts

I don't want a new console to come out every time a new graphics chip is made. That would be a nightmare.

Avatar image for xantufrog
xantufrog

17898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#154 xantufrog  Moderator
Member since 2013 • 17898 Posts

God damn - there are only so many times you can read "butthurt" and "alt" and "meltdown". This is getting bad, guys

Avatar image for NFJSupreme
NFJSupreme

6605

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#155 NFJSupreme
Member since 2005 • 6605 Posts

This generation has been the weakest competitively in console history

Avatar image for deactivated-5a8875b6c648f
deactivated-5a8875b6c648f

954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#156 deactivated-5a8875b6c648f
Member since 2015 • 954 Posts

@leonkennedy97:God of War and Spiderman haven't even released yet, and Spiderman definitely doesn't look top 10 from recent gameplay.

Avatar image for suicidesn0wman
suicidesn0wman

7490

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#157 suicidesn0wman
Member since 2006 • 7490 Posts

I feel dirty just for having read through this thread. Not sure why it wasn't locked yet.

Avatar image for UNcartMe
UNcartMe

725

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#158 UNcartMe
Member since 2011 • 725 Posts

Looks at Get Out graphics then looks at Horizon, God of War, and TLOU 2. Goes back to sleep.

Avatar image for leonkennedy97
leonkennedy97

83

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#159 leonkennedy97
Member since 2017 • 83 Posts

@phantomfire335: Compared to full open world games? Are you kidding?

Avatar image for dr_vancouver
Dr_Vancouver

1046

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#160 Dr_Vancouver
Member since 2017 • 1046 Posts

That's a load of BS. Any game on PS4 Pro is also on PS4, and likely original XBone and XBoneS and soon to be on XBoneX. Point being the systems "holding devs back" are the lowest performing, in this case any gripes anybody has with the lack of power on XBoneX or PS4 Pro are invalid as original XBone, XBoneS and PS4's are the starting point for multiplat games. Insulting the Pro is only further Insulting anything behind it. Of the soon to be 5 consoles between M$ and Sony on the market, the Pro will be second most powerful. Last place is what holds devs back, not first or second of five. Talented devs can get BotW running on a WiiU, if you can't hack it on a PS4 Pro maybe think of a new line of work...

Avatar image for kingtito
kingtito

11775

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#161 kingtito
Member since 2003 • 11775 Posts

@sirk1264 said:

@kingtito: lmao so you called me quads alt and now you call me a cow when I'm far from it. We all called your bullshit and you claim everyone else is having a meltdown. You're so obsessed with quad. It's sad. Let it go. You'll feel better kid.

Hey if it walks like a duck, sounds like a duck by golly it must be a duck. Even if you aren't a cow, I doubt it, you're still in here melting down because I said quack is having a meltdown. It obviously went above your head son but I'm curious as to why you felt the need to come in trying to defend the meltdown king quack?

Avatar image for Gaming-Planet
Gaming-Planet

21106

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#162 Gaming-Planet
Member since 2008 • 21106 Posts

Isn't the base Xbox worse?

I don't get why he's ranting.

Avatar image for kingtito
kingtito

11775

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#163 kingtito
Member since 2003 • 11775 Posts
@quadknight said:

@kingtito: ^ ^ ? What a meltdown.

Another thread another KingButthurt meltdown. You're more predictable than Old Faithful at this point.

You were saying quack?

Avatar image for davillain
DaVillain

58693

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#164 DaVillain  Moderator
Member since 2014 • 58693 Posts

@kingtito: You are trying way too hard.

Avatar image for PSP107
PSP107

18983

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#165 PSP107
Member since 2007 • 18983 Posts

@aroxx_ab: "you dont need top highend PC to make/play good games"

NES/NES/PS1/PS2 exclusives>>>>> PC

Avatar image for Basinboy
Basinboy

14559

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#166 Basinboy
Member since 2003 • 14559 Posts

@leonkennedy97: resolution and framerate do matter. Also forcibly refusing from releasing games on other platforms provides a false artifice to prop up the position that "console games" look better. There is no such thing.

Avatar image for FLOPPAGE_50
FLOPPAGE_50

4500

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#167 FLOPPAGE_50
Member since 2004 • 4500 Posts

Well said devs, we been saying this for ages.

TCHBO

Avatar image for lrdfancypants
lrdfancypants

3850

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#168 lrdfancypants
Member since 2014 • 3850 Posts

@goldenelementxl:

Well it's the most powerful console ever released till November.

Weird statement.

Avatar image for leonkennedy97
leonkennedy97

83

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#169 leonkennedy97
Member since 2017 • 83 Posts

@Basinboy: There is no such thing as what? The games I mentioned are easily some of the best looking games available, period end of story. The advantage to that is they spend their time on one spec as opposed to 5. I don't care what kind of PC snob you are . The PS4 pro and X1X deliver superb IQ.

Avatar image for Basinboy
Basinboy

14559

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#170 Basinboy
Member since 2003 • 14559 Posts

@leonkennedy97: and they'd all perform better on PC given the chance

Avatar image for putaspongeon
PutASpongeOn

4897

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#171 PutASpongeOn
Member since 2014 • 4897 Posts

Devs will develop around the base ps4 since that's where sales are, ps4 pro if anything is a positive thing for devs.

What an idiot, is he new to video games? This has been the case for generations.

Avatar image for X_CAPCOM_X
X_CAPCOM_X

9625

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#172  Edited By X_CAPCOM_X
Member since 2004 • 9625 Posts

@Sam3231 said:

@lexxluger: Ah ha! Look at this meltdown! Fucking cow lem. Clem.

I think I'm getting the hang of this.

You need more emojis, graphs, and exclamation marks.

Avatar image for ermacness
ermacness

10956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#173 ermacness
Member since 2005 • 10956 Posts

@nintendoboy16:

How many "other" devs have stated anywhere near the lines of what this particular dev is stating? One dev doesn't speak for all.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#174 waahahah
Member since 2014 • 2462 Posts

This also applies to about 90% of gaming PC's where most people game on a budget. Targeting super high end reduces the pool to a limited number of people. This developer probably wants to do some cool things and when he's not being "held back" by hardware he'll probably be held back once he discovers the complexity of these alleged games he wants to make. Or he's talking about high res assets and stuff and at this point... I just don't care... please give me fun games. Game's like sea of thieves actually look like they are bringing something a bit new to the table instead of more mundane crap with prettier textures.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#175 ronvalencia
Member since 2008 • 29612 Posts

@NFJSupreme said:

This generation has been the weakest competitively in console history

8800 GTX was released a few days ahead of PS3.

AMD's missing in action with PC's high end GPUs mirrors ATI's missing in action with PC's high end GPUs in the 2005/2006.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#176  Edited By waahahah
Member since 2014 • 2462 Posts

@Basinboy said:

@leonkennedy97: and they'd all perform better on PC given the chance

That may not be entirely true. One of the things consoles actually do better than PC is a unified memory architecture. ND uses a lot of gpu compute power and PC's have a bottle neck pushing data back and forth over the PCI express bus. This is one of the area where consoles have a significant advantage if developers use it excessively.

FOR INSTANCE. What if xbox x uses it's extra GPU power not for 4k, but for physics or something. Those settings may work significantly worse on PC on any hardware. This is something a developer can choose to do.

Avatar image for CTR360
CTR360

9217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#177 CTR360
Member since 2007 • 9217 Posts

Another theory for unknown developer and studio

Avatar image for Jebus213
Jebus213

10056

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#178 Jebus213
Member since 2010 • 10056 Posts

I always ask myself "Do these people actually know what they're talking about?"

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#179  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Jebus213 said:

I always ask myself "Do these people actually know what they're talking about?"

waahahah's view point is OK.

It's the latency with CPU -->CPU memory --(PCI-E)--> GPU memory ---> GPU ----> GPU memory --(PCI-E)--> CPU memory ---> CPU process.

On consoles, CPU -->memory --> GPU ----> memory ---> CPU process.

To compensate, Intel AVX v2 gains GPU style gather instructions.

Intel Haswell quad-core with a 4GHz Turbo mode will offer 512 GFLOPS FP32 via AVX_v2-256

Intel Skylake X quad-core with a 4GHz Turbo mode will offer 1 TFLOPS FP32 via AVX_v3-512

Intel Skylake X eight-core with a 4GHz Turbo mode will offer 2 TFLOPS FP32 via AVX_v3-512

The problem with XBO is the lack of spare GCN CUs for physics work.

If X1X's 10 CU (1.5 TFLOPS at 1172 Mhz ) from 40 CU resource was allocated for physics work, XBO can't run this game.

If PS4 Pro 9 CU (1 TFLOPS at 914 Mhz ) from 36 CU resource was allocated for physics work, PS4's render is gimped.

Another problem....

If PS4 is dumped, PS4 Pro's maximum software optimizations involves compute shader tile to GPU 2MB L2 cache and PRT to fake a large texture memory storage. PS4 Pro hardware profile games wouldn't work on PS4.

On NVIDIA Maxwell/Pascal, compute/shader/render tile automatically changes size according to hardware's GPU L2 cache size.

Atm, the delay for RX-Vega is mostly due to drivers with brand new tile render to L2 cache feature.

Avatar image for Basinboy
Basinboy

14559

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#180 Basinboy
Member since 2003 • 14559 Posts

@waahahah: but to depend on the premise that a developer's expertise can result in greater performance/fidelity requires assuming that they cannot or will not do exercise the same with respect to the greater resource pool modular devices allow for.

But my intention was not to further reinforce the arbitrary distinction between consoles and PC. All game systems are PCs and software is simply coded for certain structures. But I started getting flamed and gave up on trying to argue and resorted to being snarky.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#181 waahahah
Member since 2014 • 2462 Posts

@Basinboy said:

@waahahah: but to depend on the premise that a developer's expertise can result in greater performance/fidelity requires assuming that they cannot or will not do exercise the same with respect to the greater resource pool modular devices allow for.

But my intention was not to further reinforce the arbitrary distinction between consoles and PC. All game systems are PCs and software is simply coded for certain structures. But I started getting flamed and gave up on trying to argue and resorted to being snarky.

The thing is, you can't design around a bottleneck without designing for inferior data flow. There is now magic dev talent that can get around an inferior hardware setup. All the raw power isn't going to take away consoles unified memory space. PC's are hindered by the pci-express lane when it comes to gpu compute and will not be able to do as much as consoles as things move forward. This is NOT an arbitrary distinction. Consoles and desktop PC's do have a clear distinction in hardware and with xbox one x direct x command processor there are special bits of hardware that PC's will never have likely.

Avatar image for SinjinSmythe
SinjinSmythe

1049

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#182 SinjinSmythe
Member since 2008 • 1049 Posts

Wrong. It is holding Destiny 2 back so this thread is /

Avatar image for Phazevariance
Phazevariance

12356

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#183 Phazevariance
Member since 2003 • 12356 Posts

Lol, dev mentions PS4 Pro is outdated, cows complain "what is the xbox one doing then?"

PS4 - holding back games
PS4 Pro - holding back games
Xbox One - holding back games
Xbox One X - bringing games to the future with current(ish) hardware

The real problem here is Sony's half ass mid-gen upgrade.

Avatar image for Basinboy
Basinboy

14559

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#184 Basinboy
Member since 2003 • 14559 Posts

@waahahah: nm, you're missing what I'm hitting at and addressing an argument I'm not making.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#185  Edited By waahahah
Member since 2014 • 2462 Posts

@Basinboy said:

@waahahah: nm, you're missing what I'm hitting at and addressing an argument I'm not making.

I don't think your point matters, the point is, is these aren't generic PC's, they share the cpu/gpu in common but actually are different architectures. Where the pool of raw resources will always be better on PCs, consoles actually have the ability to make games in a different way. Everything PC's can do can be scaled down to fit on consoles so your PC of matching hardware performance will generally be equal with consoles, but the other way is not true. There are things that developers can leverage on consoles without even trying to have more efficiency and better utilization, where an equally equipped PC will have to be scaled down, and as long as the bus is equal on most PC's, generally even a superior PC will have to be scaled down. And GPU compute isn't something that developers won't utilize, but passing data between the cpu and gpu is basically free on consoles.

Also note that the mass of PC's generally aren't that good, most are equivalent or slightly better/worse than consoles. The added resources can't be used to do anything fundamentally different than what consoles can/can't do.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#186  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Phazevariance said:

Lol, dev mentions PS4 Pro is outdated, cows complain "what is the xbox one doing then?"

PS4 - holding back games

PS4 Pro - holding back games

Xbox One - holding back games

Xbox One X - bringing games to the future with current(ish) hardware

The real problem here is Sony's half ass mid-gen upgrade.

MS guided X1X's design for existing 3D engines, hence improving AMD GPU's Pixel Engine path i.e. 60 deep graphics pipeline changes. The keywords are "graphics pipeline".

-----------

PS4 Pro was designed with PSVR (two 1920x1080 screens) and narrow optimizations path i.e. PS4 Pro was designed for extreme Sony's CELL's SPU style workload with Compute Engine to L2 cache path, but designing for compute shader tile for GPU's 2MB L2 cache breaks PS4 which is missing 2MB L2 cache feature. Sony's Vega NCU selection shows this extreme SPU style workload with Compute Engine to L2 cache path.

On XBO and X1X, XBO's ESRAM can fake X1X's GPU 2MB L2 cache with lower tiling performance.

Read http://www.playstationlifestyle.net/2017/03/10/horizon-zero-dawn-ps4-pro-utilization/

The machine has a lot of power, so we understand what it can do, but the game was already in place when we learnt about the Pro and got the dev kit. So what we’ve done is taken the power and tried to make the best improvements we think for our existing game. We didn’t design the game from the ground up for the Pro, but we tried to use that processing power to do the best things we could to make the experience look or feel better for you.

I don’t think it’s fair to say we haven’t used it fully, but I do think it’s fair to say that if we’d designed the game from the ground up for the Pro, we’d have probably used it differently.

Horizon Zero Dawn developers wasn't ready for PS4 Pro's unexpected features. PS4 Pro's 3D engines needs to be redesigned for heavy tile compute to 2MB L2 cache designs and heavy PRT (faking larger memory storage with tiled textures) usage.

PS4 Pro was designed for Sony's needs, while MS has heavy 3rd party 3D engine needs e.g. Unreal Engine 4 for CrackDown 3, Sea of Thieves, Gears of War 4 and State of Decay 2. MS's heavy Unreal Engine 4 usage and related hardware optimizations leads to other benefits for other NVIDIA Gameworks games.

Sony picked subset Vega IP blocks that suites their needs.

MS picked subset Vega IP blocks that suites their needs.

PS4 Pro's double rate FP16 enables it to reach ~70 percent of my GTX 980 TI results for Mantis Burn Racing. Don't underestimate PS4 Pro since there's a very narrow software optimizations path for extracting PS4 Pro's TFLOPS power.

PS4 Pro devs has to heavy tile thier math problems and can't rely on easy unified memory bus with large storage programming model. If tiling with XBO's 30 MB ESRAM was hard, try tiling with GPU's 2 MB L2 cache.

Avatar image for Basinboy
Basinboy

14559

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#187 Basinboy
Member since 2003 • 14559 Posts

@waahahah: Your arument is confined in the preconceptions I am disputing and you're addressing something I am not. But regardless, I disagree with the assumptions you are making within those preconceptions, attempting to equate differing architectures by evaluating comparable hardware within each structure. I am not disputing the efficiency of unified memory architecture, but the model of being restricted to static hardware when the other model is not and can compensate for its architectural disadvantages by other means.

I have no desire to carry on this back and forth since you'd rather dismiss my contributions than address them and bullheadedly assert your correctness. I wish you well.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#188 waahahah
Member since 2014 • 2462 Posts

@Basinboy said:

@waahahah: Your arument is confined in the preconceptions I am disputing and you're addressing something I am not. But regardless, I disagree with the assumptions you are making within those preconceptions, attempting to equate differing architectures by evaluating comparable hardware within each structure. I am not disputing the efficiency of unified memory architecture, but the model of being restricted to static hardware when the other model is not and can compensate for its architectural disadvantages by other means.

I have no desire to carry on this back and forth since you'd rather dismiss my contributions than address them and bullheadedly assert your correctness. I wish you well.

I addressed your issues, to me it sounds like you think the raw resources can over come everything. So your assertion that all console games would perform better on PC is not true given the architecture differences with unified memory architecture. You tried to dismiss this difference as "arbitrary"

You'll always get better resolution and textures but at the end of the day the unified memory architecture allows developers utilize GPU compute much heavier than PC will ever be able to do until it moves into a unified memory space. The pci-express bus becomes too much of a bottle neck. And consoles have been moving more and more towards HSA, distinguishing themselves from PC architecture even more.

@ronvalencia said:
@Jebus213 said:

I always ask myself "Do these people actually know what they're talking about?"

waahahah's view point is OK.

It's the latency with CPU -->CPU memory --(PCI-E)--> GPU memory ---> GPU ----> GPU memory --(PCI-E)--> CPU memory ---> CPU process.

On consoles, CPU -->memory --> GPU ----> memory ---> CPU process.

To compensate, Intel AVX v2 gains GPU style gather instructions.

Intel Haswell quad-core with a 4GHz Turbo mode will offer 512 GFLOPS FP32 via AVX_v2-256

Intel Skylake X quad-core with a 4GHz Turbo mode will offer 1 TFLOPS FP32 via AVX_v3-512

Intel Skylake X eight-core with a 4GHz Turbo mode will offer 2 TFLOPS FP32 via AVX_v3-512

The problem with XBO is the lack of spare GCN CUs for physics work.

If X1X's 10 CU (1.5 TFLOPS at 1172 Mhz ) from 40 CU resource was allocated for physics work, XBO can't run this game.

If PS4 Pro 9 CU (1 TFLOPS at 914 Mhz ) from 36 CU resource was allocated for physics work, PS4's render is gimped.

Another problem....

If PS4 is dumped, PS4 Pro's maximum software optimizations involves compute shader tile to GPU 2MB L2 cache and PRT to fake a large texture memory storage. PS4 Pro hardware profile games wouldn't work on PS4.

On NVIDIA Maxwell/Pascal, compute/shader/render tile automatically changes size according to hardware's GPU L2 cache size.

Atm, the delay for RX-Vega is mostly due to drivers with brand new tile render to L2 cache feature.

Yah the one thing I'm not saying is that these systems probably can't budget much for compute, but future generations will likely be held back by PC's. An nvidia engineer on reddit was commenting about the unified memory space and just how much pci-e hurts compute on games. Granted at some point PC's might move to a unified architecture, even disk space is getting tied directly to the memory bus, you can get ssd's that insert in memory slots.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#189 ronvalencia
Member since 2008 • 29612 Posts

@waahahah said:
@Basinboy said:

@waahahah: Your arument is confined in the preconceptions I am disputing and you're addressing something I am not. But regardless, I disagree with the assumptions you are making within those preconceptions, attempting to equate differing architectures by evaluating comparable hardware within each structure. I am not disputing the efficiency of unified memory architecture, but the model of being restricted to static hardware when the other model is not and can compensate for its architectural disadvantages by other means.

I have no desire to carry on this back and forth since you'd rather dismiss my contributions than address them and bullheadedly assert your correctness. I wish you well.

I addressed your issues, to me it sounds like you think the raw resources can over come everything. So your assertion that all console games would perform better on PC is not true given the architecture differences with unified memory architecture. You tried to dismiss this difference as "arbitrary"

You'll always get better resolution and textures but at the end of the day the unified memory architecture allows developers utilize GPU compute much heavier than PC will ever be able to do until it moves into a unified memory space. The pci-express bus becomes too much of a bottle neck. And consoles have been moving more and more towards HSA, distinguishing themselves from PC architecture even more.

@ronvalencia said:
@Jebus213 said:

I always ask myself "Do these people actually know what they're talking about?"

waahahah's view point is OK.

It's the latency with CPU -->CPU memory --(PCI-E)--> GPU memory ---> GPU ----> GPU memory --(PCI-E)--> CPU memory ---> CPU process.

On consoles, CPU -->memory --> GPU ----> memory ---> CPU process.

To compensate, Intel AVX v2 gains GPU style gather instructions.

Intel Haswell quad-core with a 4GHz Turbo mode will offer 512 GFLOPS FP32 via AVX_v2-256

Intel Skylake X quad-core with a 4GHz Turbo mode will offer 1 TFLOPS FP32 via AVX_v3-512

Intel Skylake X eight-core with a 4GHz Turbo mode will offer 2 TFLOPS FP32 via AVX_v3-512

The problem with XBO is the lack of spare GCN CUs for physics work.

If X1X's 10 CU (1.5 TFLOPS at 1172 Mhz ) from 40 CU resource was allocated for physics work, XBO can't run this game.

If PS4 Pro 9 CU (1 TFLOPS at 914 Mhz ) from 36 CU resource was allocated for physics work, PS4's render is gimped.

Another problem....

If PS4 is dumped, PS4 Pro's maximum software optimizations involves compute shader tile to GPU 2MB L2 cache and PRT to fake a large texture memory storage. PS4 Pro hardware profile games wouldn't work on PS4.

On NVIDIA Maxwell/Pascal, compute/shader/render tile automatically changes size according to hardware's GPU L2 cache size.

Atm, the delay for RX-Vega is mostly due to drivers with brand new tile render to L2 cache feature.

Yah the one thing I'm not saying is that these systems probably can't budget much for compute, but future generations will likely be held back by PC's. An nvidia engineer on reddit was commenting about the unified memory space and just how much pci-e hurts compute on games. Granted at some point PC's might move to a unified architecture, even disk space is getting tied directly to the memory bus, you can get ssd's that insert in memory slots.

There's the incoming PCI-E version 4.0 and NVIDIA's POV with NVlink vs current PCI-E version 3.0.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#190 waahahah
Member since 2014 • 2462 Posts

@ronvalencia said:

There's the incoming PCI-E version 4.0 and NVIDIA's POV with NVlink vs current PCI-E version 3.0.

Right, but is it still as good as free? Which is what the nvidia developer was alluding too. When you have no bus to shuffle data on you can do a lot more during a frame time.

Avatar image for deactivated-5a8875b6c648f
deactivated-5a8875b6c648f

954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#191 deactivated-5a8875b6c648f
Member since 2015 • 954 Posts
@leonkennedy97 said:

@phantomfire335: Compared to full open world games? Are you kidding?

You didn't say open world games before, you just said it was top 10, which it aint. If you're talking open world games then it may squeeze in.

(Witcher 3 and AC Unity look MUCH better)

Avatar image for leonkennedy97
leonkennedy97

83

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#192 leonkennedy97
Member since 2017 • 83 Posts

@phantomfire335: Errr no they dont. And you could count in the top 10 regardless. Especially with all thats going on.

Avatar image for drlostrib
DrLostRib

5931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#193 DrLostRib
Member since 2017 • 5931 Posts

@lexxluger said:
@Sam3231 said:

What is it with you guys and "meltdowns"

Apparently, whenever someone posts a response to any other users post here on this board, they are having a "meltdown"!

Oopse :( I just replied to your post, you know what that means?

There have been users that have actual "meltdowns" where they post long tirades

But it's now been diluted to anytime fanboys disagree with each other

Avatar image for kvally
kvally

8445

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 9

#194 kvally
Member since 2014 • 8445 Posts

PS4 Pro Is Like A 5 Year Old PC And It's Holding Developers Back

Avatar image for Angryduck67
Angryduck67

272

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#195  Edited By Angryduck67
Member since 2004 • 272 Posts

Did this guy see the Anthem demo? Or that Uncharted expansion? Or Days Gone render nearly 100 zombies on screen at once? Or the tech demo of Beyond Good and Evil 2 where they rendered entire planets at once with no load times between them? Or the GoW 4 trailer?

This developer really thinks what's holding us back are GPU's? What???

Classic trick that's been around for at least 15 years: make so-so multiplatform games and when people unfavorably compare them to vastly superior games, blame the consoles both of your games are on for being underpowered and ruining the entire industry.

Hey, don't look at me, it's obviously Sony's fault (please buy our games).

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#196  Edited By ronvalencia
Member since 2008 • 29612 Posts

@waahahah said:
@ronvalencia said:

There's the incoming PCI-E version 4.0 and NVIDIA's POV with NVlink vs current PCI-E version 3.0.

Right, but is it still as good as free? Which is what the nvidia developer was alluding too. When you have no bus to shuffle data on you can do a lot more during a frame time.

I agree. A proper fusion example would be powered by ZEN and NAVI 11 based APU.. Not factoring any AMD server ZEN + Greenland monster APUs.

Avatar image for pimphand_gamer
PimpHand_Gamer

3048

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#197  Edited By PimpHand_Gamer
Member since 2014 • 3048 Posts

@nfamouslegend said:

@pimphand_gamer: What's a GTX 880?

https://www.techpowerup.com/199750/nvidia-geforce-gtx-880-detailed

  • 20 nm GM204 silicon
  • 7.9 billion transistors
  • 3,200 CUDA cores
  • 200 TMUs
  • 32 ROPs
  • 5.7 TFLOP/s single-precision floating-point throughput
  • 256-bit wide GDDR5 memory interface
  • 4 GB standard memory amount
  • 238 GB/s memory bandwidth
  • Clock speeds of 900 MHz core, 950 MHz GPU Boost, 7.40 GHz memory
  • 230W board power
Avatar image for aki2017
Aki2017

817

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#198 Aki2017
Member since 2017 • 817 Posts

Consoles have always held PC games back, but that's not new. Good looking, fun games can still be had on systems. The Switch is coming out with gorgeous looking games, and the PS4 pro is even more powerful then that. it's just annoying tweaking developers need to do to get games to work on console. more work means more money.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#199  Edited By ronvalencia
Member since 2008 • 29612 Posts

@pimphand_gamer said:
@nfamouslegend said:

@pimphand_gamer: What's a GTX 880?

https://www.techpowerup.com/199750/nvidia-geforce-gtx-880-detailed

  • 20 nm GM204 silicon
  • 7.9 billion transistors
  • 3,200 CUDA cores
  • 200 TMUs
  • 32 ROPs
  • 5.7 TFLOP/s single-precision floating-point throughput
  • 256-bit wide GDDR5 memory interface
  • 4 GB standard memory amount
  • 238 GB/s memory bandwidth
  • Clock speeds of 900 MHz core, 950 MHz GPU Boost, 7.40 GHz memory
  • 230W board power

Real GM204 silicon refers to GTX 980 and GTX 970.

https://www.techpowerup.com/gpudb/2621/geforce-gtx-980

Shading Units: 2048. That's 2048 CUDA cores with 256 bit wide GDDR5 memory interface.

My GTX 980 Ti refers to GM200 silicon with 2816 CUDA cores and 384 bit wide GDDR5 memory interface. https://www.techpowerup.com/gpudb/2724/geforce-gtx-980-ti

Avatar image for Dark_man123
Dark_man123

4012

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#200 Dark_man123
Member since 2005 • 4012 Posts

@bigblunt537: Exactly I mean seriously how could he compare consoles that can't even be upgraded.