How and why does anyone think the PS5 is even remotely as capable as the Series X

  • 129 results
  • 1
  • 2
  • 3
Avatar image for ConanTheStoner
ConanTheStoner

23837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 ConanTheStoner
Member since 2011 • 23837 Posts
@uninspiredcup said:

PC has Fakefactory.

Well shit.

Nvm Yayas, I take it all back.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52 tormentos
Member since 2003 • 33793 Posts

@Pedro said:

That is false and was clearly stated not to be true in Mark Cerny's own words.

Excuse me it will run at 2.23ghz most of the time.

However, again, while 2.23GHz is the limit and also the typical speed

https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-specs-and-tech-that-deliver-sonys-next-gen-vision

The only times the GPU will not run at that speed if when the CPU require more power,in other words in CPU bound scenarios were the CPU would need extra power,other wise that will be the speed.

@EG101 said:

The PS5 is a 9.2 TF console boosted to 10.3 TF's.

The 7nm process can only sustain speeds around 1.8 Ghz and 7nmE Process can sustain speeds up to 2 Ghz not above 2 Ghz. MS already told us XSX is 7nmEnhanced.

Unless the PS5 is built on the expensive and fairly new 7nm+ process those 2.23Ghz claims are not realistic.

I highly doubt the PS5 was designed around a different process than XSX. More likely both are probably built on the 7nmE process with 2 Ghz sustained clocks being the safe clock speeds for those GPU's.

GitHub was correct with their information. 9.2 TF's is likely the sustained output of PS5's GPU with it able to boost to 10.3 TF's when power allows.

NO.

RDNA2 please stop.

However, again, while 2.23GHz is the limit and also the typical speed

https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-specs-and-tech-that-deliver-sonys-next-gen-vision

Again the speed will be 2.23ghz only when the PS5 CPU require more power the GPU will drop in worse case 10% power,which mean an even smaller % frequency drop.

Guthub never stated that Guthub showed what was an early test of the chip,which by the way more than 1 year ago was already running at 2.0ghz.

Second you can't just take a GPU that already is running at 2ghz and rise it 200+mhz more without problems,hell people here claimed the PS5 wasn't RDNA2,when it is clear it is since RDNA 1 would never everrrr hold those clock sustain period,hell RDNA1 was even giving problems with thermals on small OC over stock.

I ask you why do you fanboys get from this? Why invent crap to try to make the xbox gap bigger than it is,you people did the same shit this gen but on reverse trying to downplay a 40% gap,now you want to make a 17% one seem bigger and you are not the only one,and oddly using the same style claiming false crap that already has been disproven by sony own words and worse which even DF state in its article.

The gapo will be what it is period,it will not grow or get smaller by arguments.

Hell there is a bigger disparity in the SSD than on power.

@Sagemode87 said:

@EG101: No, it's not. But keep saying that to further your little narrative.

Either he has 10 accounts or lemmings have join forces to make the same false claims to see if it stick.😂

Then they claim i am damage controlling,but some how what they are doing is not inventing crap.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 EG101
Member since 2007 • 2091 Posts

@Martin_G_N said:

@EG101: the 5700xt has been overclocked to 2.2ghz without issues. There will be no problem to run the PS5 GPU at those speeds consistently if needed. Or else Cerny wouldn't have told devs that. But keeping the power consumption down if games don't utilize all the power is important, especially these days where the environment is so important. There is no point for the X1X to use 300watt running Netflix.

The small differences between these consoles isn't much to argue over. The X1 ran most games at 720 to 900p at 25-30fps, no Xbox fans complained then. Both these next gen consoles are impressive, but Sony looks like to have the best combination of power, innovation, and cost. People who really misses some pixels will have a gaming PC anyways.

Thanks that's good to hear. PS5 should be able to sustain 2.23 Ghz without any issues.

When there are no current GPU'S selling at that clock speed its difficult to believe PR speak that 2.23 would've been sustainable.

Avatar image for uninspiredcup
uninspiredcup

62759

Forum Posts

0

Wiki Points

0

Followers

Reviews: 86

User Lists: 2

#54 uninspiredcup
Member since 2013 • 62759 Posts

@ConanTheStoner said:
@uninspiredcup said:

PC has Fakefactory.

Well shit.

Nvm Yayas, I take it all back.

No idea what a yaya is but the man goes above and beyond the call of duty.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 EG101
Member since 2007 • 2091 Posts

@tormentos said:
@Pedro said:

That is false and was clearly stated not to be true in Mark Cerny's own words.

Excuse me it will run at 2.23ghz most of the time.

However, again, while 2.23GHz is the limit and also the typical speed

https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-specs-and-tech-that-deliver-sonys-next-gen-vision

The only times the GPU will not run at that speed if when the CPU require more power,in other words in CPU bound scenarios were the CPU would need extra power,other wise that will be the speed.

@EG101 said:

The PS5 is a 9.2 TF console boosted to 10.3 TF's.

The 7nm process can only sustain speeds around 1.8 Ghz and 7nmE Process can sustain speeds up to 2 Ghz not above 2 Ghz. MS already told us XSX is 7nmEnhanced.

Unless the PS5 is built on the expensive and fairly new 7nm+ process those 2.23Ghz claims are not realistic.

I highly doubt the PS5 was designed around a different process than XSX. More likely both are probably built on the 7nmE process with 2 Ghz sustained clocks being the safe clock speeds for those GPU's.

GitHub was correct with their information. 9.2 TF's is likely the sustained output of PS5's GPU with it able to boost to 10.3 TF's when power allows.

NO.

RDNA2 please stop.

However, again, while 2.23GHz is the limit and also the typical speed

https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-specs-and-tech-that-deliver-sonys-next-gen-vision

Again the speed will be 2.23ghz only when the PS5 CPU require more power the GPU will drop in worse case 10% power,which mean an even smaller % frequency drop.

Guthub never stated that Guthub showed what was an early test of the chip,which by the way more than 1 year ago was already running at 2.0ghz.

Second you can't just take a GPU that already is running at 2ghz and rise it 200+mhz more without problems,hell people here claimed the PS5 wasn't RDNA2,when it is clear it is since RDNA 1 would never everrrr hold those clock sustain period,hell RDNA1 was even giving problems with thermals on small OC over stock.

I ask you why do you fanboys get from this? Why invent crap to try to make the xbox gap bigger than it is,you people did the same shit this gen but on reverse trying to downplay a 40% gap,now you want to make a 17% one seem bigger and you are not the only one,and oddly using the same style claiming false crap that already has been disproven by sony own words and worse which even DF state in its article.

The gapo will be what it is period,it will not grow or get smaller by arguments.

Hell there is a bigger disparity in the SSD than on power.

@Sagemode87 said:

@EG101: No, it's not. But keep saying that to further your little narrative.

Either he has 10 accounts or lemmings have join forces to make the same false claims to see if it stick.😂

Then they claim i am damage controlling,but some how what they are doing is not inventing crap.

What a silly thing to say.

Anyway PS5 will be the first mass market device that will have a 36 CU AMD GPU clocked so highly. Hope it works out well.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56 tormentos
Member since 2003 • 33793 Posts
@EG101 said:
@Sagemode87 said:

@EG101: No, it's not. But keep saying that to further your little narrative.

Name 1 other stock AMD GPU running above 2Ghz right now on a 7nmE process.

If you can I'll admit I'm wrong

Name one RDNA2 GPU bigger than the series X,oh and both have Ray traycing again no AMD varian on PC exist with this specs.

Just because something is new and not on PC doesn't mean is not real,both the PS5 and xbox have GPU with no equivalent from AMD on PC yet.

Basically what you are doing is damage controlling the fact that the PS5 is not far,the xbox been stronger doesn't fill you unless it is by allot..😂

Avatar image for nepu7supastar7
nepu7supastar7

6773

Forum Posts

0

Wiki Points

0

Followers

Reviews: 51

User Lists: 0

#57 nepu7supastar7
Member since 2007 • 6773 Posts

@Martin_G_N:

"The differences between the X1 and PS4 made me think we'd get almost different games, but basically it's just resolution on the X1 that's lower."

- In the end, that's all we're really going to see between the ps5 and XboxSX. I dunno what all the hubbub is about. And in the end, ps5 will still have the better exclusives.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58  Edited By EG101
Member since 2007 • 2091 Posts

@tormentos said:
@EG101 said:
@Sagemode87 said:

@EG101: No, it's not. But keep saying that to further your little narrative.

Name 1 other stock AMD GPU running above 2Ghz right now on a 7nmE process.

If you can I'll admit I'm wrong

Name one RDNA2 GPU bigger than the series X,oh and both have Ray traycing again no AMD varian on PC exist with this specs.

Just because something is new and not on PC doesn't mean is not real,both the PS5 and xbox have GPU with no equivalent from AMD on PC yet.

Basically what you are doing is damage controlling the fact that the PS5 is not far,the xbox been stronger doesn't fill you unless it is by allot..😂

We'll find out the truth when the DF comparisons make the rounds.

I still don't trust that high clock as a sustainable clock. Sounds like it will get very hot to me.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#59  Edited By ronvalencia
Member since 2008 • 29612 Posts

@EG101 said:

What a silly thing to say.

Anyway PS5 will be the first mass market device that will have a 36 CU AMD GPU clocked so highly. Hope it works out well.

Lisa Su already stated RX-5700/RX 5700 XT will be refreshed

Read https://www.tweaktown.com/news/70277/amd-promises-navi-refresh-next-gen-rdna2-graphics-cards-2020/index.html

Lisa Su said: "In 2019, we launched our new architecture in GPUs, it's the RDNA architecture, and that was the Navi based products. You should expect that those will be refreshed in 2020 - and we'll have a next generation RDNA architecture that will be part of our 2020 lineup".

She continued: "So we're pretty excited about that, and we'll talk more about that at our financial analyst day. On the data centre GPU side, you should also expect that we'll have some new products in the second half of this year"

RDNA 2 with DXR 1.1 was already demo'ed by AMD.

Loading Video...

AMD's wasteful reflection raytracing everywhere demo.

Mobile 36 CU/40 CU RDNA 2 GPUs will be important for the laptop PC market.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60 tormentos
Member since 2003 • 33793 Posts

@EG101 said:

We'll find out the truth when the DF comparisons make the rounds.

I still don't trust that high clock as a sustainable clock. Sounds like it will get very hot to me.

Probably but i am sure it will not tell you the speed of the GPU.

Well according to what is been say sony has a hell of good cooling system in place.

Avatar image for Zero_epyon
Zero_epyon

20499

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#61 Zero_epyon
Member since 2004 • 20499 Posts

PS5 should still be a 4K/60 console. I'm not worried about the next gen consoles in terms of visuals and quality. The difference will be in performance. Will that extra bit of GPU power really make a noticeable difference in performance? What cutbacks will multiplatform devs really have to make because the PS5 has a slightly less capable GPU.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 EG101
Member since 2007 • 2091 Posts

@Zero_epyon said:

PS5 should still be a 4K/60 console. I'm not worried about the next gen consoles in terms of visuals and quality. The difference will be in performance. Will that extra bit of GPU power really make a noticeable difference in performance? What cutbacks will multiplatform devs really have to make because the PS5 has a slightly less capable GPU.

Imo, the differences between the 2 will likely just be in the Ray Tracing implementation.

XSX will be able to be a bit more aggressive there. Games should run the same.

Avatar image for Zero_epyon
Zero_epyon

20499

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#63 Zero_epyon
Member since 2004 • 20499 Posts

@EG101 said:
@Zero_epyon said:

PS5 should still be a 4K/60 console. I'm not worried about the next gen consoles in terms of visuals and quality. The difference will be in performance. Will that extra bit of GPU power really make a noticeable difference in performance? What cutbacks will multiplatform devs really have to make because the PS5 has a slightly less capable GPU.

Imo, the differences between the 2 will likely just be in the Ray Tracing implementation.

XSX will be able to be a bit more aggressive there. Games should run the same.

That's one thing I missed. What's the difference between the Series X and the PS5 in terms of ray tracing capabilities?

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64  Edited By EG101
Member since 2007 • 2091 Posts

@Zero_epyon said:
@EG101 said:
@Zero_epyon said:

PS5 should still be a 4K/60 console. I'm not worried about the next gen consoles in terms of visuals and quality. The difference will be in performance. Will that extra bit of GPU power really make a noticeable difference in performance? What cutbacks will multiplatform devs really have to make because the PS5 has a slightly less capable GPU.

Imo, the differences between the 2 will likely just be in the Ray Tracing implementation.

XSX will be able to be a bit more aggressive there. Games should run the same.

That's one thing I missed. What's the difference between the Series X and the PS5 in terms of ray tracing capabilities?

The rumor is in RDNA2's architecture the more CU's you have the better Ray Tracing can be Implemented. Clock speed doesn't necessarily scale linearly with Ray Tracing but CU count does.

Not sure how accurate that is but that's the rumor from the same guy that released the GitHub specs for both consoles.

Edit: Also Ray Tracing is a resource hog so the extra TF's will help with running your game while implementing RT.

Avatar image for Pedro
Pedro

73940

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#65 Pedro
Member since 2002 • 73940 Posts

@EG101 said:

The rumor is in RDNA2's architecture the more CU's you have the better Ray Tracing can be Implemented. Clock speed doesn't necessarily scale linearly with Ray Tracing but CU count does.

Not sure how accurate that is but that's the rumor from the same guy that released the GitHub specs for both consoles.

That sounds like something. ;)

Avatar image for thereal25
thereal25

2074

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#66  Edited By thereal25
Member since 2011 • 2074 Posts

People need to be more forward thinking. While the both may be able to run games at 4k/60fps initially, obviously the sex box will be able to maintain higher and more stable settings for longer.

Avatar image for Uruz7laevatein
Uruz7laevatein

160

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67  Edited By Uruz7laevatein
Member since 2009 • 160 Posts

@EG101:

The github leak refers to both consoles being Zen-1.5 and RDNA1 (older dev-kits? or just plain guessing) I wouldn't place too much faith in the predictions, with current info of both being a baseline of Zen2,RDNA2,GDDR6.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#68  Edited By ronvalencia
Member since 2008 • 29612 Posts
@tormentos said:
@Pedro said:

That is false and was clearly stated not to be true in Mark Cerny's own words.

Excuse me it will run at 2.23ghz most of the time.

However, again, while 2.23GHz is the limit and also the typical speed

https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-specs-and-tech-that-deliver-sonys-next-gen-vision

The only times the GPU will not run at that speed if when the CPU require more power,in other words in CPU bound scenarios were the CPU would need extra power,other wise that will be the speed.

@EG101 said:

The PS5 is a 9.2 TF console boosted to 10.3 TF's.

The 7nm process can only sustain speeds around 1.8 Ghz and 7nmE Process can sustain speeds up to 2 Ghz not above 2 Ghz. MS already told us XSX is 7nmEnhanced.

Unless the PS5 is built on the expensive and fairly new 7nm+ process those 2.23Ghz claims are not realistic.

I highly doubt the PS5 was designed around a different process than XSX. More likely both are probably built on the 7nmE process with 2 Ghz sustained clocks being the safe clock speeds for those GPU's.

GitHub was correct with their information. 9.2 TF's is likely the sustained output of PS5's GPU with it able to boost to 10.3 TF's when power allows.

NO.

RDNA2 please stop.

However, again, while 2.23GHz is the limit and also the typical speed

https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-specs-and-tech-that-deliver-sonys-next-gen-vision

Again the speed will be 2.23ghz only when the PS5 CPU require more power the GPU will drop in worse case 10% power,which mean an even smaller % frequency drop.

Guthub never stated that Guthub showed what was an early test of the chip,which by the way more than 1 year ago was already running at 2.0ghz.

Second you can't just take a GPU that already is running at 2ghz and rise it 200+mhz more without problems,hell people here claimed the PS5 wasn't RDNA2,when it is clear it is since RDNA 1 would never everrrr hold those clock sustain period,hell RDNA1 was even giving problems with thermals on small OC over stock.

I ask you why do you fanboys get from this? Why invent crap to try to make the xbox gap bigger than it is,you people did the same shit this gen but on reverse trying to downplay a 40% gap,now you want to make a 17% one seem bigger and you are not the only one,and oddly using the same style claiming false crap that already has been disproven by sony own words and worse which even DF state in its article.

The gapo will be what it is period,it will not grow or get smaller by arguments.

Hell there is a bigger disparity in the SSD than on power.

@Sagemode87 said:

@EG101: No, it's not. But keep saying that to further your little narrative.

Either he has 10 accounts or lemmings have join forces to make the same false claims to see if it stick.😂

Then they claim i am damage controlling,but some how what they are doing is not inventing crap.

Claim: The memory bandwidth gap between XSX and PS5 increases with higher CPU usage

XSX CPU has ~937 GFLOPS at 3.66 Ghz with SMT or ~973 GFLOPS at 3.8 Ghz with no-SMT

PS5 CPU has ~896 GFLOPS at 3.5 Ghz (variable) with SMT

GFLOPS in FP32.

Notes: Intel AXV 2 has GPU like gather instructions. Each Zen 2 has dual 256-bit AVX 2 FMA3 instruction capability (IBM PowerPC can get lost). AXV 2 can handle FP16/INT16 packing into its 256-bit AVX 2 registers e.g. https://stackoverflow.com/questions/39413328/load-16-bit-integers-in-avx2-vector

Scenario 1: Near GPU only memory access with maximum CPU-to-GPU fusion link usage and maximum respect CPU cache boundaries programming tricks i.e. X86 CPU can be programmed like a CELL SPE. It's in Intel CPU optimization guide since X86 CPUs gained L2 cache.

For example, Swiftshader 3.0 Direct3D9c software JIT render requires proper CPU cache size config setting or the software 3D rendering performance will suffer while CELL SPEs will trigger an exception error. X86 CPU is more forgiving to the programmer i.e. the programmer can decide if the performance penalty is acceptable.

Manual tiled cache compute programming is not new for high-performance X86 server apps. L3 cache is usually large with X86 server and workstation CPUs.

XSX GPU has 25% memory bandwidth advantage over PS5 GPU.

Scenario 2: Zen 2 CPU consumes 40 GB/s memory bandwidth which exceeds PS4 CPU's 20 GB/s memory access IO

Equivalent to PC CPU with 128 bit DDR3-2600

XSX GPU: 520 GB/s

XSX CPU: 40 GB/s

VS

PS5 GPU: 408 GB/s

PS5 CPU: 40 GB/s

XSX GPU has 27.45% memory bandwidth advantage over PS5 GPU.

Scenario 3: Zen 2 CPU consumes 60 GB/s memory bandwidth which exceeds PS4 CPU's 20 GB/s memory access IO

Equivalent to PC CPU with 128 bit DDR4-3800

XSX GPU: 500 GB/s

XSX CPU: 60 GB/s

VS

PS5 GPU: 388 GB/s

PS5 CPU: 60 GB/s

XSX GPU has 28.86% memory bandwidth advantage over PS5 GPU.

Scenario 4: Zen 2 CPU consumes 80 GB/s memory bandwidth which exceeds PS4 CPU's 20 GB/s memory access IO

Equivalent to PC CPU with 128 bit DDR4-5000 e.g. Corsair’s Vengeance LPX DDR4 5,000 MHz kit (pair of 8GB modules)

XSX GPU: 480 GB/s

XSX CPU: 80 GB/s

VS

PS5 GPU: 368 GB/s

PS5 CPU: 80 GB/s

XSX GPU has 30.4% memory bandwidth advantage over PS5 GPU.

Only XSX can brute force like gaming PC with similar CPU and GPU specs.

CPU game world simulation can operate independently from loading textures aka Tile Resource. LOL

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#69  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Pedro said:
@EG101 said:

The rumor is in RDNA2's architecture the more CU's you have the better Ray Tracing can be Implemented. Clock speed doesn't necessarily scale linearly with Ray Tracing but CU count does.

Not sure how accurate that is but that's the rumor from the same guy that released the GitHub specs for both consoles.

That sounds like something. ;)

Well, it's the same with Turing i.e. more SM = more RT cores, more cache.

OC TU106... it wouldn't beat TU104 or TU102.

AMD should have learned from Vega BS TFLOPS scaling by RDNA 2's release because NVIDIA is rumored to increase GPC(raster engine and geometry input) from six to seven and eight scales with RTX Ampere.

I'm looking forward to RTX 3080 Ti with seven GPCs, +40 percent higher raster (16 percent for seven GPCs + 33 percent clock speed increase e.g. 2.39 Ghz) and double DXR performance.

Avatar image for gifford38
Gifford38

7902

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#70 Gifford38
Member since 2020 • 7902 Posts

@sealionact: lol no just in a rush when I wrote it.

Avatar image for gifford38
Gifford38

7902

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#71 Gifford38
Member since 2020 • 7902 Posts

@BlackShirt20: there was an article saying Sony is working with Microsoft to get xcloud. Sony will have its own gamepass.

Avatar image for FinalFighters
FinalFighters

3410

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 FinalFighters
Member since 2013 • 3410 Posts

All i know is whichever next gen console runs cyberpunk 2077 the best will be getting my money day one

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#74 BlackShirt20
Member since 2005 • 2631 Posts

@gifford38: Incorrect. Microsoft will be handling all of Sony’s online functions. Such as PSN and multiplayer.

XCloud is 100% exclusive to Xbox and allows Xbox users to stream all there content on there mobile devices.

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#75 BlackShirt20
Member since 2005 • 2631 Posts

@FinalFighters: Xbox Series X will run all games better than any other console.

Avatar image for ButDuuude
ButDuuude

1907

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#76  Edited By ButDuuude
Member since 2013 • 1907 Posts

Can’t wait for the game comparisons from Digital Foundry...

Avatar image for Tessellation
Tessellation

9297

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78 Tessellation
Member since 2009 • 9297 Posts

@boxrekt: lol "pureplaystation", "play station universe" gotta love your sources

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#79 tormentos
Member since 2003 • 33793 Posts

@BlackShirt20 said:

@FinalFighters: Xbox Series X will run all games better than any other console.

Well it depends on what you consider as better,faster frames,resolution or effects sure,faster loading less pop in and faster system probably not.

@ronvalencia said:

Claim: The memory bandwidth gap between XSX and PS5 increases with higher CPU usage

XSX CPU has ~937 GFLOPS at 3.66 Ghz with SMT or ~973 GFLOPS at 3.8 Ghz with no-SMT

PS5 CPU has ~896 GFLOPS at 3.5 Ghz (variable) with SMT

GFLOPS in FP32.

Notes: Intel AXV 2 has GPU like gather instructions. Each Zen 2 has dual 256-bit AVX 2 FMA3 instruction capability (IBM PowerPC can get lost). AXV 2 can handle FP16/INT16 packing into its 256-bit AVX 2 registers e.g. https://stackoverflow.com/questions/39413328/load-16-bit-integers-in-avx2-vector

Scenario 1: Near GPU only memory access with maximum CPU-to-GPU fusion link usage and maximum respect CPU cache boundaries programming tricks i.e. X86 CPU can be programmed like a CELL SPE. It's in Intel CPU optimization guide since X86 CPUs gained L2 cache.

For example, Swiftshader 3.0 Direct3D9c software JIT render requires proper CPU cache size config setting or the software 3D rendering performance will suffer while CELL SPEs will trigger an exception error. X86 CPU is more forgiving to the programmer i.e. the programmer can decide if the performance penalty is acceptable.

Manual tiled cache compute programming is not new for high-performance X86 server apps. L3 cache is usually large with X86 server and workstation CPUs.

XSX GPU has 25% memory bandwidth advantage over PS5 GPU.

Scenario 2: Zen 2 CPU consumes 40 GB/s memory bandwidth which exceeds PS4 CPU's 20 GB/s memory access IO

Equivalent to PC CPU with 128 bit DDR3-2600

XSX GPU: 520 GB/s

XSX CPU: 40 GB/s

VS

PS5 GPU: 408 GB/s

PS5 CPU: 40 GB/s

XSX GPU has 27.45% memory bandwidth advantage over PS5 GPU.

Scenario 3: Zen 2 CPU consumes 60 GB/s memory bandwidth which exceeds PS4 CPU's 20 GB/s memory access IO

Equivalent to PC CPU with 128 bit DDR4-3800

XSX GPU: 500 GB/s

XSX CPU: 60 GB/s

VS

PS5 GPU: 388 GB/s

PS5 CPU: 60 GB/s

XSX GPU has 28.86% memory bandwidth advantage over PS5 GPU.

Scenario 4: Zen 2 CPU consumes 80 GB/s memory bandwidth which exceeds PS4 CPU's 20 GB/s memory access IO

Equivalent to PC CPU with 128 bit DDR4-5000 e.g. Corsair’s Vengeance LPX DDR4 5,000 MHz kit (pair of 8GB modules)

XSX GPU: 480 GB/s

XSX CPU: 80 GB/s

VS

PS5 GPU: 368 GB/s

PS5 CPU: 80 GB/s

XSX GPU has 30.4% memory bandwidth advantage over PS5 GPU.

Only XSX can brute force like gaming PC with similar CPU and GPU specs.

CPU game world simulation can operate independently from loading textures aka Tile Resource. LOL

There is some things missing here.

1-Where is the 3.5GB of slower memory than the PS5 illustrated there.?

2-Did you notice the trend of those memory calculations you make?

In all the xbox show a gap much wider than the actual power it has over the PS5 which basically is 17%.

So basically is over kill,having more memory than power will not increase your power,the only way power increase with more bandwidth is if you are bandwdith bound which doesn't seem to be the case,so that 17% power gap will not suddenly transform into a 30% gap just because the xbox has more bandwidth.

If a game uses 13.5GB of video memory the xbox will have mixed results because its bandwidth is not completely as fast all around.

Avatar image for r-gamer
R-Gamer

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#80  Edited By R-Gamer
Member since 2019 • 2221 Posts

@kazhirai: The problem is you pulled the worse case scenario out of your ass. Multiple sources have said the GPU will never drop below 10tf. Multiple devs have said the gap will be smaller then the numbers on paper and you have Xbox fanboys saying it will somehow be larger.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 tormentos
Member since 2003 • 33793 Posts

@BlackShirt20 said:

@gifford38: Incorrect. Microsoft will be handling all of Sony’s online functions. Such as PSN and multiplayer.

XCloud is 100% exclusive to Xbox and allows Xbox users to stream all there content on there mobile devices.

MS is not handling per say,Sony pay to use MS cloud which is not the same,sony will be like any other costumer.

@gifford38 said:

@BlackShirt20: there was an article saying Sony is working with Microsoft to get xcloud. Sony will have its own gamepass.

Sony already has its own gamepass,psn now,gamepass is a copy of ps now and games with gold is a copy of PSN+.

Avatar image for r-gamer
R-Gamer

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#82 R-Gamer
Member since 2019 • 2221 Posts

@bluestars: Hellblade doesn't look better then God of War.

Avatar image for Sagemode87
Sagemode87

3438

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 Sagemode87  Online
Member since 2013 • 3438 Posts

@bluestars: Hellblade looks the same on Pro and X. Get it out of your head that a few extra pixels make a game look better lol. Lmao at Hellblade looking better than GOW. Xbox has no identity, all you guys care about is better multiplats because you KNOW the exclusives suck.

Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#84 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

@r-gamer said:

@kazhirai: The problem is you pulled the worse case scenario out of your ass. Multiple sources have said the GPU will never drop below 10tf. Multiple devs have said the gap will be smaller then the numbers on paper and you have Xbox fanboys saying it will somehow be larger.

Who are these "sources" and what full next gen games have they tested this on?

Avatar image for BlackShirt20
BlackShirt20

2631

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#85  Edited By BlackShirt20
Member since 2005 • 2631 Posts

@tormentos: Gamepass is PSN done right. PSN is a complete joke.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#86  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@BlackShirt20 said:

@FinalFighters: Xbox Series X will run all games better than any other console.

Well it depends on what you consider as better,faster frames,resolution or effects sure,faster loading less pop in and faster system probably not.

@ronvalencia said:

Claim: The memory bandwidth gap between XSX and PS5 increases with higher CPU usage

XSX CPU has ~937 GFLOPS at 3.66 Ghz with SMT or ~973 GFLOPS at 3.8 Ghz with no-SMT

PS5 CPU has ~896 GFLOPS at 3.5 Ghz (variable) with SMT

GFLOPS in FP32.

Notes: Intel AXV 2 has GPU like gather instructions. Each Zen 2 has dual 256-bit AVX 2 FMA3 instruction capability (IBM PowerPC can get lost). AXV 2 can handle FP16/INT16 packing into its 256-bit AVX 2 registers e.g. https://stackoverflow.com/questions/39413328/load-16-bit-integers-in-avx2-vector

Scenario 1: Near GPU only memory access with maximum CPU-to-GPU fusion link usage and maximum respect CPU cache boundaries programming tricks i.e. X86 CPU can be programmed like a CELL SPE. It's in Intel CPU optimization guide since X86 CPUs gained L2 cache.

For example, Swiftshader 3.0 Direct3D9c software JIT render requires proper CPU cache size config setting or the software 3D rendering performance will suffer while CELL SPEs will trigger an exception error. X86 CPU is more forgiving to the programmer i.e. the programmer can decide if the performance penalty is acceptable.

Manual tiled cache compute programming is not new for high-performance X86 server apps. L3 cache is usually large with X86 server and workstation CPUs.

XSX GPU has 25% memory bandwidth advantage over PS5 GPU.

Scenario 2: Zen 2 CPU consumes 40 GB/s memory bandwidth which exceeds PS4 CPU's 20 GB/s memory access IO

Equivalent to PC CPU with 128 bit DDR3-2600

XSX GPU: 520 GB/s

XSX CPU: 40 GB/s

VS

PS5 GPU: 408 GB/s

PS5 CPU: 40 GB/s

XSX GPU has 27.45% memory bandwidth advantage over PS5 GPU.

Scenario 3: Zen 2 CPU consumes 60 GB/s memory bandwidth which exceeds PS4 CPU's 20 GB/s memory access IO

Equivalent to PC CPU with 128 bit DDR4-3800

XSX GPU: 500 GB/s

XSX CPU: 60 GB/s

VS

PS5 GPU: 388 GB/s

PS5 CPU: 60 GB/s

XSX GPU has 28.86% memory bandwidth advantage over PS5 GPU.

Scenario 4: Zen 2 CPU consumes 80 GB/s memory bandwidth which exceeds PS4 CPU's 20 GB/s memory access IO

Equivalent to PC CPU with 128 bit DDR4-5000 e.g. Corsair’s Vengeance LPX DDR4 5,000 MHz kit (pair of 8GB modules)

XSX GPU: 480 GB/s

XSX CPU: 80 GB/s

VS

PS5 GPU: 368 GB/s

PS5 CPU: 80 GB/s

XSX GPU has 30.4% memory bandwidth advantage over PS5 GPU.

Only XSX can brute force like gaming PC with similar CPU and GPU specs.

CPU game world simulation can operate independently from loading textures aka Tile Resource. LOL

There is some things missing here.

1-Where is the 3.5GB of slower memory than the PS5 illustrated there.?

2-Did you notice the trend of those memory calculations you make?

In all the xbox show a gap much wider than the actual power it has over the PS5 which basically is 17%.

So basically is over kill,having more memory than power will not increase your power,the only way power increase with more bandwidth is if you are bandwdith bound which doesn't seem to be the case,so that 17% power gap will not suddenly transform into a 30% gap just because the xbox has more bandwidth.

If a game uses 13.5GB of video memory the xbox will have mixed results because its bandwidth is not completely as fast all around.

1. Game's 3.5 GB with 336 GB/s is treated like a non-GPU memory e.g. CPU and audio.

Based on PS4's Killzone ShadowFall, GPU dominates memory storage

This is why I cited PC's 128 bit DDR4-3800 with 60 GB/s example.

From PS4 example, it's effective asking for a gaming PC setup. The tiny CPU+GPU data share is like PC's CPU-to-GPU PCI-E link equivalent.

CPU does its assigned workload.

GPU does its assigned workload.

Small CPU-GPU shared workload.

Two weeks port raw Gears 5 benchmark port + PC ultra settings already showing XSX rivaling GTX 2080 class GPU, hence XSX GPU's 12 TFLOPS is scaling which is backed by memory bandwidth increase. XSX is acting as a gaming PC with similar CPU and GPU specs.

Also, XSX's memory layout leads to an easy last-minute change in the future e.g. 20 GB GDDR6 with ten 2GB chips.

2. TFLOPS calculations

XSX GPU, 2 * (52 x 64) * 1.825 Ghz = 12,147 GFLOPS or 12.147 TFLOPS at base clock speeds

PS5 GPU, 2 * (36 x 64) * 2.230 Ghz = 10,275 GFLOPS or 10.275 TFLOPS at variable clock speeds

PS5 GPU -2%, 2 * (36 x 64) * 2.1854‬ Ghz = 10,075 GFLOPS or 10.070 TFLOPS at variable clock speeds

12,147 / 10,275 = 1.18 or XSX GPU has 18% on top of PS5's GPU

12,147 / 10,075 = 1.2056 or XSX GPU has 20.56% on top of PS5's GPU

3. Each hardware advantage impacts render time which can be cumulative, but each advantage has dependency e.g. high memory bandwidth without a good TFLOPS painter is nearly pointless e.g. XBO which is CU bound.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87  Edited By tormentos
Member since 2003 • 33793 Posts

@ronvalencia said:1. Game's 3.5 GB with 336 GB/s is treated like a non-GPU memory e.g. CPU and audio.

Based on PS4's Killzone ShadowFall, GPU dominates memory storage

This is why I cited PC's 128 bit DDR4-3800 with 60 GB/s example.

From PS4 example, it's effective asking for a gaming PC setup. The tiny CPU+GPU data share is like PC's CPU-to-GPU PCI-E link equivalent.

CPU does its assigned workload.

GPU does its assigned workload.

Small CPU-GPU shared workload.

Two weeks port raw Gears 5 benchmark port + PC ultra settings already showing XSX rivaling GTX 2080 class GPU, hence XSX GPU's 12 TFLOPS is scaling which is backed by memory bandwidth increase. XSX is acting as a gaming PC with similar CPU and GPU specs.

Also, XSX's memory layout leads to an easy last-minute change in the future e.g. 20 GB GDDR6 with ten 2GB chips.

2. TFLOPS calculations

XSX GPU, 2 * (52 x 64) * 1.825 Ghz = 12,147 GFLOPS or 12.147 TFLOPS at base clock speeds

PS5 GPU, 2 * (36 x 64) * 2.230 Ghz = 10,275 GFLOPS or 10.275 TFLOPS at variable clock speeds

PS5 GPU -2%, 2 * (36 x 64) * 2.1854‬ Ghz = 10,075 GFLOPS or 10.070 TFLOPS at variable clock speeds

12,147 / 10,275 = 1.18 or XSX GPU has 18% on top of PS5's GPU

12,147 / 10,075 = 1.2056 or XSX GPU has 20.56% on top of PS5's GPU

3. Each hardware advantage impacts render time which can be cumulative, but each advantage has dependency e.g. high memory bandwidth without a good TFLOPS painter is nearly pointless e.g. XBO which is CU bound.

Wait the xbox series X is not the PS4,you can't base how the xbox will use that memory based on Killzone MS and Sony use totally different aproach.

But if that memory will be threat as no GPU memory that means the xbox series X will be capped at 10GB of ram,which mean it can have less memory available to games than the PS5.

But from what i am seeing and reading in some places most tech sites compare it to the xbox one memory setup with 2 memory with 2 different speeds.

And some even claim that at those 3.5ghz the xbox series X will be slower.

@BlackShirt20 said:@tormentos: Gamepass is PSN done right. PSN is a complete joke.

No is a copy that is what it is.

Yeah PSN is such a joke that is basically printing money for sony,no one use it right because it sucks so much.

Is not 2002 man is 2020 PSN and live are basically the same shit where it counts and most games still are P2P.

@i_p_daily said:Who are these "sources" and what full next gen games have they tested this on?

Sony say the drop was small so unless you have a better source than Cerny it self you are damage controlling again.

Avatar image for r-gamer
R-Gamer

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#88  Edited By R-Gamer
Member since 2019 • 2221 Posts

@i_p_daily: Well first we have Cerny saying it would rarely drop below 10.3.

Then devs saying the gap is smaller then it appears.

http://www.pushsquare.com/news/2020/03/ps5_superior_to_xbox_series_x_in_a_lot_of_ways_but_devs_seem_disappointed_by_sonys_communication

Also if you listened to the conference it takes a 1% drop in clock to save 10% power. So dropping down to 9.2 tf would require almost a 10% drop in clock speed which is a 100% drop in power. Not happening.

People are still taking the github leak at 9.2tf which was on RDNA1 to guess that number. It was a test sample but not final silicon.

Avatar image for sealionact
sealionact

10038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#89 sealionact
Member since 2014 • 10038 Posts

@tormentos: "Well it depends on what you consider as better,faster frames,resolution or effects sure,faster loading less pop in and faster system probably not."

The experience i have actually playing the game is more important than the time spent waiting for it to load. Especially since that 2-3 second advantage will only be on 1st party Sony games which mostly dont interest me.

Avatar image for deactivated-5f2b4872031c2
deactivated-5f2b4872031c2

2683

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#91  Edited By deactivated-5f2b4872031c2
Member since 2018 • 2683 Posts

Are lems really this insecure?

Yes, yes they are.

Avatar image for joshrmeyer
JoshRMeyer

12773

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#92 JoshRMeyer
Member since 2015 • 12773 Posts

Developers are saying the ps5 is the better machine.

https://wccftech.com/schreier-multiple-devs-are-telling-me-ps5-is-superior-to-xbox-series-x-in-several-ways-despite-spec/amp/

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#93 Juub1990
Member since 2013 • 12622 Posts

@joshrmeyer said:

Developers are saying the ps5 is the better machine.

https://wccftech.com/schreier-multiple-devs-are-telling-me-ps5-is-superior-to-xbox-series-x-in-several-ways-despite-spec/amp/

Jason Schreier, whose sources are usually sound, clearly said he heard from several different developers that while both consoles are undoubtedly impressive, the Xbox Series X isn't actually that more powerful; in fact, some even said PS5 is superior in several ways. As such, Schreier commented during the podcast that Sony dropped the ball hard in terms of communication.

Same dude who claimed that his sources told him the PS5 would have a faster GPU than the 2080? This Jason?

Lol.

These insiders are a joke and are all taking you for a ride. They just care about boosting their followers count on Twitter.

Avatar image for r-gamer
R-Gamer

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#94 R-Gamer
Member since 2019 • 2221 Posts

@Juub1990: Jason never said that. Another leak said that and yes the performance will be about on par with a 2080 so not really that far off.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#95 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

@Sagemode87 said:

@bluestars: Hellblade looks the same on Pro and X. Get it out of your head that a few extra pixels make a game look better lol. Lmao at Hellblade looking better than GOW. Xbox has no identity, all you guys care about is better multiplats because you KNOW the exclusives suck.

Death stranding is the best looking console game, nothing comes close. Don't listen to me, ask digital foundry.

Avatar image for gifford38
Gifford38

7902

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#96 Gifford38
Member since 2020 • 7902 Posts

@r-gamer: also hellblade is like a game say onimusia sorry spelling were the world is a corridor making them put more detail. Hellblade does look good but technically God of war scale and size blows it away.

Avatar image for Pedro
Pedro

73940

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#97 Pedro
Member since 2002 • 73940 Posts

So its confirmed. The PS5 is the most powerful console for next gen.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98  Edited By tormentos
Member since 2003 • 33793 Posts

@sealionact said:

@tormentos: "Well it depends on what you consider as better,faster frames,resolution or effects sure,faster loading less pop in and faster system probably not."

The experience i have actually playing the game is more important than the time spent waiting for it to load. Especially since that 2-3 second advantage will only be on 1st party Sony games which mostly dont interest me.

Yeah i guess you didn't touch the xbox one from 2013 to 2017.

Thats like saying the only graphical advantage the xbox will have in on exclusives,what make you think that developer just like the pushed the PS4,xbox one X they will not push the xbox series X and PS5 as well?

Pathetic.

@Juub1990 said:

Jason Schreier, whose sources are usually sound, clearly said he heard from several different developers that while both consoles are undoubtedly impressive, the Xbox Series X isn't actually that more powerful; in fact, some even said PS5 is superior in several ways. As such, Schreier commented during the podcast that Sony dropped the ball hard in terms of communication.

Same dude who claimed that his sources told him the PS5 would have a faster GPU than the 2080? This Jason?

Lol.

These insiders are a joke and are all taking you for a ride. They just care about boosting their followers count on Twitter.

Well is what he is saying is true that would mean it is faster than a 2080,the series X should be.

But i don't see that happening,people need to come to terms the xbox will be more powerful period,not by much but it will be more powerful,cows can't fall as low as lemmings did at the start of this gen.

fu** the secret sauce.

@Pedro said:

So its confirmed. The PS5 is the most powerful console for next gen.

No and people should not say that or quote people implying that,if that would happen i would eat crow,but the xbox will be stronger not by much but will be stronger period.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#99 Juub1990
Member since 2013 • 12622 Posts
@r-gamer said:

@Juub1990: Jason never said that. Another leak said that and yes the performance will be about on par with a 2080 so not really that far off.

That thing will be on the level of a 5700XT but a bit faster. Still far from a 2080 let alone stronger. He was dead wrong.

Yes he did in a podcast.

Source

Summary of transcript

Avatar image for r-gamer
R-Gamer

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#100  Edited By R-Gamer
Member since 2019 • 2221 Posts

@Juub1990: A 5700xt is weaker even when overclocked and has no Raytracing.

RDNA 2.0 also does better overclocked then RDN 1.0.

Sorry you are clueless.