3080 > XSX > 2080 > 2070 > PS5

Avatar image for Sagemode87
Sagemode87

3438

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#101  Edited By Sagemode87
Member since 2013 • 3438 Posts

@i_own_u_4ever: still spreading FUD huh? This gen really messed you up I guess. Sucks for you that PS5 will be top dog as usual. Games will look about the same, only so much 16 percent difference will do. Keep lying to yourself though. Playstation will be easier to develop for and will be lead platform as more money will come from it. I feel sorry for you as an Xbox fanboy.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#102 04dcarraher
Member since 2004 • 23858 Posts

@tormentos said:
@pc_rocks said:

Nope. Split RAM is where CPU or GPU can't access one pool. Both the CPU and GPU access the same pool of RAM simultaneously. It's just that both pools differ in speed.

From what i have read the CPU can't access the GPU memory at 560Gb/s,so effectively it is split in speed,even worse the GPU has access to 13.5Gb from which 3.5GB are slower which again create basically a split scenario sort of,while the total access to memory is there (well not total there are reservations) the speed is effectively different.

I don't think 10GB is enough to run all process + Ray tracing i think accessing that 3.5GB at lower speed will create a problem,the only reason MS use this method is the same reason why sony chose a 36CU GPU saving money.

From the memory layout of the XSX diagram looks like the whole APU can access both types of memory . However the design will cause issues since devs will have allocate and force the memory usage based on priority. If they start running out of memory of the 10gb pool for vram and have to swap to the slower 6gb can could cause some headaches.

When it comes cpu functions GDDR6 isn't the best solution to begin with. The bandwidth of the 6gb for OS,features and game cache wouldn't be affected with the 336gb/s. The gpu performance would be affected once that 10gb buffer gets saturated.

Avatar image for bluestars
Bluestars

2789

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#103 Bluestars
Member since 2019 • 2789 Posts

@zaryia:

Nope

But on release the Xbox s x won’t be mid tear due to what pc grafix cards hit the shelves HaH

Will it be Top of the tree....NO,but it will be high end..

Avatar image for Sagemode87
Sagemode87

3438

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 Sagemode87
Member since 2013 • 3438 Posts

@bluestars: high end held back by current gen design, lmao. Don't kid yourself, PS5 is in the same ballpark with its own advantages. Can't wait for the reaction when games look the same. Rude awakening incoming.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#105 tormentos
Member since 2003 • 33798 Posts

@04dcarraher said:

From the memory layout of the XSX diagram looks like the whole APU can access both types of memory . However the design will cause issues since devs will have allocate and force the memory usage based on priority. If they start running out of memory of the 10gb pool for vram and have to swap to the slower 6gb can could cause some headaches.

When it comes cpu functions GDDR6 isn't the best solution to begin with. The bandwidth of the 6gb for OS,features and game cache wouldn't be affected with the 336gb/s. The gpu performance would be affected once that 10gb buffer gets saturated.

From what i read optimal memory which is GPU and standard offer identical performance for the CPU its only the GPU which actually seethe different speed,so if the CPU were to take memory from the GPU it would see it as 336GB/s only the GPU is able to use 560Gb/s and part of the 336GB/s.

Bur i don't think the CPU would take memory from the 10GB optimal side,because already there is 3.5GB as standard memory + the 2.5GB for OS.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#106 tormentos
Member since 2003 • 33798 Posts
@bluestars said:

@zaryia:

Nope

But on release the Xbox s x won’t be mid tear due to what pc grafix cards hit the shelves HaH

Will it be Top of the tree....NO,but it will be high end..

Dude it will probably be mid high end,and that is OK the xbox will not cost you $1,200 will cost you probably $500 so what it has is more than good,and probably be mid range.

Avatar image for nfamouslegend
NfamousLegend

1037

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#107 NfamousLegend
Member since 2016 • 1037 Posts

Did Ron get banned? Haven't seen him around in a while, thought for sure he would flock to a nerd battle this thread has become.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#108 Juub1990
Member since 2013 • 12622 Posts

@nfamouslegend said:

Did Ron get banned? Haven't seen him around in a while, thought for sure he would flock to a nerd battle this thread has become.

Yes and we're grateful for that.

Avatar image for IMAHAPYHIPPO
IMAHAPYHIPPO

4213

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#109  Edited By IMAHAPYHIPPO
Member since 2004 • 4213 Posts
@04dcarraher said:
@i_own_u_4ever said:
@04dcarraher said:
@i_own_u_4ever said:
@ocinom said:

Ubisoft: Assassin's Creed Valhalla will hit "at least" 30 FPS at 4K resolutions on the Xbox Series X

XSX>2080 my ass

Get real the new AC game is going to be cross gen so base PS4 and Xbox One need to play it. Use better common sense.

You should use common sense....... all that extra horse power and only will target 4k 30 fps? With all that unused power they could have targeted 60 fps easily unless the game is an un-optimized mess

Wow you really are that clueless. First of all we don't have final conformation on any of this yet. Also devs aren't going to commit millions of extra dollars in development for cross gen games just to boost one version to 60fps. It's not likely just based on the money and dev time needed. Wait about a year and pretty much all games will be running at 60fps on both PS5 and XSX.

Use common sense.

Your the one who is clueless... all you need to do is change the fps cap and not lock it to 30. You do not have to comment millions of dollars ......but one person with not even 5 minutes of work to change the number 30 to 60 in a config file...... If you have the processing power available you can hit that number easily without having to spend all that time and money optimizing. XSX is no different than PC in this regard in having the resources in being able to drive 60 fps even with current gen limits.

04dcarraher, you're arguing with a brick wall, dude. Me3x12 (i_own_u_4ever) showed up near the last console launch and posted the exact same kind of nonsense threads until he got banned. His only response is, "you're clueless, you have no common sense," while incessantly posting pointless thread after pointless thread about the new Xbox with absolutely no substance or even any semblance of thought put into his posts.

carraher: I respect your desire to engage in rational conversation. me3: you were doing this in 2013. How have you not matured, even a little bit? This is sad.

Avatar image for djoffer123
Djoffer123

2377

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#110 Djoffer123
Member since 2016 • 2377 Posts

Right now a GeForce is infinitely stronger than the next Xbox since you know the damn thing is t released yet....

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#111 PC_Rocks
Member since 2018 • 8611 Posts

@tormentos said:
@pc_rocks said:

Because we haven't seen anything about the RT/ML capabilities. How much die is dedicated to that? Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Gears 5 wasn't even the most demanding game on PC. RDR 2 would have been a far better candidate to judge it.

Not saying it's impossible just there's not enough data on either side.

Dude the Max Q is not even better than the 5700 let alone the 5700XT,so common sense tell me the PS5 would beat it without problems.

This is one of the reason i say you are a console hater period,i don't think the PS5 will have problems topping the 5700 you do some how.

The PS5 has been beaten by an year old laptop for the UE5 demo. Cry more!

Also, for the 48th time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#112  Edited By PC_Rocks
Member since 2018 • 8611 Posts

@wervenom said:
@pc_rocks said:

The demo was up to 1440, meaning it was dropping resolution and we don't know how much. So, no it wouldn't have gone above 30FPS at all.

The engineer specifically said the same demo ran at 40+FPS on a year old laptop.

https://mobile.twitter.com/timsweeneyepic/status/1261834093521776641

Also I'm not sure how a 2070 max q would beats whats in the PS5 unless the PS5 is somehow weaker then current AMD tech.

Don't know what that quote supposed to discredit. He's saying it could run above 30FPS then why the resolution was variable? The official briefing did say it was upto 1440p and dropping resolution so yeah, you can't have it both ways.

Oh and it's factually wrong that consoles can't run at unlocked frame rate. KZ:SF and Infamous: SS both released with unlocked FPS going above 30 but under 40. So was GoW 3 on PS3 was in 40s range.

The post seems like after cover up just like the previous 'the engineer was talking about video quality' which now seems like they gave up on after realizing how stupid it sounded.

Lastly since you are so focused on 5700, do tell me about the RT, ML and rest of the DX12 Ultimate capabilities of 5700 in comparison to MaxQ.

Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#113 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

@Juub1990 said:
@nfamouslegend said:

Did Ron get banned? Haven't seen him around in a while, thought for sure he would flock to a nerd battle this thread has become.

Yes and we're grateful for that.

Speak for yourself. I thought Ron was at least entertaining because he riled up all you hermits, probably why dav banned him and yet keeps the PC loving alt in casual4ever around.

Avatar image for davillain
DaVillain

58743

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#114 DaVillain
Member since 2014 • 58743 Posts

@i_p_daily said:

Speak for yourself. I thought Ron was at least entertaining because he riled up all you hermits, probably why dav banned him and yet keeps the PC loving alt in casual4ever around.

Ron's outburst is what let him to his banned in the first place and I warn him about it but know this. I have nothing to do with his banning. So yes, I stand with Juub on this and not my fault, that was done by other mods.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#115 PC_Rocks
Member since 2018 • 8611 Posts
@i_own_u_4ever said:
@pc_rocks said:

Okay, then I misunderstood. I thought you were talking about those RT cores also being able to help with rasterization performance. Yeah, it might be possible to not stall the GPU when RT is enabled. The other units can start working on a different frame while the RT is being calculated. That sounds plausible. Nevertheless, RT will have impact on the overall performance and we don't yet know the capabilities of PS5 and XSX GPU.

As for XSX different RAM speed, I never claimed it will hamper it so much or the other way. Just explained it's KINDA like 970.

With having 52cu's the XSX will have far superior RT over PS5 with that many cores they have the leg room and dedicated cores to clearly have the better RT solution. We will likely see in just about every DF comparison between PS5 and XSX that the XSX wins probably every time in RT as well as texture detail and resolution.

Not relevant to my post or argument. If you want to brag about XSX vs PS5, then take it up to cows. But against PC, it's weaker than 2 year old GPUs.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#116 Juub1990
Member since 2013 • 12622 Posts

@i_p_daily: Nah. Dude kept editing his posts to add insults to them. Sometimes two or three times.

Avatar image for SecretPolice
SecretPolice

45721

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#117 SecretPolice
Member since 2007 • 45721 Posts

Cap Ron was banned?

Say it ain't so. :P

Avatar image for gifford38
Gifford38

7910

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#118 Gifford38
Member since 2020 • 7910 Posts

@i_own_u_4ever said:
@Juub1990 said:
@i_own_u_4ever said:

Maybe not a 3060 but likely a 3070 > XSX

And it's been mostly said that the XSX beats a standard 2080 and the XSX is more closer to a 2080TI not beat the TI but closer to the TI

And depending the PC with CPU and Ram and being the PS5 has variable clocks and down clocks as low as mid 9 tflops the 2070 with good PC parts likely pulls ahead of the PS5.

It most definitely isn't closer to the 2080 Ti. In the only direct comparison we have, it trailed behind the 2080 by a tiny amount. It's either neck and neck with the 2080 or 2080S which puts it closer to them than the 2080 Ti unless you believe it beats those cards by >10%

We need to keep in mind that the XSX is early in dev kits and MS is notorious for not releasing big performance gain dev kit updates until around launch and shortly after launch. This is how MS kinda fooled devs months ago and likely Sony also with having the XSX kits running slow then more recently we obviously see some dev kit updates then all devs kinda changed their tone on now saying wow the XSX is more power.

wow you make up crap just to be heard. ms is notorious for not releasing big performance gain dev kits updates. lol they all do that. the games sony has been making for the last 2 years for the ps5 got new dev kits aswell. i have proof the spiderman video the ssd was only half the speed in that video.

proof number 2 new dev kits came out for the ps5 now epic had to change the code for the unreal engine 5 to be able to take full advantage of the ps 5 i/o put. the ssd is even faster than the old dev kits and more custom hardware. you don't get what mark is doing with the ps5. the ps5 custom main chip decompress the data before it hits your gpu or cpu. it goes straight to system memory ram. series x cpu has to do all the work to decompress the data and check in. sony even said the made a changes to the gpu but they have not revealed yet. mark made last min changes to the gpu. so we have to wait on that.

plus third parties are saying the ps5 is the better system plus you got epic saying it to. you have mark carmock saying it as well. the lead industry guys are saying the ps 5 is better because it is more balanced system and the easiest console to dev for.

the only ones are saying the series x is better does not have ps5 dev kits. there working on the series x only.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#119  Edited By Zaryia
Member since 2016 • 21607 Posts

@i_p_daily said:
@Juub1990 said:
@nfamouslegend said:

Did Ron get banned? Haven't seen him around in a while, thought for sure he would flock to a nerd battle this thread has become.

Yes and we're grateful for that.

Speak for yourself. I thought Ron was at least entertaining because he riled up all you hermits, probably why dav banned him and yet keeps the PC loving alt in casual4ever around.

Ron hated PC so much that he played 99% of games on PC this gen.

He's like Giovela. Can't quit greatness.

Avatar image for deactivated-5efed3ebc2180
deactivated-5efed3ebc2180

923

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#120 deactivated-5efed3ebc2180
Member since 2006 • 923 Posts

@Juub1990 said:

@pc_rocks: No such thing as a RTX 2080 Max Q performing as well as a PS5. The Max Q model is a piece of crap compared to even a regular desktop 2070.

Yup, it's basically equal to non-S RTX 2060.

Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#121 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

@davillain- said:
@i_p_daily said:

Speak for yourself. I thought Ron was at least entertaining because he riled up all you hermits, probably why dav banned him and yet keeps the PC loving alt in casual4ever around.

Ron's outburst is what let him to his banned in the first place and I warn him about it but know this. I have nothing to do with his banning. So yes, I stand with Juub on this and not my fault, that was done by other mods.

I stand corrected, my bad. The reason I sad you was because it was you who gave him his last 2 warnings.

Avatar image for deactivated-63d2876fd4204
deactivated-63d2876fd4204

9129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#122  Edited By deactivated-63d2876fd4204
Member since 2016 • 9129 Posts

@i_own_u_4ever said:
@Juub1990 said:
@goldenelementxl said:

LMAO! Series X will not match a 2080 Super. This is AMD GPU tech we’re talking about here.

Why not? It has 54 CU's at 1825MHz. The 5700XT with 36 CU's and 2GHz comes reasonably close to the 2070S. More CU's scale better than higher clocks so it's really not hard to believe the SX can match the 2080S which isn't even that much faster than the regular 2080 anyway. We're talking a 5-10% difference. The demo of GOW5 already has the SX coming within spitting distance of a 2080 so a 2080S is definitely something it could beat by a tiny amount if we go by what we know.

Exactly with this many CU's and future performance updates the XSX has some leg stretching capabilities.

The 5700XT has a 225W TDP (suggested PSU 550W) and thermals that hit the mid 80's when gaming. That's not going in a console box... ESPECIALLY a console with only 1 fan (not to cool, but displace hot air) Remember, the CPU is clocked much higher than any console CPU we saw this gen. TDP and heat are gonna be 2 concerns. So caution needs to be raised when talking increased CU counts and higher clocks. Comparisons to PC parts won't make much sense still. For example, the Xbox One X maxes out at 245W. The Series X won't go much higher than that if at all. We are talking 250W maybe. You aren't getting RTX 2080 performance with a Ryzen 7 comparable CPU at 250W. Especially not from AMD. They haven't shown us ANYTHING that would lead me to believe they're capable of that. And I'm gonna say it again. I wish that Series X (*not) cooling fan the best of luck...

As for that Gears demo, be careful. The last time we used a demo to predict real world performance, @ronvalencia had a group of Lems thinking the Scorpio would beat a GTX 1070.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#123 Juub1990
Member since 2013 • 12622 Posts
@goldenelementxl said:

The 5700XT has a 225W TDP (suggested PSU 550W) and thermals that hit the mid 80's when gaming. That's not going in a console box... ESPECIALLY a console with only 1 fan (not to cool, but displace hot air) Remember, the CPU is clocked much higher than any console CPU we saw this gen. TDP and heat are gonna be 2 concerns. So caution needs to be raised when talking increased CU counts and higher clocks. Comparisons to PC parts won't make much sense still. For example, the Xbox One X maxes out at 245W. The Series X won't go much higher than that if at all. We are talking 250W maybe. You aren't getting RTX 2080 performance with a Ryzen 7 comparable CPU at 250W. Especially not from AMD. They haven't shown us ANYTHING that would lead me to believe they're capable of that. And I'm gonna say it again. I wish that Series X (*not) cooling fan the best of luck...

As for that Gears demo, be careful. The last time we used a demo to predict real world performance, @ronvalencia had a group of Lems thinking the Scorpio would beat a GTX 1070.

Is there a reason you're ignoring the 50% increase in power efficiency of RDNA2?

The CPU is pretty much a 3700X with lower clocks.

Avatar image for wervenom
WeRVenom

479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#124  Edited By WeRVenom
Member since 2020 • 479 Posts

@pc_rocks said:
@wervenom said:
@pc_rocks said:

The demo was up to 1440, meaning it was dropping resolution and we don't know how much. So, no it wouldn't have gone above 30FPS at all.

The engineer specifically said the same demo ran at 40+FPS on a year old laptop.

https://mobile.twitter.com/timsweeneyepic/status/1261834093521776641

Also I'm not sure how a 2070 max q would beats whats in the PS5 unless the PS5 is somehow weaker then current AMD tech.

Don't know what that quote supposed to discredit. He's saying it could run above 30FPS then why the resolution was variable? The official briefing did say it was upto 1440p and dropping resolution so yeah, you can't have it both ways.

Oh and it's factually wrong that consoles can't run at unlocked frame rate. KZ:SF and Infamous: SS both released with unlocked FPS going above 30 but under 40. So was GoW 3 on PS3 was in 40s range.

The post seems like after cover up just like the previous 'the engineer was talking about video quality' which now seems like they gave up on after realizing how stupid it sounded.

Lastly since you are so focused on 5700, do tell me about the RT, ML and rest of the DX12 Ultimate capabilities of 5700 in comparison to MaxQ.

Just telling you what it said that it could go well above 30.

I also didn't claim games couldn't run at an uncapped framerate. I said when they use Vsync ( which they did) it will only run at 30 or 60. My guess is because they couldn't get it to 60 and the framerate was variable so they locked it at 30.

5700 doesn't have RT but a PS5 supposedly has it in every CU. I will be curious to see the RT performance myself. But saying a laptop beats a PS5 based off a quick blurb from an engineer is disengenious. Were the settings the same? Was it the exact same demo? These are things that I would ask myself before I make that claim. And I would also not base all performance standards by one demo.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#125 Zaryia
Member since 2016 • 21607 Posts

Ugh consoles are gonna be so shitty. Again.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126 Juub1990
Member since 2013 • 12622 Posts
@zaryia said:

Ugh consoles are gonna be so shitty. Again.

Aren't you tired of the same trolling antics?

Jesus trolls here freakin suck.

Avatar image for deactivated-63d2876fd4204
deactivated-63d2876fd4204

9129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#127 deactivated-63d2876fd4204
Member since 2016 • 9129 Posts

@Juub1990 said:

Is there a reason you're ignoring the 50% increase in power efficiency of RDNA2?

The CPU is pretty much a 3700X with lower clocks.

The reason I'm ignoring AMD's claims? Around 24 years of AMD GPU's

The RDNA vs RDNA 2 50% slide also shows that there was supposed to be a 50% increase between GCN and RDNA. That didn't happen... (unless they're talking about GCN 1.0 that launched in 2011 instead of GCN 4.0 from 2016) So why would you take this next claim at face value?

Every time AMD makes a GPU claim, folks bite but end up disappointed.

The next-gen CPU's are pretty much 3700X's from what we know. The advertised clocks almost match the base clock of the 3700X, right? That's a 65W chip right there that isn't exactly cool temperature wise. A CPU with clocks this high have been avoided in the past because of heat and power. So now we have a competent console CPU (finally!!!) competing with the console GPU for power and cooling. Sure console APU's are more efficient in both respects, but saying RTX 2080+ and Ryzen 7 3700X performance is a given seems pretty outlandish to me. Especially with console TDP and a single (non) cooling fan. Something doesn't jive here.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#128 tormentos
Member since 2003 • 33798 Posts

@pc_rocks:

No it wasn't the whole laptop bullshit was a joke that laptop wasn't even running the game it was playing a video.😂😂😂

You are just green that for a small amount of money people will top your imaginary PC.😂😂

But but but the 2080max Q.😂

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#129  Edited By Juub1990
Member since 2013 • 12622 Posts
@goldenelementxl said:

The reason I'm ignoring AMD's claims? Around 24 years of AMD GPU's

The RDNA vs RDNA 2 50% slide also shows that there was supposed to be a 50% increase between GCN and RDNA. That didn't happen... (unless they're talking about GCN 1.0 that launched in 2011 instead of GCN 4.0 from 2016) So why would you take this next claim at face value?

Every time AMD makes a GPU claim, folks bite but end up disappointed.

The next-gen CPU's are pretty much 3700X's from what we know. The advertised clocks almost match the base clock of the 3700X, right? That's a 65W chip right there that isn't exactly cool temperature wise. A CPU with clocks this high have been avoided in the past because of heat and power. So now we have a competent console CPU (finally!!!) competing with the console GPU for power and cooling. Sure console APU's are more efficient in both respects, but saying RTX 2080+ and Ryzen 7 3700X performance is a given seems pretty outlandish to me. Especially with console TDP and a single (non) cooling fan. Something doesn't jive here.

Never claimed it was a given. I said based on what we know it's entirely possible.

The GCN claims are also out the window because we know the SX packs 56CU's at 1,825MHz. This lines up with the 50% increase in efficiency because if it weren't true, they would have never gotten anywhere near this amount of CU's with clocks this high. 2080-class performance is very possible if not likely. Would I be shocked if it fell short? Nah but I expect it to be at least equal to a 2070S.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#130 tormentos
Member since 2003 • 33798 Posts

@pc_rocks said:

Not relevant to my post or argument. If you want to brag about XSX vs PS5, then take it up to cows. But against PC, it's weaker than 2 year old GPUs.

Yeah its going to be weaker than a 2 year old $1,200 GPU what a shock.

Man even if the series X end up been a 2080,it is a superve value for what you get,that 2080 still goes for $600+ dollars,much much much muchhhhhhh better than PC,the amount of money saving you get getting a series X over building a PC to match it is huge.

The same even on the PS5 case with a weaker GPU.

@goldenelementxl said:

The reason I'm ignoring AMD's claims? Around 24 years of AMD GPU's

The RDNA vs RDNA 2 50% slide also shows that there was supposed to be a 50% increase between GCN and RDNA. That didn't happen... (unless they're talking about GCN 1.0 that launched in 2011 instead of GCN 4.0 from 2016) So why would you take this next claim at face value?

Every time AMD makes a GPU claim, folks bite but end up disappointed.

The next-gen CPU's are pretty much 3700X's from what we know. The advertised clocks almost match the base clock of the 3700X, right? That's a 65W chip right there that isn't exactly cool temperature wise. A CPU with clocks this high have been avoided in the past because of heat and power. So now we have a competent console CPU (finally!!!) competing with the console GPU for power and cooling. Sure console APU's are more efficient in both respects, but saying RTX 2080+ and Ryzen 7 3700X performance is a given seems pretty outlandish to me. Especially with console TDP and a single (non) cooling fan. Something doesn't jive here.

Dude i remember a time were Ati had a stronger GPU than Nvidia,they basically are owning intel now in the CPU side.

But is easy to see that the whole 50% improvement is real,the xbox series X has 52CU and runs at 1800+mhz man several 5700XT models don't even run that high on stock.

The PS5 will run at 2.3ghz even if they lower the clocks 5% still is over 2.0ghz so yeah that gain is real.

Worse you using in the argument the 5700 which doesn't represent RDNA2,is like saying the 3000 from Nvidia would have the same pitfalls as the 2000 series it make no sense.

@zaryia said:

Ugh consoles are gonna be so shitty. Again.

You poor fella how is your imaginary PC?😂

Avatar image for fedor
Fedor

11829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#131 Fedor
Member since 2015 • 11829 Posts

@bluestars said:

@zaryia:

Nope

But on release the Xbox s x won’t be mid tear due to what pc grafix cards hit the shelves HaH

Will it be Top of the tree....NO,but it will be high end..

Ole mid range Andy is having a crisis.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#132 PC_Rocks
Member since 2018 • 8611 Posts

@WESTBLADE said:

Yup, it's basically equal to non-S RTX 2060.

@WESTBLADE said:
@Juub1990 said:

@pc_rocks: No such thing as a RTX 2080 Max Q performing as well as a PS5. The Max Q model is a piece of crap compared to even a regular desktop 2070.

Yup, it's basically equal to non-S RTX 2060.

Well, if you have a problem with that then take it up with the Epic's engineer not me.

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#133 I_own_u_4ever
Member since 2020 • 647 Posts

@Sagemode87 said:

@i_own_u_4ever: still spreading FUD huh? This gen really messed you up I guess. Sucks for you that PS5 will be top dog as usual. Games will look about the same, only so much 16 percent difference will do. Keep lying to yourself though. Playstation will be easier to develop for and will be lead platform as more money will come from it. I feel sorry for you as an Xbox fanboy.

Both systems will be easy to develop for but actually a dev came out recently and said the XSX was the easiest of all time to developer for.

https://www.windowscentral.com/chorvs-developer-says-xbox-series-x-easier-develop-any-other-console#:~:text=The%20game's%20Head%20of%20Core,X%20than%20any%20other%20console.%22

Avatar image for flashn00b
flashn00b

3961

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#134 flashn00b
Member since 2006 • 3961 Posts

Wonder where an RX 6600 would fit in.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#135 Juub1990
Member since 2013 • 12622 Posts

@pc_rocks: The Epic engineer said none of that. Stop spreading misinformation.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#136 PC_Rocks
Member since 2018 • 8611 Posts

@wervenom said:

Just telling you what it said that it could go well above 30.

I also didn't claim games couldn't run at an uncapped framerate. I said when they use Vsync ( which they did) it will only run at 30 or 60. My guess is because they couldn't get it to 60 and the framerate was variable so they locked it at 30.

5700 doesn't have RT but a PS5 supposedly has it in every CU. I will be curious to see the RT performance myself. But saying a laptop beats a PS5 based off a quick blurb from an engineer is disengenious. Were the settings the same? Was it the exact same demo? These are things that I would ask myself before I make that claim. And I would also not base all performance standards by one demo.

And I also debunked that claim. There was no need to drop resolution if it could go above 30, the resolution would have been locked. Also he said frame time, not frame rate. He's as vague as he can.

So, you don't know about the RT/MT capabilities of PS5 and like to make a statement that it's better than whatever RTX line of GPU you're putting it against? Who's disingenuous now? So, you're making a claim that PS5 should be better than 5700 and then in the same breadth telling me it has RT capabilities. RT doesn't come free, it would have taken a chunk of the die size.

As for you now changing the argument to settings and unknown variable. Do you know how did it compare to the PS5 demo? If you do then enlighten me but if you don't why should I give you a benefit of the doubt over that engineer? He was specifically talking about the demo so it's pretty reasonable to assume that everything was same or he ouldn't have atlked about 40+ FPS. Doesn't even make any sense then.

Nevertheless, if you want to prove something then take it up with Epic to say the following:

1) The engineer on the video was wrong and the demo didn't run at 40+ FPS on laptop
2) Or the official press release was wrong where the demo was running with variable resolution

Oh, I'm not saying that PS5 might not be able to outperform 5700 or MaxQ. Could very well, but the mental gymnastics to discredit what Epic Engineer said is just laughable and doesn't make any sense. At this point in time, based on what he said, PS5 is beaten by a year old laptop in the UE5 demo until proven otherwise.

Avatar image for deactivated-5efed3ebc2180
deactivated-5efed3ebc2180

923

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#137 deactivated-5efed3ebc2180
Member since 2006 • 923 Posts
@tormentos said:
@pc_rocks said:

Not relevant to my post or argument. If you want to brag about XSX vs PS5, then take it up to cows. But against PC, it's weaker than 2 year old GPUs.

Yeah its going to be weaker than a 2 year old $1,200 GPU what a shock.

Man even if the series X end up been a 2080,it is a superve value for what you get,that 2080 still goes for $600+ dollars,much much much muchhhhhhh better than PC,the amount of money saving you get getting a series X over building a PC to match it is huge.

The same even on the PS5 case with a weaker GPU.

Maybe you're not aware of it, but new Nvidia and AMD GPU's are (supposed to) launch few months before before XSX/PS5 - throwing even "teh almighty XseXXX" into midrange category (almost given for Nvidia GPU's, don't have much hope in AMD, but i'd be glad if they show up to be competive).

Also PS5 being on equal footing with an RTX 2070/2070 S (well, at least according to Epic), which i have since launch (mine is OC'ed beyond RTX 2070 Super, mind you) and already is out of breath in most games, well, good luck being stuck with that shit for next 5+ years... That overhyped SSD won't do any miracles when it comes to computing and "otpimization" is just a myth... *cringes*

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#138 PC_Rocks
Member since 2018 • 8611 Posts

@tormentos said:

@pc_rocks:

No it wasn't the whole laptop bullshit was a joke that laptop wasn't even running the game it was playing a video.😂😂😂

You are just green that for a small amount of money people will top your imaginary PC.😂😂

But but but the 2080max Q.😂

2 weeks late on this DC that has already been debunked in the original thread about the Epic engineer. Also so funny that even the person who actually said something about the 'video quality' has retracted the statement, go and ask your fellow cow WeAReVenom about the latest tweet from Sweeney.

Remain mad, that your precious yet to be released PS5 is beaten by an year old laptop.

I also noticed about removing my quote. No problem, I'll repeat again: For the 49th time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#139 PC_Rocks
Member since 2018 • 8611 Posts

@tormentos said:
@pc_rocks said:

Not relevant to my post or argument. If you want to brag about XSX vs PS5, then take it up to cows. But against PC, it's weaker than 2 year old GPUs.

Yeah its going to be weaker than a 2 year old $1,200 GPU what a shock.

Man even if the series X end up been a 2080,it is a superve value for what you get,that 2080 still goes for $600+ dollars,much much much muchhhhhhh better than PC,the amount of money saving you get getting a series X over building a PC to match it is huge.

The same even on the PS5 case with a weaker GPU.

Thank you for agreeing that next-gen yet to be released consoles are weaker than 2+ year old GPUs and one of them has been beaten by a year old laptop in UE5 demo.

For the 50th time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#140 PC_Rocks
Member since 2018 • 8611 Posts

@Juub1990 said:

@pc_rocks: The Epic engineer said none of that. Stop spreading misinformation.

Since you're claiming that I said anything that the Epic Engineer didn't say then fell free to quote this misinformation.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#141  Edited By Zaryia
Member since 2016 • 21607 Posts

@fedor said:
@bluestars said:

@zaryia:

Nope

But on release the Xbox s x won’t be mid tear due to what pc grafix cards hit the shelves HaH

Will it be Top of the tree....NO,but it will be high end..

Ole mid range Andy is having a crisis.

Mid range on launch. Low after 2 years. Very low for 5 more. He's gonna be playing on low-med/30 fps for 8 years lol.

Again lol.

Avatar image for wervenom
WeRVenom

479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#142  Edited By WeRVenom
Member since 2020 • 479 Posts

@pc_rocks said:
@wervenom said:

Just telling you what it said that it could go well above 30.

I also didn't claim games couldn't run at an uncapped framerate. I said when they use Vsync ( which they did) it will only run at 30 or 60. My guess is because they couldn't get it to 60 and the framerate was variable so they locked it at 30.

5700 doesn't have RT but a PS5 supposedly has it in every CU. I will be curious to see the RT performance myself. But saying a laptop beats a PS5 based off a quick blurb from an engineer is disengenious. Were the settings the same? Was it the exact same demo? These are things that I would ask myself before I make that claim. And I would also not base all performance standards by one demo.

And I also debunked that claim. There was no need to drop resolution if it could go above 30, the resolution would have been locked. Also he said frame time, not frame rate. He's as vague as he can.

So, you don't know about the RT/MT capabilities of PS5 and like to make a statement that it's better than whatever RTX line of GPU you're putting it against? Who's disingenuous now? So, you're making a claim that PS5 should be better than 5700 and then in the same breadth telling me it has RT capabilities. RT doesn't come free, it would have taken a chunk of the die size.

As for you now changing the argument to settings and unknown variable. Do you know how did it compare to the PS5 demo? If you do then enlighten me but if you don't why should I give you a benefit of the doubt over that engineer? He was specifically talking about the demo so it's pretty reasonable to assume that everything was same or he ouldn't have atlked about 40+ FPS. Doesn't even make any sense then.

Nevertheless, if you want to prove something then take it up with Epic to say the following:

1) The engineer on the video was wrong and the demo didn't run at 40+ FPS on laptop

2) Or the official press release was wrong where the demo was running with variable resolution

Oh, I'm not saying that PS5 might not be able to outperform 5700 or MaxQ. Could very well, but the mental gymnastics to discredit what Epic Engineer said is just laughable and doesn't make any sense. At this point in time, based on what he said, PS5 is beaten by a year old laptop in the UE5 demo until proven otherwise.

I'm not sure how the claim was debunked. I'm also not discrediting the engineer. But there were many assumptions made off a fairly vague translation. But we will have more information soon enough.

Avatar image for wervenom
WeRVenom

479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#143 WeRVenom
Member since 2020 • 479 Posts
@pc_rocks said:
@WESTBLADE said:

Yup, it's basically equal to non-S RTX 2060.

@WESTBLADE said:
@Juub1990 said:

@pc_rocks: No such thing as a RTX 2080 Max Q performing as well as a PS5. The Max Q model is a piece of crap compared to even a regular desktop 2070.

Yup, it's basically equal to non-S RTX 2060.

Well, if you have a problem with that then take it up with the Epic's engineer not me.

I'm going to bookmark this. We have PC gamers saying the PS5 will be on par with a 2060. Lol!

Avatar image for deactivated-5efed3ebc2180
deactivated-5efed3ebc2180

923

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#144 deactivated-5efed3ebc2180
Member since 2006 • 923 Posts

@wervenom said:
@pc_rocks said:
@WESTBLADE said:

Yup, it's basically equal to non-S RTX 2060.

@WESTBLADE said:
@Juub1990 said:

@pc_rocks: No such thing as a RTX 2080 Max Q performing as well as a PS5. The Max Q model is a piece of crap compared to even a regular desktop 2070.

Yup, it's basically equal to non-S RTX 2060.

Well, if you have a problem with that then take it up with the Epic's engineer not me.

I'm going to bookmark this. We have PC gamers saying the PS5 will be on par with a 2060. Lol!

Eh, i hope you are not refering to me, i was agreeing with Juub1990 about the mentioned 2080 Max Q's "performance" (or lack of)...

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#145  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@Juub1990 said:
@goldenelementxl said:

The reason I'm ignoring AMD's claims? Around 24 years of AMD GPU's

The RDNA vs RDNA 2 50% slide also shows that there was supposed to be a 50% increase between GCN and RDNA. That didn't happen... (unless they're talking about GCN 1.0 that launched in 2011 instead of GCN 4.0 from 2016) So why would you take this next claim at face value?

Every time AMD makes a GPU claim, folks bite but end up disappointed.

The next-gen CPU's are pretty much 3700X's from what we know. The advertised clocks almost match the base clock of the 3700X, right? That's a 65W chip right there that isn't exactly cool temperature wise. A CPU with clocks this high have been avoided in the past because of heat and power. So now we have a competent console CPU (finally!!!) competing with the console GPU for power and cooling. Sure console APU's are more efficient in both respects, but saying RTX 2080+ and Ryzen 7 3700X performance is a given seems pretty outlandish to me. Especially with console TDP and a single (non) cooling fan. Something doesn't jive here.

Never claimed it was a given. I said based on what we know it's entirely possible.

The GCN claims are also out the window because we know the SX packs 56CU's at 1,825MHz. This lines up with the 50% increase in efficiency because if it weren't true, they would have never gotten anywhere near this amount of CU's with clocks this high. 2080-class performance is very possible if not likely. Would I be shocked if it fell short? Nah but I expect it to be at least equal to a 2070S.

Here's a point I like to make.... Have no idea if its true but the rumors has it that Nvidia's RTX 3080 is using a 102 die instead of 104 or rumored "103" die. This means Nvidia is actually concerned enough about what AMD's upcoming RDNA 2.0 can do, and stepped it up and using better and less cut downed chips to compete or slaughter RDNA.

Also like point out that XSX's power supply is just using a two prong power cord like laptops. So its going to use less than 300w, this tells us that a system sporting a 3.6ghz Ryzen 7 zen 2 cpu with a gpu that has 52CU at 1.8ghz along with all the other items inside. This kinda tells us that AMD's power efficiency has really improved over their current gpus.... Its not to say its not going to a hot APU. But there is no reason not to believe in some degree that XSX is in the ball park of rtx 2080 level of performance at the very least.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#146 Juub1990
Member since 2013 • 12622 Posts
@pc_rocks said:

Since you're claiming that I said anything that the Epic Engineer didn't say then fell free to quote this misinformation.

The 2080Q delivering performance on par with the UE5 demo is false and was never said. This was a bunch of misquotes from people who misunderstood the feed.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#147 Juub1990
Member since 2013 • 12622 Posts
@wervenom said:

I'm going to bookmark this. We have PC gamers saying the PS5 will be on par with a 2060. Lol!

You mean we have one guy @pc_rocks claiming that. The PS5's GPU if its clocks can even maintain 2GHz will stomp out the 2060. If it can sustain anywhere near 2.23GHz, it'll perform in the ballpark of a 2070S.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7971

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#148 Grey_Eyed_Elf
Member since 2011 • 7971 Posts

Funny how the only benchmark results we have has the XSX at "close to RTX 2080" performance... yet people are still claiming something different.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7971

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#149 Grey_Eyed_Elf
Member since 2011 • 7971 Posts
@04dcarraher said:
@pc_rocks said:
@04dcarraher said:
@pc_rocks said:

Which benchmark put XSX better than 2080/S?

The gears 5 port job, which resulted 2080 like performance with little optimizing.

So, nothing at all. Have said it countless times that Gears 5 isn't a best benchmark to judge it and that too under controlled settings.

It was a nearly a straight up copy and paste job from x1x and the xsx gpu was able to perform as well as a 2080 did on same settings . Its not hard to believe in the fact that we know a 40CU RX 5700xt is able to compete with a RTX 2070 and why is so hard to believe that a 52CU RDNA 2.0 gpu would not match or beat a 2080/2080S? We are talking like 25% more CU's without even taking into account architecture improvements. a 52 CU RX 5700 would be in 2080 range as is.

CU's don't scale 1:1 with performance increases, a 25% CU increase won't yeild a 25% perfomance boost.

Navi responds better to clocks and bandwidth better than CU count.

Look at a 5600 XT vs a 5700... Cap the bandwidth and you lose serious performance with the same CU count.

Then apply a overclock to a 5700 and a power mod and you can match a 5700XT with more CU's.

In addition to the fact that the XSX came close to a 2080 in the benchmark indicates pretty clearly that this is the case, a 52 CU Navi card should be 2080 Super levels of performance but its not.

Worst case scenario is will be 2070 Super performance and if the game is AMD optimised it can beat a 2080... Just like the X1X when it performed mostly at OC 580 levels and occasionally GTX 1070 levels.

A X1X has a GPU with more CU's than a RX 580/590 yet depending on the game matched them.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#150 tormentos
Member since 2003 • 33798 Posts

@pc_rocks said:

Thank you for agreeing that next-gen yet to be released consoles are weaker than 2+ year old GPUs and one of them has been beaten by a year old laptop in UE5 demo.

For the 50th time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

You most have confuse me with some blind fanboy who believe consoles are stronger than PC,maybe those tears in your eyes clouded your judgement.

So yeah it will be weaker than a 2 year old $1,200 GPU that not even 1% of the PC market owns according to steam own stats.

Stop deflecting you were shown the data you chose to ignore it that is your problem not mine.

@Grey_Eyed_Elf said:

Funny how the only benchmark results we have has the XSX at "close to RTX 2080" performance... yet people are still claiming something different.

I am not saying it is more than that,but a demo ported with minimal personal in short period of time is not a representation of what the machine can do.