3080 > XSX > 2080 > 2070 > PS5

Avatar image for sakaixx
sakaiXx

16611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 5

#151 sakaiXx
Member since 2013 • 16611 Posts

$500 for 2080 or $450 for 2070S performance seems like a steal. those gpu alone cost $600 - $800 on newegg.

Avatar image for fedor
Fedor

11829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#152 Fedor
Member since 2015 • 11829 Posts

@sakaixx said:

$500 for 2080 or $450 for 2070S performance seems like a steal. those gpu alone cost $600 - $800 on newegg.

Assuming the consoles are listed at those prices, sure. But they don't come out until fall and the new GPU lineups will be out by then, lowering the cost of GPUs at that performance level.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#153 PC_Rocks
Member since 2018 • 8611 Posts

@Juub1990 said:
@wervenom said:

I'm going to bookmark this. We have PC gamers saying the PS5 will be on par with a 2060. Lol!

You mean we have one guy @pc_rocks claiming that. The PS5's GPU if its clocks can even maintain 2GHz will stomp out the 2060. If it can sustain anywhere near 2.23GHz, it'll perform in the ballpark of a 2070S.

Feel free to quote where I said that. Should be easy enough since you're claiming I'm saying something the Epic engineer didn't say.

Oh and sure bookmark it. Will be fun looking back at all the DC cows did regarding that quote from Epic Engineer about him discussing the 'video quality' while being a rendering engineer to saying the 30FPS was just due to VSync even though the demo was dropping resolution.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#154  Edited By PC_Rocks
Member since 2018 • 8611 Posts
@Juub1990 said:
@pc_rocks said:

Since you're claiming that I said anything that the Epic Engineer didn't say then fell free to quote this misinformation.

The 2080Q delivering performance on par with the UE5 demo is false and was never said. This was a bunch of misquotes from people who misunderstood the feed.

So, go ahead and post specifically what was misunderstood. Just rebutting it without evidence isn't going to fly. It's tormentos and tgob's level bad.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#155 PC_Rocks
Member since 2018 • 8611 Posts
@wervenom said:
@pc_rocks said:
@wervenom said:

Just telling you what it said that it could go well above 30.

I also didn't claim games couldn't run at an uncapped framerate. I said when they use Vsync ( which they did) it will only run at 30 or 60. My guess is because they couldn't get it to 60 and the framerate was variable so they locked it at 30.

5700 doesn't have RT but a PS5 supposedly has it in every CU. I will be curious to see the RT performance myself. But saying a laptop beats a PS5 based off a quick blurb from an engineer is disengenious. Were the settings the same? Was it the exact same demo? These are things that I would ask myself before I make that claim. And I would also not base all performance standards by one demo.

And I also debunked that claim. There was no need to drop resolution if it could go above 30, the resolution would have been locked. Also he said frame time, not frame rate. He's as vague as he can.

So, you don't know about the RT/MT capabilities of PS5 and like to make a statement that it's better than whatever RTX line of GPU you're putting it against? Who's disingenuous now? So, you're making a claim that PS5 should be better than 5700 and then in the same breadth telling me it has RT capabilities. RT doesn't come free, it would have taken a chunk of the die size.

As for you now changing the argument to settings and unknown variable. Do you know how did it compare to the PS5 demo? If you do then enlighten me but if you don't why should I give you a benefit of the doubt over that engineer? He was specifically talking about the demo so it's pretty reasonable to assume that everything was same or he ouldn't have atlked about 40+ FPS. Doesn't even make any sense then.

Nevertheless, if you want to prove something then take it up with Epic to say the following:

1) The engineer on the video was wrong and the demo didn't run at 40+ FPS on laptop

2) Or the official press release was wrong where the demo was running with variable resolution

Oh, I'm not saying that PS5 might not be able to outperform 5700 or MaxQ. Could very well, but the mental gymnastics to discredit what Epic Engineer said is just laughable and doesn't make any sense. At this point in time, based on what he said, PS5 is beaten by a year old laptop in the UE5 demo until proven otherwise.

I'm not sure how the claim was debunked. I'm also not discrediting the engineer. But there were many assumptions made off a fairly vague translation. But we will have more information soon enough.

Pretty simple. He's making conflicting arguments. Saying it could go above 30 FPS is factually false when the official press release said the demo ran at up to 1440p. Why was it dropping resolution if it could go above 30 FPS?

That's basic resource management.

For sure, there are many assumptions about it but all of those fail in comparison to the DC to discredit what he said. If there was an objective way to discredit what he said it would have been said by now and not took down the video with stupid explanation like it being a video quality or video that ran at 40+ FPS or this nonsense of 30FPS+ while dropping resolution. At least stick to one argument for consistency. You guys are jumping all over the place.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#156 PC_Rocks
Member since 2018 • 8611 Posts
@tormentos said:
@pc_rocks said:

Thank you for agreeing that next-gen yet to be released consoles are weaker than 2+ year old GPUs and one of them has been beaten by a year old laptop in UE5 demo.

For the 50th time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

You most have confuse me with some blind fanboy who believe consoles are stronger than PC,maybe those tears in your eyes clouded your judgement.

So yeah it will be weaker than a 2 year old $1,200 GPU that not even 1% of the PC market owns according to steam own stats.

Stop deflecting you were shown the data you chose to ignore it that is your problem not mine.

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

For the 51st time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Avatar image for joyfreakcom
JoyFreakcom

19

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 6

#157 JoyFreakcom
Member since 2019 • 19 Posts

I'm absolutely looking forward to the PS5. The specs are fantastic compared to XSX.

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#158 I_own_u_4ever
Member since 2020 • 647 Posts

@joyfreakcom said:

I'm absolutely looking forward to the PS5. The specs are fantastic compared to XSX.

PS5 faster loading times with weaker CPU/GPU slower memory, looks up rolls eyes scratches head?

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#159 I_own_u_4ever
Member since 2020 • 647 Posts
@pc_rocks said:
@wervenom said:
@pc_rocks said:
@wervenom said:

Just telling you what it said that it could go well above 30.

I also didn't claim games couldn't run at an uncapped framerate. I said when they use Vsync ( which they did) it will only run at 30 or 60. My guess is because they couldn't get it to 60 and the framerate was variable so they locked it at 30.

5700 doesn't have RT but a PS5 supposedly has it in every CU. I will be curious to see the RT performance myself. But saying a laptop beats a PS5 based off a quick blurb from an engineer is disengenious. Were the settings the same? Was it the exact same demo? These are things that I would ask myself before I make that claim. And I would also not base all performance standards by one demo.

And I also debunked that claim. There was no need to drop resolution if it could go above 30, the resolution would have been locked. Also he said frame time, not frame rate. He's as vague as he can.

So, you don't know about the RT/MT capabilities of PS5 and like to make a statement that it's better than whatever RTX line of GPU you're putting it against? Who's disingenuous now? So, you're making a claim that PS5 should be better than 5700 and then in the same breadth telling me it has RT capabilities. RT doesn't come free, it would have taken a chunk of the die size.

As for you now changing the argument to settings and unknown variable. Do you know how did it compare to the PS5 demo? If you do then enlighten me but if you don't why should I give you a benefit of the doubt over that engineer? He was specifically talking about the demo so it's pretty reasonable to assume that everything was same or he ouldn't have atlked about 40+ FPS. Doesn't even make any sense then.

Nevertheless, if you want to prove something then take it up with Epic to say the following:

1) The engineer on the video was wrong and the demo didn't run at 40+ FPS on laptop

2) Or the official press release was wrong where the demo was running with variable resolution

Oh, I'm not saying that PS5 might not be able to outperform 5700 or MaxQ. Could very well, but the mental gymnastics to discredit what Epic Engineer said is just laughable and doesn't make any sense. At this point in time, based on what he said, PS5 is beaten by a year old laptop in the UE5 demo until proven otherwise.

I'm not sure how the claim was debunked. I'm also not discrediting the engineer. But there were many assumptions made off a fairly vague translation. But we will have more information soon enough.

Pretty simple. He's making conflicting arguments. Saying it could go above 30 FPS is factually false when the official press release said the demo ran at up to 1440p. Why was it dropping resolution if it could go above 30 FPS?

That's basic resource management.

For sure, there are many assumptions about it but all of those fail in comparison to the DC to discredit what he said. If there was an objective way to discredit what he said it would have been said by now and not took down the video with stupid explanation like it being a video quality or video that ran at 40+ FPS or this nonsense of 30FPS+ while dropping resolution. At least stick to one argument for consistency. You guys are jumping all over the place.

With lower specked variable down clocks we will get lower res and fps on PS5. Devs will have to pick higher res and lower fps or higher fps and lower res.

Avatar image for SecretPolice
SecretPolice

45721

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#160 SecretPolice
Member since 2007 • 45721 Posts

XseX >>>>>>>>>> All. :P

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#161 Juub1990
Member since 2013 • 12622 Posts

@pc_rocks: How about you post what was said in the first place verbatim then we can go to the source of the confusion?

Avatar image for BassMan
BassMan

18754

Forum Posts

0

Wiki Points

0

Followers

Reviews: 233

User Lists: 0

#162 BassMan
Member since 2002 • 18754 Posts

@joyfreakcom said:

I'm absolutely looking forward to the PS5. The specs are fantastic compared to XSX.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#163 tormentos
Member since 2003 • 33798 Posts

@pc_rocks said:

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

For the 51st time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Well you were been call already by other people who even argue pro PC a 2.0ghz 36CU RDNA2 should have no problem beating a 2080max Q even the 5700 beat it,so a 2.3ghz one even more.

The only crying is done by you zeeshanhaider you were a cry baby troll on that account as much as you are on this one.

The PS5 will cost probably $500 or less and will beat your imaginary PC..😂

@pc_rocks said:

Feel free to quote where I said that. Should be easy enough since you're claiming I'm saying something the Epic engineer didn't say.

Oh and sure bookmark it. Will be fun looking back at all the DC cows did regarding that quote from Epic Engineer about him discussing the 'video quality' while being a rendering engineer to saying the 30FPS was just due to VSync even though the demo was dropping resolution.

🤣😂😂😂😂🤣🤣🤣

You continue to make an ass of your self,that laptop sure match the PS5 video,but even a dual core one with no dedicated GPU would have it was a VIDEO of the PS5 demo.

Avatar image for davillain
DaVillain

58743

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#164 DaVillain  Moderator
Member since 2014 • 58743 Posts

@joyfreakcom said:

I'm absolutely looking forward to the PS5. The specs are fantastic compared to XSX.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#165  Edited By tormentos
Member since 2003 • 33798 Posts

@pc_rocks said:

So, go ahead and post specifically what was misunderstood. Just rebutting it without evidence isn't going to fly. It's tormentos and tgob's level bad.

Just to make you look like an ass again zeeshanhaider.

But we don't even need that,we know for fact the 5700 can beat the 2080 max Q,so a 2.3ghz RDNA2 version with 36CU should beat it no problems as well man,you would know this if you were a real PC gamer and not just a sad console hater.

Not that using a freaking 2.5K and up PC to beat a $500 or less console didn't make that even better you pay $3000 in some cases for that laptop and you will be still behind..lol

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#166  Edited By PC_Rocks
Member since 2018 • 8611 Posts

@tormentos said:
@pc_rocks said:

So, go ahead and post specifically what was misunderstood. Just rebutting it without evidence isn't going to fly. It's tormentos and tgob's level bad.

Just to make you look like an ass again zeeshanhaider.

But we don't even need that,we know for fact the 5700 can beat the 2080 max Q,so a 2.3ghz RDNA2 version with 36CU should beat it no problems as well man,you would know this if you were a real PC gamer and not just a sad console hater.

Not that using a freaking 2.5K and up PC to beat a $500 or less console didn't make that even better you pay $3000 in some cases for that laptop and you will be still behind..lol

Was already debunked in the original Epic Engineer thread as well as this. Ask your buddy WeRVenom for a follow up tweet. You are the one who has made an a$$ of him self (as usual).

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

For the 52nd time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#167 PC_Rocks
Member since 2018 • 8611 Posts
@Juub1990 said:

@pc_rocks: How about you post what was said in the first place verbatim then we can go to the source of the confusion?

No you said, I was spreading any misinformation and added something to what the deleted video said. You also claimed that MaxQ is better than PS5 as said by me. So, the onus is on you to provide proof for these two claims.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#168  Edited By Juub1990
Member since 2013 • 12622 Posts
@pc_rocks said:

No you said, I was spreading any misinformation and added something to what the deleted video said. You also claimed that MaxQ is better than PS5 as said by me. So, the onus is on you to provide proof for these two claims.

Bullshit it is, don't try to shift the burden. You made this claim;

Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Which you first of all have to provide proof of for me to even counter. This is how argumentation works. You support your postulate with evidence. You don't fucking ask the opposing party to provide it for you to counter it themselves. If you can't provide proof for your claims, you are spreading misinformation which I already accused you of doing.

Avatar image for IgGy621985
IgGy621985

5922

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#169 IgGy621985
Member since 2004 • 5922 Posts

That cooler alone on RTX 3080 is rumored to cost $150 to make. Damn.

Speaking of coolers - the industry is currently rumored to have a manufacturing price of a whopping 150 USD for the entire construction of the FE, i.e. the housing and cooler in one piece. This also suggests the positioning and pricing of the Founders Edition, which you would like to significantly upgrade.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#170 PC_Rocks
Member since 2018 • 8611 Posts

@Juub1990 said:
@pc_rocks said:

No you said, I was spreading any misinformation and added something to what the deleted video said. You also claimed that MaxQ is better than PS5 as said by me. So, the onus is on you to provide proof for these two claims.

Bullshit it is, don't try to shift the burden. You made this claim;

Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Which you first of all have to provide proof of for me to even counter. This is how argumentation works. You support your postulate with evidence. You don't fucking ask the opposing party to provide it for you to counter it themselves. If you can't provide proof for your claims, you are spreading misinformation which I already accused you of doing.

Yup and that's what I claimed, now where did I lie? The demo was running at 40+FPS. Proof the contrary.

  • 53:00, he said his notebook could run at 1440p 40fps+, optimization target is 60fps for next-gen.

So, yeah waiting for the proof of me spreading misinformation while added something to what the deleted video said. Also need the proof that I claimed MaxQ is better than PS5.

Avatar image for rmpumper
rmpumper

2330

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#171 rmpumper
Member since 2016 • 2330 Posts

So the rumors so far are as follows:

  • big navi is set to compete with 2080Ti
  • nvidia is upgrading 3080 specs to be able to beat big navi
  • new stock cooling will cost $150 for the RTX3000 series

So I guess a 3070 will be released with a $699 price tag (the bullshit FE 2070 was $599, add the additional cost of the new heatsink, and you end up with $699). What a load of shit.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#172 tormentos
Member since 2003 • 33798 Posts

@pc_rocks said:

Was already debunked in the original Epic Engineer thread as well as this. Ask your buddy WeRVenom for a follow up tweet. You are the one who has made an a$$ of him self (as usual).

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

For the 52nd time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

I already PROVED my point you were owned..🤣

Now keed asking for ram this is a clear example of how data is presented to you and you ignore it.

The 5700 beat the 2080max Q a 36CU RDNA2 GPU at 2.ghz will beat it even esier,is common sense you have non and that is the problem.😂

That laptop wasn't running the demo it was video playing it..😂😂😂😂

Now please by all means buy a 2080 Max Q laptop and show it to us so we can laugh at how your stock with a machine barely upgradable that cost you close to $3000 and that is weaker than a $400 - $500 console..

@Juub1990 said:
@pc_rocks said:

No you said, I was spreading any misinformation and added something to what the deleted video said. You also claimed that MaxQ is better than PS5 as said by me. So, the onus is on you to provide proof for these two claims.

Bullshit it is, don't try to shift the burden. You made this claim;

Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Which you first of all have to provide proof of for me to even counter. This is how argumentation works. You support your postulate with evidence. You don't fucking ask the opposing party to provide it for you to counter it themselves. If you can't provide proof for your claims, you are spreading misinformation which I already accused you of doing.

And that he will not provide period because he doesn't have it,common sense alone shoulod have resolve the issue as the 5700 beat the 2080 max Q.

Not to mention that he is using a damn close to $3,000 dollar laptop to make fun or a $400 to $500 console,hell i build a 2080TI system with that many and still have money to spare before buying a laptop like that which is basically a console because you can't change its CPU or GPU like you would on a desktop PC.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#173  Edited By Juub1990
Member since 2013 • 12622 Posts
@pc_rocks said:

Yup and that's what I claimed, now where did I lie? The demo was running at 40+FPS. Proof the contrary.

  • 53:00, he said his notebook could run at 1440p 40fps+, optimization target is 60fps for next-gen.

So, yeah waiting for the proof of me spreading misinformation while added something to what the deleted video said. Also need the proof that I claimed MaxQ is better than PS5.

Do you even follow? This is what happened;

I'm going to bookmark this. We have PC gamers saying the PS5 will be on par with a 2060. Lol!

To which I replied

You mean we have one guy @pc_rocks claiming that. The PS5's GPU if its clocks can even maintain 2GHz will stomp out the 2060. If it can sustain anywhere near 2.23GHz, it'll perform in the ballpark of a 2070S.

Because the 2080 Max Q gets mopped up by a desktop 2070 and is in the ballpark of a desktop 2060 so you believing the demo is representative of the PS5's capabilities are you equating its performance to roughly what amounts to a desktop 2060.

As for the rest, you missed a whole bunch of shit regarding the stream. The engineer never even gave the specs of his notebook(and if he did, it wasn't even a 2080 Max Q from memory.). They also claimed it ran on a mobile 2080 which would put its performance on the level of a desktop 2070S.

From the reddit thread where all this shit started then different reports were made about the specs and ultimately, the guy never even gave the specs of his laptop.

And according to a Chinese user on beyond3d forums, the engineer also said:

"Actually for the scene you saw previously, we were able to run it @ 40fps on our laptop inside the editor tool where the assets were not completely cooked".

Then that same user who apparently watched the whole thing claimed this;

The evidence for the laptop specs come from a Chinese forum where it was said the engineer is part of. Word is a user asked him his laptop's specs and it sported a RTX 2080(not MaxQ ) but there is a myriad of conflicting reports about its actual hardware. Then it turns out the engineer wasn't on that forum at all but was merely paraphrased by a user there who claimed to have asked him.

Then you had this mega confusion about him playing a video of the demo vs him running the actual demo. Something which even Tim Sweeny chimed in as you posted.

What you posted is simply stating the devs said his notebook could run it at 1440p/40fps+, it doesn't tell us it does that on a 2080 Max Q and according to the latest reports, it was a mobile 2080(y'know, the desktop model but clocked a lot lower giving it performance on par with a 2070 to 2070S depending of the temps) not a Max Q limited to like 95W.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#174 tormentos
Member since 2003 • 33798 Posts

And PC_Troll banish after been owned..🤣🤣🤣

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#175 PC_Rocks
Member since 2018 • 8611 Posts

@tormentos said:
@pc_rocks said:

Was already debunked in the original Epic Engineer thread as well as this. Ask your buddy WeRVenom for a follow up tweet. You are the one who has made an a$$ of him self (as usual).

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

For the 52nd time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

I already PROVED my point you were owned..🤣

Now keed asking for ram this is a clear example of how data is presented to you and you ignore it.

The 5700 beat the 2080max Q a 36CU RDNA2 GPU at 2.ghz will beat it even esier,is common sense you have non and that is the problem.😂

That laptop wasn't running the demo it was video playing it..😂😂😂😂

Now please by all means buy a 2080 Max Q laptop and show it to us so we can laugh at how your stock with a machine barely upgradable that cost you close to $3000 and that is weaker than a $400 - $500 console..

Nah, you didn't at all. You just owned your self.

For the 53rd time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#176  Edited By PC_Rocks
Member since 2018 • 8611 Posts
@Juub1990 said:

Do you even follow? This is what happened;

I'm going to bookmark this. We have PC gamers saying the PS5 will be on par with a 2060. Lol!

To which I replied

You mean we have one guy @pc_rocks claiming that. The PS5's GPU if its clocks can even maintain 2GHz will stomp out the 2060. If it can sustain anywhere near 2.23GHz, it'll perform in the ballpark of a 2070S.

Because the 2080 Max Q gets mopped up by a desktop 2070 and is in the ballpark of a desktop 2060 so you believing the demo is representative of the PS5's capabilities are you equating its performance to roughly what amounts to a desktop 2060.

As for the rest, you missed a whole bunch of shit regarding the stream. The engineer never even gave the specs of his notebook(and if he did, it wasn't even a 2080 Max Q from memory.). They also claimed it ran on a mobile 2080 which would put its performance on the level of a desktop 2070S.

From the reddit thread where all this shit started then different reports were made about the specs and ultimately, the guy never even gave the specs of his laptop.

And according to a Chinese user on beyond3d forums, the engineer also said:

"Actually for the scene you saw previously, we were able to run it @ 40fps on our laptop inside the editor tool where the assets were not completely cooked".

Then that same user who apparently watched the whole thing claimed this;

The evidence for the laptop specs come from a Chinese forum where it was said the engineer is part of. Word is a user asked him his laptop's specs and it sported a RTX 2080(not MaxQ ) but there is a myriad of conflicting reports about its actual hardware. Then it turns out the engineer wasn't on that forum at all but was merely paraphrased by a user there who claimed to have asked him.

Then you had this mega confusion about him playing a video of the demo vs him running the actual demo. Something which even Tim Sweeny chimed in as you posted.

What you posted is simply stating the devs said his notebook could run it at 1440p/40fps+, it doesn't tell us it does that on a 2080 Max Q and according to the latest reports, it was a mobile 2080(y'know, the desktop model but clocked a lot lower giving it performance on par with a 2070 to 2070S depending of the temps) not a Max Q limited to like 95W.

Exactly, WeRVenom said We have PC gamers saying the PS5 will be on par with a 2060. Lol! To which you replied yes, we have PC_Rocks claiming that. So still waiting for where did I ever claim that?

I already said this in my reply to WeRVenom: Oh, I'm not saying that PS5 might not be able to outperform 5700 or MaxQ. Could very well, but the mental gymnastics to discredit what Epic Engineer said is just laughable and doesn't make any sense.

All I claimed that PS5 was outperformed by MaxQ for UE5 demo. I admit the only thing I got wrong with was the MaxQ which actually I didn't said in the previous thread, was calling it mobile 2080 but once NoddleFighter called it a MaxQ, I ran with it without verifying it.

Also, the quote about running in editor doesn't bode well for PS5 either nor is it any knock against a laptop. If anything it's even more alarming. You do know that running anything in an editor is far more taxing than running the actual demo/game?

Now let's get extremely nitpicky. First, still neither you nor me know if it was a MaxQ or not, so how could anyone say otherwise? Isn't MaxQ vs Mobile 2080 just have a difference of clock speeds and power envelope with it being a little better than mobile 2070? That puts it better than desktop 2060 and somewhat in the range of 2060S-2070. Neverthless, Mobile 2080 it is.

The rest in all my replies I was debunking the 'it being just a video running on laptop' myth which Sweeney him self later corrected or the factually wrong claim that 'the demo was running above 30FPS outside of VSync on PS5'.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#177 Juub1990
Member since 2013 • 12622 Posts
@pc_rocks said:

Exactly, WeRVenom said We have PC gamers saying the PS5 will be on par with a 2060. Lol! To which you replied yes, we have PC_Rocks claiming that. So still waiting for where did I ever claim that?

I already said this in my reply to WeRVenom: Oh, I'm not saying that PS5 might not be able to outperform 5700 or MaxQ. Could very well, but the mental gymnastics to discredit what Epic Engineer said is just laughable and doesn't make any sense.

All I claimed that PS5 was outperformed by MaxQ for UE5 demo. I admit the only thing I got wrong with was the MaxQ which actually I didn't said in the previous thread, was calling it mobile 2080 but once NoddleFighter called it a MaxQ, I ran with it without verifying it.

Also, the quote about running in editor doesn't bode well for PS5 either nor is it any knock against a laptop. If anything it's even more alarming. You do know that running anything in an editor is far more taxing than running the actual demo/game?

Now let's get extremely nitpicky. First, still neither you nor me know if it was a MaxQ or not, so how could anyone say otherwise? Isn't MaxQ vs Mobile 2080 just have a difference of clock speeds and power envelope with it being a little better than mobile 2070? That puts it better than desktop 2060 and somewhat in the range of 2060S-2070. Neverthless, Mobile 2080 it is.

The rest in all my replies I was debunking the 'it being just a video running on laptop' myth or the factually wrong claim that 'the demo was running above 30FPS outside of VSync on PS5'.

Because you said this;

Because we haven't seen anything about the RT/ML capabilities. How much die is dedicated to that? Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

And I said "no such thing" yet you doubled down and vehemently argued the UE5 demo was running on a 2080 Max Q when it's patently false as all sources who translated the quote from the Chinese forum claimed a 2080 mobile. So if you weren't defending the PS5 having a performance profile to a 2060 in the UE5 demo, why bother bringing up the demo to discredit it if it's not even representative?

A Max Q is a LOT worse than a mobile 2080 mainly because of its extremely limited power profile. The thing is capped at 90W for Christ's sake.

I mean look at this:

The mobile 2080 is clocked a whopping 45% higher than the Max Q. The difference in performance between them is enormous. It's useless to even bring up the 2080 Max Q. It's not even comparable to the 5700XT which the PS5 should match if not exceed with its insane clocks and reportedly better IPC.

Avatar image for Pedro
Pedro

74003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#178 Pedro
Member since 2002 • 74003 Posts

@davillain- said:
@joyfreakcom said:

I'm absolutely looking forward to the PS5. The specs are fantastic compared to XSX.

That actually made me LOL.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#179 tormentos
Member since 2003 • 33798 Posts

@pc_rocks said:

Nah, you didn't at all. You just owned your self.

For the 53rd time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

Yes i did and you are grasping badly.

Don't worry the machine will come out and we will se if that GPU is stronger than the PS5 please don't change to a new alt when that happen.😂

See look at that bold par which is exactly what jubb is refering to.

Please prove that to me because i can prove the 2080 Max Q can be beat by the 5700,now please prove to me than an OLD laptop can beat the PS5.🤣

By the way explain to me how the 2080 Max Q is old? Laptop with that GPU have 1 year and 2 months and retail for 2.5k and up and some models over 3k.

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#181 I_own_u_4ever
Member since 2020 • 647 Posts

@joyfreakcom said:

I'm absolutely looking forward to the PS5. The specs are fantastic compared to XSX.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#182 PC_Rocks
Member since 2018 • 8611 Posts

@tormentos said:
@pc_rocks said:

Nah, you didn't at all. You just owned your self.

For the 53rd time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

Yes i did and you are grasping badly.

Don't worry the machine will come out and we will se if that GPU is stronger than the PS5 please don't change to a new alt when that happen.😂

See look at that bold par which is exactly what jubb is refering to.

Please prove that to me because i can prove the 2080 Max Q can be beat by the 5700,now please prove to me than an OLD laptop can beat the PS5.🤣

By the way explain to me how the 2080 Max Q is old? Laptop with that GPU have 1 year and 2 months and retail for 2.5k and up and some models over 3k.

Keep owning yourself as usual. Sweeney him self corrected the video argument.

For the 54th time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#183 PC_Rocks
Member since 2018 • 8611 Posts
@Juub1990 said:
@pc_rocks said:

Exactly, WeRVenom said We have PC gamers saying the PS5 will be on par with a 2060. Lol! To which you replied yes, we have PC_Rocks claiming that. So still waiting for where did I ever claim that?

I already said this in my reply to WeRVenom: Oh, I'm not saying that PS5 might not be able to outperform 5700 or MaxQ. Could very well, but the mental gymnastics to discredit what Epic Engineer said is just laughable and doesn't make any sense.

All I claimed that PS5 was outperformed by MaxQ for UE5 demo. I admit the only thing I got wrong with was the MaxQ which actually I didn't said in the previous thread, was calling it mobile 2080 but once NoddleFighter called it a MaxQ, I ran with it without verifying it.

Also, the quote about running in editor doesn't bode well for PS5 either nor is it any knock against a laptop. If anything it's even more alarming. You do know that running anything in an editor is far more taxing than running the actual demo/game?

Now let's get extremely nitpicky. First, still neither you nor me know if it was a MaxQ or not, so how could anyone say otherwise? Isn't MaxQ vs Mobile 2080 just have a difference of clock speeds and power envelope with it being a little better than mobile 2070? That puts it better than desktop 2060 and somewhat in the range of 2060S-2070. Neverthless, Mobile 2080 it is.

The rest in all my replies I was debunking the 'it being just a video running on laptop' myth or the factually wrong claim that 'the demo was running above 30FPS outside of VSync on PS5'.

Because you said this;

Because we haven't seen anything about the RT/ML capabilities. How much die is dedicated to that? Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

And I said "no such thing" yet you doubled down and vehemently argued the UE5 demo was running on a 2080 Max Q when it's patently false as all sources who translated the quote from the Chinese forum claimed a 2080 mobile. So if you weren't defending the PS5 having a performance profile to a 2060 in the UE5 demo, why bother bringing up the demo to discredit it if it's not even representative?

A Max Q is a LOT worse than a mobile 2080 mainly because of its extremely limited power profile. The thing is capped at 90W for Christ's sake.

I mean look at this:

The mobile 2080 is clocked a whopping 45% higher than the Max Q. The difference in performance between them is enormous. It's useless to even bring up the 2080 Max Q. It's not even comparable to the 5700XT which the PS5 should match if not exceed with its insane clocks and reportedly better IPC.

You said I claimed RTX 2060 is better than a PS5. Feel free to quote where I did that. I said MaxQ outperformed the PS5 in the UE5 demo. I also posted the actual quote where I specifically said it may very well not be the case.

The only thing I was off about is the MaxQ vs RTX 2080 Mobile. I already explained how that happened. I didn't even look at the specs of MaxQ, I just assumed it's another brand name. If you look at all the quotes not even cows challenged the MaxQ part because they all also assume it to be just a brand name for another line of laptops. They all were jumping up and down about either the engineer was talking about the video or PS5 demo was running at higher FPS etc.

Also not really related to our argument but UE5 demo could have performed better on MaxQ vs PS5. May be it scales better on core count than clock speeds. All depends on workloads.

Lastly, if wasn't right, can you prove it wasn't MaxQ either? Looking at the Notebooks MaxQ also come under the RTX 2080 mobile. How was only I spreading misinformation when you on the contrary also claimed it wasn't MaxQ.

Avatar image for wervenom
WeRVenom

479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#184 WeRVenom
Member since 2020 • 479 Posts

@pc_rocks said:
@wervenom said:
@pc_rocks said:
@wervenom said:

Just telling you what it said that it could go well above 30.

I also didn't claim games couldn't run at an uncapped framerate. I said when they use Vsync ( which they did) it will only run at 30 or 60. My guess is because they couldn't get it to 60 and the framerate was variable so they locked it at 30.

5700 doesn't have RT but a PS5 supposedly has it in every CU. I will be curious to see the RT performance myself. But saying a laptop beats a PS5 based off a quick blurb from an engineer is disengenious. Were the settings the same? Was it the exact same demo? These are things that I would ask myself before I make that claim. And I would also not base all performance standards by one demo.

And I also debunked that claim. There was no need to drop resolution if it could go above 30, the resolution would have been locked. Also he said frame time, not frame rate. He's as vague as he can.

So, you don't know about the RT/MT capabilities of PS5 and like to make a statement that it's better than whatever RTX line of GPU you're putting it against? Who's disingenuous now? So, you're making a claim that PS5 should be better than 5700 and then in the same breadth telling me it has RT capabilities. RT doesn't come free, it would have taken a chunk of the die size.

As for you now changing the argument to settings and unknown variable. Do you know how did it compare to the PS5 demo? If you do then enlighten me but if you don't why should I give you a benefit of the doubt over that engineer? He was specifically talking about the demo so it's pretty reasonable to assume that everything was same or he ouldn't have atlked about 40+ FPS. Doesn't even make any sense then.

Nevertheless, if you want to prove something then take it up with Epic to say the following:

1) The engineer on the video was wrong and the demo didn't run at 40+ FPS on laptop

2) Or the official press release was wrong where the demo was running with variable resolution

Oh, I'm not saying that PS5 might not be able to outperform 5700 or MaxQ. Could very well, but the mental gymnastics to discredit what Epic Engineer said is just laughable and doesn't make any sense. At this point in time, based on what he said, PS5 is beaten by a year old laptop in the UE5 demo until proven otherwise.

I'm not sure how the claim was debunked. I'm also not discrediting the engineer. But there were many assumptions made off a fairly vague translation. But we will have more information soon enough.

Pretty simple. He's making conflicting arguments. Saying it could go above 30 FPS is factually false when the official press release said the demo ran at up to 1440p. Why was it dropping resolution if it could go above 30 FPS?

That's basic resource management.

For sure, there are many assumptions about it but all of those fail in comparison to the DC to discredit what he said. If there was an objective way to discredit what he said it would have been said by now and not took down the video with stupid explanation like it being a video quality or video that ran at 40+ FPS or this nonsense of 30FPS+ while dropping resolution. At least stick to one argument for consistency. You guys are jumping all over the place.

Ok so it's safe to say you think a 2060 will outperform a PS5. I'll bookmark this and well come back to it later.

Avatar image for nfamouslegend
NfamousLegend

1037

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#185 NfamousLegend
Member since 2016 • 1037 Posts

Are people now arguing over whether or not a 2080 MaxQ is better than a PS5/XSX? If that is the case lol, a 2080 MaxQ is not even comparable to a vanilla 5700 and given the rumored IPC gains and clockspeeds of RDNA2 performance not even in the same plane. Even the recently revealed 2080 Super MaxQ is nothing impressive, all for $3000 get yours today.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#186 Juub1990
Member since 2013 • 12622 Posts
@pc_rocks said:

You said I claimed RTX 2060 is better than a PS5. Feel free to quote where I did that. I said MaxQ outperformed the PS5 in the UE5 demo. I also posted the actual quote where I specifically said it may very well not be the case.

The only thing I was off about is the MaxQ vs RTX 2080 Mobile. I already explained how that happened. I didn't even look at the specs of MaxQ, I just assumed it's another brand name. If you look at all the quotes not even cows challenged the MaxQ part because they all also assume it to be just a brand name for another line of laptops. They all were jumping up and down about either the engineer was talking about the video or PS5 demo was running at higher FPS etc.

Also not really related to our argument but UE5 demo could have performed better on MaxQ vs PS5. May be it scales better on core count than clock speeds. All depends on workloads.

Lastly, if wasn't right, can you prove it wasn't MaxQ either? Looking at the Notebooks MaxQ also come under the RTX 2080 mobile. How was only I spreading misinformation when you on the contrary also claimed it wasn't MaxQ.

It's not a Max-Q because none of the people who watched the video and understood what he was saying reported ever hearing words about a Max-Q. The only source we have is some third party guy claiming on a Chinese forum he's asked the engineer about his specs and he said it was a RTX 2080 mobile. That's all we got. Saying it's a 2080 Max-Q is as valid as saying it's a 2080 Ti. ie nothing except the 2080 part was present in the post speaking about the specs. Furthermore, with the specs alone we KNOW the PS5 will destroy a 2060, no ifs or buts.

As for the first line, the RTX 2080 Max-Q's performance profile lands it in the ballpark of a desktop 2060 and you seemed to be buying it and even used it to mock tormentos and this other guy because of it.

Avatar image for Uruz7laevatein
Uruz7laevatein

160

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#187 Uruz7laevatein
Member since 2009 • 160 Posts

I see, somebody's nonexistent manhood felt threatened by a gaming console......... ahahahahaha

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#188  Edited By tormentos
Member since 2003 • 33798 Posts

@pc_rocks:

Post proof of the PS5 been beat by a 1 year old $3,000 laptop.🤣🤣🤣

Go ahead I'll wait.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#189 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

Wow, never seen a meltdown as bad as this. Move on chump, you are getting destroyed.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#190 PC_Rocks
Member since 2018 • 8611 Posts

@wervenom said:
@pc_rocks said:

Pretty simple. He's making conflicting arguments. Saying it could go above 30 FPS is factually false when the official press release said the demo ran at up to 1440p. Why was it dropping resolution if it could go above 30 FPS?

That's basic resource management.

For sure, there are many assumptions about it but all of those fail in comparison to the DC to discredit what he said. If there was an objective way to discredit what he said it would have been said by now and not took down the video with stupid explanation like it being a video quality or video that ran at 40+ FPS or this nonsense of 30FPS+ while dropping resolution. At least stick to one argument for consistency. You guys are jumping all over the place.

Ok so it's safe to say you think a 2060 will outperform a PS5. I'll bookmark this and well come back to it later.

No, that's you putting words in my mouth because you know you have no leg to stand on and all your arguments so far has been debunked. Do bookmark it but I would also like a quote where I said this.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#191  Edited By PC_Rocks
Member since 2018 • 8611 Posts
@Juub1990 said:

It's not a Max-Q because none of the people who watched the video and understood what he was saying reported ever hearing words about a Max-Q. The only source we have is some third party guy claiming on a Chinese forum he's asked the engineer about his specs and he said it was a RTX 2080 mobile. That's all we got. Saying it's a 2080 Max-Q is as valid as saying it's a 2080 Ti. ie nothing except the 2080 part was present in the post speaking about the specs. Furthermore, with the specs alone we KNOW the PS5 will destroy a 2060, no ifs or buts.

As for the first line, the RTX 2080 Max-Q's performance profile lands it in the ballpark of a desktop 2060 and you seemed to be buying it and even used it to mock tormentos and this other guy because of it.

That's precisely my point neither you nor me know if it was a MaxQ or not. So, your argument that I was spreading misinformation is as wrong as you claiming it wasn't MaxQ. The only thing I did do wrong was claimed it to be MaxQ.

Ah, no. Nowhere did I compared MaxQ to 2060 or even said it has the same performance profile. There's not a single post of me with that. I'm still mocking this other guy and tormentos based on a laptop beating their precious PS5 for UE5 demo which is indeed correct. You see even they didn't know MaxQ's performance profile is close to 2060 until you said so. They never once raised objection about MaxQ vs 2080 mobile. All they are really mad about is PS5 being weaker than whatever laptop it was and contesting it based on stupid/misunderstood tweets from Sweeney.

As I said, it could very well be MaxQ for this particular demo. Lumen most probably works with voxel cone/ray tracing with a temporal accumulation. This workload is needs high bandwidth and scales well with core counts for the same TFLOPs. PS5 is definitely at a disadvantage with its clock biased approach.

Lastly, saying you know PS5 will destroy the 2060 with specs is as bad as you accusing me of saying 2060 will be better than PS5. Neither you nor me know the RT/DX 12U capabilities of PS5 or even RDNA2. And if you do know these, feel free to link it and I'll happily admit you're right.

So, yeah still waiting for a quote of me where I claimed 2060 is better than PS5, from both you and wervenom. When I already said this way before even 2060 appear in any of the arguments at all:

Oh, I'm not saying that PS5 might not be able to outperform 5700 or MaxQ. Could very well, but the mental gymnastics to discredit what Epic Engineer said is just laughable and doesn't make any sense.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#192  Edited By PC_Rocks
Member since 2018 • 8611 Posts
@tormentos said:

@pc_rocks:

Post proof of the PS5 been beat by a 1 year old $3,000 laptop.🤣🤣🤣

Go ahead I'll wait.

Don't have to. Epic engineer did for me and most definitely not when you have changed your argument 4 times in a row after self ownage each time. ROFLMAO!

For the 55th time where are the sources for GDDR is better than DDR for CPUs? Still waiting for it after two months.

Your precious yet to be released PS5 is being beaten by an year old laptop in UE5 demo. Cry more!

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#193  Edited By Juub1990
Member since 2013 • 12622 Posts
@pc_rocks said:

That's precisely my point neither you nor me know if it was a MaxQ or not. So, your argument that I was spreading misinformation is as wrong as you claiming it wasn't MaxQ. The only thing I did do wrong was claimed it to be MaxQ.

Ah, no. Nowhere did I compared MaxQ to 2060 or even said it has the same performance profile. There's not a single post of me with that. I'm still mocking this other guy and tormentos based on a laptop beating their precious PS5 for UE5 demo which is indeed correct. You see even they didn't know MaxQ's performance profile is close to 2060 until you said so. They never once raised objection about MaxQ vs 2080 mobile. All they are really mad about is PS5 being weaker than whatever laptop it was and contesting it based on stupid/misunderstood tweets from Sweeney.

As I said, it could very well be MaxQ for this particular demo. Lumen most probably works with voxel cone/ray tracing with a temporal accumulation. This workload is needs high bandwidth and scales well with core counts for the same TFLOPs. PS5 is definitely at a disadvantage with its clock biased approach.

Lastly, saying you know PS5 will destroy the 2060 with specs is as bad as you accusing me of saying 2060 will be better than PS5. Neither you nor me know the RT/DX 12U capabilities of PS5 or even RDNA2. And if you do know these, feel free to link it and I'll happily admit you're right.

There is nothing supporting it is a Max-Q. Nothing AT ALL. While there is some guy purportedly in the know about the specs who said it was a mobile 2080 and the performance of the mobile 2080 is roughly in the ballpark of the PS5's GPU. So I have A LOT more to go on than you do.

And don't be obtuse. The 5700XT destroys the 2060 and the PS5 has similar specs with 4 disabled CU's but much higher clocks so yes we know. Don't be an idiot.

Furthermore, yes, you were spreading misinformation because you were the one going around saying a 2080 Max-Q beat a PS5 in UE5. I wasn't going around saying anything.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#194 PC_Rocks
Member since 2018 • 8611 Posts

@Uruz7laevatein said:

I see, somebody's nonexistent manhood felt threatened by a gaming console......... ahahahahaha

I commend your bravery. It take guts to come out and admit that. I truly feel sorry for your loss (not)man.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#195  Edited By PC_Rocks
Member since 2018 • 8611 Posts

@Juub1990 said:

There is nothing supporting it is a Max-Q. Nothing AT ALL. While there is some guy purportedly in the know about the specs who said it was a mobile 2080 and the performance of the mobile 2080 is roughly in the ballpark of the PS5's GPU. So I have A LOT more to go on than you do.

And don't be obtuse. The 5700XT destroys the 2060 and the PS5 has similar specs with 4 disabled CU's but much higher clocks so yes we know. Don't be an idiot.

Furthermore, yes, you were spreading misinformation because you were the one going around saying a 2080 Max-Q beat a PS5 in UE5. I wasn't going around saying anything.

Keep repeating your self doesn't make it true. You claimed two things that I was spreading misinformation by saying 2060 is/will be better than PS5. You have yet to provide the proof of that.

MaxQ being wrong part has already been addressed by me.

If there was nothing supporting it that it was MaxQ then there's nothing supporting it that it wasn't either. And you're still saying that in this very post so you indeed are also spreading misinformation. If you want to claim something like that then put IMO after it.

What are the 5700XT's MT/RT/DX12U capabilities? Who's the one now claiming things he doesn't know. Regardless 5700XT destroying 2060 has nothing to do with whatever I said so I don't know why you keep bringing that up. Never once used 5700/XT or 2060 in any of my arguments at all. If you are claiming that I did it then provide proof of that too.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#196 Juub1990
Member since 2013 • 12622 Posts
@pc_rocks said:

Keep repeating your self doesn't make it true. You claimed two things that I was spreading misinformation by saying 2060 is/will be better than PS5. You have yet to provide the proof of that.

MaxQ being wrong part has already been addressed by me.

If there was nothing supporting it that it was MaxQ then there's nothing supporting it that it wasn't either. And you're still saying that in this very post so you indeed are also spreading misinformation. If you want to claim something like that then put IMO after it.

What are the 5700XT's MT/RT/DX12U capabilities? Who's the one now claiming things he doesn't know. Regardless 5700XT destroying 2060 has nothing to do with whatever I said so I don't know why you keep bringing that up.Never once used 5700/XT or 2060 in any of my arguments at all.

Stop wasting my fucking time with your dumbshit stonewalling. There is evidence supporting it is NOT a 2080 Max-Q. I was NOT the one claiming anything YOU were and YOU asked me to refute YOUR claims which I did and now you go "hur hur, there is nothing proving the opposite either". News flash cretin, if you make a claim you have to back it up, which you fucking didn't.

I'm not spreading misinformation because I'm not fucking going around saying a 2080 mobile beat a PS5. I corrected your utter bullshit by refuting it.

Now stop fucking deflecting.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#197  Edited By PC_Rocks
Member since 2018 • 8611 Posts

@Juub1990 said:

Stop wasting my fucking time with your dumbshit stonewalling. There is evidence supporting it is NOT a 2080 Max-Q. I was NOT the one claiming anything YOU were and YOU asked me to refute YOUR claims which I did and now you go "hur hur, there is nothing proving the opposite either". News flash cretin, if you make a claim you have to back it up, which you fucking didn't.

I'm not spreading misinformation because I'm not fucking going around saying a 2080 mobile beat a PS5. I corrected your utter bullshit by refuting it.

Now stop fucking deflecting.

Feel free to quote the point where I deflected. Will be waiting.

Let me again quote what you claimed: yes, we have PC_Rocks claiming that (referring to 2060 better than PS5). Still waiting for the quote.

Where is the evidence supporting it was not a MaxQ specifically stating that? Not what YOU THINK or any comparison to specs or blahblah. If not you're the one who's deflecting and spreading misinformation even now.

I did back up all my claims. You are deliberately going back to 'MaxQ' as a word over and over again even when I admitted that being wrong, just to hide you being false on every other accusation of yours.

Oh and if you're claiming that Non-MaxQ 2080 didn't beat the PS5 in UE5 demo referring to above bold part then also provide proof of that.

Now, let's see if you can back up your claims.

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#198 I_own_u_4ever
Member since 2020 • 647 Posts

If the PS5 SSD is so amazing then why was the UE5 demo 30fps 1440p?

Avatar image for fedor
Fedor

11829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#200 Fedor
Member since 2015 • 11829 Posts

I just came to reiterate

3060 > (maybe =) XSX = 2080 > PS5 = 2070 S

Carry on with the laptop arguments.