3080 > XSX > 2080 > 2070 > PS5

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#51 04dcarraher
Member since 2004 • 23858 Posts

@pc_rocks said:
@04dcarraher said:
@pc_rocks said:

So, nothing at all. Have said it countless times that Gears 5 isn't a best benchmark to judge it and that too under controlled settings.

It was a nearly a straight up copy and paste job from x1x and the xsx gpu was able to perform as well as a 2080 did on same settings . Its not hard to believe in the fact that we know a 40CU RX 5700xt is able to compete with a RTX 2070 and why is so hard to believe that a 52CU RDNA 2.0 gpu would not match or beat a 2080/2080S? We are talking like 25% more CU's without even taking into account architecture improvements. a 52 CU RX 5700 would be in 2080 range as is.

Because we haven't seen anything about the RT/ML capabilities. How much die is dedicated to that? Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Gears 5 wasn't even the most demanding game on PC. RDR 2 would have been a far better candidate to judge it.

Not saying it's impossible just there's not enough data on either side.

In pure rasterization performance we can do an educated guess in what to expect.And that the gears 5 is a good peak into what to expect.

The only RT performance info we have is the minecraft RT demo. But at the same time AMD in one of their RDNA 2.0 slides said " select lighting effects for real time gaming" Which tells me that AMD's RT will be singular effect just like current RTX cards do. Not being able to do multiple RT effects at the same time. But I suspect that AMD's solution wont stall their shader processors like RTX does not being utilized when RT is on, which Ampere is suppose to fix.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52 Gatygun
Member since 2010 • 2709 Posts

@i_own_u_4ever said:

Simple breakdown in power and what to expect from PC and consoles.

https://www.windowscentral.com/xbox-series-x-gpu-may-be-more-powerful-nvidia-geforce-rtx-2080-super-and-other-pc-gpus#:~:text=Microsoft%20revealed%20that%20the%20Xbox,that%20should%20also%20be%20considered.

https://www.tomsguide.com/news/ps5-performance-could-be-trounced-by-a-decent-gaming-pc

That windowscentral guy is completely clueless. He honestly thinks a 2080 super is a 11tflop card and compares it towards amd tflops like they are the same architecture and then just slams rdna 2 = better performance because reasons.

What a joke.

DF tested the box and it seems to be around 2080 performance which is probably as rumors suggested the 3060. 3060 will have most likely also far better raytracing with it.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 Juub1990
Member since 2013 • 12622 Posts

Once again, unless the 3060's RT demolishes the 2080 Ti's, I don't see it matching it. When did a 60 series card outperform the 80 Ti from the generation prior? That's usually the job of the 70. People might be disappointed if they expect a 3060 to beat a SX in raster performance.

Avatar image for ellos
ellos

2532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54 ellos
Member since 2015 • 2532 Posts

A good speculation?

Loading Video...

Avatar image for R4gn4r0k
R4gn4r0k

49125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 R4gn4r0k
Member since 2004 • 49125 Posts

@Juub1990 said:

I know you guys are semi-trolling but let's not pretend this is because of the SX. That's a design decision enforced by Ubisoft. A 2080-class GPU is nothing to scoff at and the machine overall is beautifully engineered(except for that split RAM which makes me grimace).

What's the split RAM about that you mentioned?

AFAIK the current consoles, Xbox One and PS4, also use split RAM between GPU and CPU.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56 Zaryia
Member since 2016 • 21607 Posts

@ocinom said:
@zaryia said:
@ocinom said:
@i_own_u_4ever said:
@ocinom said:

Lame. PS4 and Xbox one launch titles are optimized from the start of the gen. Therefore XSX is not>2080.

You really have no idea what you are saying. Consoles nowadays get updates and yes they get performance updates through the life cycle of the system. You're either trolling really bad or really have no idea what you are saying.

No amount of console patches can make a game go from 30fps to 60fps. XSX not>2080

Ugh I can't imagine running at console fps.

Imagine having the latest 4k HDR 240hz monitor just to play a game that can't even run beyond 50fps. LOL

34" 1440p Ultrawide at 120+ fps ;-)

Or I can do 30fps on low settings with consoleqs. Sounds cool, if I luck shitty stuff.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#57 PC_Rocks
Member since 2018 • 8611 Posts

@04dcarraher said:

In pure rasterization performance we can do an educated guess in what to expect.And that the gears 5 is a good peak into what to expect.

The only RT performance info we have is the minecraft RT demo. But at the same time AMD in one of their RDNA 2.0 slides said " select lighting effects for real time gaming" Which tells me that AMD's RT will be singular effect just like current RTX cards do. Not being able to do multiple RT effects at the same time. But I suspect that AMD's solution wont stall their shader processors like RTX does not being utilized when RT is on, which Ampere is suppose to fix.

I don't think that's possible. The whole point of RT cores was to accelerate the ray/triangle intersection and BVH. It's not like RT couldn't be done using compute but it's slower. You can have a separate RT section on the die or you can bundle it inside the CUDA/CUs, you will still be dedicating silicon to that which will be stalled unless they really put a rabbit out of their hats.

@R4gn4r0k said:

What's the split RAM about that you mentioned?

AFAIK the current consoles, Xbox One and PS4, also use split RAM between GPU and CPU.

No. Current consoles don't have a split RAM unless you count Pro's 1 GB DDR3 dedicated to OS and Original X1's 32MB ESRAM which is like a fast cache for GPU.

Also XSX don't have a split RAM either. It's just One pool of RAM is faster while the other is slower. As long as you can fit your assets in the faster pool you're fine but see decreased performance when it exceeds. Kinda like GTX 970 0.5GB.

Avatar image for wervenom
WeRVenom

479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#58  Edited By WeRVenom
Member since 2020 • 479 Posts

@pc_rocks said:
@04dcarraher said:
@pc_rocks said:

So, nothing at all. Have said it countless times that Gears 5 isn't a best benchmark to judge it and that too under controlled settings.

It was a nearly a straight up copy and paste job from x1x and the xsx gpu was able to perform as well as a 2080 did on same settings . Its not hard to believe in the fact that we know a 40CU RX 5700xt is able to compete with a RTX 2070 and why is so hard to believe that a 52CU RDNA 2.0 gpu would not match or beat a 2080/2080S? We are talking like 25% more CU's without even taking into account architecture improvements. a 52 CU RX 5700 would be in 2080 range as is.

Because we haven't seen anything about the RT/ML capabilities. How much die is dedicated to that? Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Gears 5 wasn't even the most demanding game on PC. RDR 2 would have been a far better candidate to judge it.

Not saying it's impossible just there's not enough data on either side.

They never really said it ran better then PS5. The PS5 was capped at 30 Tim Sweeney said it could ho well above that. But I have never seen them cap a game at 40 on consoles it's either 30 or 60.

Avatar image for bluestars
Bluestars

2789

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#59  Edited By Bluestars
Member since 2019 • 2789 Posts

@i_own_u_4ever:

Bu bu but pc gamers say the xbsx will be mid gen when the new card comes out

Mid gen my hoop

HAH

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#60 04dcarraher
Member since 2004 • 23858 Posts

@pc_rocks said:
@04dcarraher said:

In pure rasterization performance we can do an educated guess in what to expect.And that the gears 5 is a good peak into what to expect.

The only RT performance info we have is the minecraft RT demo. But at the same time AMD in one of their RDNA 2.0 slides said " select lighting effects for real time gaming" Which tells me that AMD's RT will be singular effect just like current RTX cards do. Not being able to do multiple RT effects at the same time. But I suspect that AMD's solution wont stall their shader processors like RTX does not being utilized when RT is on, which Ampere is suppose to fix.

I don't think that's possible. The whole point of RT cores was to accelerate the ray/triangle intersection and BVH. It's not like RT couldn't be done using compute but it's slower. You can have a separate RT section on the die or you can bundle it inside the CUDA/CUs, you will still be dedicating silicon to that which will be stalled unless they really put a rabbit out of their hats.

@R4gn4r0k said:

What's the split RAM about that you mentioned?

AFAIK the current consoles, Xbox One and PS4, also use split RAM between GPU and CPU.

No. Current consoles don't have a split RAM unless you count Pro's 1 GB DDR3 dedicated to OS and Original X1's 32MB ESRAM which is like a fast cache for GPU.

Also XSX don't have a split RAM either. It's just One pool of RAM is faster while the other is slower. As long as you can fit your assets in the faster pool you're fine but see decreased performance when it exceeds. Kinda like GTX 970 0.5GB.

What Im talking about isnt about the size or how much room being for dedicated hardware for RT nor if their going to utilize the CU's along with it... but the fact that their architecture may not stall shader cores like what happens with RTX 2000 series. Which is why performance tanks in games so much on current RTX cards. Nvidia's ampere is suppose to fix the low gpu usage on the "normal" shader processors while RT is on.

Also the XSX's two speed ram config will not have the same affect as GTX 970's 512mb pool only having 24gb/s.

Avatar image for DragonfireXZ95
DragonfireXZ95

26716

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 DragonfireXZ95
Member since 2005 • 26716 Posts

@fedor said:
@i_own_u_4ever said:
@04dcarraher said:
@i_own_u_4ever said:
@ocinom said:

Ubisoft: Assassin's Creed Valhalla will hit "at least" 30 FPS at 4K resolutions on the Xbox Series X

XSX>2080 my ass

Get real the new AC game is going to be cross gen so base PS4 and Xbox One need to play it. Use better common sense.

You should use common sense....... all that extra horse power and only will target 4k 30 fps? With all that unused power they could have targeted 60 fps easily unless the game is an un-optimized mess

Wow you really are that clueless. First of all we don't have final conformation on any of this yet. Also devs aren't going to commit millions of extra dollars in development for cross gen games just to boost one version to 60fps. It's not likely just based on the money and dev time needed. Wait about a year and pretty much all games will be running at 60fps on both PS5 and XSX.

Use common sense.

Right, that's the opposite of common sense. Better video cards will be out and devs will focus on going graphics first, framerate second on the consoles just to keep the visual parity somewhat close to the PC versions.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62  Edited By Juub1990
Member since 2013 • 12622 Posts
@R4gn4r0k said:

What's the split RAM about that you mentioned?

AFAIK the current consoles, Xbox One and PS4, also use split RAM between GPU and CPU.

See below.

@pc_rocks said:

Also XSX don't have a split RAM either. It's just One pool of RAM is faster while the other is slower. As long as you can fit your assets in the faster pool you're fine but see decreased performance when it exceeds. Kinda like GTX 970 0.5GB.

That's exactly what split RAM is.

Avatar image for fedor
Fedor

11829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 Fedor
Member since 2015 • 11829 Posts

@bluestars said:

@i_own_u_4ever:

Bu bu but pc gamers say the xbsx will be mid gen when the new card comes out

Mid gen my hoop

HAH

Because it will be mid range.

Avatar image for Mozelleple112
Mozelleple112

11293

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#64 Mozelleple112
Member since 2011 • 11293 Posts

well the RTX 2070 Super costs more than an entire PS5 will, so not really surprising. Now factor in an 8 core CPU, 4K blu-ray, NVMe.2 SSD etc. Great value.

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#65  Edited By I_own_u_4ever
Member since 2020 • 647 Posts

@04dcarraher said:
@i_own_u_4ever said:
@04dcarraher said:
@i_own_u_4ever said:

Sigh why do I even bother with you. Not if devs from the start of development chose to offer higher fidelity and graphical effects to the game and if the game was designed around 30fps in mind. Can devs possible offer 60fps for the XSX well possibly this is why Ubisoft has said they haven't finished development yet. But again if the devs from the onset of development went with a 30fps in mind being it's cross gen and they wanted to go for higher fidelity then that's what we are likely getting.

lol... really? if they are making a cross platform game they will be using the same assets as PC version would. And yet PC isnt limited to a 30 fps cap..... Nor should the XSX. There is only few reason why if their sticking to 30 fps.

1.Keeping parity throughout the consoles.

2. Shoddy work, creating a un-optimized version for XSX/PS5 not leveraging the unused potential.

Clueless you are you have no idea about development and money and resources.

Apparently you dont ....

because if you did you would know the Xbox and PC share many of the same development tools and other aspects so the work and money needed is negligible. They are only dealing with three specs when it comes Xbox and three with Sony's consoles vs PC 's slew of configs.

For the last time your troll game is really weak and also if the engine is based off of being made for 30fps with higher fidelity then that's what they will do because in the console world most don't care only the hardcore even know what it's like to have 60fps. Don't question my tech knowledge i'm far beyond you I used to build PC's and gaming PC's for work. This is why I'm not even going to get drug into a weak troll rant with you because you butthurt your PS5 getting stomped by the XSX.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#66  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@i_own_u_4ever said:

For the last time your troll game is really weak and also if the engine is based off of being made for 30fps with higher fidelity then that's what they will do because in the console world most don't care only the hardcore even know what it's like to have 60fps. Don't question my tech knowledge i'm far beyond you I used to build PC'

s and gaming PC's for work. This is why I'm not even going to get drug into a weak troll rant with you because you butthurt your PS5 getting stomped by the XSX.

And yet the engine isnt capped on PC..... only because current consoles cant handle it. the fact that their still keeping the cap at 30 fps is a slap in the face for anyone how buys the game and play it on XSX or PS5. 4k 30 fps is a joke on high end hardware that could handle the workload the current ones cant. It would be better to use X1X settings and be at 60 fps than using full ultra settings for minor improvements, where 60 fps would change the whole feel of the game for the better.

lol with "questioning your tech knowledge" when your whole premise of this thread is wrong lol

PS5 is going to be stronger than a 2070 or 2070 super.....

Avatar image for masonshoemocker
masonshoemocker

740

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67 masonshoemocker
Member since 2003 • 740 Posts

I just wanted to say that the prices in the gpu market are ridiculous right now

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#68 I_own_u_4ever
Member since 2020 • 647 Posts

@04dcarraher said:
@i_own_u_4ever said:

For the last time your troll game is really weak and also if the engine is based off of being made for 30fps with higher fidelity then that's what they will do because in the console world most don't care only the hardcore even know what it's like to have 60fps. Don't question my tech knowledge i'm far beyond you I used to build PC'

s and gaming PC's for work. This is why I'm not even going to get drug into a weak troll rant with you because you butthurt your PS5 getting stomped by the XSX.

And yet the engine isnt capped on PC..... only because current consoles cant handle it. the fact that their still keeping the cap at 30 fps is a slap in the face for anyone how buys the game and play it on XSX or PS5. 4k 30 fps is a joke on high end hardware that could handle the workload.

No it's likely because to save money and time they are just going to keep it parity across the board on console.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 Juub1990
Member since 2013 • 12622 Posts
@Mozelleple112 said:

well the RTX 2070 Super costs more than an entire PS5 will, so not really surprising. Now factor in an 8 core CPU, 4K blu-ray, NVMe.2 SSD etc. Great value.

Yeah but its replacement, the RTX 3060 which will most likely beat it will probably retail for $299-$349. Still a great value but you'll likely be able to build a PC that outclasses the PS4 for about $800 or so. Which is still a lot of money all things considered but better than the $1200-ish you'd need today.

Avatar image for R4gn4r0k
R4gn4r0k

49125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 R4gn4r0k
Member since 2004 • 49125 Posts

@pc_rocks said:

@R4gn4r0k said:

What's the split RAM about that you mentioned?

AFAIK the current consoles, Xbox One and PS4, also use split RAM between GPU and CPU.

No. Current consoles don't have a split RAM unless you count Pro's 1 GB DDR3 dedicated to OS and Original X1's 32MB ESRAM which is like a fast cache for GPU.

Also XSX don't have a split RAM either. It's just One pool of RAM is faster while the other is slower. As long as you can fit your assets in the faster pool you're fine but see decreased performance when it exceeds. Kinda like GTX 970 0.5GB.

Maybe it was the Xbox 360 and PS3 that had shared RAM between CPU and GPU, or maybe I'm just misremembering lol.

Anyway thank you for the explanation. Wasn't really too thrilled that the GTX 970 did that, but still a banging GPU for its price.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#71  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@i_own_u_4ever said:

No it's likely because to save money and time they are just going to keep it parity across the board on console.

save money..... is an excuse, it does not hardly any effort to port X1X to XSX. Hell Epic ported over gears 5 in a demo within a week with little to no optimization and had it performing on par with 2080..... Yet you repeated what I suggested before with "parity"......

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#72  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@R4gn4r0k said:

Maybe it was the Xbox 360 and PS3 that had shared RAM between CPU and GPU, or maybe I'm just misremembering lol.

Anyway thank you for the explanation. Wasn't really too thrilled that the GTX 970 did that, but still a banging GPU for its price.

All but one the consoles from 2005 onward use a shared memory pool between the cpu and gpu. With the PS4 Pro they included hat 1gb to offload more resources off of the 8gb pool , cause 8gb shared between everything is getting abit thin these days.

Avatar image for xantufrog
xantufrog

17898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#73 xantufrog  Moderator
Member since 2013 • 17898 Posts

@bluestars: midRANGE not mid gen - and if you had bothered following the conversation above yes - the debate is between 3060-3070 performance, which is midrange.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#74 PC_Rocks
Member since 2018 • 8611 Posts

@Juub1990 said:

@pc_rocks said:

Also XSX don't have a split RAM either. It's just One pool of RAM is faster while the other is slower. As long as you can fit your assets in the faster pool you're fine but see decreased performance when it exceeds. Kinda like GTX 970 0.5GB.

That's exactly what split RAM is.

Nope. Split RAM is where CPU or GPU can't access one pool. Both the CPU and GPU access the same pool of RAM simultaneously. It's just that both pools differ in speed.

Avatar image for ButDuuude
ButDuuude

1907

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#75 ButDuuude
Member since 2013 • 1907 Posts

Smells like desperation in here .

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#76  Edited By PC_Rocks
Member since 2018 • 8611 Posts
@wervenom said:
@pc_rocks said:
@04dcarraher said:
@pc_rocks said:

So, nothing at all. Have said it countless times that Gears 5 isn't a best benchmark to judge it and that too under controlled settings.

It was a nearly a straight up copy and paste job from x1x and the xsx gpu was able to perform as well as a 2080 did on same settings . Its not hard to believe in the fact that we know a 40CU RX 5700xt is able to compete with a RTX 2070 and why is so hard to believe that a 52CU RDNA 2.0 gpu would not match or beat a 2080/2080S? We are talking like 25% more CU's without even taking into account architecture improvements. a 52 CU RX 5700 would be in 2080 range as is.

Because we haven't seen anything about the RT/ML capabilities. How much die is dedicated to that? Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Gears 5 wasn't even the most demanding game on PC. RDR 2 would have been a far better candidate to judge it.

Not saying it's impossible just there's not enough data on either side.

They never really said it ran better then PS5. The PS5 was capped at 30 Tim Sweeney said it could ho well above that. But I have never seen them cap a game at 40 on consoles it's either 30 or 60.

The demo was up to 1440, meaning it was dropping resolution and we don't know how much. So, no it wouldn't have gone above 30FPS at all.

The engineer specifically said the same demo ran at 40+FPS on a year old laptop.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#77 Juub1990
Member since 2013 • 12622 Posts
@pc_rocks said:

Nope. Split RAM is where CPU or GPU can't access one pool. Both the CPU and GPU access the same pool of RAM simultaneously. It's just that both pools differ in speed.

Yeah and the DRAM is different ie split. Split RAM allocation/pool. That's literally what they call it.

Avatar image for Pedro
Pedro

74003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#78 Pedro
Member since 2002 • 74003 Posts

Isn't the PS3 RAM split?

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#79 PC_Rocks
Member since 2018 • 8611 Posts

@R4gn4r0k said:

Maybe it was the Xbox 360 and PS3 that had shared RAM between CPU and GPU, or maybe I'm just misremembering lol.

Anyway thank you for the explanation. Wasn't really too thrilled that the GTX 970 did that, but still a banging GPU for its price.

360 had a unified memory pool but PS3 was split in half between CPU and GPU.

@Juub1990 said:
@pc_rocks said:

Nope. Split RAM is where CPU or GPU can't access one pool. Both the CPU and GPU access the same pool of RAM simultaneously. It's just that both pools differ in speed.

Yeah and the DRAM is different ie split. Split RAM allocation/pool. That's literally what they call it.

Well, we are really arguing semantics now. I think what R4gn4r0k meant was if the memory is split between CPU and GPU which it isn't. I think dedicated RAM is probably a better word for what I'm describing.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#80  Edited By PC_Rocks
Member since 2018 • 8611 Posts
@04dcarraher said:

What Im talking about isnt about the size or how much room being for dedicated hardware for RT nor if their going to utilize the CU's along with it... but the fact that their architecture may not stall shader cores like what happens with RTX 2000 series. Which is why performance tanks in games so much on current RTX cards. Nvidia's ampere is suppose to fix the low gpu usage on the "normal" shader processors while RT is on.

Also the XSX's two speed ram config will not have the same affect as GTX 970's 512mb pool only having 24gb/s.

Okay, then I misunderstood. I thought you were talking about those RT cores also being able to help with rasterization performance. Yeah, it might be possible to not stall the GPU when RT is enabled. The other units can start working on a different frame while the RT is being calculated. That sounds plausible. Nevertheless, RT will have impact on the overall performance and we don't yet know the capabilities of PS5 and XSX GPU.

As for XSX different RAM speed, I never claimed it will hamper it so much or the other way. Just explained it's KINDA like 970.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 Juub1990
Member since 2013 • 12622 Posts
@pc_rocks said:

360 had a unified memory pool but PS3 was split in half between CPU and GPU.

Well, we are really arguing semantics now. I think what R4gn4r0k meant was if the memory is split between CPU and GPU which it isn't. I think dedicated RAM is probably a better word for what I'm describing.

Pretty much yeah.

Avatar image for wervenom
WeRVenom

479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#82 WeRVenom
Member since 2020 • 479 Posts

@pc_rocks said:
@wervenom said:
@pc_rocks said:
@04dcarraher said:
@pc_rocks said:

So, nothing at all. Have said it countless times that Gears 5 isn't a best benchmark to judge it and that too under controlled settings.

It was a nearly a straight up copy and paste job from x1x and the xsx gpu was able to perform as well as a 2080 did on same settings . Its not hard to believe in the fact that we know a 40CU RX 5700xt is able to compete with a RTX 2070 and why is so hard to believe that a 52CU RDNA 2.0 gpu would not match or beat a 2080/2080S? We are talking like 25% more CU's without even taking into account architecture improvements. a 52 CU RX 5700 would be in 2080 range as is.

Because we haven't seen anything about the RT/ML capabilities. How much die is dedicated to that? Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Gears 5 wasn't even the most demanding game on PC. RDR 2 would have been a far better candidate to judge it.

Not saying it's impossible just there's not enough data on either side.

They never really said it ran better then PS5. The PS5 was capped at 30 Tim Sweeney said it could ho well above that. But I have never seen them cap a game at 40 on consoles it's either 30 or 60.

The demo was up to 1440, meaning it was dropping resolution and we don't know how much. So, no it wouldn't have gone above 30FPS at all.

The engineer specifically said the same demo ran at 40+FPS on a year old laptop.

https://mobile.twitter.com/timsweeneyepic/status/1261834093521776641

Also I'm not sure how a 2070 max q would beats whats in the PS5 unless the PS5 is somehow weaker then current AMD tech.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83  Edited By tormentos
Member since 2003 • 33798 Posts

@i_own_u_4ever said:

Simple breakdown in power and what to expect from PC and consoles.

https://www.windowscentral.com/xbox-series-x-gpu-may-be-more-powerful-nvidia-geforce-rtx-2080-super-and-other-pc-gpus#:~:text=Microsoft%20revealed%20that%20the%20Xbox,that%20should%20also%20be%20considered.

https://www.tomsguide.com/news/ps5-performance-could-be-trounced-by-a-decent-gaming-pc

So do you have any proof to your claim?

You make a pretty speficit claim that isn't back up by any of those 2 sources,one is from March and made a claim comparing Tflops between AMD and Nvidia which mean total shit.

The second is a toms guide link which is nothing more than a click bait site, and talk about UE5 demo which we already know wasn't really running on PC,it was been play as a movie.

So if you have any real information please share it,i am not saying that would not be the case alto rather very unlikely given the 18%b GPU gap,but yeah back up your claim.

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#84 I_own_u_4ever
Member since 2020 • 647 Posts

@04dcarraher said:
@i_own_u_4ever said:

No it's likely because to save money and time they are just going to keep it parity across the board on console.

save money..... is an excuse, it does not hardly any effort to port X1X to XSX. Hell Epic ported over gears 5 in a demo within a week with little to no optimization and had it performing on par with 2080..... Yet you repeated what I suggested before with "parity"......

For one we won't have final conformation on AC until several months from now. And again none of us are a tech in here on the AC engine UBI is using so we don't know what the time effort cost or whatever would be if they chose to do it or not.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 tormentos
Member since 2003 • 33798 Posts

@i_own_u_4ever said:

We need to keep in mind that the XSX is early in dev kits and MS is notorious for not releasing big performance gain dev kit updates until around launch and shortly after launch. This is how MS kinda fooled devs months ago and likely Sony also with having the XSX kits running slow then more recently we obviously see some dev kit updates then all devs kinda changed their tone on now saying wow the XSX is more power.

Were di you pull this shit^? we know for sure DX12 improved shit on xbox one,and what MS really gain on the xbox one performance wise is because they gave back system reserved resources that were in place for Kinect,and the win was so little literaly nothing changed.

@i_own_u_4ever said:
@ocinom said:

@i_own_u_4ever: So what? XSX still is not >2080. Try harder next time with other trash post

Most devs say it is or at worst equal to it but we are so early in the dev kit cycle for the XSX. After launch the XSX will get big performance boost with updates.

Please provide link to that bold part.

@pc_rocks said:

Because we haven't seen anything about the RT/ML capabilities. How much die is dedicated to that? Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Gears 5 wasn't even the most demanding game on PC. RDR 2 would have been a far better candidate to judge it.

Not saying it's impossible just there's not enough data on either side.

Dude the Max Q is not even better than the 5700 let alone the 5700XT,so common sense tell me the PS5 would beat it without problems.

This is one of the reason i say you are a console hater period,i don't think the PS5 will have problems topping the 5700 you do some how.

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#86  Edited By I_own_u_4ever
Member since 2020 • 647 Posts
@pc_rocks said:
@04dcarraher said:

What Im talking about isnt about the size or how much room being for dedicated hardware for RT nor if their going to utilize the CU's along with it... but the fact that their architecture may not stall shader cores like what happens with RTX 2000 series. Which is why performance tanks in games so much on current RTX cards. Nvidia's ampere is suppose to fix the low gpu usage on the "normal" shader processors while RT is on.

Also the XSX's two speed ram config will not have the same affect as GTX 970's 512mb pool only having 24gb/s.

Okay, then I misunderstood. I thought you were talking about those RT cores also being able to help with rasterization performance. Yeah, it might be possible to not stall the GPU when RT is enabled. The other units can start working on a different frame while the RT is being calculated. That sounds plausible. Nevertheless, RT will have impact on the overall performance and we don't yet know the capabilities of PS5 and XSX GPU.

As for XSX different RAM speed, I never claimed it will hamper it so much or the other way. Just explained it's KINDA like 970.

With having 52cu's the XSX will have far superior RT over PS5 with that many cores they have the leg room and dedicated cores to clearly have the better RT solution. We will likely see in just about every DF comparison between PS5 and XSX that the XSX wins probably every time in RT as well as texture detail and resolution.

Avatar image for lundy86_4
lundy86_4

62044

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#87 lundy86_4  Online
Member since 2003 • 62044 Posts

This is a stretch and a half, which is on point for TC.

Avatar image for bluestars
Bluestars

2789

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#88 Bluestars
Member since 2019 • 2789 Posts

@fedor:

Erm nope

HaH

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#89 tormentos
Member since 2003 • 33798 Posts

@R4gn4r0k said:
@Juub1990 said:

I know you guys are semi-trolling but let's not pretend this is because of the SX. That's a design decision enforced by Ubisoft. A 2080-class GPU is nothing to scoff at and the machine overall is beautifully engineered(except for that split RAM which makes me grimace).

What's the split RAM about that you mentioned?

AFAIK the current consoles, Xbox One and PS4, also use split RAM between GPU and CPU.

No the PS4 doesn't have split ram the xbox one does,the xbox one X doesn't either,he is referring to the series X having 2 different ram speed which effectively make it into a split ram machine,as effectively there are 2 different speed contrary to the PS5 just 1 across all ram,so in 10GB the xbox is faster on 6GB is slower.

@bluestars said:

@i_own_u_4ever:

Bu bu but pc gamers say the xbsx will be mid gen when the new card comes out

Mid gen my hoop

HAH

Dude the xbox series X is weaker than the 2080ti NOW this holiday the 3080 line comes out so yeah probably mid upper gen,which isn't bad at all by the way,you want nothing but the best buy a PC,oh wait didn't you claim to be a PC gamer as well Kingtito?

@i_own_u_4ever said:
@04dcarraher said:

Apparently you dont ....

because if you did you would know the Xbox and PC share many of the same development tools and other aspects so the work and money needed is negligible. They are only dealing with three specs when it comes Xbox and three with Sony's consoles vs PC 's slew of configs.

For the last time your troll game is really weak and also if the engine is based off of being made for 30fps with higher fidelity then that's what they will do because in the console world most don't care only the hardcore even know what it's like to have 60fps. Don't question my tech knowledge i'm far beyond you I used to build PC's and gaming PC's for work. This is why I'm not even going to get drug into a weak troll rant with you because you butthurt your PS5 getting stomped by the XSX.

This post is pathetic you know shit about PC,i don't have to defend @04dcarraher he has been arguing PC here for years,and while we disagree in many things i know he knows more about PC hands down than you.

You don't need to even know computers to build a damn PC is basically a streamline process that most people can do,is not rocket science.

He is not a sony fan by the way so your comment about the his PS5 is mighty stupid.

Avatar image for bluestars
Bluestars

2789

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#90  Edited By Bluestars
Member since 2019 • 2789 Posts

@xantufrog:

DF claim the XBSX is on par with the 2080,WC claim better than a 2080 super these are not a mid range grafix card now and won’t be when the 3080 hits shelves

But keep the laughs coming

Hermits HaH

Avatar image for fedor
Fedor

11829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91  Edited By Fedor
Member since 2015 • 11829 Posts

@bluestars said:

@fedor:

Erm nope

HaH

It is, and it's hilarious how insecure it makes you. XSX won't even be high end Navi, must sting.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#92  Edited By Zaryia
Member since 2016 • 21607 Posts

@bluestars said:

@xantufrog:

DF claim the XBSX is on par with the 2080,WC claim better than a 2080 super these are not a mid range grafix card now and won’t be when the 3080 hits shelves

But keep the laughs coming

Hermits HaH

You're super hyped the mere chance of having better hardware for a month or two, and then being shit on for 8 years?

Avatar image for Pedro
Pedro

74003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#93 Pedro
Member since 2002 • 74003 Posts

This thread definitely delivered.

Avatar image for fedor
Fedor

11829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94  Edited By Fedor
Member since 2015 • 11829 Posts

@zaryia: Its already confirmed by AMD that they are releasing their GPU line before the consoles come out, Nvidia is expected to follow suit. He won't even have a month or 2.

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#95 I_own_u_4ever
Member since 2020 • 647 Posts

@tormentos said:
@i_own_u_4ever said:

We need to keep in mind that the XSX is early in dev kits and MS is notorious for not releasing big performance gain dev kit updates until around launch and shortly after launch. This is how MS kinda fooled devs months ago and likely Sony also with having the XSX kits running slow then more recently we obviously see some dev kit updates then all devs kinda changed their tone on now saying wow the XSX is more power.

Were di you pull this shit^? we know for sure DX12 improved shit on xbox one,and what MS really gain on the xbox one performance wise is because they gave back system reserved resources that were in place for Kinect,and the win was so little literaly nothing changed.

@i_own_u_4ever said:
@ocinom said:

@i_own_u_4ever: So what? XSX still is not >2080. Try harder next time with other trash post

Most devs say it is or at worst equal to it but we are so early in the dev kit cycle for the XSX. After launch the XSX will get big performance boost with updates.

Please provide link to that bold part.

@pc_rocks said:

Because we haven't seen anything about the RT/ML capabilities. How much die is dedicated to that? Also, if Epic's now deleted video about MaxQ performing better than PS5 for the UE5 demo is anything to go by, it doesn't paint a very good picture of XSX either.

Gears 5 wasn't even the most demanding game on PC. RDR 2 would have been a far better candidate to judge it.

Not saying it's impossible just there's not enough data on either side.

Dude the Max Q is not even better than the 5700 let alone the 5700XT,so common sense tell me the PS5 would beat it without problems.

This is one of the reason i say you are a console hater period,i don't think the PS5 will have problems topping the 5700 you do some how.

With further updates the OG X1 was able to have far more 1080p games. The Kinect power given back helped also but clearly MS released numerous performance updates this is why we started seeing many more games at 1080p on the OG X1.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#96 tormentos
Member since 2003 • 33798 Posts

@pc_rocks said:

Nope. Split RAM is where CPU or GPU can't access one pool. Both the CPU and GPU access the same pool of RAM simultaneously. It's just that both pools differ in speed.

From what i have read the CPU can't access the GPU memory at 560Gb/s,so effectively it is split in speed,even worse the GPU has access to 13.5Gb from which 3.5GB are slower which again create basically a split scenario sort of,while the total access to memory is there (well not total there are reservations) the speed is effectively different.

I don't think 10GB is enough to run all process + Ray tracing i think accessing that 3.5GB at lower speed will create a problem,the only reason MS use this method is the same reason why sony chose a 36CU GPU saving money.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#97 tormentos
Member since 2003 • 33798 Posts

@Pedro said:

This thread definitely delivered.

Yes a bunch of miss information which i am find odd that you haven't question yet.¯\_(ツ)_/¯

@i_own_u_4ever said:

With having 52cu's the XSX will have far superior RT over PS5 with that many cores they have the leg room and dedicated cores to clearly have the better RT solution. We will likely see in just about every DF comparison between PS5 and XSX that the XSX wins probably every time in RT as well as texture detail and resolution.

Dude i would love to see this so call far superior which you pulled from deep deep down your buns.

In fact RT is a freaking demanding process if you abuse it on xbox series X you run the risc of having to lower resolution and frames well bellow the PS5,RT is expensive even on the top dog card on the market in fact hard enough to send the 2080TI into 1080p,so yeah i think that can be a scenario which may hurt the xbox,and before you say shit look at RE3 remake and how the xbox one X version had considerably worse frames than the PS4 Pro because Capcom try to push 4k or nothing,the gap was patch to improve frames and yeah resolution was drop closer to the Pro and the xbox one X version improved greatly,and that is a machine with 45% more power 4GB more of memory and faster memory as well.

The series X is just 18% stronger with the same amount of memory and with asymetrical memory speeds.

At the same resolution the xbox series X should have little better RT than the PS5,if they push RT to be much better on the series X something will have to be trade period,either frames or resolution.

@i_own_u_4ever said:

With further updates the OG X1 was able to have far more 1080p games. The Kinect power given back helped also but clearly MS released numerous performance updates this is why we started seeing many more games at 1080p on the OG X1.

No it did not stop inventing shit most games on xbox one simply aren't 1080p,many are not even 900p.

No the PS4 mantained its lead over the xbox without problem all generation long and no sdk changed that.

By the way were is the link to prove this thread information as non of your links prove that.

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#98 I_own_u_4ever
Member since 2020 • 647 Posts

@tormentos said:
@Pedro said:

This thread definitely delivered.

Yes a bunch of miss information which i am find odd that you haven't question yet.¯\_(ツ)_/¯

@i_own_u_4ever said:

With having 52cu's the XSX will have far superior RT over PS5 with that many cores they have the leg room and dedicated cores to clearly have the better RT solution. We will likely see in just about every DF comparison between PS5 and XSX that the XSX wins probably every time in RT as well as texture detail and resolution.

Dude i would love to see this so call far superior which you pulled from deep deep down your buns.

In fact RT is a freaking demanding process if you abuse it on xbox series X you run the risc of having to lower resolution and frames well bellow the PS5,RT is expensive even on the top dog card on the market in fact hard enough to send the 2080TI into 1080p,so yeah i think that can be a scenario which may hurt the xbox,and before you say shit look at RE3 remake and how the xbox one X version had considerably worse frames than the PS4 Pro because Capcom try to push 4k or nothing,the gap was patch to improve frames and yeah resolution was drop closer to the Pro and the xbox one X version improved greatly,and that is a machine with 45% more power 4GB more of memory and faster memory as well.

The series X is just 18% stronger with the same amount of memory and with asymetrical memory speeds.

At the same resolution the xbox series X should have little better RT than the PS5,if they push RT to be much better on the series X something will have to be trade period,either frames or resolution.

@i_own_u_4ever said:

With further updates the OG X1 was able to have far more 1080p games. The Kinect power given back helped also but clearly MS released numerous performance updates this is why we started seeing many more games at 1080p on the OG X1.

No it did not stop inventing shit most games on xbox one simply aren't 1080p,many are not even 900p.

No the PS4 mantained its lead over the xbox without problem all generation long and no sdk changed that.

By the way were is the link to prove this thread information as non of your links prove that.

Horse shit after MS started updating the X1 we started seeing many more 1080p games not all sure a lot still 900p but we stared seeing more and more 1080p games also

Avatar image for i_own_u_4ever
I_own_u_4ever

647

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#99 I_own_u_4ever
Member since 2020 • 647 Posts
@tormentos said:
@Pedro said:

This thread definitely delivered.

Yes a bunch of miss information which i am find odd that you haven't question yet.¯\_(ツ)_/¯

@i_own_u_4ever said:

With having 52cu's the XSX will have far superior RT over PS5 with that many cores they have the leg room and dedicated cores to clearly have the better RT solution. We will likely see in just about every DF comparison between PS5 and XSX that the XSX wins probably every time in RT as well as texture detail and resolution.

Dude i would love to see this so call far superior which you pulled from deep deep down your buns.

In fact RT is a freaking demanding process if you abuse it on xbox series X you run the risc of having to lower resolution and frames well bellow the PS5,RT is expensive even on the top dog card on the market in fact hard enough to send the 2080TI into 1080p,so yeah i think that can be a scenario which may hurt the xbox,and before you say shit look at RE3 remake and how the xbox one X version had considerably worse frames than the PS4 Pro because Capcom try to push 4k or nothing,the gap was patch to improve frames and yeah resolution was drop closer to the Pro and the xbox one X version improved greatly,and that is a machine with 45% more power 4GB more of memory and faster memory as well.

The series X is just 18% stronger with the same amount of memory and with asymetrical memory speeds.

At the same resolution the xbox series X should have little better RT than the PS5,if they push RT to be much better on the series X something will have to be trade period,either frames or resolution.

@i_own_u_4ever said:

With further updates the OG X1 was able to have far more 1080p games. The Kinect power given back helped also but clearly MS released numerous performance updates this is why we started seeing many more games at 1080p on the OG X1.

No it did not stop inventing shit most games on xbox one simply aren't 1080p,many are not even 900p.

No the PS4 mantained its lead over the xbox without problem all generation long and no sdk changed that.

By the way were is the link to prove this thread information as non of your links prove that.

Not closer to the pro the X was still 1800p checkerboard still a decent amount higher then the pro. And you're a fool if you think 52CU's is not gonna wipe the floor against 36CU's in the PS5. MS will have better dedicated RT period with plenty of cores still open to offer better image and things over the PS5. You'll see it's going to be a massacre on DF next gen with the XSX taking the title in almost all match ups.

Avatar image for Pedro
Pedro

74003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#100 Pedro
Member since 2002 • 74003 Posts

@tormentos said:
@Pedro said:

This thread definitely delivered.

Yes a bunch of miss information which i am find odd that you haven't question yet.¯\_(ツ)_/¯

I see you still can't get over me. Its time for you to move on. 😎