X1X vs GTX 1060....

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#201  Edited By GioVela2010
Member since 2008 • 5566 Posts

@NoodleFighter said:

lol really trying to claim victory with that one game that has graphic features exclusive to the X

But but optimization doesn’t exist

oh wait it does, it’s rendering better graphics with a “weaker GPU”

Loading Video...
Avatar image for HalcyonScarlet
HalcyonScarlet

13838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#202  Edited By HalcyonScarlet
Member since 2011 • 13838 Posts

@Xplode_games said:

This quote is from one of the youtube comments. I found it very interesting.

"GTX 1060 with 2 GHz OC is around 10-15 fps faster, so it's already close to stock GTX 1070 results at this point. But even with 2GHz OC 1060 has problems matching xbox X results, it's clearly slower GPU. Below interesting results:

Wolfenstein 2 gamegpu.com/action-/-fps-/-tps/wolfenstein-ii-the-new-colossus-test-gpu-cpu GTX 1060 results 25-30 fps at 4K native and max settings. But of course xbox X is using dynamic 4K (although 2016p most of the time), and not maxed settings, so we should look for another test. So here's 4K dynamic gameplay https://youtu.be/ISvoRBR6JgI 2GHz OC GTX 1060 - 4K dynamic, LOW/MEDIUM settings, 45-55 fps XBOX X - 4K dynamic, almost maxed out settings, 55-60 fps GTX 1060 is not even close, and tha's even with 2GHz OC (at this point GTX 1060 is 10-15 fps faster). I dont want to upsed GTX 1060 owners, but results speak for themselves and clearly GTX 1060 is not even close to xbox X GPU results.

Rise Of The Tomb Raider https://youtu.be/r9q4V_eCx-Q GTX 1070 - 4K, Medium settings, dips below 30 fps. Xbox X also has dips below 30fps, but runs high settings.

When it comes to forza 7, I have GTX 1060 gameplay, even in one car race that card barely holds 60fps, 55-62 fps and during the more demanding scenarios it dips to around 45 fps. And here's interesting gameplay video https://www.youtube.com/watch?v=4GwmE_fIzLY 40 fps dips in forza 7 on GTX 1060 and in another video the same GTX 1060 owner have said his GPU is simply not enough to match xbox X results, he addmited that. Xbox X version NEVER DIPS BELOW 60 fps, average fps is probably around 80-90fps as Digital Foundry have suggested. I'm not sure what settings xbox X runs, PCMR guys suggest dynamic settings, but digital foundry suggest maxed out settings, the only difference they were able to find was MSAA instead of EQAA, but the thing is EQAA works exactly like CSAA on Nv cards, so it's MSAA, but with additional samples on top of that. That way it gives MSAAx8 quality at the cost of MSAAx4, so it's much more efficient way. Below you can read about EQAA and CSAA modes http://www.tomshardware.com/reviews/anti-aliasing-nvidia-geforce-amd-radeon,2868-4.html

When it comes to gears of war 4. I have seen GTX 1060 gameplay at 4K, that card indeed can run 4K and ultra settings, BUT WITH DYNAMIC RESOLUTION SCALING. Benchmark charts suggest lower fps https://cdn.wccftech.com/wp-content/uploads/2016/10/2160p.jpg 21-25fps at ultra settings without dynamic resolution, and the thing is benchmark scenarios are usually not that demanding.

People may ask, why xbox X has GTX 1070 results, while xbox X specification on paper suggest RX 580. But xbox X GPU unlike RX 580 is totally custom build and at this point cant be compared to RX 580 (and RX 580 has already similar performance compared to GTX 1060, especially in DX12 code) https://gpucuriosity.wordpress.com/2017/09/10/xbox-one-xs-render-backend-2mb-render-cache-size-advantage-over-the-older-gcns/ Xbox X GPU has 7 billion transistors which points to GPU design not being RX-480/RX-580 and unlike RX 580 xbox X GPU has additional features. Xbox has polaris-based GPU with Vega features thrown in (for example delta color compression from vega), but not an entire Vega GPU. There are also additional DX12 features build into it. Digital foundry article in regards to xbox X clearly suggest xbox X 6 tflops GPU is much faster than numbers suggest. below interesting quotes from DF article https://www.neogaf.com/threads/xbox-scorpio-dx12-built-directly-into-gpu.1358475/#post-233477389

With these improvements it should be no surprise to you that xbox X GPU is much faster than RX 580. That's why developers suggest GTX 1070 performance level (war thunder developer even suggested performance level close to GTX 1080 as you already know). Looking at games, most xbox X enchanced titles shows results like GTX 1070 (it's not GTX 1060 performance level for a fact, because even GTX 1060 owners themselves say they cant match xbox X results). So IMO xbox GPU is comparable to GTX 1070 performance level and I base my opinion on articles, game results on xbox X, GTX 1060 users opinions and even developer opinions (they have much more knowledge about technology compared to gamers, that's why they making games!).

At 500$ no PC can match xbox X results. That guy in this video is using USED parts, and even keeping this in mind he still have just 8GB ram in that build (if you are buying used parts, you can very well buy used and much cheaper xbox X also). MaDz (that YT that recorded forza 7 gameplay on his GTX 1060) have spend 800$ on that PC, and that build cost doesnt even include windows 10 (around 100$), not to mention he still have 8GB ram (that's not enough for new games). In order to match xbox X results you have to buy GTX 1070, and if you are buying GPU like that realistically speaking you will not pair it with i3 or slower CPU, you have to buy at least i5. You are looking at 1000$ + PC build at this point. And the thing is, in order to get much better results compared to xbox X (60fps instead of 30fps on xbox x) you have to build high end PC with 1080ti, that costs much more than xbox X and even that PC will not get you 4K 60fps in most demanding games. I have 1080ti myself and my PC can runx xbox X games at the same settings at 60fps instead of 30fps, but I have paid 800$ for my GPU alone (asus strix OC version), not to mention other PC parts." - P Number8

I think that just proves unoptimized console ports to the PC. FM 7 on the PC is still kinda messy. Not as smooth as it should be, so I wouldn't read too much into that.

(for the bolded text) Or someone could pair it with an affordable AMD CPU.

Avatar image for HalcyonScarlet
HalcyonScarlet

13838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#203 HalcyonScarlet
Member since 2011 • 13838 Posts
@xhawk27 said:
@HalcyonScarlet said:

Xbox gamers are really insecure. They really want to prove the XXbone is more powerful than the PC.

It's pathetic really. So sad ?.

You know what's more pathetic is trying to downplay the most powerful console ever made. For the Price the X1X can not be beat. Xbox haters so sad ?.

You lems aren't helping your case much. Your comment just reeks of insecurity. No one really cares about its power. Only it's games, or lack there of. Now you could go back to playing with your xxbone or you could continue with your insecurities about it's power.

If power is that important to you, the original Xbone must have sent you into depression.

Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#204 xhawk27
Member since 2010 • 12194 Posts

@04dcarraher said:
@scatteh316 said:
@glez13 said:
@scatteh316 said:

Are people factoring in that an extremely large majority of GTX1060 owners overclock the card?

And that generally, overclocked third party cards are cheaper then Nvidia's FE?

So the gap is even larger as the benchmarks are only using FE cards.

The XBX GPU should be like an overclocked 580 power-wise. What will most probably make the difference is that those GPUs will be paired with a more capable CPU.

If you look at raw specs..... fill rate, texel rate.....Tflops...etc...etc..... X is actually slower then a stock RX580.

RX 580 Top, X1X GPU bottom

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#205  Edited By scatteh316
Member since 2004 • 10273 Posts

@xhawk27 said:
@04dcarraher said:
@scatteh316 said:
@glez13 said:
@scatteh316 said:

Are people factoring in that an extremely large majority of GTX1060 owners overclock the card?

And that generally, overclocked third party cards are cheaper then Nvidia's FE?

So the gap is even larger as the benchmarks are only using FE cards.

The XBX GPU should be like an overclocked 580 power-wise. What will most probably make the difference is that those GPUs will be paired with a more capable CPU.

If you look at raw specs..... fill rate, texel rate.....Tflops...etc...etc..... X is actually slower then a stock RX580.

RX 580 Top, X1X GPU bottom

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Compute unit count means sweet **** all by itself.......

Avatar image for PCgameruk
PCgameruk

2273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#206 PCgameruk
Member since 2012 • 2273 Posts

When the Forza video is playing both are supposed to be 60fps. But i can easily see the PC version is smoother.

Avatar image for NoodleFighter
NoodleFighter

11897

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#207 NoodleFighter
Member since 2011 • 11897 Posts

@HalcyonScarlet: I get the feeling that FM7 was purposely gimped on PC so it wouldn't outshine the Xbox One X too much as the devs did something so stupid even amatuer programmers wouldn't do. FM7 maxes out only one CPU core. The devs basically admitted to not putting in multi threading for the PC version. Wow the power of DirectX12 and a Microsoft 1st Party studio somehow does this crap? Yeah I'm real suspicious.

@GioVela2010:The devs made those graphical features exclusive to the X, you're claiming victory over a technicality of the dev not putting those features on the PC version.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#208  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@xhawk27 said:

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does not outpace the RX 580's 13% increase in boosted clockrate. Which is why X!X operates at 6.001 TFLOPS vs RX's 6.175 TFLOPS

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#209 scatteh316
Member since 2004 • 10273 Posts

@04dcarraher said:
@xhawk27 said:

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does outpace the RX 580's 13% increase in boosted clockrate. Which is why X!X operates at 6.001 TFLOPS vs RX's 6.175 TFLOPS

^^ He gets it!!

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#211 scatteh316
Member since 2004 • 10273 Posts

@GioVela2010 said:
@04dcarraher said:
@xhawk27 said:

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does outpace the RX 580's 13% increase in boosted clockrate.

Wait isn’t the RX580 just an overclocked RX480

Yep..... X's GPU is actually closer to the RX480 then the RX580.

Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#212 xhawk27
Member since 2010 • 12194 Posts

@04dcarraher said:
@xhawk27 said:

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does outpace the RX 580's 13% increase in boosted clockrate. Which is why X!X operates at 6.001 TFLOPS vs RX's 6.175 TFLOPS

It means something when apps that require more compute calculations are released for the 1X.

Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#213  Edited By xhawk27
Member since 2010 • 12194 Posts

@scatteh316 said:
@GioVela2010 said:
@04dcarraher said:
@xhawk27 said:

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does outpace the RX 580's 13% increase in boosted clockrate.

Wait isn’t the RX580 just an overclocked RX480

Yep..... X's GPU is actually closer to the RX480 then the RX580.

No it's a RX 580 underclocked and more CU units.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#214  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@xhawk27 said:
@04dcarraher said:
@xhawk27 said:

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does outpace the RX 580's 13% increase in boosted clockrate. Which is why X!X operates at 6.001 TFLOPS vs RX's 6.175 TFLOPS

It means something when apps that require more compute calculations are released for the 1X.

huh? That statement makes no sense..... X1X has slightly less total compute processing capabilities than a vanilla RX 580.... then X1X has that inadequate jaguar based cpu holding back the gpu even further in situations that require more cpu resources. The more they throw cpu/compute tasks to the gpu , the less the gpu has to devote resources to gaming.... Its the same illusion die hard cows held on too with PS4 having a crap load of extra ACE units on its gpu to regulate compute workloads..... Them comparing to at the time the 7970's 2 ACE's vs PS4's 8 and claiming superiority . In the end those extra ACE's didn't do squat in allowing Ps4 to beat the 7970 series and its later revisions in 200/300 series.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#215 04dcarraher
Member since 2004 • 23858 Posts

@xhawk27 said:

Yep..... X's GPU is actually closer to the RX480 then the RX580.

No it's a RX 580 underclocked and more CU units.

No its neither one.... They are all based on the same Polaris architecture. vanilla RX 480 RX 580 and X1X raw performance numbers are very close to each other. We are talking about within 3% of each other.

RX 480<X1X<RX580

X1X is 3% faster than RX 480 while RX 580 is 3% faster than X1X, This isnt even taking into account non reference models of the 480 and 580. Which can give both another 5-10% gain on top of the vanilla cards.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#216  Edited By scatteh316
Member since 2004 • 10273 Posts

@xhawk27 said:
@04dcarraher said:
@xhawk27 said:

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does outpace the RX 580's 13% increase in boosted clockrate. Which is why X!X operates at 6.001 TFLOPS vs RX's 6.175 TFLOPS

It means something when apps that require more compute calculations are released for the 1X.

Yes it will..... as the RX580 will perform better as it has more compute power.......

RX580 = 6.1Tflops

Xbox X = 6Tflops

It's not a huge difference but the RX580 IS faster....... why are you finding it so difficult to understand this?

It's computer basics.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#217 Juub1990
Member since 2013 • 12622 Posts

@04dcarraher: I love how these console guys are speaking out of their depth when it comes to hardware and you put them back in their place lol.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#218 QuadKnight
Member since 2015 • 12916 Posts

@Juub1990 said:

@04dcarraher: I love how these console guys are speaking out of their depth when it comes to hardware and you put them back in their place lol.

Well, they were trained by Ron. Anyone who spouts the same bullshit as Ron is bound to fail.

That dude is a completely joke.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#219  Edited By Xplode_games
Member since 2011 • 2540 Posts

@scatteh316 said:
@xhawk27 said:
@04dcarraher said:
@xhawk27 said:

Except the GPU in the X1X has 40 compute units compared to 36 for the RX580.

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does outpace the RX 580's 13% increase in boosted clockrate. Which is why X!X operates at 6.001 TFLOPS vs RX's 6.175 TFLOPS

It means something when apps that require more compute calculations are released for the 1X.

Yes it will..... as the RX580 will perform better as it has more compute power.......

RX580 = 6.1Tflops

Xbox X = 6Tflops

It's not a huge difference but the RX580 IS faster....... why are you finding it so difficult to understand this?

It's computer basics.

What the hell are you talking about? Now we are counting teraflops to determine performance? These noobs need more training. Why don't you learn what the hell you're talking about before making an ignorant post? The X1X GPU has a more advanced implementation of GCN with many Vega features, larger cache and a lot more memory bandwidth. But you think all of that is beat by a 0.1 teraflop advantage? WTF, and you say you're the expert?

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#220  Edited By scatteh316
Member since 2004 • 10273 Posts

@Xplode_games said:
@scatteh316 said:
@xhawk27 said:
@04dcarraher said:

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does outpace the RX 580's 13% increase in boosted clockrate. Which is why X!X operates at 6.001 TFLOPS vs RX's 6.175 TFLOPS

It means something when apps that require more compute calculations are released for the 1X.

Yes it will..... as the RX580 will perform better as it has more compute power.......

RX580 = 6.1Tflops

Xbox X = 6Tflops

It's not a huge difference but the RX580 IS faster....... why are you finding it so difficult to understand this?

It's computer basics.

What the hell are you talking about? Now we are counting teraflops to determine performance? These noobs need more training. Why don't you learn what the hell you're talking about before making an ignorant post? The X1X GPU has a more advanced implementation of GCN with many Vega features, larger cache and a lot more memory bandwidth. But you think all of that is beat by a 0.1 teraflop advantage? WTF, and you say you're the expert?

Oh look a noob....... I have to dumb down hardware talk for people in here (Even you) as most noobs in here don't understand the hardware properly enough to have a deep and meaning full conversation about it.

People understand Tflops.... so that's just the easiest way to get people to understand.

But as you've piped up let me school you

1. Xbox X's GPU does have 'some' of Vega's improvements but not all of them, for example it doesn't have RPM ability, but the GPU in Pro does - So much for an advanced GPU LOL!!

2. A lot more memory bandwidth? I think you'll find 'a lot more' is way off and a big exaggeration.

3. Now you're showing what a noob you are, after all the advantages I've pointed out on RX580 you think the 0.1 Tflop is the only thing is has going for it over Xbox X? Lmao...

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#221 04dcarraher
Member since 2004 • 23858 Posts

@Juub1990 said:

@04dcarraher: I love how these console guys are speaking out of their depth when it comes to hardware and you put them back in their place lol.

I remember the battles with tormentos and loosing ends..... Those two were some tough nuts hard to crack with facts lol

Avatar image for howmakewood
Howmakewood

7838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#222  Edited By Howmakewood
Member since 2015 • 7838 Posts

If One X gpu is so much more advanced tech than Polaris, then why is Vega pretty much just an upclocked Fury X(I mean they have pretty much the same performance running on similar clocks). Seriously the difference between Fury X(also ran on HBM) vs Vega are pretty laughable considering the later one is supposedly a whole new architecture

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#223 GioVela2010
Member since 2008 • 5566 Posts

HD 5970 = 4640 GFLOpS

HD 7970 = 3788 GFLOPS

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#224  Edited By Xplode_games
Member since 2011 • 2540 Posts

@scatteh316 said:
@Xplode_games said:
@scatteh316 said:
@xhawk27 said:
@04dcarraher said:

Which means squat ... When the RX 580 36 CU's base clock is 1257mhz and boosts to 1340mhz while the X1X's 40 CU's static clock rate is 1172mhz. X1X's 10% increase of CU's does outpace the RX 580's 13% increase in boosted clockrate. Which is why X!X operates at 6.001 TFLOPS vs RX's 6.175 TFLOPS

It means something when apps that require more compute calculations are released for the 1X.

Yes it will..... as the RX580 will perform better as it has more compute power.......

RX580 = 6.1Tflops

Xbox X = 6Tflops

It's not a huge difference but the RX580 IS faster....... why are you finding it so difficult to understand this?

It's computer basics.

What the hell are you talking about? Now we are counting teraflops to determine performance? These noobs need more training. Why don't you learn what the hell you're talking about before making an ignorant post? The X1X GPU has a more advanced implementation of GCN with many Vega features, larger cache and a lot more memory bandwidth. But you think all of that is beat by a 0.1 teraflop advantage? WTF, and you say you're the expert?

Oh look a noob....... I have to dumb down hardware talk for people in here (Even you) as most noobs in here don't understand the hardware properly enough to have a deep and meaning full conversation about it.

People understand Tflops.... so that's just the easiest way to get people to understand.

But as you've piped up let me school you

1. Xbox X's GPU does have 'some' of Vega's improvements but not all of them, for example it doesn't have RPM ability, but the GPU in Pro does - So much for an advanced GPU LOL!!

2. A lot more memory bandwidth? I think you'll find 'a lot more' is way off and a big exaggeration.

3. Now you're showing what a noob you are, after all the advantages I've pointed out on RX580 you think the 0.1 Tflop is the only thing is has going for it over Xbox X? Lmao...

If that's the case, then why did you criticize another poster who pointed out that the X1X GPU has 40 CUs vs 36 on the RX 580? He was simplifying as well but you went after him then you go on to simplify yourself.

Let's be honest here, the X1X GPU is a little faster than the RX 580 even though it has 0.1 less teraflops, that's clear as day. However, if you are looking for the PC GPU that is most similar to the X1X, then it's reasonable to say it's the RX 580.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#225  Edited By RyviusARC
Member since 2011 • 5708 Posts

@GioVela2010 said:

HD 5970 = 4640 GFLOpS

HD 7970 = 3788 GFLOPS

You tried to cherry pick the worst case scenario.

First of all those cards are a few generations apart with widely different architectures.

Second the 5970 was a dual gpu card similar to two 5850s in Crossfire and Crossfire/SLI will not give 100% performance when two GPUs work as one.

So you are trying to skew the facts by using a graph in the wrong way.

Also some games on the 5970 do run better than the 7970 despite the inefficiency of using two gpus in Crossfire but it is very selective since a lot of newer games don't take advantage of Crossfire well and they usually require much more vRAM than the 5970 can handle.

A more accurate comparison would be to take the benchmark results of a 5850 then double the fps and compare it to the 7970. Since the stats of the 5970 are being doubled even though Crossfire can't use two gpus at once at 100% efficiency.

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#226 GioVela2010
Member since 2008 • 5566 Posts

@RyviusARC said:
@GioVela2010 said:

HD 5970 = 4640 GFLOpS

HD 7970 = 3788 GFLOPS

You tried to cherry pick the worst case scenario.

First of all those cards are a few generations apart with widely different architectures.

Second the 5970 was a dual gpu card similar to two 5850s in Crossfire and Crossfire/SLI will not give 100% performance when two GPUs work as one.

So you are trying to skew the facts by using a graph in the wrong way.

Also some games on the 5970 do run better than the 7970 despite the inefficiency of using two gpus in Crossfire but it is very selective since a lot of newer games don't take advantage of Crossfire well and they usually require much more vRAM than the 5970 can handle.

A more accurate comparison would be to take the benchmark results of a 5850 then double the fps and compare it to the 7970. Since the stats of the 5970 are being doubled even though Crossfire can't use two gpus at once at 100% efficiency.

So what you’re Saying is GFLOPS isn’t everything.

thanks

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#227 RyviusARC
Member since 2011 • 5708 Posts

@GioVela2010 said:
@RyviusARC said:
@GioVela2010 said:

HD 5970 = 4640 GFLOpS

HD 7970 = 3788 GFLOPS

You tried to cherry pick the worst case scenario.

First of all those cards are a few generations apart with widely different architectures.

Second the 5970 was a dual gpu card similar to two 5850s in Crossfire and Crossfire/SLI will not give 100% performance when two GPUs work as one.

So you are trying to skew the facts by using a graph in the wrong way.

Also some games on the 5970 do run better than the 7970 despite the inefficiency of using two gpus in Crossfire but it is very selective since a lot of newer games don't take advantage of Crossfire well and they usually require much more vRAM than the 5970 can handle.

A more accurate comparison would be to take the benchmark results of a 5850 then double the fps and compare it to the 7970. Since the stats of the 5970 are being doubled even though Crossfire can't use two gpus at once at 100% efficiency.

So what you’re Saying is GFLOPS isn’t everything.

thanks

It isn't everything but it is very reliable when it comes to measuring performance of two gpus of the same or similar architecture.

And your post was flawed because it was comparing a dual gpu to a single. It makes your argument null. You lost this one.

Avatar image for UnnDunn
UnnDunn

3981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#228 UnnDunn
Member since 2002 • 3981 Posts

I love how obsessed PC gamers are at proving they can beat the Xbox One X. Every week since the One X was announced, we've had a new one of these.

Of course, PC gamers are also obsessed with frame rates above everything else, so no-one would actually buy one of these $500 "potato-mashers." They just build them to make themselves feel superior to Xbox owners.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#229 scatteh316
Member since 2004 • 10273 Posts

@Xplode_games said:
@scatteh316 said:
@Xplode_games said:
@scatteh316 said:
@xhawk27 said:

It means something when apps that require more compute calculations are released for the 1X.

Yes it will..... as the RX580 will perform better as it has more compute power.......

RX580 = 6.1Tflops

Xbox X = 6Tflops

It's not a huge difference but the RX580 IS faster....... why are you finding it so difficult to understand this?

It's computer basics.

What the hell are you talking about? Now we are counting teraflops to determine performance? These noobs need more training. Why don't you learn what the hell you're talking about before making an ignorant post? The X1X GPU has a more advanced implementation of GCN with many Vega features, larger cache and a lot more memory bandwidth. But you think all of that is beat by a 0.1 teraflop advantage? WTF, and you say you're the expert?

Oh look a noob....... I have to dumb down hardware talk for people in here (Even you) as most noobs in here don't understand the hardware properly enough to have a deep and meaning full conversation about it.

People understand Tflops.... so that's just the easiest way to get people to understand.

But as you've piped up let me school you

1. Xbox X's GPU does have 'some' of Vega's improvements but not all of them, for example it doesn't have RPM ability, but the GPU in Pro does - So much for an advanced GPU LOL!!

2. A lot more memory bandwidth? I think you'll find 'a lot more' is way off and a big exaggeration.

3. Now you're showing what a noob you are, after all the advantages I've pointed out on RX580 you think the 0.1 Tflop is the only thing is has going for it over Xbox X? Lmao...

If that's the case, then why did you criticize another poster who pointed out that the X1X GPU has 40 CUs vs 36 on the RX 580? He was simplifying as well but you went after him then you go on to simplify yourself.

Let's be honest here, the X1X GPU is a little faster than the RX 580 even though it has 0.1 less teraflops, that's clear as day. However, if you are looking for the PC GPU that is most similar to the X1X, then it's reasonable to say it's the RX 580.

NO............The other poster tried to say that X is faster because it has more CU's.... which is just plain wrong..... and wasn't him just trying to 'simplify' things.

Let's be honest here........ it's really not..... and it's a damn good chunk slower then a GTX1070.

Avatar image for ShepardCommandr
ShepardCommandr

4939

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#230 ShepardCommandr
Member since 2013 • 4939 Posts

good luck finding a 1060 at msrp

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#231 GioVela2010
Member since 2008 • 5566 Posts

@RyviusARC said:
@GioVela2010 said:
@RyviusARC said:
@GioVela2010 said:

HD 5970 = 4640 GFLOpS

HD 7970 = 3788 GFLOPS

You tried to cherry pick the worst case scenario.

First of all those cards are a few generations apart with widely different architectures.

Second the 5970 was a dual gpu card similar to two 5850s in Crossfire and Crossfire/SLI will not give 100% performance when two GPUs work as one.

So you are trying to skew the facts by using a graph in the wrong way.

Also some games on the 5970 do run better than the 7970 despite the inefficiency of using two gpus in Crossfire but it is very selective since a lot of newer games don't take advantage of Crossfire well and they usually require much more vRAM than the 5970 can handle.

A more accurate comparison would be to take the benchmark results of a 5850 then double the fps and compare it to the 7970. Since the stats of the 5970 are being doubled even though Crossfire can't use two gpus at once at 100% efficiency.

So what you’re Saying is GFLOPS isn’t everything.

thanks

It isn't everything but it is very reliable when it comes to measuring performance of two gpus of the same or similar architecture.

And your post was flawed because it was comparing a dual gpu to a single. It makes your argument null. You lost this one.

Oh please, like you said FLOPS arent everything. Especially when comparing different brand GPU's. Nvidia cards are always more efficient per GFLOP vs more powerful AMD cards at first. But as time goes on and on the AMD cards do better and better with modern games and the Nvidia cards start to fall behind..

With that said the Xbox One X is a 6. TFLOPS and has 9GB VRAM available

RX 580 is 6TFLOPS and has 8GB VRAM

GTX 1060 is 5.1 TFLOPS and has 6GB Video RAM (We wont even mention the 4.6 TFLOP 1060 with 3GB VRAM)

If you think GTX 1060 is going to last as long with its 5.1 TFLOPS vs a X1X with 6 TFLOPS, than NVidia fanboys are in for a rude awakening in a couple years. (It's close now but i still say X1X comes out ahead, and the biased video reviewer did nothing to change my mind)

But don't take my word for it.

https://linustechtips.com/main/topic/785948-do-amd-cards-actually-last-longer/

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#232 QuadKnight
Member since 2015 • 12916 Posts

@GioVela2010: Nvidia Tflops =! AMD TFlops.

You've already lost the argument dude, let it go.

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#233 GioVela2010
Member since 2008 • 5566 Posts

2013: “WOW Nvidia cards are so efficient, look how well the 5.3 GFLOP 780 Ti does against a 5.6 GFLOP 290X

2016: wow look at the 780 Ti get slaughtered by the 290X

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#234  Edited By scatteh316
Member since 2004 • 10273 Posts

@GioVela2010 said:

2013: “WOW Nvidia cards are so efficient, look how well the 5.3 GFLOP 780 Ti does against a 5.6 GFLOP 290X

2016: wow look at the 780 Ti get slaughtered by the 290X

Talk about cherry picking....

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#235  Edited By Gatygun
Member since 2010 • 2709 Posts

@GioVela2010 said:

2013: “WOW Nvidia cards are so efficient, look how well the 5.3 GFLOP 780 Ti does against a 5.6 GFLOP 290X

2016: wow look at the 780 Ti get slaughtered by the 290X

3gb card playing 4k resolutions? how does hardware work, and that in the gen of 700 series on top of that.

If you honestly bought a 780ti, for 4k in the future. You simple did everything wrong at that time.

Avatar image for HalcyonScarlet
HalcyonScarlet

13838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#236  Edited By HalcyonScarlet
Member since 2011 • 13838 Posts

@UnnDunn said:

I love how obsessed PC gamers are at proving they can beat the Xbox One X. Every week since the One X was announced, we've had a new one of these.

Of course, PC gamers are also obsessed with frame rates above everything else, so no-one would actually buy one of these $500 "potato-mashers." They just build them to make themselves feel superior to Xbox owners.

That's what Xbone gamers try to hide behind. But they can't help to try to prove how xxbone 'owns' pcs. But the minute pc gamers say something back its 'pc gamers that are insecure'. Xbone gamers have been quiet all gen, but now they have a powerful console, they've been the ones making endless threads about it.

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#237 GioVela2010
Member since 2008 • 5566 Posts

@Gatygun said:
@GioVela2010 said:

2013: “WOW Nvidia cards are so efficient, look how well the 5.3 GFLOP 780 Ti does against a 5.6 GFLOP 290X

2016: wow look at the 780 Ti get slaughtered by the 290X

3gb card playing 4k resolutions? how does hardware work, and that in the gen of 700 series on top of that.

If you honestly bought a 780ti, for 4k in the future. You simple did everything wrong at that time.

Keep the excuses coming. 290X benches better most of the time now, the opposite was true 4-5 years ago.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#238  Edited By Gatygun
Member since 2010 • 2709 Posts

@GioVela2010 said:
@Gatygun said:
@GioVela2010 said:

2013: “WOW Nvidia cards are so efficient, look how well the 5.3 GFLOP 780 Ti does against a 5.6 GFLOP 290X

2016: wow look at the 780 Ti get slaughtered by the 290X

3gb card playing 4k resolutions? how does hardware work, and that in the gen of 700 series on top of that.

If you honestly bought a 780ti, for 4k in the future. You simple did everything wrong at that time.

Keep the excuses coming. 290X benches better most of the time now, the opposite was true 4-5 years ago.

What's there for excuse, if you bought a 780 ti for 4k resolutions years later down the route you are doing it wrong.

Where is the excuse for anything?

That whole chart was junk.

And what has that chart to do with my comment or matters to performance of the xbox one x?

That chart also doesn't proof anything in relation towards the xbox one x.

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#239  Edited By GioVela2010
Member since 2008 • 5566 Posts

@Gatygun said:
@GioVela2010 said:
@Gatygun said:
@GioVela2010 said:

2013: “WOW Nvidia cards are so efficient, look how well the 5.3 GFLOP 780 Ti does against a 5.6 GFLOP 290X

2016: wow look at the 780 Ti get slaughtered by the 290X

3gb card playing 4k resolutions? how does hardware work, and that in the gen of 700 series on top of that.

If you honestly bought a 780ti, for 4k in the future. You simple did everything wrong at that time.

Keep the excuses coming. 290X benches better most of the time now, the opposite was true 4-5 years ago.

What's there for excuse, if you bought a 780 ti for 4k resolutions years later down the route you are doing it wrong.

Where is the excuse for anything?

That whole chart was junk.

And what has that chart to do with my comment or matters to performance of the xbox one x?

That chart also doesn't proof anything in relation towards the xbox one x.

Literally in one ear and out the other

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#240  Edited By Xplode_games
Member since 2011 • 2540 Posts

@scatteh316 said:
@Xplode_games said:
@scatteh316 said:
@Xplode_games said:

What the hell are you talking about? Now we are counting teraflops to determine performance? These noobs need more training. Why don't you learn what the hell you're talking about before making an ignorant post? The X1X GPU has a more advanced implementation of GCN with many Vega features, larger cache and a lot more memory bandwidth. But you think all of that is beat by a 0.1 teraflop advantage? WTF, and you say you're the expert?

Oh look a noob....... I have to dumb down hardware talk for people in here (Even you) as most noobs in here don't understand the hardware properly enough to have a deep and meaning full conversation about it.

People understand Tflops.... so that's just the easiest way to get people to understand.

But as you've piped up let me school you

1. Xbox X's GPU does have 'some' of Vega's improvements but not all of them, for example it doesn't have RPM ability, but the GPU in Pro does - So much for an advanced GPU LOL!!

2. A lot more memory bandwidth? I think you'll find 'a lot more' is way off and a big exaggeration.

3. Now you're showing what a noob you are, after all the advantages I've pointed out on RX580 you think the 0.1 Tflop is the only thing is has going for it over Xbox X? Lmao...

If that's the case, then why did you criticize another poster who pointed out that the X1X GPU has 40 CUs vs 36 on the RX 580? He was simplifying as well but you went after him then you go on to simplify yourself.

Let's be honest here, the X1X GPU is a little faster than the RX 580 even though it has 0.1 less teraflops, that's clear as day. However, if you are looking for the PC GPU that is most similar to the X1X, then it's reasonable to say it's the RX 580.

NO............The other poster tried to say that X is faster because it has more CU's.... which is just plain wrong..... and wasn't him just trying to 'simplify' things.

Let's be honest here........ it's really not..... and it's a damn good chunk slower then a GTX1070.

And you tried to say that the RX 580 is faster because it has more teraflops, also just plain wrong. You both were trying to simplify. He said 40 CUs vs 36 CUs which would look obvious to anyone that is faster. You were saying 6.1 vs 6.0 which is obvious to anyone again that is faster.

Technically you were both wrong. In the end you were more wrong because his point was that the X's GPU is faster than the RX 580 GPU and he was right. You were claiming the RX 580 was faster than the X GPU and the only thing you had to support your argument was the bogus 0.1 teraflop advantage. We both agreed it was bogus and your only defense for it is that you were trying to simplify.

What makes it worse is that at the end your last post, you drop the RX 580 and then move on to the GTX 1070 to claim that is much faster than the X. Well nice moving of goal posts but you know what, no it's not. Sometimes it's a lot faster, sometimes it's a little faster, sometimes it's the same and sometimes it's a bit slower. It depends on the game.

The Xbox One X is a powerful console and a spectacular value at $499 and even better with gift cards and games included if you find a deal. It is crazy to nitpick and say wait maybe an RX 580 is 0.1 teraflops faster when it's actually slower overall because of other things, LOL! Do you realize how absurd that is?

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#241 Juub1990
Member since 2013 • 12622 Posts

@Xplode_games: Never seen the 1070 slower than the X1X.

Avatar image for UnnDunn
UnnDunn

3981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#242 UnnDunn
Member since 2002 • 3981 Posts

@HalcyonScarlet said:
@UnnDunn said:

I love how obsessed PC gamers are at proving they can beat the Xbox One X. Every week since the One X was announced, we've had a new one of these.

Of course, PC gamers are also obsessed with frame rates above everything else, so no-one would actually buy one of these $500 "potato-mashers." They just build them to make themselves feel superior to Xbox owners.

That's what Xbone gamers try to hide behind. But they can't help to try to prove how xxbone 'owns' pcs. But the minute pc gamers say something back its 'pc gamers that are insecure'. Xbone gamers have been quiet all gen, but now they have a powerful console, they've been the ones making endless threads about it.

Haha. Xbox gamers are excited about Xbox One X and they talk about it, but never in the context of it "beating PC" (except maybe here on System Wars.) But you can't have a comment thread talking about console gaming anywhere on the Internet without PC gamers crashing it, saying "buy a PC!!!!!" and linking to videos like this one "proving" how some low-spec PC beats the console.

You want to talk about insecure? PC gamers have an entire subreddit where they do nothing but brag about framerates and RGB, call console players "peasants" and declare PCs to be objectively better than consoles (without ever providing an objective way to measure how "good" they are.)

Avatar image for hrt_rulz01
hrt_rulz01

22687

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#243 hrt_rulz01
Member since 2006 • 22687 Posts

@HalcyonScarlet said:
@UnnDunn said:

I love how obsessed PC gamers are at proving they can beat the Xbox One X. Every week since the One X was announced, we've had a new one of these.

Of course, PC gamers are also obsessed with frame rates above everything else, so no-one would actually buy one of these $500 "potato-mashers." They just build them to make themselves feel superior to Xbox owners.

That's what Xbone gamers try to hide behind. But they can't help to try to prove how xxbone 'owns' pcs. But the minute pc gamers say something back its 'pc gamers that are insecure'. Xbone gamers have been quiet all gen, but now they have a powerful console, they've been the ones making endless threads about it.

Lmao... uhhh no. Cows are the ones that post a topic about XB1 X every fu*king day. They're obsessed with it (*cough*Quacknight *cough*).

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#244 QuadKnight
Member since 2015 • 12916 Posts

@hrt_rulz01: Someone is obsessed with me ?

Avatar image for hrt_rulz01
hrt_rulz01

22687

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#245  Edited By hrt_rulz01
Member since 2006 • 22687 Posts

@quadknight said:

@hrt_rulz01: Someone is obsessed with me ?

Yeah, I can't stop thinking about you all day and night...

Lol.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#246  Edited By QuadKnight
Member since 2015 • 12916 Posts

@hrt_rulz01:? Keep lying to yourself you sick bastard lol. I’m changing all my locks.

Avatar image for hrt_rulz01
hrt_rulz01

22687

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#247  Edited By hrt_rulz01
Member since 2006 • 22687 Posts

@quadknight: Your couch looks nice from the window from your front yard btw... lmao.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#248 QuadKnight
Member since 2015 • 12916 Posts

@hrt_rulz01: ? Oh shit.....?

Avatar image for hrt_rulz01
hrt_rulz01

22687

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#249 hrt_rulz01
Member since 2006 • 22687 Posts

@quadknight: Lmao.

Avatar image for clone01
clone01

29844

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#250 clone01
Member since 2003 • 29844 Posts
Loading Video...