Those who said the X1X GPU is a match for the 1070 come forward and apologize

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#501  Edited By commander
Member since 2010 • 16217 Posts

@04dcarraher said:

5-20% difference is comparable...... and the only way i5 hexacores will widen the gap from i7 7700k is when the next generation come out where their ipc improves again by the typical tick tock progression intel tends to do. because right now i5 8600k isnt outright slaughtering the 7700k is on par at stock and slightly faster when both are overclocked.

The lower differences are just because some games are optimized for quad cores with hyperthreading. That's a warped image, it's obvious if you look at benchmarks that target cpu power.

Avatar image for deactivated-5ebd39d683340
deactivated-5ebd39d683340

4089

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#502  Edited By deactivated-5ebd39d683340
Member since 2005 • 4089 Posts

@04dcarraher: Why recommend a 8600K over a 1600X at all? The 20-30% performance increase will flip the other way in 2 years when HT does come into play. Motherboards are cheaper as well. A 1700x with a AM4+ decent mobo can be had cheaper than a 8600K on a LGA1151 mobo. Both OC'd got neglible difference in speed while gaming, but substantial difference in longevity and productivity.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#503 04dcarraher
Member since 2004 • 23858 Posts

Like which games? because it sounds like you grasping for more straws..... if you would look at the pure power benchmarks you wouldn't be in denial.....

Cinebench, timespy cpu test, blender, handbrake, pc mark , pov ray etc show their are on par at stock and 20% or less difference when both are overclocked.....

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#504 commander
Member since 2010 • 16217 Posts

@Juub1990 said:

@commander: The fact they have to drop the resolution has nothing to do with the CPU. Get a clue.

you do realize the gpu is used for cpu task as well nowadays?

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#505 Juub1990
Member since 2013 • 12622 Posts

@commander: You’re full of shit. This game has a GPU bottleneck. Not CPU.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#506 04dcarraher
Member since 2004 • 23858 Posts

@jahnee said:

@04dcarraher: Why recommend a 8600K over a 1600X at all? The 20-30% performance increase will flip the other way in 2 years when HT does come into play. Motherboards are cheaper as well. A 1700x with a AM4+ decent mobo can be had cheaper than a 8600K on a LGA1151 mobo. Both OC'd got neglible difference in speed while gaming, but substantial difference in longevity and productivity.

never said I would.... it all depends on budget and goals of the person. Right now Ryzen ipc is on par with intel's 4th gen icores. While intels 8th gen is roughly 30% faster clock per clock. Personally I would pick ryzen 1600x over i5 8600k because of the headroom with ryzen's SMT "Hyperthreading".

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#507  Edited By ronvalencia
Member since 2008 • 29612 Posts

@appariti0n said:
@commander said:
@Juub1990 said:

@commander: You’re spreading nonsense. You got quoted spreading it and now you’re trying to save face by mitigating the bullshit you said. “The 7700K gets murdered” now it’s “oh well in the future whoever said the difference was slight in the past will be wrong” lol. How is that not spreading nonsense?

You and your posse should just man up and admit you were wrong but you can’t. You even attempted to turn blatant losses into stalemates or moral victories for you. You’ve made an idiot out of yourself in this very thread. Should have guessed that much when you showed you weren’t smart enough to find an up to date benchmark and are still struggling to use google.

Don’t hurt your brain trying to wrap your head around my post now. I know it’s very difficult for you to scroll down and read let alone understand it.

but it does get murdered...

and that's what we see right now, it's only going to get worse in future.

In this random youtube video that specifically focuses on games where 6 physical cores perform better than 4C/8T. Sure. This is your best case, cherry picked scenario right here, and it's what... 15-20% faster? Good god, no wonder you and Ron are so tight.

I love how eurogamer.net is suddenly not relevant, even though you authoritatively posted benchmarks from there claiming ownage when you erroneously thought it was supporting your case.

Until we pointed out that you weren't even benchmarking the correct CPU, and that the results actually work AGAINST you. Then you giggled like a school girl saying "hehe" when you knew you were caught like a deer in the headlights.

The 8600K being on a whole different level of performance has been proven to be bullshit.

Your claim of the 7700k's 8 virtual cores being dwarfed by 6 physical cores has been chopped off at the knees.

And the only thing being MURDERED around here, is the last shred of credibility you might have had.

WTF? Leave me out of 7700K vs 8xxx debate. My view on PC CPUs almost mirrors 04dcarraher.

The main reason for many-hardware threads are the thread context overhead cost in the future e.g. PS5 or X1X.next would have mobile Ryzens.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#508  Edited By commander
Member since 2010 • 16217 Posts

@Juub1990 said:

@commander: You’re full of shit. This game has a GPU bottleneck. Not CPU.

lol , getting salty when confronted with the facts.

It's widely known gpgpu tools are used this gen, why do you think the ps4 and xboxone have such weak cpu's in the first place.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#509  Edited By Juub1990
Member since 2013 • 12622 Posts
@commander said:

lol , getting salty when confronted with the fact.

It's widely known Gpgpu tools are used this gen, why do you think the ps4 and xboxone had such weak cpu's in the first place.

Please prove the game has a CPU bottleneck. Otherwise you're full of shit.

Loading Video...

Here is the game running at max settings with a G4560+a GTX 1060 at 768p(to prevent GPU bottleneck). Look how easily it stays way above 30fps and even hits 50fps+ in desert areas. You previously said the X1X's CPU beats the G4560 so how the hell is this game bottlenecked by a CPU faster than a G4560 which has no issue maintaining 30+fps?

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#510 commander
Member since 2010 • 16217 Posts

@ronvalencia said:

WTF? Leave me out of 7700K vs 8xxx debate. My view on PC CPUs almost mirrors 04dcarraher.

so you would say the I5 8600k and I7 7700k are comparable when both overclocked?

Avatar image for mazuiface
mazuiface

1617

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#511  Edited By mazuiface
Member since 2016 • 1617 Posts

11 pages and no one who originally jumped on the hype has really admitted that the X1X does not perform like a 1070. Lots of goal posts moving and talking about different PC CPUs (that are both better than console CPUS). -- it still doesn't argue anything about the console supposedly performing like a 1070.

This is going to be a long thread. All of this could have been avoided if Xbox fanboys didn't hype their new mid gen upgrade to be as strong as a 1070 and just waited.

Avatar image for appariti0n
appariti0n

5193

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#512 appariti0n
Member since 2009 • 5193 Posts

@commander Let's summarize then shall we? (oh boy I love doing these)

@commander said:

"It will be a cold day in hell before the 7700k comes close to 8600k performance"

"It will be significant enough to put the 8600k in whole different range of performance"

"the two extra cores on the I5 8600k will dwarf the two extra threads on the I7 7700 and blow it out of the water."

"The I5 8600k has 50 percent more cores, the I7 7700k 50 percent more threads. That's 50 percent vs the half of 15-30, so that's like what 8-15 percent.

Sure it narrows the gap a bit, but the I5 is still about 35-42 percent faster."

"But it does get murdered..."

You proceeded to post this gem of an image:

Suddenly comparing the 8700K, which is 6 cores, with 12 threads. Not the 8600K. Oops.

So from the exact same website you used, eurogamer.net, I posted this, with the correct processors being compared: (exact same games you tried to cherry pick too!)

Oh, what do you know. very you know... COMPARABLE.

I think we all agree with @04dcarraher at this point when he says:

"What in the world is commander on?"

But wait, it gets better!

Here comes the backpedalling, while still claiming you're right somehow.

@commander said:

"when I5's hexacores become more mainstream the differences are going to get bigger."

"Sure the I5 8600k won't give that much performance difference today when you compare average benchmark results"

"I don't need benchmarks to know this"

"50 percent more cores is 50 percent more cpu power, it might not translate directly into real world applications like games"

"I may have exagerrated a bit when I said murder"

And then you post this:

With the wrong processor again LOL. Showing what you thought was the 7700K getting MURDERED by the 8600K. It's actually the 7600K now, a 4 core, with no hyperthreading. OOPS.

I fixed it for you though, same website, same game again.

Oh what do you know..... the 7700K at the exact same clocks is pushing 50% more fps than the 7600k you tried to compare with. Imagine that. Rekt again.

And of course, this website, as well as Eurogamer.net are never referenced again by you, instead you switch to some random youtube channel with more cherry picked games.

@commander said:

"whoops, I already saw that btw hehe"

Oh, I get it, you were trying to lull us into a false sense of security by pretending to be a moron. Mission accomplished, I'm still fooled.

And now for your final desperate act, attempting to rewrite the definition of well established english words!

@commander said:

"It's not how this discussion started, he said they would be comparable."

@appariti0n said:

"So what is your definition of "comparable" then?"

@commander said:

"it's completely the same"

Here lemme help you.

com·pa·ra·bleˈkämp(ə)rəb(ə)l/adjective

  1. (of a person or thing) able to be likened to another; similar."flaked stone and bone tools comparable to Neanderthal man's tools"
    synonyms:similar, close, near, approximate, akin, equivalent, commensurate, proportional, proportionate;

Oh that's funny, "Completely the same" isn't listed. Guess I need to talk to Webster's about this oversight.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#513  Edited By commander
Member since 2010 • 16217 Posts
@appariti0n said:

@commander: they'll be comparable. 7700K will likely retain the single core/overclock performance, yet still be ok in multi-thread thanks to HT.

@appariti0n said:

Here lemme help you.

com·pa·ra·bleˈkämp(ə)rəb(ə)l/adjective

  1. (of a person or thing) able to be likened to another; similar."flaked stone and bone tools comparable to Neanderthal man's tools"
    synonyms:similar, close, near, approximate, akin, equivalent, commensurate, proportional, proportionate;

Oh that's funny, "Completely the same" isn't listed. Guess I need to talk to Webster's about this oversight.

equivalent: equal in value, amount , function

try again...

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#514 commander
Member since 2010 • 16217 Posts

@Juub1990 said:
@commander said:

lol , getting salty when confronted with the fact.

It's widely known Gpgpu tools are used this gen, why do you think the ps4 and xboxone had such weak cpu's in the first place.

Please prove the game has a CPU bottleneck. Otherwise you're full of shit.

Loading Video...

Here is the game running at max settings with a G4560+a GTX 1060 at 768p(to prevent GPU bottleneck). Look how easily it stays way above 30fps and even hits 50fps+ in desert areas. You previously said the X1X's CPU beats the G4560 so how the hell is this game bottlenecked by a CPU faster than a G4560 which has no issue maintaining 30+fps?

really 768p?

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#515  Edited By Juub1990
Member since 2013 • 12622 Posts
@commander said:

really 768p?

Yeah so the GPU wouldn't bottleneck it. You can't read?

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#516 waahahah
Member since 2014 • 2462 Posts

@Juub1990 said:
@commander said:

lol , getting salty when confronted with the fact.

It's widely known Gpgpu tools are used this gen, why do you think the ps4 and xboxone had such weak cpu's in the first place.

Please prove the game has a CPU bottleneck. Otherwise you're full of shit.

Here is the game running at max settings with a G4560+a GTX 1060 at 768p(to prevent GPU bottleneck). Look how easily it stays way above 30fps and even hits 50fps+ in desert areas. You previously said the X1X's CPU beats the G4560 so how the hell is this game bottlenecked by a CPU faster than a G4560 which has no issue maintaining 30+fps?

Your video shows the CPU basically tapped out. Plus we know origins is very biased on to nvidia hardware... this game is a bad example at comparing anything xbox one x related because of that.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#517 Juub1990
Member since 2013 • 12622 Posts
@waahahah said:

Your video shows the CPU basically tapped out. Plus we know origins is very biased on to nvidia hardware... this game is a bad example at comparing anything xbox one x related because of that.

And that tapped out CPU which @commander claims is slower than that of the X1X doesn't drop a single frame below 30 and has no issues punching way above 30 even hitting 60 at times. So how can the CPU be a bottleneck at 30fps on the X1X when a slower CPU isn't on the unoptimized PC lol?

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#518 commander
Member since 2010 • 16217 Posts

@Juub1990 said:
@waahahah said:

Your video shows the CPU basically tapped out. Plus we know origins is very biased on to nvidia hardware... this game is a bad example at comparing anything xbox one x related because of that.

And that tapped out CPU which @commander claims is slower than that of the X1X doesn't drop a single frame below 30 and has no issues punching way above 30 even hitting 60 at times. So how can the CPU be a bottleneck at 30fps on the X1X when a slower CPU isn't on the unoptimized PC lol?

the xboxone 1 solves the bottleneck by gpgpu tools and a dx 12 chip, which is a bigger strain on the gpu.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#519  Edited By waahahah
Member since 2014 • 2462 Posts

@Juub1990 said:
@waahahah said:

Your video shows the CPU basically tapped out. Plus we know origins is very biased on to nvidia hardware... this game is a bad example at comparing anything xbox one x related because of that.

And that tapped out CPU which @commander claims is slower than that of the X1X doesn't drop a single frame below 30 and has no issues punching way above 30 even hitting 60 at times. So how can the CPU be a bottleneck at 30fps on the X1X when a slower CPU isn't on the unoptimized PC lol?

Did you read the second point? That CPU IS tapped out but the nvidia drivers are handling origins better. Not a 1:1 cpu comparison.

https://segmentnext.com/2017/10/27/assassins-creed-origins-benchmarks/

1.03 helped bump nvidia performance but not amd's. This is one of thise "its complicated" comparisons to make any meaningful determinations out of.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#520  Edited By Juub1990
Member since 2013 • 12622 Posts
@commander said:

the xboxone 1 solves the bottleneck by gpgpu tools and a dx 12 chip, which is a bigger strain on the gpu.

I'm sure you can prove that. I'll wait.

@waahahah said:

Did you read the second point? That CPU IS tapped out but the nvidia drivers are handling origins better. Not a 1:1 cpu comparison.

You never played the game on PC so you don't know what you're talking about. Even my 4790K shows maximum usage(and is constantly above 80%) in that game and it's in a completely different league from the G4560. That's at 1080p. It has nothing to do with the NVIDIA drivers. This game will just hog the crap out of CPU resources no matter what you got.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#521 waahahah
Member since 2014 • 2462 Posts
@Juub1990 said:
@commander said:

the xboxone 1 solves the bottleneck by gpgpu tools and a dx 12 chip, which is a bigger strain on the gpu.

I'm sure you can prove that. I'll wait.

@waahahah said:

Did you read the second point? That CPU IS tapped out but the nvidia drivers are handling origins better. Not a 1:1 cpu comparison.

You never played the game on PC so you don't know what you're talking about. Even my 4790K shows maximum usage(and is constantly above 80%) in that game and it's in a completely different league from the G4560. That's at 1080p. It has nothing to do with the NVIDIA drivers. This game will just hog the crap out of CPU resources no matter what you got.

And there is evidence this game has really good CPU scaling, anything below 8 threads gets hit.

Also wtf does playing/not playing have anything to do with being able to read bench marks online. Don't make that argument its actually dumb. A really really dumb argument. Also I only own it on PC but your right I haven't played it yet. It was free with my motherboard. So don't assume dumb shit like that.

Secondly... the drivers matter. The 1070 is doing better than a vega64 on the same system. This game is seriously nvidia biased.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#522 Juub1990
Member since 2013 • 12622 Posts
@waahahah said:

And there is evidence this game has really good CPU scaling, anything below 8 threads gets hit.

Also wtf does playing/not playing have anything to do with being able to read bench marks online. Don't make that argument its actually dumb. A really really dumb argument. Also I only own it on PC but your right I haven't played it yet. It was free with my motherboard. So don't assume dumb shit like that.

Secondly... the drivers matter. The 1070 is doing better than a vega64 on the same system. This game is seriously nvidia biased.

Because if you had played the game you would know the CPU is tapped out no matter what you do. My 4790K at Ultra and 1080p is also tapped out. It shows a constant 80%+ usage and often gets up to 98-99%. It's tapped out whether at 1080p or at 4K.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#523 ronvalencia
Member since 2008 • 29612 Posts

@commander said:
@ronvalencia said:

WTF? Leave me out of 7700K vs 8xxx debate. My view on PC CPUs almost mirrors 04dcarraher.

so you would say the I5 8600k and I7 7700k are comparable when both overclocked?

They are similar, but i5 8600K has more FPU/SIMD resources. 7700K has more threads context storage i.e. 8 threads before context switch.

I don't feel i5-8600K to be justifiable upgrade from 7700K. I'll upgrade from i7-8700K from i7-7700K i.e. there's something missing on i5-8600K.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#524 Juub1990
Member since 2013 • 12622 Posts
@ronvalencia said:

They are similar, but i5 8600K has more FPU/SIMD resources. 7700K has more threads context storage i.e. 8 threads before context switch.

I don't feel i5-8600K to be justifiable upgrade from 7700K. I'll upgrade from i7-8700K from i7-7700K i.e. there's something missing on i5-8600K.

Throw in the towel @commander it's a mercy killing. Even your pal ron agrees with them lol.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#525  Edited By commander
Member since 2010 • 16217 Posts

@Juub1990 said:
@commander said:

the xboxone 1 solves the bottleneck by gpgpu tools and a dx 12 chip, which is a bigger strain on the gpu.

I'm sure you can prove that. I'll wait.

@waahahah said:

Did you read the second point? That CPU IS tapped out but the nvidia drivers are handling origins better. Not a 1:1 cpu comparison.

You never played the game on PC so you don't know what you're talking about. Even my 4790K shows maximum usage(and is constantly above 80%) in that game and it's in a completely different league from the G4560. That's at 1080p. It has nothing to do with the NVIDIA drivers. This game will just hog the crap out of CPU resources no matter what you got.

either way, one game is not enough to prove your point.

gpgpu is not new

http://www.redgamingtech.com/ubisoft-gdc-presentation-of-ps4-x1-gpu-cpu-performance/

Ubisofts is actually one of the first that started with this for consoles.

You haven't heard about the dx12 chip in the xboxone x ?

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#526 Juub1990
Member since 2013 • 12622 Posts
@commander said:

either way, one game is not enough to prove your point.

gpgpu is not new

http://www.redgamingtech.com/ubisoft-gdc-presentation-of-ps4-x1-gpu-cpu-performance/

Ubisofts is actually one of the first that started with this for consoles.

You haven't heard about the dx12 chip in the xboxone x ?

No I'm asking you to prove resources are offloaded from the GPU to help the CPU maintain the target 30fps. Something you won't do.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#527  Edited By waahahah
Member since 2014 • 2462 Posts

@Juub1990 said:
@waahahah said:

And there is evidence this game has really good CPU scaling, anything below 8 threads gets hit.

Also wtf does playing/not playing have anything to do with being able to read bench marks online. Don't make that argument its actually dumb. A really really dumb argument. Also I only own it on PC but your right I haven't played it yet. It was free with my motherboard. So don't assume dumb shit like that.

Secondly... the drivers matter. The 1070 is doing better than a vega64 on the same system. This game is seriously nvidia biased.

Because if you had played the game you would know the CPU is tapped out no matter what you do. My 4790K at Ultra and 1080p is also tapped out. It shows a constant 80%+ usage and often gets up to 98-99%. It's tapped out whether at 1080p or at 4K.

right because its an older processor.

You pretty much just lost all credibility trying to make this argument. "I played it on a particular bit of hardware so I can comment on all hardware." Its not tapped out on all hardware but the requirements of high or more seem to stress the cpu and it has really good cpu scaling and uses at least 8 threads. Although I'm not sure if they are using fixed threads or a thread pool... Either way jaguars aren't going to cut it with their IPC...

http://digiworthy.com/2017/10/29/assassins-creed-origins-cpu-benches/

you don't have to play the game if you can see thorough testing on multiple versions of hardware. This should be a trivial concept. And in fact if your basing your argument on your experience its... mentally challenged...

Avatar image for appariti0n
appariti0n

5193

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#528 appariti0n
Member since 2009 • 5193 Posts

@Juub1990 said:
@ronvalencia said:

They are similar, but i5 8600K has more FPU/SIMD resources. 7700K has more threads context storage i.e. 8 threads before context switch.

I don't feel i5-8600K to be justifiable upgrade from 7700K. I'll upgrade from i7-8700K from i7-7700K i.e. there's something missing on i5-8600K.

Throw in the towel @commander it's a mercy killing. Even your pal ron agrees with them lol.

I feel like this is the part in DBZ where Vegeta unexpectedly switches sides to help the good guys.

Thanks Ron!

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#529  Edited By commander
Member since 2010 • 16217 Posts

@ronvalencia said:
@commander said:
@ronvalencia said:

WTF? Leave me out of 7700K vs 8xxx debate. My view on PC CPUs almost mirrors 04dcarraher.

so you would say the I5 8600k and I7 7700k are comparable when both overclocked?

They are similar, but i5 8600K has more FPU/SIMD resources. 7700K has more threads context storage i.e. 8 threads before context switch.

I don't feel i5-8600K to be justifiable upgrade from 7700K. I'll upgrade from i7-8700K from i7-7700K i.e. there's something missing on i5-8600K.

You do realize the i7 7700k gets beaten across the board when overclocked and 20 percent difference is not an exception. 6 cores also means 6 threads, while having 50 percent more cores.

Loading Video...

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#530  Edited By Juub1990
Member since 2013 • 12622 Posts
@waahahah said:

right because its an older processor.

You pretty much just lost all credibility trying to make this argument. "I played it on a particular bit of hardware so I can comment on all hardware." Its not tapped out on all hardware but the requirements of high or more seem to stress the cpu and it has really good cpu scaling and uses at least 8 threads. Although I'm not sure if they are using fixed threads or a thread pool... Either way jaguars aren't going to cut it with love IPC...

http://digiworthy.com/2017/10/29/assassins-creed-origins-cpu-benches/

You still don't get it do you? The GPU USAGE in % show it's nearly maxed out when it isn't. When I'm running the game at 4K/Ultra and drop to 40fps, it shows the GPU usage at 100% but it clearly isn't the case. The GPU is the sole bottleneck at those settings and resolution. It doesn't stop my 4790K from going way above 60fps at lower resolutions regardless. The benches you linked make no mention of GPU usage and simply show the frame rates which isn't what we argue.

We argue how much the CPU is being taxed then you argue the NVIDIA drivers are handling Origins better which has **** all to do what what we're discussing. We're discussing whether the X1X has a CPU bottleneck or a GPU bottleneck in Origins. The G4560 easily handles 30fps and much, much more so how is the X1X which supposedly sports a faster CPU bottlenecked at a mere 30fps and needs to offload CPU tasks to the GPU to stress it so much it is forced to bump down the resolution by an average of 40% to maintain the frame rate? How shitty is the X1X CPU in that case if its GPU which is supposedly comparable to a 1070 needs to cut things back so badly resolution is decrased by 40%+ and settings are also dropped to help out the CPU? Without gpgpu the CPU would bog down that game to 10fps?

This makes no sense.

Edit: And those benches you linked have the G4560 average 43fps and drop to 29fps at the lowest further cementing the notion that this game needing a monster CPU at 30fps is bullshit.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#531 commander
Member since 2010 • 16217 Posts

@appariti0n said:
@Juub1990 said:
@ronvalencia said:

They are similar, but i5 8600K has more FPU/SIMD resources. 7700K has more threads context storage i.e. 8 threads before context switch.

I don't feel i5-8600K to be justifiable upgrade from 7700K. I'll upgrade from i7-8700K from i7-7700K i.e. there's something missing on i5-8600K.

Throw in the towel @commander it's a mercy killing. Even your pal ron agrees with them lol.

I feel like this is the part in DBZ where Vegeta unexpectedly switches sides to help the good guys.

Thanks Ron!

so he must be right about 1070 performance then?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#532  Edited By ronvalencia
Member since 2008 • 29612 Posts

@commander said:
@ronvalencia said:
@commander said:
@ronvalencia said:

WTF? Leave me out of 7700K vs 8xxx debate. My view on PC CPUs almost mirrors 04dcarraher.

so you would say the I5 8600k and I7 7700k are comparable when both overclocked?

They are similar, but i5 8600K has more FPU/SIMD resources. 7700K has more threads context storage i.e. 8 threads before context switch.

I don't feel i5-8600K to be justifiable upgrade from 7700K. I'll upgrade from i7-8700K from i7-7700K i.e. there's something missing on i5-8600K.

You do realize the i7 7700k gets beaten across the board when overclocked and 20 percent difference is not an exception

Overclock is not a general case. Again, I didn't upgrade to i5-8600K since it's missing something i.e. it's i7-8700K or bust. I upgraded into i7-7820K 8C/16C at 4.5 Ghz for all CPU cores instead and I play games at 1440p to 4K not some low resolution 1080p bullshit.

If I game at 1080p, I'll stick with my old MSI Gaming X GTX 980 Ti OC (7.7 TFLOPS**) with i7-4790K combo.

**I can still overclock this GPU towards 8.1 TFLOPS.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#533  Edited By Juub1990
Member since 2013 • 12622 Posts
@commander said:

so he must be right about 1070 performance then?

Further proof you're full of shit:

You're the one spreading misinformation here, comparing the xboxone x chip with the g4560 is ridiculous. This is a custom jaguar cpu, it murders the g4560

Source

So supposedly it murders the G4560 which has no issue maintaining 30fps and a lot more in AC Origins yet it's so slow it needs to offload tasks to the GPU(bogging down its performance so much it makes it absolutely unable to compete with a 1070)? Yeah you got some explaining to do bro.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#534  Edited By commander
Member since 2010 • 16217 Posts

@ronvalencia said:
@commander said:
@ronvalencia said:
@commander said:

so you would say the I5 8600k and I7 7700k are comparable when both overclocked?

They are similar, but i5 8600K has more FPU/SIMD resources. 7700K has more threads context storage i.e. 8 threads before context switch.

I don't feel i5-8600K to be justifiable upgrade from 7700K. I'll upgrade from i7-8700K from i7-7700K i.e. there's something missing on i5-8600K.

You do realize the i7 7700k gets beaten across the board when overclocked and 20 percent difference is not an exception

Overclock is not a general case. Again, I didn't upgrade to i5-8600K since it's missing something i.e. it's i7-8700K or bust. I upgraded into i7-7820K 8C/16C at 4.5 Ghz for all CPU cores instead and I play games at 1440p to 4K not some low resolution 1080p bullshit.

The comparison is when overclocked. We were not discussing the difference when both cpu's run at stock clocks. I can understand you wouldn't upgrade to it though, I wouldn't either, I mean you already need to buy a new board for it but I wouldn't say they're similar or comparable in performance when overclocked.

nice cpu that i7 7820k , I'm planning on buying a new cpu as well and I did consider haswell e or 2066 but those extreme boards are expensive. I'm considering ryzen as well, I need it to feed my oculus rift and for games of course too.

The I7 7820k would be above my budget though, thats a 600$ cpu lol but quite cheap if you consider that 8 core intel cpu's costed a 1000$ or even 2000$ not so long ago.

I have been eyeballing the 6800k , seems quite cheap for what it is, but again, the price of the boards is what holding me back. I may buy one second handed though.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#535  Edited By waahahah
Member since 2014 • 2462 Posts

@Juub1990 said:
@waahahah said:

right because its an older processor.

You pretty much just lost all credibility trying to make this argument. "I played it on a particular bit of hardware so I can comment on all hardware." Its not tapped out on all hardware but the requirements of high or more seem to stress the cpu and it has really good cpu scaling and uses at least 8 threads. Although I'm not sure if they are using fixed threads or a thread pool... Either way jaguars aren't going to cut it with love IPC...

http://digiworthy.com/2017/10/29/assassins-creed-origins-cpu-benches/

You still don't get it do you? The GPU USAGE in % show it's nearly maxed out when it isn't. When I'm running the game at 4K/Ultra and drop to 40fps, it shows the GPU usage at 100% but it clearly isn't the case. The GPU is the sole bottleneck at those settings and resolution. It doesn't stop my 4790K from going way above 60fps regarldess. The benches you linked make no mention of GPU usage and simply show the frame rates which isn't what we argue.

We argue how much the CPU is being taxed then you argue the NVIDIA drivers are handling Origins better which has **** all to do what what we're discussing. We're discussing whether the X1X has a CPU bottleneck or a GPU bottleneck in Origins. The G4560 easily handles 30fps and much, much more so how is the X1X which supposedly sports a faster CPU bottlenecked at a mere 30fps and needs to offload CPU tasks to the GPU to stress it so much it is forced to bump down the resolution by an average of 40% to maintain the frame rate? How shitty is the X1X CPU in that case if its GPU which is supposedly comparable to a 1070 needs to cut things back so badly resolution is decrased by 40%+ and settings are also dropped to help out the CPU? Without gpgpu the CPU would bog down that game to 10fps?

This makes no sense.

The link is explicitly testing for CPU everything else is fixed, or as much as it can be. The vast majority of CPU's are bottle necked with origins.

Also resolution has nothing to do with cpu, its one of the few settings you change to see where a GPU turns into a bottleneck as it doesn't add load to the cpu.

So why is the the resolution lower? Well because the other link I pointed out showed how bad AMD hardware in general is performing on it. The Vega64/56 have both ended up performing better in many cases on PC than a 1070, both of which are under a 1070 in this this game.

Many games have come in very close to the 1070, some haven't where they are obvious CPU issues maintaining 30fps but not 60fps. Other times like origin we see a clear issue on the PC version running on AMD hardware. Do I know why its running slow? No but there are now too many factors in this comparison. Something you seem to be failing to grasp since its your best "proof" that the amd GPU isn't close to the 1070 level. Yet we can clearly see it near the 1070 on other tests so... short answer is its complicated. Its definitely close but the differences we are seeing can be summed up as poor drivers, poor optimization for amd hardware, cpu bottleneck, GPGPU taken up additional resources, or the architecture just won't work well with certain types of game loads. We won't know because we don't have access to the game engine and a detailed analysis of where origins is falling over on xbox.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#536 Juub1990
Member since 2013 • 12622 Posts
@waahahah said:

The link is explicitly testing for CPU everything else is fixed, or as much as it can be. The vast majority of CPU's are bottle necked with origins.

Also resolution has nothing to do with cpu, its one of the few settings you change to see where a GPU turns into a bottleneck as it doesn't add load to the cpu.

So why is the the resolution lower? Well because the other link I pointed out showed how bad AMD hardware in general is performing on it. The Vega64/56 have both ended up performing better in many cases on PC than a 1070, both of which are under a 1070 in this this game.

Many games have come in very close to the 1070, some haven't where they are obvious CPU issues maintaining 30fps but not 60fps. Other times like origin we see a clear issue on the PC version running on AMD hardware. Do I know why its running slow? No but there are now too many factors in this comparison. Something you seem to be failing to grasp since its your best "proof" that the amd GPU isn't close to the 1070 level. Yet we can clearly see it matching the 1070 on other tests so... short answer is its complicated. Its definitely close but the differences we are seeing can be summed up as poor drivers, poor optimization for amd hardware, cpu bottleneck, GPGPU taken up additional resources, or the architecture just won't work well with certain types of game loads. We won't know because we don't have access to the game engine and a detailed analysis of where origins is falling over on xbox.

I know resolution has nothing to do with the CPU. That's why I showed the game running at 768p, to make sure the CPU wasn't held back by the GPU.

"Many games have come very close to the 1070". Please list those games.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#537 waahahah
Member since 2014 • 2462 Posts

@Juub1990 said:
@waahahah said:

The link is explicitly testing for CPU everything else is fixed, or as much as it can be. The vast majority of CPU's are bottle necked with origins.

Also resolution has nothing to do with cpu, its one of the few settings you change to see where a GPU turns into a bottleneck as it doesn't add load to the cpu.

So why is the the resolution lower? Well because the other link I pointed out showed how bad AMD hardware in general is performing on it. The Vega64/56 have both ended up performing better in many cases on PC than a 1070, both of which are under a 1070 in this this game.

Many games have come in very close to the 1070, some haven't where they are obvious CPU issues maintaining 30fps but not 60fps. Other times like origin we see a clear issue on the PC version running on AMD hardware. Do I know why its running slow? No but there are now too many factors in this comparison. Something you seem to be failing to grasp since its your best "proof" that the amd GPU isn't close to the 1070 level. Yet we can clearly see it matching the 1070 on other tests so... short answer is its complicated. Its definitely close but the differences we are seeing can be summed up as poor drivers, poor optimization for amd hardware, cpu bottleneck, GPGPU taken up additional resources, or the architecture just won't work well with certain types of game loads. We won't know because we don't have access to the game engine and a detailed analysis of where origins is falling over on xbox.

I know resolution has nothing to do with the CPU. That's why I showed the game running at 768p, to make sure the CPU wasn't held back by the GPU.

"Many games have come very close to the 1070". Please list those games.

they've already been listed in this thread.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#538 Juub1990
Member since 2013 • 12622 Posts
@waahahah said:

they've already been listed in this thread.

They haven't. Link the post. I know it's not true simply because not many games come close to the performance of the 1070. The only games that do are the games held back by the CPU which is irrelevant because these games are so not demanding they got a shitload of overhead on a PC sporting a 1070.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#539  Edited By waahahah
Member since 2014 • 2462 Posts

@Juub1990 said:
@waahahah said:

they've already been listed in this thread.

They haven't. Link the post. I know it's not true simply because not many games come close to the performance of the 1070. The only games that do are the games held back by the CPU which is irrelevant because these games are so not demanding they got a shitload of overhead on a PC sporting a 1070.

The example that I used earlier is Gears4, 1070 ultra 4k was 30fps, same with xbox 1. If thats not close than I don't know what is... And it had a bottleneck at the lower resolution, if you wanted 60fps instead it didn't maintain 60fps like the 1070 did on the PC. When the GPU is no longer the limiting factor the CPU becomes a bottle neck on xbox.

I don't think you understood the argument from the CPU bottleneck perspective.

edit:

Based on some previous statements also.. you don't seem to get that different settings can affect cpu/gpu load differently.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#540 Juub1990
Member since 2013 • 12622 Posts
@waahahah said:

The example that I used earlier is Gears4, 1070 ultra 4k was 30fps, same with xbox 1. If thats not close than I don't know what is... And it had a bottleneck at the lower resolution, if you wanted 60fps instead it didn't maintain 60fps like the 1070 did on the PC. When the GPU is no longer the limiting factor the CPU becomes a bottle neck on xbox.

I don't think you understood the argument from the CPU bottleneck perspective.

Gears of War 4 averages 36fps at 4K/Ultra settings on a 1070. That's a 20% bump over the X1X so no, not comparable. I already mentioned it was pointless to compare games with overhead left. We don't even know how well the X1X would do in Gears 4 if it had uncapped frame rate nor has the analysis been done to test the frame rate and see if resolution is dynamic or not.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#541  Edited By waahahah
Member since 2014 • 2462 Posts

@Juub1990 said:
@waahahah said:

The example that I used earlier is Gears4, 1070 ultra 4k was 30fps, same with xbox 1. If thats not close than I don't know what is... And it had a bottleneck at the lower resolution, if you wanted 60fps instead it didn't maintain 60fps like the 1070 did on the PC. When the GPU is no longer the limiting factor the CPU becomes a bottle neck on xbox.

I don't think you understood the argument from the CPU bottleneck perspective.

Gears of War 4 averages 36fps at 4K/Ultra settings on a 1070. That's a 20% bump over the X1X so no, not comparable. I already mentioned it was pointless to compare games with overhead left. We don't even know how well the X1X would do in Gears 4 if it had uncapped frame rate nor has the analysis been done to test the frame rate and see if resolution is dynamic or not.

Its totally comparable. do you understand what that means? You literally just compared it.

And its not worlds apart. And as you pointed out its locked at 30FPS so there is margin of error + an unkown amount bringing it closer to the 1070. One of the links in this thread also landed around 30fps for the 1070 so again. So for the gears comparison we can see its either on par or close depending on the benchmark. The benchmarks also probably aren't even the same seen we'd have to verify it to really make sure. They also said they are fairly certain gears is not dynamic res.

With that said, this 1 game proves that the x1x gpu lands pretty dam close to a 1070 in performance. Other games there have been identifiable factors that end up causing it to fall short, primarily the cpu. Origins... has more factors.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#542 Zero_epyon
Member since 2004 • 20502 Posts

@waahahah: gears 4 doesn't run on ultra on x1x.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#543  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Juub1990 said:
@commander said:

really 768p?

Yeah so the GPU wouldn't bottleneck it. You can't read?

Your low resolution test shows sequential/serial performance characteristics and at higher resolution (higher instances for wider parallelism), Vega 64 rivals GTX 1080. 768p resolution is not a realistic workload for X1X.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#544  Edited By Juub1990
Member since 2013 • 12622 Posts
@waahahah said:

Its totally comparable. do you understand what that means? You literally just compared it.

And its not worlds apart. And as you pointed out its locked at 30FPS so there is margin of error + an unkown amount bringing it closer to the 1070. One of the links in this thread also landed around 30fps for the 1070 so again. So for the gears comparison we can see its either on par or close depending on the benchmark. The benchmarks also probably aren't even the same seen we'd have to verify it to really make sure.

With that said, this 1 game proves that the x1x gpu lands pretty dam close to a 1070 in performance. Other games there have been identifiable factors that end up causing it to fall short, primarily the cpu. Origins... has more factors.

20% difference is a tier of GPU. So no it's absolutely not comparable.

Second we don't even know how well the X1X version stacks up to the PC version yet, no analysis has been done. Depth of field from the second highest preset to Ultra destroys performance in Gears of War 4 for example.

Here you have SSR and Depth of Field that kill performance. Dropping these two down one level from the highest can yield gains of 20%. X1X version may as well be running the Very High preset for these two and Ultra everywhere else. It would make the PC version average 40fps with these two settings dialed back which would effectively make a 33% difference in frame rate which would make them not comparable. Until we get the analysis it's pointless to compare.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#545  Edited By waahahah
Member since 2014 • 2462 Posts

@Juub1990 said:
@waahahah said:

Its totally comparable. do you understand what that means? You literally just compared it.

And its not worlds apart. And as you pointed out its locked at 30FPS so there is margin of error + an unkown amount bringing it closer to the 1070. One of the links in this thread also landed around 30fps for the 1070 so again. So for the gears comparison we can see its either on par or close depending on the benchmark. The benchmarks also probably aren't even the same seen we'd have to verify it to really make sure.

With that said, this 1 game proves that the x1x gpu lands pretty dam close to a 1070 in performance. Other games there have been identifiable factors that end up causing it to fall short, primarily the cpu. Origins... has more factors.

20% difference is a tier of GPU. So no it's absolutely not comparable.

Second we don't even know how well the X1X version stacks up to the PC version yet, no analysis has been done. Depth of field from the second highest preset to Ultra destroys performance in Gears of War 4 for example.

Here you have SSR and Depth of Field that kill performance. Dropping these two down one level from the highest can yield gains of 20%. X1X version may as well be running the Very High preset for these two and Ultra everywhere else. It would make the PC version average 40fps with these two settings dialed back which would effectively make a 33% difference in frame rate which would make them not comparable. Until we get the analysis it's pointless to compare.

Do you listen to your self? Your stating 20% like a fact then saying but we don't know...

Also a quick google search shows that the adaptive res will only be used in MP at 4k.

https://www.neowin.net/news/gears-of-war-4s-enhancements-for-the-xbox-one-x-detailed

Also your argument is dumb, saying x1x doesn't do something because this graph states that a feature has too much of a performance hit. And then actually continuing to type after you created a hypothetical situation... after saying we can't compare... then stating it again after your hypothetical comparison that made it look worse. Biased much?

Don't play the pointless to compare either mr Origins. Although this is still a much better comparison and it shows xbox is a lot closer than many speculated it would be.

@Zero_epyon said:

@waahahah: gears 4 doesn't run on ultra on x1x.

Thats right, it has HDR also, ultra+

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#546  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Juub1990 said:
@waahahah said:

Its totally comparable. do you understand what that means? You literally just compared it.

And its not worlds apart. And as you pointed out its locked at 30FPS so there is margin of error + an unkown amount bringing it closer to the 1070. One of the links in this thread also landed around 30fps for the 1070 so again. So for the gears comparison we can see its either on par or close depending on the benchmark. The benchmarks also probably aren't even the same seen we'd have to verify it to really make sure.

With that said, this 1 game proves that the x1x gpu lands pretty dam close to a 1070 in performance. Other games there have been identifiable factors that end up causing it to fall short, primarily the cpu. Origins... has more factors.

20% difference is a tier of GPU. So no it's absolutely not comparable.

Second we don't even know how well the X1X version stacks up to the PC version yet, no analysis has been done. Depth of field from the second highest preset to Ultra destroys performance in Gears of War 4 for example.

Here you have SSR and Depth of Field that kill performance. Dropping these two down one level from the highest can yield gains of 20%. X1X version may as well be running the Very High preset for these two and Ultra everywhere else. It would make the PC version average 40fps with these two settings dialed back which would effectively make a 33% difference in frame rate which would make them not comparable. Until we get the analysis it's pointless to compare.

http://www.eurogamer.net/articles/digitalfoundry-2016-gears-of-war-4-face-off

All settings retain the same level of edge coverage which can be modified instead by the temporal AA sharpening option. These settings have a minimal impact on performance and, as such, we recommend using the highest possible anti-aliasing quality with your own desired level of temporal sharpening. The differences are extremely subtle to the point where it's not clear where Xbox One falls but it does appear to at least match the high setting.

...

Character level of detail determines the complexity of characters within the scene. Modifying this setting impacts how many on-screen characters will be displayed at full detail. At the lowest setting, for instance, only the player character is displayed at full detail. That said, in practice, the difference is incredibly subtle as is the performance cost. The final setting in this group is foliage draw distance which controls, of course, how much visible foliage is presented on screen. Interestingly, Xbox One seems to use a custom value here with results closer to the high setting but still falling short in a few areas.

....

Lighting and shadows

The first half of this cluster focuses on lighting the game world with options such as light shaft quality, light scattering quality and bloom quality. Gears 4 sports both screen-space style light shafts and true volume lights. The quality of the light shafts, often referred to as crepuscular rays, are used at various junctions throughout the game and can be completely disabled if desired. Increasing this setting simply improves the depth and quality of the individual rays and, on Xbox One, we're looking at the high setting once again.

Then we have light scattering quality, which controls the volumetric lighting used in various portions of the game. Increasing this option improves the precision of the effect resulting in cleaner results without additional artefacting along its edges. Xbox One actually appears to use the medium option here, which still looks quite nice, but lacks some of the precision of the higher quality settings. Bloom and lens flare quality are two rather subtle options, then, that influence the intensity of said effects and both of these appear to operate at the high setting on Xbox One.

When it comes to shadows, Gears 4 leverages the strengths of Unreal Engine 4 in order to present very high quality shadow representation, even on Xbox One. The first setting focuses on the resolution of the shadow maps and the number of dynamic shadows used throughout the game and is pretty typical for a modern release. Impressively, the Xbox One version is actually a match for the ultra settings on PC which was an unexpected surprise but there is a reason for it, as we'll touch on shortly.

Capsule shadows are another nice feature and something that was introduced in Unreal Engine 4.11. Essentially, these act as indirect shadows which help root characters more firmly in the scene. The high and ultra settings produce the same quality of capsule shadows but the former limits the number of characters per scene using the higher quality effect. Xbox One appears to be a match for the high setting.

...

We believe that this technique has allowed the team to make use of higher quality shadow maps on Xbox One.

Lastly, we have the all-important ambient occlusion which is designed to handle contact shadows throughout the scene... Xbox One is interesting in that it doesn't appear to completely match any of these results on the PC with very subtle differences in coverage. We'd peg it somewhere between high and medium but it's not entirely clear.

Advanced visuals

Screen space reflections have become a common method for displaying scene accurate reflections with a reasonable performance cost by utilising screen space information. The downside is that, when relevant data is occluded from view, the reflections lose the detail as well. We see this same behavior up through the ultra setting as higher settings simply display more refined, complex reflections. After examining a number of different areas, Xbox One appears to fall around the medium quality for this setting.

Real-time cinematics and advanced settings

There are just two options in this category - depth of field and sub-surface scattering. Depth of field controls the quality of the associated effect and is used exclusively in real-time cut-scenes. Xbox One appears to fall around the high setting

Sub-surface scattering, then, simply determines the quality of light playing off of skin and other fleshy materials. Xbox One appears to utilise the high setting here and the results do look excellent during the real-time cutscenes...

Temporal AA sharpening: XBO has high settings

Character level of detail: XBO has high settings

Foliage draw distance: XBO has high settings

Light shaft quality: XBO has high settings

Light scattering quality: XBO has high settings

bloom quality: XBO has high settings

Volumetric lighting: XBO has medium settings

Bloom and lens flare quality: XBO has high settings

Dynamic shadows: XBO has ultra settings

Contact shadows: XBO has medium-high settings

Screen space reflections: XBO has medium settings

Depth of field: XBO has high settings

Sub-surface scattering: XBO has high settings

Closest 4K benchmark with XBO settings.

Stock GTX 1070 FE is slower than GTX 1070 SC (Super Overclock)

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#547  Edited By Zero_epyon
Member since 2004 • 20502 Posts

@waahahah: what's ultra+? Also it would be interesting to see a benchmark of the 1070 running on console settings.

And hdr shouldn't have much of an effect on the gpu.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#548  Edited By commander
Member since 2010 • 16217 Posts

@Juub1990 said:
@waahahah said:

Its totally comparable. do you understand what that means? You literally just compared it.

And its not worlds apart. And as you pointed out its locked at 30FPS so there is margin of error + an unkown amount bringing it closer to the 1070. One of the links in this thread also landed around 30fps for the 1070 so again. So for the gears comparison we can see its either on par or close depending on the benchmark. The benchmarks also probably aren't even the same seen we'd have to verify it to really make sure.

With that said, this 1 game proves that the x1x gpu lands pretty dam close to a 1070 in performance. Other games there have been identifiable factors that end up causing it to fall short, primarily the cpu. Origins... has more factors.

20% difference is a tier of GPU. So no it's absolutely not comparable.

Second we don't even know how well the X1X version stacks up to the PC version yet, no analysis has been done. Depth of field from the second highest preset to Ultra destroys performance in Gears of War 4 for example.

Here you have SSR and Depth of Field that kill performance. Dropping these two down one level from the highest can yield gains of 20%. X1X version may as well be running the Very High preset for these two and Ultra everywhere else. It would make the PC version average 40fps with these two settings dialed back which would effectively make a 33% difference in frame rate which would make them not comparable. Until we get the analysis it's pointless to compare.

@appariti0n also thinks 20 percent performance difference is comparable , which is ridiculous. So I certainly have to agree here with the fact that 20 percent performance difference is not comparable. It's actually a 'whole different range of performance'

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#549  Edited By Juub1990
Member since 2013 • 12622 Posts

@waahahah: We have to wait until we get the whole rundown. As it stands we don’t know how the X1X version stacks up up not to mention the 1070 still performs better regardless. It’s 20% better.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#550  Edited By commander
Member since 2010 • 16217 Posts

@Juub1990 said:

@waahahah: We have to wait until we get the whole rundown. As it stands we don’t know how the X1X version stacks up up not to mention the 1070 still performs better regardless. It’s 20% better.

sorry but there's no proof the 1070 is 20 percent better at this time.