Those who said the X1X GPU is a match for the 1070 come forward and apologize

Avatar image for Dark_sageX
Dark_sageX

3561

Forum Posts

0

Wiki Points

0

Followers

Reviews: 236

User Lists: 0

#401 Dark_sageX
Member since 2003 • 3561 Posts

@commander said:

Well when people say that the xboxone x will match gtx 1070 performance, they were obviously not talking about the cpu, the fact that it is on one chip doesn't change much. Now you're moving goalposts.

It's quite obvious that ac origins likes beefy cpu's

More cores defenitely help, but the cpu speed stays a major factor across the board, and the cpu in the xboxone x is a lot slower than the cpu's they use in benchmarks.

I have an I3 4170 paired with a gtx 970. It powers my vr headset quite well. If the xboxone x can power the games good enough, then what's to complain about, a ryzen cpu would have driven up the price and they have workarounds so it doesn't bottleneck the gpu.

That's why this was a discussion you could never win, like you said, if you would use a similar cpu as the xboxone x, it would choke the 1070. Even if you would compare 4k benchmarks there's no way of knowing how much cpu tasks the xboxone x has transferred to the gpu. You can only do it by pairing the gtx 1070 with a similar cpu, and then we come back again to the choking part...

But even then, it's obvious that the xboxx custom gpu is stronger than a 1060 when you look at the difference between the ps4 pro and the xboxone x. It pushes resolutions as far as 96 percent higher This while it should only be 40 percent stronger when looking at the raw power, but the xboxone x has optimizations in place, something ms does with every console to push it beyond it's raw specs.

It's something that df saw and that the ark dev saw as well. Digital foundry paddled back, but apparently they don't know microsofts history. There a reason why they didn't mention their pc specs in this video. The word is that ac origins is the most demanding game to date.

and maybe you should have listened more closely, the dynamic resolution goes from 1800p all the way up to above 2000p.

Ron declared that the Xbox ONE X will perform on par with a PC that has a GTX 1070, it did not, and thats the reality of it, again, look at the benchmark and tell me the Xbox ONE X can perform the same as a GTX 1070. Like i said, I only care about real world results, I don't care if the GPU component in the scorpio chip can match an actual GTX 1070 or even that the CPU is the bottleneck, because that shit doesn't matter, what matters is what the product delivers, what we are looking at is an APU, not a CPU and GPU combo, again it is a single unit molded onto a PCB, there is no point in comparing the chip's CPU to a stand alone quad core processor or anything above that, because there is nothing you can do to change anything about it, you cannot upgrade the X1X's CPU to rectify the bottleneck, it is what is, and the benchmarks have spoken, so please spare me, this discussion is over.

And OK you are right, it hovers between 1800p and 2160p on the X1X, still doesn't make a difference, a system with a GTX 1070 does native 2160p (at all times) and pumps up higher fps with similar graphical settings (averaging at 40fps according to the video I saw the other day, i'll link it when I'm bothered to). The X1X performance is still as i said, on par with a system with a GTX 1060 (or more accurately an RX 580), it is still a considerable distance away from a GTX 1070 or anything above that, so Ron was still wrong.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#402  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@jahnee:

Games are not always specifically coded for consoles first. multiple examples show Pc then consoles game and graphics engine wise, but its also the nature of using modern API's and newer hardware. Consoles only set the base on lowest common denominator, but the fact that 3rd gen+ intel quad cores still decimate these consoles shows how slow they are and are not solely basis on game development

Avatar image for deactivated-5ebd39d683340
deactivated-5ebd39d683340

4089

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#403  Edited By deactivated-5ebd39d683340
Member since 2005 • 4089 Posts

@04dcarraher: The Frostbite 3, Unreal Engine 4 & Cryengine 3 multiplatform games their setpieces are all designed with the 8 core jaguar handling them in mind. Increase in scope regarding physics simulations, animations or cpu calculated AI would crush the consoles, thus the sales of their games. The engines can handle more, and are designed on pc's sure. But the main reason for the massive downgrades is still because of that jaguar CPU. The PC version in turn is also downgraded since the rest of the game was developed with, again, that jaguar cpu in mind. While an increase in CPU power would gain you 10-30% in performance depending on the game, CPU's have taken an enormous backseat in game development. It's not surprising though, CPU coding in general takes a lot more talent hence the lack of good AI in almost all video games.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#404  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@jahnee said:

@04dcarraher: The Frostbite 3, Unreal Engine 4 & Cryengine 3 multiplatform games their setpieces are all designed with the 8 core jaguar handling them in mind. Increase in scope regarding physics simulations, animations or cpu calculated AI would crush the consoles, thus the sales of their games. The engines can handle more, and are designed on pc's sure. But the main reason for the massive downgrades is still because of that jaguar CPU. The PC version in turn is also downgraded since the rest of the game was developed with, again, that jaguar cpu in mind.

Actually no , Frostbite 3 and Cryengine 3 were created before x1 and ps4 were even released and the 360 and PS3 were still primary consoles. Those engines took advantage of modern hardware using more than four threads. even frostbite 2.0 used 8 threads. Then Unreal 4 is a multiplatform engine which started on PC as a tech demo in 2012.... many of these engines are being scaled back for consoles cpu limitations. The only time games are designed around consoles first is when a dev is console centric and is porting the work over to PC. Most of these modern multiplatform engines are not focused on consoles but adjust to the platform abilities.

Avatar image for appariti0n
appariti0n

5193

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#405  Edited By appariti0n
Member since 2009 • 5193 Posts

@commander said:

The I5 8600k has 50 percent more cores, the I7 7700k 50 percent more threads. That's 50 percent vs the half of 15-30, so that's like what 8-15 percent.

Sure it narrows the gap a bit, but the I5 is still about 35-42 percent faster.

Oh, well when you put it like that, I guess you're right. 50% more cores means 50% better performance then?

BRB, going to buy a threadripper for gaming.

When people like @04dcarraher explain this shit, you should take notes, and learn, instead of attempting to argue from a place of complete ignorance.

But, I guess Dale Carnegie was right. Show someone how wrong they are, and they'll just entrench further into their position anyways.

Go ahead and keep believing in your theoretical 35-42% difference, I'll continue to use actual benchmarks.

Avatar image for FLOPPAGE_50
FLOPPAGE_50

4500

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#406 FLOPPAGE_50
Member since 2004 • 4500 Posts

@waahahah: damn you rekt quacknight

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#407 Zero_epyon
Member since 2004 • 20502 Posts

Just a heads up. COD WW2 uses a dynamic scaler and temporal reconstruction to get to a 4K target. The gtx 1070 can run the game at native 4K without reconstruction and runs at an average of 57 fps on ultra.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#408  Edited By QuadKnight
Member since 2015 • 12916 Posts
@Zero_epyon said:

Just a heads up. COD WW2 uses a dynamic scaler and temporal reconstruction to get to a 4K target. The gtx 1070 can run the game at native 4K without reconstruction and runs at an average of 57 fps on ultra.

? Dang but @waahahah was just telling me how the GTX 1070 had been matched by the GPU on the XboneL since it matches the GTX 1070 in resolution in everything but framerates lol.

@FLOPPAGE_50 said:

@waahahah: damn you rekt quacknight

Nah, he rekt himself with his first sentence.

? You're just as clueless as him and you lems have already lost the argument before it even began. His arguments are baseless and weak.

@waahahah said:

If you say your a "smart" PC gamer.. you should realize the comparison your making? You know a cpu limited system that is likely not going to stack up against 60 FPS is... not stacking up against 60FPS.

You know your video clearly explains the cpu limitation as the console isn't drawing that much power, at least not as much as it should be so its being under utilized.

So unfortunately if we are looking video card to video card your example is dumb. Its better to compare the 4k locked 30fps to the desktop which won't be as bottle necked by the cpu...

https://www.pcworld.com/article/3128346/software-games/tested-gears-of-wars-4-pc-benchmarks-yield-glorious-graphics-options-galore.html

Whats this? A 1080 is only 15 fps more than the xbox? Double on high?

Whats xbox settings for this game again? I'm not actually sure. I couldn't find a 1070 version of this bench mark nor do I care about owning fanboys online that much. But this illustrates that the xbox is closer to a 1070 than what most PC gamers are comfortable with... clearly. And since you can't help yourself reacting to every bit of information without thinking it through, its not helping your image as a 'smart' pc gamer, and not an unintelligent console fanboy.

You're just as bad as Ron maybe even worse. The problem with you clowns is that you post charts without knowing how to read them. You would have an argument if the games you were comparing were running at Ultra PC settings on the XboneL but they aren't.

"b...bu...bu....but teh CPU bottlenecks!!1"

When you're not only failing to match the GTX 1070 in resolution but also performance as well it's more than just "teh CPU bottlenecks!!!", you have been completely rekt. ? Are you gonna tell me that a game like COD WW2 is limited by the CPU on the XboneL because it's so CPU intensive? If the XboneL was as powerful as you claim it would have no issues running the game in native 4K like the GTX 1070.

Avatar image for appariti0n
appariti0n

5193

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#409 appariti0n
Member since 2009 • 5193 Posts

@quadknight: except when you rag on the shitty cpu, then lems claim there isn't a bottleneck lol.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#410  Edited By QuadKnight
Member since 2015 • 12916 Posts

@appariti0n said:

@quadknight: except when you rag on the shitty cpu, then lems claim there isn't a bottleneck lol.

Exactly.

The CPU is only a bottleneck when they are choking in a multiplatform game like ACO. When they hit a stable 4K30 it's no longer a bottleneck. What a bunch of flip-flopping hypocrites. No wonder they are in last place. Them and MS can't stick to a single story and they end up confusing themselves and the whole market.

Avatar image for Legend002
Legend002

13405

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 1

#411 Legend002
Member since 2007 • 13405 Posts

The X is a $500 box that also includes the game pad, cpu, os, box, power cable, blu ray UHD drive and hdmi. The 1070 is what? $450? lol

Avatar image for appariti0n
appariti0n

5193

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#412 appariti0n
Member since 2009 • 5193 Posts

@Legend002 said:

The X is a $500 box that also includes the game pad, cpu, os, box, power cable, blu ray UHD drive and hdmi. The 1070 is what? $450? lol

Which makes it all the more ridiculous that anyone could think it would match a 1070.....

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#413  Edited By waahahah
Member since 2014 • 2462 Posts

@quadknight said:

? Dang but @waahahah was just telling me how the GTX 1070 had been matched by the GPU on the XboneL since it matches the GTX 1070 in resolution in everything but framerates lol.

I never said it matched .. I said its closer than what people expected. And its making people like you uncomfortable clearly...

Nah, he rekt himself with his first sentence.

? You're just as clueless as him and you lems have already lost the argument before it even began. His arguments are baseless and weak.

Baseless would assume I didn't use your evidence against you to point out that the GPU is limited significantly by the CPU, so in examples that reduce the cpu bottle its easier to see the GPU comparing much better to a 1070.

You're just as bad as Ron maybe even worse. The problem with you clowns is that you post charts without knowing how to read them. You would have an argument if the games you were comparing were running at Ultra PC settings on the XboneL but they aren't.

How to read a metric that shows a 1080 running gears 4 at ultra 4k at 44 fps vs the link you gave showed the xbox performs at about 30fps at ultraish settings.

Can you explain where I misinterpreted this?

"b...bu...bu....but teh CPU bottlenecks!!1"

When you're not only failing to match the GTX 1070 in resolution but also performance as well it's more than just "teh CPU bottlenecks!!!", you have been completely rekt. ? Are you gonna tell me that a game like COD WW2 is limited by the CPU on the XboneL because it's so CPU intensive? If the XboneL was as powerful as you claim it would have no issues running the game in native 4K like the GTX 1070.

Well yah, the xbox has cpu bottlenecks... clearly and considering PC gamers have been claiming for weeks here that it'll struggle with anything above 30fps... and your link. Not to mention there is a good chance because of the cpu the gap may even widen when it comes to visuals as devs move to more gpu compute workloads to offset the CPU adding more to the gpu workload.

Moral of the story is, in some games its closer to the 1070 than the 1060... bear in mind that these aren't low end video cards... some games suffer because of the cpu, some games don't match as well and we can't be sure why, but the ones that do show its a capable GPU albeit the cpu is managed well.

edit: wording, I meant to say its capable of reaching a certain metric but it sounded like its capable of matching, which I'm sure your likely already slamming the keyboard away at this one sentence and ignoring everything else is said...

I never claimed it will run like a pc with a 1070 paired with i7... or that it will even match a 1070... Your so quick to try to shit on someone that you can't even understand basic english.

Avatar image for Legend002
Legend002

13405

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 1

#414 Legend002
Member since 2007 • 13405 Posts

@appariti0n: Exactly. People don't think nor use common sense.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#415  Edited By waahahah
Member since 2014 • 2462 Posts

@quadknight said:
@appariti0n said:

@quadknight: except when you rag on the shitty cpu, then lems claim there isn't a bottleneck lol.

Exactly.

The CPU is only a bottleneck when they are choking in a multiplatform game like ACO. When they hit a stable 4K30 it's no longer a bottleneck. What a bunch of flip-flopping hypocrites. No wonder they are in last place. Them and MS can't stick to a single story and they end up confusing themselves and the whole market.

When it doesn't appear to be a bottle neck... yes it stops being a bottle neck.. do you understand how bottlenecks work?

for instance, lets say the cpu struggles to complete its task for a frame above 40 fps... if you do something like... increase the resolution which doesn't affect the cpu's time to write a frame, then the gpu will take longer.

In one scenario the gpu stalls waiting for the cpu and the gpu is underutilized. The other the cpu stalls waiting or the gpu to finish a frame.

edit: To point out... at 4k30fps the GPU becomes the bottleneck because it is the limiting factor when writing a frame. At 60 fps the cpu is. Its not my fault the bottle neck can actually flip flop depending on the workload.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#416 QuadKnight
Member since 2015 • 12916 Posts

@Legend002 said:

@appariti0n: Exactly. Lems don't think nor use common sense.

? FTFY

Avatar image for deactivated-5ebd39d683340
deactivated-5ebd39d683340

4089

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#417  Edited By deactivated-5ebd39d683340
Member since 2005 • 4089 Posts

@04dcarraher: I understand your point, but point me towards that cpu power being used in the actual gameplay elements of a pc game. The skeleton has to be playable on consoles. They'll add some extra stuff in for pc, but they won't redesign it. That was my whole point. The difference in cpu power between pc and console is so immense we are talking about more than a generational cap. You can't just tell me the engine just 'scales back' to console hardware. There is a reason we are seeing pretty reskinned games and not innovative interactive worlds. Crysis from 2007 still showcases advanced physics f.e. I am very well aware these engines can do much more than what these multiplatform games are showing, but multiplatform developers earn a ton more on consoles so that's where they'll code the campaign sequences and cpu calculated events for.

Think big scale events man. Games their battle arena's are still the same size as x360/ps3 era. The amount happening is still battling room for room. Battlefield 1's scale is still on Battlefield 3's/ Bad Company 2's level. Gears of War 4 still uses 'on the rails' events instead of complex interaction.

If games were truely coded around a high end cpu with 7700k being the bare minimum, games could play much different instead of just scalable levels of graphical settings being enabled. But the market isn't there for it.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#418  Edited By waahahah
Member since 2014 • 2462 Posts

@jahnee said:

@04dcarraher: I understand your point, but point me towards that cpu power being used in the actual gameplay elements of a game. The skeleton has to be playable on consoles. They'll add some extra stuff in for pc, but they won't redesign it. That was my whole point. I am very well aware these engines can do much more than what these multiplatform games are showing.

You don't need a redesign for consoles/PC, if your game engine supports threaded pipe lining than all you have to do is scale your workload to fit the hardware available.

Avatar image for Diddies
Diddies

2415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#419 Diddies
Member since 2007 • 2415 Posts

@Xplode_games said:

@waahahah: Do you now understand what it's like to present facts and a logical argument to these rabid fanboys? I was literally hitting the TC over the head with facts for pages in this thread and he just ignored it and went back to his fanboy talking points.

You are dumb. You really got owned in this. I know you are blinded, but damn. You blinded this bad?

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#420 scatteh316
Member since 2004 • 10273 Posts

@Legend002 said:

The X is a $500 box that also includes the game pad, cpu, os, box, power cable, blu ray UHD drive and hdmi. The 1070 is what? $450? lol

And if you already have a PC that's a few years old with a decent CPU that 1070 upgrade is cheaper and superior ;)

Avatar image for xantufrog
xantufrog

17898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#421  Edited By xantufrog  Moderator
Member since 2013 • 17898 Posts

@waahahah: I think a lot of people on here don't really understand the concept of bottlenecks. It's pretty easy to take some uncapped pc games and move the dials around to literally go from a cpu to a gpu bottleneck. Not all games give enough freedom or push hardware enough for it. But if you've got a midrange machine like I do, that can't just brute force everything into the ground without breaking a sweat, you can experiment like that

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#422 ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:
@Legend002 said:

The X is a $500 box that also includes the game pad, cpu, os, box, power cable, blu ray UHD drive and hdmi. The 1070 is what? $450? lol

And if you already have a PC that's a few years old with a decent CPU that 1070 upgrade is cheaper and superior ;)

At 4K resolution, GTX 1070 would bottleneck Intel Core i5/i7 Ivybridge/Haswells class CPUs.

Intel Core i5/i7 Ivybridge/Haswells class with GTX 1070 is ideal for >60 hz/90Hz/100Hz/120 hz/144 hz 1080p/1440p gaming, but it's mostly GPU bound at 4K resolution like X1X.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#423 ronvalencia
Member since 2008 • 29612 Posts

@appariti0n said:
@Legend002 said:

The X is a $500 box that also includes the game pad, cpu, os, box, power cable, blu ray UHD drive and hdmi. The 1070 is what? $450? lol

Which makes it all the more ridiculous that anyone could think it would match a 1070.....

X1X did match 1070 with Killer Instinct Season 3.

Avatar image for deactivated-5ebd39d683340
deactivated-5ebd39d683340

4089

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#424  Edited By deactivated-5ebd39d683340
Member since 2005 • 4089 Posts

@waahahah: Then why are pc games running at 200fps @1080p? Crysis never did this. CPU intensity wise there has been no new Crysis. Assassins Creed Unity? Look at the amount of NPC's and AI! But later Assassins Creed games got scaled back hard because it was too CPU intensive. Metro does get an honorable mention but also came to consoles eventually. But those few are a poor amount in comparison to the total game library is it not?

http://www.pcgamer.com/the-most-demanding-pc-games-right-now/

I understand how bottlenecking works. But also understand that progress has only taken place in rediculous high resolutions, frame rates & pretty reskins. Not a bad progression at all. Just after 10 years it's becoming stale.

I am calling it right now. Give consoles a Ryzen CPU and you'll see true next gen. Engines scaling to cores =/= innovative gameplay. We need a raw increase in CPU benchmarks to see the Next Gen change.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#425  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Dark_sageX said:
@commander said:

Well when people say that the xboxone x will match gtx 1070 performance, they were obviously not talking about the cpu, the fact that it is on one chip doesn't change much. Now you're moving goalposts.

It's quite obvious that ac origins likes beefy cpu's

More cores defenitely help, but the cpu speed stays a major factor across the board, and the cpu in the xboxone x is a lot slower than the cpu's they use in benchmarks.

I have an I3 4170 paired with a gtx 970. It powers my vr headset quite well. If the xboxone x can power the games good enough, then what's to complain about, a ryzen cpu would have driven up the price and they have workarounds so it doesn't bottleneck the gpu.

That's why this was a discussion you could never win, like you said, if you would use a similar cpu as the xboxone x, it would choke the 1070. Even if you would compare 4k benchmarks there's no way of knowing how much cpu tasks the xboxone x has transferred to the gpu. You can only do it by pairing the gtx 1070 with a similar cpu, and then we come back again to the choking part...

But even then, it's obvious that the xboxx custom gpu is stronger than a 1060 when you look at the difference between the ps4 pro and the xboxone x. It pushes resolutions as far as 96 percent higher This while it should only be 40 percent stronger when looking at the raw power, but the xboxone x has optimizations in place, something ms does with every console to push it beyond it's raw specs.

It's something that df saw and that the ark dev saw as well. Digital foundry paddled back, but apparently they don't know microsofts history. There a reason why they didn't mention their pc specs in this video. The word is that ac origins is the most demanding game to date.

and maybe you should have listened more closely, the dynamic resolution goes from 1800p all the way up to above 2000p.

Ron declared that the Xbox ONE X will perform on par with a PC that has a GTX 1070, it did not, and thats the reality of it, again, look at the benchmark and tell me the Xbox ONE X can perform the same as a GTX 1070. Like i said, I only care about real world results, I don't care if the GPU component in the scorpio chip can match an actual GTX 1070 or even that the CPU is the bottleneck, because that shit doesn't matter, what matters is what the product delivers, what we are looking at is an APU, not a CPU and GPU combo, again it is a single unit molded onto a PCB, there is no point in comparing the chip's CPU to a stand alone quad core processor or anything above that, because there is nothing you can do to change anything about it, you cannot upgrade the X1X's CPU to rectify the bottleneck, it is what is, and the benchmarks have spoken, so please spare me, this discussion is over.

And OK you are right, it hovers between 1800p and 2160p on the X1X, still doesn't make a difference, a system with a GTX 1070 does native 2160p (at all times) and pumps up higher fps with similar graphical settings (averaging at 40fps according to the video I saw the other day, i'll link it when I'm bothered to). The X1X performance is still as i said, on par with a system with a GTX 1060 (or more accurately an RX 580), it is still a considerable distance away from a GTX 1070 or anything above that, so Ron was still wrong.

Notice the resolution, it's 1080p not 4K resolution. Games like ARK Survival (UE4) overloaded the GPUs at 1080p resolution, hence it's GPU bound instead of CPU bound.

https://www.gamespot.com/articles/ark-dev-talks-xbox-one-x-and-says-sony-wont-allow-/1100-6452662/

On the subject of the Xbox One X's horsepower, Stieglitz said Ark can run at the equivalent of "Medium" or "High" settings on PC. It can run at 1080p/60fps (Medium) or 1440p/30fps (High), and it sounds like developer Studio Wildcard may offer an option to switch between them.

For GTX 1070, medium settings 1080p resolution with 60 fps target starts at https://youtu.be/nIbiUd3l4PQ?t=20

Loading Video...

https://youtu.be/SoHfywz2fqA?t=136 RX-480's 1920x1080 medium result which failed 60 fps.

http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ram/index.html

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

Avatar image for appariti0n
appariti0n

5193

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#426  Edited By appariti0n
Member since 2009 • 5193 Posts

@ronvalencia: Wow, you're really sticking to your guns on this one huh?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#427 ronvalencia
Member since 2008 • 29612 Posts

@appariti0n said:

@ronvalencia: Wow, you're really sticking to your guns on this one huh?

This topic was started from a single ACO game example and X1X doesn't have actual GTX 1070.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#428 QuadKnight
Member since 2015 • 12916 Posts

@appariti0n said:

@ronvalencia: Wow, you're really sticking to your guns on this one huh?

lol, you really expecting him to change after being proven wrong so many times?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#429 ronvalencia
Member since 2008 • 29612 Posts

@quadknight said:
@appariti0n said:

@ronvalencia: Wow, you're really sticking to your guns on this one huh?

lol, you really expecting him to change after being proven wrong so many times?

What's matter? it's counter examples against your argument. Stay salty.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#430 scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:
@appariti0n said:

@ronvalencia: Wow, you're really sticking to your guns on this one huh?

This topic was started from a single ACO game example and X1X doesn't have actual GTX 1070.

So you're not apologizing for

1. Being wrong

2. The threads you made about it being like a 1070

3. All the times you quoted the Ark devs

4. All the times you bought up Forza's wet tracks

It's a not GTX1070 and the more videos that come out the more evidence stacks up.

It performs like an RX580 at best.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#431  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@jahnee said:

@waahahah: Then why are pc games running at 200fps @1080p? Crysis never did this. CPU intensity wise there has been no new Crysis. Assassins Creed Unity? Look at the amount of NPC's and AI! But later Assassins Creed games got scaled back hard because it was too CPU intensive. Metro does get an honorable mention but also came to consoles eventually. But those few are a poor amount in comparison to the total game library is it not?

http://www.pcgamer.com/the-most-demanding-pc-games-right-now/

I understand how bottlenecking works. But also understand that progress has only taken place in rediculous high resolutions, frame rates & pretty reskins. Not a bad progression at all. Just after 10 years it's becoming stale.

I am calling it right now. Give consoles a Ryzen CPU and you'll see true next gen. Engines scaling to cores =/= innovative gameplay. We need a raw increase in CPU benchmarks to see the Next Gen change.

The framerate is mostly from the cpu being able to feed the gpu the data quick enough and gpu is strong enough to drive that example to 200 fps. Which Crysis? if your talking about the first one that is because that one was limited by the graphics engine using Direct x 9 and that the game only supported two threads. Assassins creed unity example follows suit with the Witcher 3 the consoles could only support 70 some npcs in the cities while on pc settings allow hundreds to be rendered. Now you really never see 70 npcs at one time anyways.

Here is a question for you what can they innovative gameplay wise? more npcs on screen for immersion benefits , better physics?, already has been done before.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#432 QuadKnight
Member since 2015 • 12916 Posts

@ronvalencia said:
@quadknight said:
@appariti0n said:

@ronvalencia: Wow, you're really sticking to your guns on this one huh?

lol, you really expecting him to change after being proven wrong so many times?

What's matter? it's counter examples against your argument. Stay salty.

You're then that's salty and rekt in this thread. This whole thread destroys you and your foolish claims lol.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#433 ronvalencia
Member since 2008 • 29612 Posts

@quadknight said:
@ronvalencia said:
@quadknight said:
@appariti0n said:

@ronvalencia: Wow, you're really sticking to your guns on this one huh?

lol, you really expecting him to change after being proven wrong so many times?

What's matter? it's counter examples against your argument. Stay salty.

You're then that's salty and rekt in this thread. This whole thread destroys you and your foolish claims lol.

What rekt? I rekt your BS arguments. You can't claim X1X having "GTX 1060" when some games results are greater than GTX 1060 and rivaled GTX 1070!

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#434 Zero_epyon
Member since 2004 • 20502 Posts

@ronvalencia: there are no games with similar results to 1070. The games that seem to aren't running on the same quality presets. It's either the X game is running on ultra, but the resolution is scaled back, or the resolution is locked or reconstructed 4K and the quality is scaled back compared to the 1070 benchmarks.

See call of duty.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#435  Edited By Juub1990
Member since 2013 • 12622 Posts
@ronvalencia said:

What rekt? I rekt your BS arguments. You can't claim X1X having "GTX 1060" when some games results are greater than GTX 1060 and rivaled GTX 1070!

"Some games" lol the only game is Killer Instinct season 3 which the 1070 runs at 4K/60fps anyway and we all know fighting games are usually capped at 60fps so the 1070 may actually have a lot of overhead. Don't even know how well it would do uncapped. But by all means, stick to a 4 year old game to prove your point.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#436  Edited By commander
Member since 2010 • 16217 Posts

@appariti0n said:

Oh, well when you put it like that, I guess you're right. 50% more cores means 50% better performance then?

BRB, going to buy a threadripper for gaming.

When people like @04dcarraher explain this shit, you should take notes, and learn, instead of attempting to argue from a place of complete ignorance.

But, I guess Dale Carnegie was right. Show someone how wrong they are, and they'll just entrench further into their position anyways.

Go ahead and keep believing in your theoretical 35-42% difference, I'll continue to use actual benchmarks.

@04dcarraher said:
@Xplode_games said:

I don't understand why he's even arguing this. a 4 core processor is low end now. Ok it has hyperthreading but so does a Ryzen 1500, it has 4cores, 8 threads and is only $150 now.

These people can't admit when they're wrong. I personally wouldn't buy an 8600k because I think it's moronic to not buy the 8700k but I think it's obvious it's a lot better than the obsolete 7700k.

You're right, hypberbullshit can't overcome a 50% increase in actual cores. No one would argue against that other than someone who's just trolling or maybe legitimately ignorant on the subject.

So wrong.. your clearly ignorant on the subject as well

I'll quote my earlier post.....

"Lets look at Battlefront 2 SP, With a GTX 1080ti an i7 6700 gets the same performance as a i7 6850k and the game uses 8 threads well. That extra two cores and 4 threads didnt do anything for i7 6850k over the quad core i7 in SP. Even with Battlefield 1 cpu testing where it uses upto 16 threads using 64 player multiplayer both cpus here at 4ghz i7 6950k 10 core 20 thread vs i7 i7 6900 8 core 16 thread there was only a 2% difference on min and avg. again two real cores didnt really help.

While the new Call of duty WW2 which can use 16 threads, the performance difference between the i7 6850k and i7 6700 was only around 10% difference in min and avg framerates even with two extra cores and 4 threads.... ie 50% more processing power did not translate to 25+ percent more performance."

Even with synthetic benchmarks i7 7700k beat out i5 8400 6 core however with i5 8600k (clocked higher )only edged out i7 7700k by 8% on avearge. Most games the difference is virtually nil. while a few its less than 10% difference between 8600k and i7 7700k.... cpu load and multithreading coding with games does not mean 50% more cores and threads equal alot more performance if any at all

@04dcarraher said:

HT performance totally depends on the cpu load and multithreading coding done. You can see upto 50% increase in minimum framerate going from i5 to an i7 from the same generation. For example Gears of war 4 using a GTX 1080 at 1080p (making it more cpu bound) an i5 4670k get 52 min 118 avg while a i7 4770k gets 102 min and 148 avg. so we see 50% increase in min and 21% increase in avg, when both cpus have nearly the same processing power per core and same amount of real cores.

Having that extra 2 cores dont mean that your going to perform any better nor majorly better... i5 6 core vs an i7 quad core with HT, If a game is coded to use 8 or more threads that i7 can concurrently handle eight threads of work at one time while the i5 6 core can only handle 6 threads at one time. Meaning the next task waiting will have to wait abit longer to get done on the i5 until one thread is done with its current task.

Lets look at Battlefront 2 SP, With a GTX 1080ti an i7 6700 gets the same performance as a i7 6850k and the game uses 8 threads well. That extra two cores and 4 threads didnt do anything for i7 6850k over the quad core i7 in SP. Even with Battlefield 1 cpu testing where it uses upto 16 threads using 64 player multiplayer both cpus here at 4ghz i7 6950k 10 core 20 thread vs i7 i7 6900 8 core 16 thread there was only a 2% difference on min and avg. again two real cores didnt really help.

While the new Call of duty WW2 which can use 16 threads, the performance difference between the i7 6850k and i7 6700 was only around 10% difference in min and avg framerates even with two extra cores and 4 threads.... ie 50% more processing power did not translate to 25+ percent more performance.

Like I said depends on cpu load and the multithreading balancing. your estimates on performance betyween HT vs real cores is wrong because of case by case differences.

So your argument is that hyperthreading can have an increase up to 50 percent compared with a quad core cpu with no hyperthreading, if the game is optimized for 8 threads. I don't disagree with that, but the problem is that you are not comparing a quad core with quad core here. The hyperthreading will never make up for 50 percent more cores, since the hexacores can handle 6 threads instead of 4 , this while have 50 percent more cpu power.

Not to mention these scenario's are not that common. You're talking about best case scenarios, this is mostly not the case, not only because most games just don't prefer 8 threads, but try to scale over all cpu's as much as they can. So it will go from 2 threads, 4, 6 threads and 8 threads. A lot people have 6 core amd's and now we have 6 core intel I5's.

But there are also downsides of hyperthreading as well. Running 2 processes at the same time will result in cache misses and have an impact on performance. Of course hyperthreading wil always give you overall better performance, if it's optimized for more threads than the cpu with no hyperthreading.

Still it will never outmatch the I5 8600k running at the same clock speeds. The extra two threads because of ht will never make up for the extra two cores.

Sure the I5 8600k won't give that much performance difference today when you compare average benchmark results, mainly because games are made with quad cores in mind, or even I7. But there are already games that will make use of the extra cpu power and it will only happen more and more. Let's also not forget that cpu benchmarks on resolution higher than 1080 are mostly gpu bound as well, and you won't see much difference between cpu's that can provide sufficient cpu power.

So if you have an I7700 there's no much of reason to go to coffee lake at this time, but this is not how this discussion started. I said it would be a cold day in hell before the I7700k matches the I5 8600k, and apparation said the hyperthreading would make up for that and that's just not the case, even with some older games like crysis 3 but also newer games like watch dogs 2.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#437 waahahah
Member since 2014 • 2462 Posts

@jahnee said:

@waahahah: Then why are pc games running at 200fps @1080p? Crysis never did this. CPU intensity wise there has been no new Crysis. Assassins Creed Unity? Look at the amount of NPC's and AI! But later Assassins Creed games got scaled back hard because it was too CPU intensive. Metro does get an honorable mention but also came to consoles eventually. But those few are a poor amount in comparison to the total game library is it not?

http://www.pcgamer.com/the-most-demanding-pc-games-right-now/

I understand how bottlenecking works. But also understand that progress has only taken place in rediculous high resolutions, frame rates & pretty reskins. Not a bad progression at all. Just after 10 years it's becoming stale.

I am calling it right now. Give consoles a Ryzen CPU and you'll see true next gen. Engines scaling to cores =/= innovative gameplay. We need a raw increase in CPU benchmarks to see the Next Gen change.

or some of those things don't take as much as you think, like dead rising had tons of AI characters on screen. There are many things you might have to take into consideration like, what did the massive crowds to for unity's game play? The wading through crowds against timers was the game play in dead rising.

Also next gen has had diminishing returns, just throwing more and more at something doesn't make it proportionally better. Like the crowds in unity... or details that take a developer months to do that no one will notice. And we've been stuck on 1080p while the jump to 4k is completely brutal on hardware. Gears is 44fps on a 1080p... software... so I don't know where you think 200fps have to do with anything, games released today trying to hit 4k are falling over, its 60fps on 1080p... otherwise you start reaching into indie markarts or games that aren't trying to push graphics so you end up with 200fps..

people that grew up from 80's -> now probably are going to view technology the wrong way. We'll have to start looking at games like we look at cars. We've hit the optimized design threshold, while there is definitly more wiggle room in video games, elder scrolls 10 is probably going to be more of a side grade on something like skyrim in terms of gameplay, maybe they'll develop the combat system or reintroduce a bit more complexity in the rpg mechanics, none of which is next gen just picking known elements and remixing it to make something old new again.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#438  Edited By commander
Member since 2010 • 16217 Posts

@Dark_sageX said:
@commander said:

Well when people say that the xboxone x will match gtx 1070 performance, they were obviously not talking about the cpu, the fact that it is on one chip doesn't change much. Now you're moving goalposts.

It's quite obvious that ac origins likes beefy cpu's

More cores defenitely help, but the cpu speed stays a major factor across the board, and the cpu in the xboxone x is a lot slower than the cpu's they use in benchmarks.

I have an I3 4170 paired with a gtx 970. It powers my vr headset quite well. If the xboxone x can power the games good enough, then what's to complain about, a ryzen cpu would have driven up the price and they have workarounds so it doesn't bottleneck the gpu.

That's why this was a discussion you could never win, like you said, if you would use a similar cpu as the xboxone x, it would choke the 1070. Even if you would compare 4k benchmarks there's no way of knowing how much cpu tasks the xboxone x has transferred to the gpu. You can only do it by pairing the gtx 1070 with a similar cpu, and then we come back again to the choking part...

But even then, it's obvious that the xboxx custom gpu is stronger than a 1060 when you look at the difference between the ps4 pro and the xboxone x. It pushes resolutions as far as 96 percent higher This while it should only be 40 percent stronger when looking at the raw power, but the xboxone x has optimizations in place, something ms does with every console to push it beyond it's raw specs.

It's something that df saw and that the ark dev saw as well. Digital foundry paddled back, but apparently they don't know microsofts history. There a reason why they didn't mention their pc specs in this video. The word is that ac origins is the most demanding game to date.

and maybe you should have listened more closely, the dynamic resolution goes from 1800p all the way up to above 2000p.

Ron declared that the Xbox ONE X will perform on par with a PC that has a GTX 1070, it did not, and thats the reality of it, again, look at the benchmark and tell me the Xbox ONE X can perform the same as a GTX 1070. Like i said, I only care about real world results, I don't care if the GPU component in the scorpio chip can match an actual GTX 1070 or even that the CPU is the bottleneck, because that shit doesn't matter, what matters is what the product delivers, what we are looking at is an APU, not a CPU and GPU combo, again it is a single unit molded onto a PCB, there is no point in comparing the chip's CPU to a stand alone quad core processor or anything above that, because there is nothing you can do to change anything about it, you cannot upgrade the X1X's CPU to rectify the bottleneck, it is what is, and the benchmarks have spoken, so please spare me, this discussion is over.

And OK you are right, it hovers between 1800p and 2160p on the X1X, still doesn't make a difference, a system with a GTX 1070 does native 2160p (at all times) and pumps up higher fps with similar graphical settings (averaging at 40fps according to the video I saw the other day, i'll link it when I'm bothered to). The X1X performance is still as i said, on par with a system with a GTX 1060 (or more accurately an RX 580), it is still a considerable distance away from a GTX 1070 or anything above that, so Ron was still wrong.

Well he didn't say what pc, and comparing it with a pc with much stronger cpu is just not a fair comparison. Even this thread title is:

'Those who said the X1X GPU is a match for the 1070 come forward and apologize'

If I run the witcher 3 on my i3 4170 and my gtx 970 I can not get steady 60 fps, when I put an I5 in there I will have 60 fps. The xboxone x is different, if they aim for 60 fps they will use gpgpu tools to alleviate the cpu but use dynamic resolution. That doesn't mean the gpu doesn't have the performance of a gtx 1070.

If your question is if the product delivers well let just ask yourself this? How can you make a pc that achieves the same performance as the xboxone X for 500$? Good luck with that.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#439  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@commander:

.....you did not read well enough into my post then the 1st example was an quad core i5 vs quad core i7 of same generation the rest of the examples was 6 core 12 thread vs i7 quad core of same gen... and one example was 8 vs 10 core example. the gains from having 50% more cores and threads do not correlate to getting matching performance.

There is no downside with HT or SMT, it allows each core handle two threads per core this allows more concurrent tasks to be processed. Your idea that HT uses a core's cache and have an impact on performance is wrong in the context of multithreading. starting two tasks at the same time is faster than working on one task then having to wait to start the next. For example of good multithreading coding, gears of war 4 as an example. AMD's FX 8350 performed more than 10% better with minimum framerate vs intel's i5 6600 that is nearly 2x faster per core.

At stock an i7 7700k wont beat i5 8600k, however those two extra cores dont mean 50% performance over than i7 quad.... Cinebench 15 synthetic multithreaded benchmark shows that i5 8600k only beats i7 7700k by 8%. And even 3dmark timespy cpu test only a 9% difference. Modern game like rise of the tomb raider i5 8600 is only 6% faster on average and that game will use 16 threads. Most game engine anymore scale to what is available making use of 8 to16 threads. Multiple games and programs on the market that can use 8 threads at full tilt show that the i5 8600k wont cream the i7 7700k even in the future. Now future generations of i5 6 core cpu once their ipc increases another 20% or so it will be a different ballgame.

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#440 Xtasy26
Member since 2008 • 5594 Posts

There were rumors of a slightly more powerful version of Polaris with more Stream processors that AMD never released. Seems like the Xbox One X is just that a slightly more powerful version of Polaris or the RX 580. In other words, it sits between a GTX 1060 and GTX 1070. Even with optimizations for games that goes on for consoles it will not reach the performance of a GTX 1070. If they had something like a Vega 56 GPU in there then it might have beaten a GTX 1070. To say that it will beat a GTX 1070 is delusional at best.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#441 Juub1990
Member since 2013 • 12622 Posts
@commander said:

If your question is if the product delivers well let just ask yourself this? How can you make a pc that achieves the same performance as the xboxone X for 500$? Good luck with that.

So you're basically admitting that it has a piece of shit CPU that starts to be a bottleneck at around 30fps? Great console indeed.

Avatar image for Diddies
Diddies

2415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#442 Diddies
Member since 2007 • 2415 Posts

@appariti0n said:
@Legend002 said:

The X is a $500 box that also includes the game pad, cpu, os, box, power cable, blu ray UHD drive and hdmi. The 1070 is what? $450? lol

Which makes it all the more ridiculous that anyone could think it would match a 1070.....

You can find one fore $390. But the Xbox One X can't compete with a 1070. If you have some sort of knowledge (unlike Ron, who just spews charts and graphs in an attempt to make you lems believe he is right) then you know that the Xbox One X can't reach the 1070. I know you try to stick up for your brand of the Xbox One X and I will tell you I am glad they are moving forward with more powerful hardware, but please admit that you believed it would be equivalent and move on. No one is going to think less of you. we think less of you when you act like Xplode and keep fighting it looking like a complete idiot. He really has posted some really idiotic stuff in the last month (at least from what I saw...could be longer) and gets made fun of constantly. It is quite hilarious, but just give it up.

Avatar image for deactivated-5ebd39d683340
deactivated-5ebd39d683340

4089

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#443 deactivated-5ebd39d683340
Member since 2005 • 4089 Posts

@04dcarraher said:
@jahnee said:

@waahahah: Then why are pc games running at 200fps @1080p? Crysis never did this. CPU intensity wise there has been no new Crysis. Assassins Creed Unity? Look at the amount of NPC's and AI! But later Assassins Creed games got scaled back hard because it was too CPU intensive. Metro does get an honorable mention but also came to consoles eventually. But those few are a poor amount in comparison to the total game library is it not?

http://www.pcgamer.com/the-most-demanding-pc-games-right-now/

I understand how bottlenecking works. But also understand that progress has only taken place in rediculous high resolutions, frame rates & pretty reskins. Not a bad progression at all. Just after 10 years it's becoming stale.

I am calling it right now. Give consoles a Ryzen CPU and you'll see true next gen. Engines scaling to cores =/= innovative gameplay. We need a raw increase in CPU benchmarks to see the Next Gen change.

The framerate is mostly from the cpu being able to feed the gpu the data quick enough and gpu is strong enough to drive that example to 200 fps. Which Crysis? if your talking about the first one that is because that one was limited by the graphics engine using Direct x 9 and that the game only supported two threads. Assassins creed unity example follows suit with the Witcher 3 the consoles could only support 70 some npcs in the cities while on pc settings allow hundreds to be rendered. Now you really never see 70 npcs at one time anyways.

Here is a question for you what can they innovative gameplay wise? more npcs on screen for immersion benefits , better physics?, already has been done before.

+ @waahahah

I didn't know that the first Crysis was only coded for dual threaded cpu operations. Thank you for the information. I don't claim to know everything regarding the subject matter, and am open minded to learn and be proven wrong where needed.

I believe the biggest innovation that can take place with a CPU upgrade in consoles is the immersion factor that only pre-rendered e3 game trailers previously exhibited. Calculated animations happen simultaneously can increase, the scope of level design can increase, AI complexity (not solely the numbers) can increase, game world interaction without load times can improve, atmospheric effect can bounce off surfaces and become a gameplay element like volumetric mist/water/simulated wind affecting particle based objects. In a sense scripted events can be replaced by dynamic code inducing more randomness in open world based games or campaigns that rely on the use of AI being rendered in a bigger radius (draw distance) than a game with arena style level design.

I don't know man, maybe I am just a sucker for open world games or games with an atmospheric open level design. Half life 2 Episode 2 was such a winner in this regard, it's a shame it didn't get a sequel. And I do wish the next Elder Scrolls innovates just as much as Oblivion did back in the day. Or that Anthem, Cyberpunk 2077, Star Citizen, the next Metro or Half life 3(confimed) for that matter innovate.

Imagine the current imaginative scenario's for example: Skyrim without load times when opening doors, gates or such to any new areas without ever encountering a single load time. Some mods do do this already with the main city gates. Imagine then seeing creatures walk around in the dense forests from a mile away while the leafs and mist are obstructing the view just enough that it hinders aim. I want the game world to become more alive then it currently is sitting at, not much has changed in this regard. Most resources in the last 10 years have gone to aesthetics.

I agree with the user 'waahahah' with his statement that we are hitting an optimized design threshold, but only under the condition when next gen consoles hit with Ryzen Cpu's included. Gears of War 4 hitting 164fps on stock 1080TI clocks @1080p wasn't far off from my 200fps mark. That's one heck of a cpu overhead we got there. Now Gears 4 isn't the most cpu intensive game. But a game like Hitman hits 154fps on 1080p. Such a framerate isn't a necessity for that kind of game. A lot of more cinematic levels could have been designed if it was PC exclusive.

We have the GPU TFLOPS, cheaper 4k HDR UHD tv's, memory bandwith, high SSD drive speeds & UHD Blu-Ray disc (which potentally could hold up to 200gb of data) all available to fill up the next gen consoles with in 2020. Last piece of the puzzle is a Ryzen based CPU to be included and we can finally greenlit a potential revolutionary gaming cycle of the likes of 2005. Which I then whole heartedly welcome to introduce an incremental console cycle that lasts another 10 more years as it discarded the jaguar CPU for a Ryzen CPU.

Avatar image for soulitane
soulitane

15091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#444 soulitane
Member since 2010 • 15091 Posts

@ronvalencia said:

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

"kind of" "maybe" Still going on about this? lol

Avatar image for Diddies
Diddies

2415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#445  Edited By Diddies
Member since 2007 • 2415 Posts

@waahahah said:
@quadknight said:

? Dang but @waahahah was just telling me how the GTX 1070 had been matched by the GPU on the XboneL since it matches the GTX 1070 in resolution in everything but framerates lol.

I never said it matched .. I said its closer than what people expected. And its making people like you uncomfortable clearly...

Nah, he rekt himself with his first sentence.

? You're just as clueless as him and you lems have already lost the argument before it even began. His arguments are baseless and weak.

Baseless would assume I didn't use your evidence against you to point out that the GPU is limited significantly by the CPU, so in examples that reduce the cpu bottle its easier to see the GPU comparing much better to a 1070.

You're just as bad as Ron maybe even worse. The problem with you clowns is that you post charts without knowing how to read them. You would have an argument if the games you were comparing were running at Ultra PC settings on the XboneL but they aren't.

How to read a metric that shows a 1080 running gears 4 at ultra 4k at 44 fps vs the link you gave showed the xbox performs at about 30fps at ultraish settings.

Can you explain where I misinterpreted this?

"b...bu...bu....but teh CPU bottlenecks!!1"

When you're not only failing to match the GTX 1070 in resolution but also performance as well it's more than just "teh CPU bottlenecks!!!", you have been completely rekt. ? Are you gonna tell me that a game like COD WW2 is limited by the CPU on the XboneL because it's so CPU intensive? If the XboneL was as powerful as you claim it would have no issues running the game in native 4K like the GTX 1070.

Well yah, the xbox has cpu bottlenecks... clearly and considering PC gamers have been claiming for weeks here that it'll struggle with anything above 30fps... and your link. Not to mention there is a good chance because of the cpu the gap may even widen when it comes to visuals as devs move to more gpu compute workloads to offset the CPU adding more to the gpu workload.

Moral of the story is, in some games its closer to the 1070 than the 1060... bear in mind that these aren't low end video cards... some games suffer because of the cpu, some games don't match as well and we can't be sure why, but the ones that do show its a capable GPU albeit the cpu is managed well.

edit: wording, I meant to say its capable of reaching a certain metric but it sounded like its capable of matching, which I'm sure your likely already slamming the keyboard away at this one sentence and ignoring everything else is said...

I never claimed it will run like a pc with a 1070 paired with i7... or that it will even match a 1070... Your so quick to try to shit on someone that you can't even understand basic english.

It isn't near the 1070. How do you come up with this? It is 50% less resolution and at half the frames. Moving frames really adds a lot of burden to that GPU.

Also everyone has been saying that the CPU will limit the Xbox. However you lems didn't believe it and believed Ron's charts. However, that GPU still doesn't compare with the 1070.

And giving that MS are complete idiots to put a GPU that is the most powerful in a console history with a CPU that is a joke, somehow you are still sticking up for them saying...yea we know how GPU is good but we at okay with our shitty CPU.

Honestly, I think they should have spent a little more money on the Xbox One X and placed a better CPU in there. I would have rather seen a $550 or $600 console as the Xbox One X wasn't targeted for the casual gamer. They stated it wasn't.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#446  Edited By commander
Member since 2010 • 16217 Posts

@04dcarraher said:

@commander:

.....you did not read well enough into my post then the 1st example was an quad core i5 vs quad core i7 of same generation the rest of the examples was 6 core 12 thread vs i7 and quad core of same gen... and one example was 8 vs 10 core example. the gains from having 50% more cores and threads do not correlate to getting matching performance.

There is no downside with HT or SMT, it allows each core handle two threads per core this allows more concurrent tasks to be processed. Your idea that HT uses a core's cache and have an impact on performance is wrong in the context of multithreading. starting two tasks at the same time is faster than working on one task then having to wait to start the next. For example of good multithreading coding, gears of war 4 as an example. AMD's FX 8350 performed more than 10% better with minimum framerate vs intel's i5 6600 that is nearly 2x faster per core.

At the same range of clock rates an i7 quad core wont beat an 6 core i5, however those two extra cores dont mean 50% performance over than i7 quad.... Cinebench 15 synthetic multithreaded benchmark shows that i5 8600k only beats i7 7700k by 8%. And even 3dmark timespy cpu test only a 9% difference. Modern game like rise of the tomb raider i5 8600 is only 6% faster on average and that game will use 16 threads. Most game engine anymore scale to what is available making use of 8 to16 threads. Multiple games and programs on the market that can use 8 threads at full tilt show that the i5 8600k wont cream the i7 7700k even in the future. Now future generations of i5 6 core cpu once their ipc increases another 20% or so it will be a different ballgame.

I did read it well, but we never compared the 8600k and I7 7700k at stock clocks. When we started this discussion we kept in mind that the user would overclock the chip. I mean there's not even non k version of the 8600.

The 8600k vs 7770k benches you're using are using stock clocks but the 8600k is just as good as an overclocker as the I7700k. The stock clocks on the I7 7700 are way higher. (3.6 ghz vs 4.2 ghz)

The other benchmarks you're using are warped as well. an I7 6850k is running at 3.8 ghz, has 12 threads not 6, or you have to disable hyperthreading. There games that are gpu bound above 1080p , and there are games that don't make use of the extra threads and core count either.

The fact is that a 6 core cpu with the same clock speeds is a much better cpu than a quad core with hyperthreading. I don't need benchmarks to know this. This is just plain common sense. 4 cores/8 threads is no match for 6 cores at the same clock speeds.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#447 commander
Member since 2010 • 16217 Posts

@Juub1990 said:
@commander said:

If your question is if the product delivers well let just ask yourself this? How can you make a pc that achieves the same performance as the xboxone X for 500$? Good luck with that.

So you're basically admitting that it has a piece of shit CPU that starts to be a bottleneck at around 30fps? Great console indeed.

it won't bottleneck, there are workarounds for this, the dx12 chip and gpgpu tools.

For 500$ the xboxone x is an amazing system.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#448 Juub1990
Member since 2013 • 12622 Posts
@commander said:

it won't bottleneck, there are workarounds for this, the dx12 chip and gpgpu tools.

For 500$ the xboxone x is an amazing system.

It's also incredibly unbalanced with a good GPU and a shit CPU. Guess that's why it's only 500$.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#449 04dcarraher
Member since 2004 • 23858 Posts

@commander:

Again you read my post wrong the 6850k, I know is a 6 core 12 thread vs 6700 which was a quad I was comparing the slight differences between those two with games that made use of 8 or more threads showing that have two cores and 4 threads extra does not translate into alot of extra performance when comparing same gen cpus. I do agree with you at same clocks the i5 6800k will perform better than i7 7700k but its still not a massive difference.

Avatar image for waahahah
waahahah

2462

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#450  Edited By waahahah
Member since 2014 • 2462 Posts

@jahnee said:

+ @waahahah

Imagine the current imaginative scenario's for example: Skyrim without load times when opening doors, gates or such to any new areas without ever encountering a single load time. Some mods do do this already with the main city gates. Imagine then seeing creatures walk around in the dense forests from a mile away while the leafs and mist are obstructing the view just enough that it hinders aim. I want the game world to become more alive then it currently is sitting at, not much has changed in this regard. Most resources in the last 10 years have gone to aesthetics.

Skyrim is probably more memory bound in a werid way. You can already see the issues with its design in fallout 4 where as things in the evironment spawn in. This Its this odd ball with everything has to be persistent and the draw distance + loading items in + dyanmic nature will always cause weird shit to happen.

@04dcarraher said:

@commander:

Again you read my post wrong the 6850k, I know is a 6 core 12 thread vs 6700 which was a quad I was comparing the slight differences between those two with games that made use of 8 or more threads showing that have two cores and 4 threads extra does not translate into alot of extra performance when comparing same gen cpus. I do agree with you at same clocks the i5 6800k will perform better than i7 7700k but its still not a massive difference.

You'll probably see more of a difference at low end. If the CPU's are fast enough that it doesn't max out the 4 cores the hyperthreading probably will be worse than having 6 good cores.

edit:

Its likely pretty clear with i3 w/ hyperthreading vs i5 without