Those who said the X1X GPU is a match for the 1070 come forward and apologize

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#151 Xplode_games
Member since 2011 • 2540 Posts

@Juub1990 said:
@Xplode_games said:

Vega 64 dips to 22 fps at 4k Ultra. However, you get an average of 36 fps but you can't compare that to a consoles locked 30. The console can't go above 30 to increase it's average unless you could unlock the framerate to benchmark against a PC GPU. Since we can't do that then we can assume if there was a console with an i7 CPU and Vega 64 GPU, it would lock the framerate at 30 fps at 4k and have dips down to 22fps. That would give it similar if not worse performance to the X1X because the X only dipped that way in cut scenes which have zero affect on gameplay. But let's say the performance was exactly the same, ok. It's running at 40% higher resolution let's say but is running slightly higher settings we think. Well, with an i7 CPU and Vega 64 13.7 Teraflop GPU you get a 40% boost in resolution and that's it over the 6 teraflop X and you don't think the X is well optimized? WOW! :o

And considering the X1X saw dips below 30fps at 1700-1800p, averaging 36fps for it at 4K would be impossible. They both dropped way below 30fps mostly because of cut scenes and minor stuttering issues on PC. 25fps vs 22fps. Why the hell are you bringing up a console with a Vega 64 GPU? You said the X1X performed close to Vega 64. It doesn't. The X1X hits its limit at 1700-1800p(hence why it dips below the target frame rate), has worst AO than on PS4 and lower draw distance than on PC and also uses dynamic scaling to drop as low at 1564p(or some number like that). The X1X hasn't got any more headroom to spare. As it stands, on average it gets outperformed by 21% with 44-50% less pixels on average. We're probably looking at a 30% difference apples to apples. That's not close. Moving the goalposts again? Went from equal to a 1070 to cannot match a 1070 but close to a Vega 64 and now we're at "has good optimization". The X1X does well for the cost. However it doesn't tango with the high-end GPU's like the 1070.

I have one question for you only. If the Xbox One X hardware had an i7 CPU and a Vega 64 13.7 Teraflop GPU and they were running this game at 4k Ultra settings, what would the target fps be?

If you don't answer this question then you by default have proven your goal post moving bias. That is a simple question and should be very easy to answer for you because we have all of the benchmarks in this thread to help out. As you pointed out, you are providing the latest ones.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#152  Edited By QuadKnight
Member since 2015 • 12916 Posts

? Lems are getting savagely rekt in this thread!

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#153 Juub1990
Member since 2013 • 12622 Posts
@Xplode_games said:

I have one question for you only. If the Xbox One X hardware had an i7 CPU and a Vega 64 13.7 Teraflop GPU and they were running this game at 4k Ultra settings, what would the target fps be?

If you don't answer this question then you by default have proven your goal post moving bias. That is a simple question and should be very easy to answer for you because we have all of the benchmarks in this thread to help out. As you pointed out, you are providing the latest ones.

There is no goalpost moving except from you. The thread is titled:

This isn't about optimization or how well it does for its price(which I already admitted is respectable if not good). This is about false claims and misinformation you guys spread in the weeks leading to the release.

Those who said the X1X GPU is a match for the 1070 come forward and apologize

The X1X doesn't match a 1070. If it did, Ubisoft wouldn't need to drop the resolution to 1700-1800p with drops to 1564p. They wouldn't need to dial back on shadows, AO and draw distance. Despite this, the X1X still gets outperformed by the 1070 running a native 4K at all time something which the X1X NEVER does. So please, apologize. The X1X can't contend with a 1070. You and your posse have been exposed and publicly humiliated.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#155 Juub1990
Member since 2013 • 12622 Posts
@kuu2 said:

Maybe you can make another thread and meltdown in it.

Move on Insecure1990.

Meltdown? I'm currently laying a royal smackdown on lemmings lol.

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#156 ConanTheStoner
Member since 2011 • 23838 Posts

Heh, the dream is still going for some I see.

Honestly shouldn't matter. If you bought an X1X, you own the most powerful console on the market. Your existing library gets an upgrade and you'll be playing the best console versions of multiplats until the next gen rolls around.

I'd just be happy with that, no need to continue trying to punch above your weight.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#157 Xplode_games
Member since 2011 • 2540 Posts

@Juub1990 said:
@kuu2 said:

Maybe you can make another thread and meltdown in it.

Move on Insecure1990.

Meltdown? I'm currently laying a royal smackdown on lemmings lol.

You're a legend in your own mind.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#158 QuadKnight
Member since 2015 • 12916 Posts

“? B...bu....bu....but Ark Dev said!!!11”

? Lems were warned about this early this year. They wouldn’t tone down their hype for whatever reason. You clowns brought this ownage on yourselves. Being more powerful than the Pro wasn’t enough, you just had to go after high end PCs. ? Get rekt clowns!

Avatar image for deactivated-63d2876fd4204
deactivated-63d2876fd4204

9129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#159 deactivated-63d2876fd4204
Member since 2016 • 9129 Posts

This is what happens when console fanboys try to talk Hardware specs. It seems as if we’ve lost some good Lems these past few days. If only you took some of these Cows with you.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#160 Juub1990
Member since 2013 • 12622 Posts
@Xplode_games said:

You're a legend in your own mind.

Yeah your posse got OWNED. Thanks for playing along.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#161 Zero_epyon
Member since 2004 • 20502 Posts

Apparently pointing out how right you are means you're salty and/or having a meltdown.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#162 Xplode_games
Member since 2011 • 2540 Posts

@Juub1990 said:
@Xplode_games said:

I have one question for you only. If the Xbox One X hardware had an i7 CPU and a Vega 64 13.7 Teraflop GPU and they were running this game at 4k Ultra settings, what would the target fps be?

If you don't answer this question then you by default have proven your goal post moving bias. That is a simple question and should be very easy to answer for you because we have all of the benchmarks in this thread to help out. As you pointed out, you are providing the latest ones.

There is no goalpost moving except from you. The thread is titled:

This isn't about optimization or how well it does for its price(which I already admitted is respectable if not good). This is about false claims and misinformation you guys spread in the weeks leading to the release.

Those who said the X1X GPU is a match for the 1070 come forward and apologize

The X1X doesn't match a 1070. If it did, Ubisoft wouldn't need to drop the resolution to 1700-1800p with drops to 1564p. They wouldn't need to dial back on shadows, AO and draw distance. Despite this, the X1X still gets outperformed by the 1070 running a native 4K at all time something which the X1X NEVER does. So please, apologize. The X1X can't contend with a 1070. You and your posse have been exposed and publicly humiliated.

Let me see if I understand what you are saying here. You are telling me that you won't answer my question because this thread isn't about optimization which my questions delves into. If that's the case, then why are you clinging on to the optimization that allows the 1070 to beat the X? If this were an AMD optimized game and the 1070 was still beating the X, I would be the first to admit it. That's not what is happening here. Obviously the 1070 is outperforming the X, ***IN THIS ONE GAME***, that is the part you are trying very hard to ignore.

You need to ignore the optimization advantage that Nvidia has that causes the victory in this game to cling on to your failed point. I prove this by showing how bad a 13.7 teraflop PC GPU that is much better than a 1070 performs but again you ignore this has anything to do with optimization.

Optimization is why the performance of AMD cards isn't that great in this game compared to Nvidia. Nothing that you have said or anyone else in this thread has proven the 1070 to be vastly superior to the X.

We don't know yet if the 1070 will wind up beating the X, more games have to be examined to determine that. You are acting as if everything is over and you won because of one highly optimized Nvidia game. You're being dishonest just to win an argument and everyone knows it. I guess you feel happy because rabid cows keep gassing you up.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#165  Edited By Zero_epyon
Member since 2004 • 20502 Posts

@kuu2 said:
@Zero_epyon said:

Apparently pointing out how right you are means you're salty and/or having a meltdown.

Nope, making two threads about the same subject and having a mod smack you down is having a meltdown.

Please keep up.

Doesn't mean he's wrong though. Speaking of meltdowns, remember how mad you were about that sticky? Why don't you come on by and chat and quit being so salty?

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#166  Edited By Juub1990
Member since 2013 • 12622 Posts
@Xplode_games said:

Let me see if I understand what you are saying here. You are telling me that you won't answer my question because this thread isn't about optimization which my questions delves into. If that's the case, then why are you clinging on to the optimization that allows the 1070 to beat the X? If this were an AMD optimized game and the 1070 was still beating the X, I would be the first to admit it. That's not what is happening here. Obviously the 1070 is outperforming the X, ***IN THIS ONE GAME***, that is the part you are trying very hard to ignore.

You need to ignore the optimization advantage that Nvidia has that causes the victory in this game to cling on to your failed point. I prove this by showing how bad a 13.7 teraflop PC GPU that is much better than a 1070 performs but again you ignore this has anything to do with optimization.

Optimization is why the performance of AMD cards isn't that great in this game compared to Nvidia. Nothing that you have said or anyone else in this thread has proven the 1070 to be vastly superior to the X.

We don't know yet if the 1070 will wind up beating the X, more games have to be examined to determine that. You are acting as if everything is over and you won because of one highly optimized Nvidia game. You're being dishonest just to win an argument and everyone knows it. I guess you feel happy because rabid cows keep gassing you up.

So you're maintaining the X1X is a match for the 1070? You know when you'll inevitably be proven wrong in the future, I'll laugh at you even more right? You had one chance to make amends and you blew it. Now you'll be laughed at until you leave this place.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#167 Xplode_games
Member since 2011 • 2540 Posts

@Juub1990 said:
@Xplode_games said:

Let me see if I understand what you are saying here. You are telling me that you won't answer my question because this thread isn't about optimization which my questions delves into. If that's the case, then why are you clinging on to the optimization that allows the 1070 to beat the X? If this were an AMD optimized game and the 1070 was still beating the X, I would be the first to admit it. That's not what is happening here. Obviously the 1070 is outperforming the X, ***IN THIS ONE GAME***, that is the part you are trying very hard to ignore.

You need to ignore the optimization advantage that Nvidia has that causes the victory in this game to cling on to your failed point. I prove this by showing how bad a 13.7 teraflop PC GPU that is much better than a 1070 performs but again you ignore this has anything to do with optimization.

Optimization is why the performance of AMD cards isn't that great in this game compared to Nvidia. Nothing that you have said or anyone else in this thread has proven the 1070 to be vastly superior to the X.

We don't know yet if the 1070 will wind up beating the X, more games have to be examined to determine that. You are acting as if everything is over and you won because of one highly optimized Nvidia game. You're being dishonest just to win an argument and everyone knows it. I guess you feel happy because rabid cows keep gassing you up.

So you're maintaining the X1X is a match for the 1070? You know when you'll inevitably be proven wrong in the future, I'll laugh at you even more right?

I'm saying we don't know yet. This one Nvidia game doesn't determine everything. Let's compare more games and see what happens when we get an AMD optimized game. We'll then be better able to gauge performance. If you really think the X will never beat the 1070 then you're dreaming.

Avatar image for The_Stand_In
The_Stand_In

1179

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#169 The_Stand_In
Member since 2010 • 1179 Posts

As it currently stands, the GTX 1070 gets in performance at 4K ultra settings what the X1X should be targeting (performance wise; it would be on par for a console game).

However, the X1X does NOT run it at 4K. It does NOT run it at ultra settings, either. Even still, it BARELY manages to pull ahead of the GTX 1070 which IS running at higher settings and resolution.

Therefore, performance does NOT equal nor exceed a GTX 1070.....even with magical console "optimization".

Why is this so hard for people to accept? It dosen't make the X1X any less powerful. Doesn't make it any less good of a deal. It just is what it is.

Avatar image for thehig1
thehig1

7555

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#170 thehig1
Member since 2014 • 7555 Posts

@Juub1990: what GPU is the X1X comparable too?

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#171  Edited By Juub1990
Member since 2013 • 12622 Posts

@thehig1 said:

@Juub1990: what GPU is the X1X comparable too?

Somewhere between a 1060 and a 1070 which is what I've been claiming all along. At the time I was saying between a 980 and a 980 Ti. Looks like I was right on it.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#172  Edited By Zero_epyon
Member since 2004 • 20502 Posts

@kuu2 said:

@Zero_epyon: Sorry not going into the biased sticky. I said my piece, a meltdown would be to continue to bitch about it like TC.

I applaud you though for initiating a very good SW tactic.

What tactic?

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#173  Edited By Zero_epyon
Member since 2004 • 20502 Posts

@Juub1990 said:
@thehig1 said:

@Juub1990: what GPU is the X1X comparable too?

Somewhere between a 1060 and a 1070 which is what I've been claiming all along. At the time I was saying between a 980 and a 980 Ti. Looks like I was right on it.

DF now believes it's more of a RX 580.

Avatar image for thehig1
thehig1

7555

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#174 thehig1
Member since 2014 • 7555 Posts

@Juub1990: it's decent for a console, shame the CPU is still a little slow.

Will be upgrading soon, can't be having a console potentially out performing my GPU haha

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#175  Edited By Xplode_games
Member since 2011 • 2540 Posts

@The_Stand_In said:

As it currently stands, the GTX 1070 gets in performance at 4K ultra settings what the X1X should be targeting (performance wise; it would be on par for a console game).

However, the X1X does NOT run it at 4K. It does NOT run it at ultra settings, either. Even still, it BARELY manages to pull ahead of the GTX 1070 which IS running at higher settings and resolution.

Therefore, performance does NOT equal nor exceed a GTX 1070.....even with magical console "optimization".

Why is this so hard for people to accept? It dosen't make the X1X any less powerful. Doesn't make it any less good of a deal. It just is what it is.

This is an Nvidia gameworks title highly optimized for Nvidia GPUs. That's why the performance of the Vega 64 is so bad. The Vega 64 should easily beat a 1070. Heck the Vega 56 destroys the 1070 to such a degree that Nvidia had to rush out in a panic and replace it with the 1070 ti to try to keep up. The 1070 beats a Vega 64 in this game which is a 13.7 Teraflop GPU. Do you really expect the X1X to beat the 1070 in this game when the X has a 6 Teraflop AMD GPU?

Now when we see an AMD optimized game, things will be very different.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#176 commander
Member since 2010 • 16217 Posts

@Juub1990 said:

And @commander and @ronvalencia just to own you even worse. From PC gamer.

All of my current testing was done with the 1.03 patch, using Nvidia's latest 388.13 drivers and AMD's 17.11.1 drivers—which is an update from the initial testing I did at the time of launch. The 1.03 patch smoothed out some of the minimum fps problems, and Nvidia performance across nearly all GPUs improved substantially.

Source

Ouch...

Who the **** cares about Killer Instinct? So both GPU's can run a 2013 fighting game at 4K/60fps?

so what, it doesn't mean much, the numbers are not convincing, 31 fps with what? an I7 8700k? lmao.

Like I said, this thread doesn't prove anything.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#177 Juub1990
Member since 2013 • 12622 Posts
@Xplode_games said:

I'm saying we don't know yet. This one Nvidia game doesn't determine everything. Let's compare more games and see what happens when we get an AMD optimized game. We'll then be better able to gauge performance. If you really think the X will never beat the 1070 then you're dreaming.

Oh now we need more games? The console wasn't even released you guys were already claiming victory. Now that it's performing exactly how the more knowledgeable members predicted it would, we need to wait and see?

The trailer for Assassin's Creed Origins was running at a native 4K and they needed to cut back to make it playable. @ronvalencia and his army of lembots were using this game as evidence the X1X would be equal to a 1070 and last I checked you were on board with them. You were the one shilling about console optimization and in this VERY thread you claimed PC were ALWAYS poorly optimized now PC is crushing the X1X and suddenly loloptimization? Talk about shooting yourself in the foot.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#178  Edited By ronvalencia
Member since 2008 • 29612 Posts

@quadknight said:

? Lems are getting savagely rekt in this thread!

Not all GTX 1070 SKUs are created equally i.e. different GDDR5 quality can yield different latency memory timings, hence different results.

MSI GTX 1070 Quick Silver's result is similar to the following GTX 1070 FE result.

There's a slight silicon lottery with 1070 SKUs.

Try again.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#179 Xplode_games
Member since 2011 • 2540 Posts

@Juub1990 said:
@Xplode_games said:

I'm saying we don't know yet. This one Nvidia game doesn't determine everything. Let's compare more games and see what happens when we get an AMD optimized game. We'll then be better able to gauge performance. If you really think the X will never beat the 1070 then you're dreaming.

Oh now we need more games? The console wasn't even released you guys were already claiming victory. Now that it's performing exactly how the more knowledgeable members predicted it would, we need to wait and see?

The trailer for Assassin's Creed Origins was running at a native 4K and they needed to cut back to make it playable. @ronvalencia and his army of lembots were using this game as evidence the X1X would be equal to a 1070 and last I checked you were on board with them. You were the one shilling about console optimization and in this VERY thread you claimed PC were ALWAYS poorly optimized now PC is crushing the X1X and suddenly loloptimization? Talk about shooting yourself in the foot.

Do you not understand what optimization is? It means a game was built from the ground up to take advantage of a particular hardware architecture's strengths. How can the X overcome that with only 6 teraflops? Now when a game is built from the ground up to take advantage of the X's architecture, we'll see how a PC with a 1070 performs against it.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#180  Edited By Juub1990
Member since 2013 • 12622 Posts

@Xplode_games: That’s not what optimization is.

So what happened to PC being poorly optimized?

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#181 Xplode_games
Member since 2011 • 2540 Posts

@Juub1990 said:

@Xplode_games: That’s not what optimization is.

So what happened to PC being poorly optimized?

PC games are poorly optimized compared to consoles. Do you dispute that?

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#182  Edited By Juub1990
Member since 2013 • 12622 Posts

@Xplode_games: Yes. Feel free to make a thread about it.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#183 Xplode_games
Member since 2011 • 2540 Posts

@Juub1990 said:

@Xplode_games: Yes.

Too long of a discussion to have here in this thread. But I will say you are right for this game, it was optimized for Nvidia cards and since no console has those, this is like optimizing a game for the PC.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#184 commander
Member since 2010 • 16217 Posts

@Juub1990 said:
@Xplode_games said:

Let me see if I understand what you are saying here. You are telling me that you won't answer my question because this thread isn't about optimization which my questions delves into. If that's the case, then why are you clinging on to the optimization that allows the 1070 to beat the X? If this were an AMD optimized game and the 1070 was still beating the X, I would be the first to admit it. That's not what is happening here. Obviously the 1070 is outperforming the X, ***IN THIS ONE GAME***, that is the part you are trying very hard to ignore.

You need to ignore the optimization advantage that Nvidia has that causes the victory in this game to cling on to your failed point. I prove this by showing how bad a 13.7 teraflop PC GPU that is much better than a 1070 performs but again you ignore this has anything to do with optimization.

Optimization is why the performance of AMD cards isn't that great in this game compared to Nvidia. Nothing that you have said or anyone else in this thread has proven the 1070 to be vastly superior to the X.

We don't know yet if the 1070 will wind up beating the X, more games have to be examined to determine that. You are acting as if everything is over and you won because of one highly optimized Nvidia game. You're being dishonest just to win an argument and everyone knows it. I guess you feel happy because rabid cows keep gassing you up.

So you're maintaining the X1X is a match for the 1070? You know when you'll inevitably be proven wrong in the future, I'll laugh at you even more right? You had one chance to make amends and you blew it. Now you'll be laughed at until you leave this place.

you're the one being laughed at here, you're comparing benchmarks with cpu's that are more than 2 times as fast as the xboxone x .

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#185 Juub1990
Member since 2013 • 12622 Posts

@commander: Feel free to argue the CPU is the bottleneck at 30fps on the X1X.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#186 QuadKnight
Member since 2015 • 12916 Posts

@ronvalencia said:
@quadknight said:

? Lems are getting savagely rekt in this thread!

Not all GTX 1070 SKUs are created equally i.e. different GDDR5 quality can yield different latency memory timings, hence different results.

MSI GTX 1070 Quick Silver's result is similar to the following GTX 1070 FE result.

Try again.

? This is some of the saddest and lamest damage control I've ever seen. So after you've been proven wrong you still continue to post outdated benchmarks and double down on your selfownage? Wow. Even after it has been proven that the XboneX can't hit 4K lol. The 1070 not only runs the game at Ultra PC settings and 4K it also runs it at a solid 30fps.

Not all GTX 1070's are the same? No shit, it's the same with every graphics card out there but they do have a common baseline. This the first time I've seen anyone using variations in different GPUs a form of damage control.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#187  Edited By ronvalencia
Member since 2008 • 29612 Posts

@quadknight said:
@ronvalencia said:
@quadknight said:

? Lems are getting savagely rekt in this thread!

Not all GTX 1070 SKUs are created equally i.e. different GDDR5 quality can yield different latency memory timings, hence different results.

MSI GTX 1070 Quick Silver's result is similar to the following GTX 1070 FE result.

Try again.

? This is some of the saddest and lamest damage control I've ever seen. So after you've been proven wrong you still continue to post outdated benchmarks and double down on your selfownage? Wow. Even after it has been proven that the XboneX can't hit 4K lol. The 1070 not only runs the game at Ultra PC settings and 4K it also runs it at a solid 30fps.

Not all GTX 1070's are the same? No shit, it's the same with every graphics card out there but they do have a common baseline. This the first time I've seen anyone using variations in different GPUs a form of damage control.

You tried to debunk my first GTX 1070 benchmark with another GTX 1070 benchmark and I posted another GTX 1070 benchmark from a different source which shows similar results as my first GTX 1070 benchmarks.

Try again.

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#188 ConanTheStoner
Member since 2011 • 23838 Posts

Ha, it's funny seeing Ron try to do human things.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#189 QuadKnight
Member since 2015 • 12916 Posts

@ronvalencia said:
@quadknight said:
@ronvalencia said:
@quadknight said:

? Lems are getting savagely rekt in this thread!

Not all GTX 1070 SKUs are created equally i.e. different GDDR5 quality can yield different latency memory timings, hence different results.

MSI GTX 1070 Quick Silver's result is similar to the following GTX 1070 FE result.

Try again.

? This is some of the saddest and lamest damage control I've ever seen. So after you've been proven wrong you still continue to post outdated benchmarks and double down on your selfownage? Wow. Even after it has been proven that the XboneX can't hit 4K lol. The 1070 not only runs the game at Ultra PC settings and 4K it also runs it at a solid 30fps.

Not all GTX 1070's are the same? No shit, it's the same with every graphics card out there but they do have a common baseline. This the first time I've seen anyone using variations in different GPUs a form of damage control.

You tried to debunk my first GTX 1070 benchmark with another GTX 1070 benchmark and I posted another GTX 1070 benchmark from a different source which shows similar results as my first GTX 1070 benchmarks.

Try again.

? You failed. Your benchmark is outdated and even if it wasn't you're still a failure because the XboneX doesn't perform anywhere close to that graphics card lol.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#190 ronvalencia
Member since 2008 • 29612 Posts

@Juub1990 said:

And @commander and @ronvalencia just to own you even worse. From PC gamer.

All of my current testing was done with the 1.03 patch, using Nvidia's latest 388.13 drivers and AMD's 17.11.1 drivers—which is an update from the initial testing I did at the time of launch. The 1.03 patch smoothed out some of the minimum fps problems, and Nvidia performance across nearly all GPUs improved substantially.

Source

Ouch...

Who the **** cares about Killer Instinct? So both GPU's can run a 2013 fighting game at 4K/60fps?

One can't claim X1X having GTX 1060 class results when X1X has other results in GTX 1070/GTX 980 Ti range.

X1X doesn't have actual "GTX 1070".

Avatar image for drlostrib
DrLostRib

5931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#191 DrLostRib
Member since 2017 • 5931 Posts
@ConanTheStoner said:

Ha, it's funny seeing Ron try to do human things.

Ron:

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#192 Juub1990
Member since 2013 • 12622 Posts
@ronvalencia said:

You tried to debunk my first GTX 1070 benchmark with another GTX 1070 benchmark and I posted another GTX 1070 benchmark from a different source which shows similar results as my first GTX 1070 benchmarks.

Try again.

The Assassin's Creed: Origins 1.03 patch is now live, and for Nvidia owners the news appears to be good, as our early testing indicates a performance increase in the neighborhood of 10-15 percent over what it offered out of the gate.

Source

The patch came out on November 2nd which is the exact date you posted your benchmark from that German website which makes no mention of patch 1.03. There is no way in hell these guys finished benchmarking all these cards at all these different resolution the same day the patch came out which means you posted ANOTHER outdated benchmark.

And wouldn't matter anyway because that's at 4K. The X1X completely fails to achieve 4K in that game and consistently runs the game at a resolution that is 44-50% lower.

You try again.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#193 ronvalencia
Member since 2008 • 29612 Posts

@Juub1990 said:
@ronvalencia said:

You tried to debunk my first GTX 1070 benchmark with another GTX 1070 benchmark and I posted another GTX 1070 benchmark from a different source which shows similar results as my first GTX 1070 benchmarks.

Try again.

The Assassin's Creed: Origins 1.03 patch is now live, and for Nvidia owners the news appears to be good, as our early testing indicates a performance increase in the neighborhood of 10-15 percent over what it offered out of the gate.

Source

The patch came out on November 2nd which is the exact date you posted your benchmark from that German website which makes no mention of patch 1.03. There is no way in hell these guys finished benchmarking all these cards at all these different resolution the same day the patch came out which means you posted ANOTHER outdated benchmark.

And wouldn't matter anyway because that's at 4K. The X1X completely fails to achieve 4K in that game and consistently runs the game at a resolution that is 44-50% lower.

You try again.

Your "There is no way in hell these guys finished benchmarking ...." is your substitution i.e. fan fiction.

You try again.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#194 Juub1990
Member since 2013 • 12622 Posts
@ronvalencia said:

Your "There is no way in hell these guys finished benchmarking ...." is your substitution i.e. fan fiction.

You try again.

Prove they are using patch 1.03. I'll wait.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#195  Edited By QuadKnight
Member since 2015 • 12916 Posts

Lems....

? The lemming destruction is getting ridiculous in here.

Avatar image for deactivated-5c0b07b32bf03
deactivated-5c0b07b32bf03

6005

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#196 deactivated-5c0b07b32bf03
Member since 2014 • 6005 Posts

@drlostrib said:
@ConanTheStoner said:

Ha, it's funny seeing Ron try to do human things.

Ron:

If I had been drinking milk when I saw this post I would definitely have squirted it through my nose.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#197  Edited By commander
Member since 2010 • 16217 Posts

@Juub1990 said:

@commander: Feel free to argue the CPU is the bottleneck at 30fps on the X1X.

It's still going to make a difference. As long as you don't have any benchmarks with a system that's more comparable to the xboxone x on the cpu , and the same settings, your argument is bs.

and that's exactly the case right now, the numbers are way too close.

also where's the hdr on the pc?

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#198  Edited By Juub1990
Member since 2013 • 12622 Posts

@ronvalencia Actually, I can prove they weren't using patch 1.03. The patch went live on November 2nd at around 5:30 AM EST. The benchmark you posted was done at 9: 15 AM EST. There are 4 HOURS separating the release of the patch from your benchmark and the entire benchmark makes no mention of patch 1.03. So yeah, you're wrong. Outdated benchmark. The German site you linked tested 16 cards at 4 different resolutions. Assuming it only took 5 minutes to test each resolution per card, the tester would need at the MINIMUM 5 hours test them all and we both know it takes more than 5 minutes to do that.

@commander HDR was released for all platforms on November 7/8th I believe.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#199 commander
Member since 2010 • 16217 Posts

@Juub1990 said:

@ronvalencia Actually, I can prove they weren't using patch 1.03. The patch went live on November 2nd at around 5:30 AM EST. The benchmark you posted was done at 9: 15 AM EST. There are 4 HOURS separating the release of the patch from your benchmark and the entire benchmark makes no mention of patch 1.03. So yeah, you're wrong. Outdated benchmark.

lol your talk about the patch is horseshit, the xboxone x will get patches as well.

Avatar image for moosewayne
MooseWayne

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#200 MooseWayne
Member since 2017 • 361 Posts

??