Xbox One X was never going to be better than a GTX 980 from September 2014.
It isn't near the 1070. How do you come up with this? It is 50% less resolution and at half the frames. Moving frames really adds a lot of burden to that GPU.
Also everyone has been saying that the CPU will limit the Xbox. However you lems didn't believe it and believed Ron's charts. However, that GPU still doesn't compare with the 1070.
And giving that MS are complete idiots to put a GPU that is the most powerful in a console history with a CPU that is a joke, somehow you are still sticking up for them saying...yea we know how GPU is good but we at okay with our shitty CPU.
Honestly, I think they should have spent a little more money on the Xbox One X and placed a better CPU in there. I would have rather seen a $550 or $600 console as the Xbox One X wasn't targeted for the casual gamer. They stated it wasn't.
We can see from a GPU standpoint some games are close, some games aren't, and most of the time those are obvious CPU issues, other times we don't know if its AMD's architecture or GPU compute that was moved off the cpu and adding more burden on the GPU compared to PC versions. As I keep saying its closer than what I think people expected.
Yah the CPU was likely a bad call but ryzen likely wasn't ready for when they wanted it. They need to have the specs locked down well before release and we won't see the new... ridge racer core? I can't remember and too lazy to google... the new 15 watt apu that will probably see next year.
@commander said:
this is not how this discussion started. I said it would be a cold day in hell before the I7700k matches the I5 8600k, and apparation said the hyperthreading would make up for that and that's just not the case, even with some older games like crysis 3 but also newer games like watch dogs 2.
Actually THIS is how the discussion started. With you spouting off like you know what you're talking about. Gems such as:
It will be significant enough to put the 8600k in whole different range of performance
and
the two extra cores on the I5 8600k will dwarf the two extra threads on the I7 7700 and blow it out of the water.
and finally
Most people don't go for max overclock and even if the quad core is able to dish out a couple of 100 mhz, it will still get murdered by the hexacore
So you consider an overall performance boost of 5-10% as murdered, blown out of the water, and in a whole different range of performance then? Boy, you must believe the X1X is 2 generations ahead of it's time then!
Or only in the specific games you cherry picked that favor 6 physical cores but not 8 threads, while you conveniently ignore examples going the OTHER way from @04dcarraher and myself. Sounds pretty logical to me.
It isn't near the 1070. How do you come up with this? It is 50% less resolution and at half the frames. Moving frames really adds a lot of burden to that GPU.
Also everyone has been saying that the CPU will limit the Xbox. However you lems didn't believe it and believed Ron's charts. However, that GPU still doesn't compare with the 1070.
And giving that MS are complete idiots to put a GPU that is the most powerful in a console history with a CPU that is a joke, somehow you are still sticking up for them saying...yea we know how GPU is good but we at okay with our shitty CPU.
Honestly, I think they should have spent a little more money on the Xbox One X and placed a better CPU in there. I would have rather seen a $550 or $600 console as the Xbox One X wasn't targeted for the casual gamer. They stated it wasn't.
We can see from a GPU standpoint some games are close, some games aren't, and most of the time those are obvious CPU issues, other times we don't know if its AMD's architecture or GPU compute that was moved off the cpu and adding more burden on the GPU compared to PC versions. As I keep saying its closer than what I think people expected.
Yah the CPU was likely a bad call but ryzen likely wasn't ready for when they wanted it. They need to have the specs locked down well before release and we won't see the new... ridge racer core? I can't remember and too lazy to google... the new 15 watt apu that will probably see next year.
you bet your ass if the XB1X had a beast CPU, and cost over 800 dollars, the cow circle jerks would be glorious.
They "compromised" to keep it a console.
@commander:
Again you read my post wrong the 6850k, I know is a 6 core 12 thread vs 6700 which was a quad I was comparing the slight differences between those two with games that made use of 8 or more threads showing that have two cores and 4 threads extra does not translate into alot of extra performance when comparing same gen cpus. I do agree with you at same clocks the i5 6800k will perform better than i7 7700k but its still not a massive difference.
Well I don't know what games you're using for this, but games that are optimized for more threads/cores will make use of that power.
coffee lake and kaby lake are the same architecture. Even the overclocked I7 7700k gets can't compete with the stock clock 8700k in multithreaded games.
and the 8600k runs at 3.6 ghz, it overclocks to 5 ghz. That will make a big difference, since the i7 7700 k isn't better at overclocking and t's already getting beat by the 8600k when the latter is running 600 mhz slower. What do you think will happen when they run both at faster but similar speed.
You can debate that it won' be massive, but calling it slight is a bit ridiculous especially when you look at benchmarks that just target cpu power.
But like I said you don't have to look at the benchmarks. 50 percent more cores is 50 percent more cpu power, it might not translate directly into real world applications like games, but sooner or later it will be enough to say: Why did I listen to those people on the internet that said it was only a slight difference...
It isn't near the 1070. How do you come up with this? It is 50% less resolution and at half the frames. Moving frames really adds a lot of burden to that GPU.
Also everyone has been saying that the CPU will limit the Xbox. However you lems didn't believe it and believed Ron's charts. However, that GPU still doesn't compare with the 1070.
And giving that MS are complete idiots to put a GPU that is the most powerful in a console history with a CPU that is a joke, somehow you are still sticking up for them saying...yea we know how GPU is good but we at okay with our shitty CPU.
Honestly, I think they should have spent a little more money on the Xbox One X and placed a better CPU in there. I would have rather seen a $550 or $600 console as the Xbox One X wasn't targeted for the casual gamer. They stated it wasn't.
We can see from a GPU standpoint some games are close, some games aren't, and most of the time those are obvious CPU issues, other times we don't know if its AMD's architecture or GPU compute that was moved off the cpu and adding more burden on the GPU compared to PC versions. As I keep saying its closer than what I think people expected.
Yah the CPU was likely a bad call but ryzen likely wasn't ready for when they wanted it. They need to have the specs locked down well before release and we won't see the new... ridge racer core? I can't remember and too lazy to google... the new 15 watt apu that will probably see next year.
you bet your ass if the XB1X had a beast CPU, and cost over 800 dollars, the cow circle jerks would be glorious.
They "compromised" to keep it a console.
Not really, again, there literally just wasn't anything else available to them as an APU when they probably finalized the spec. Considering the the ps4 pro is significantly weaker on both cpu/gpu they probably were just like... **** it ship with the jaguar... Ryzen cores aren't exactly super costly at the moment and integrated into an APU it probably would be fine.
Considering AMD's new infinity fabric I could see their next APU's being more like intels new module with a discreet vega chip.
It isn't near the 1070. How do you come up with this? It is 50% less resolution and at half the frames. Moving frames really adds a lot of burden to that GPU.
Also everyone has been saying that the CPU will limit the Xbox. However you lems didn't believe it and believed Ron's charts. However, that GPU still doesn't compare with the 1070.
And giving that MS are complete idiots to put a GPU that is the most powerful in a console history with a CPU that is a joke, somehow you are still sticking up for them saying...yea we know how GPU is good but we at okay with our shitty CPU.
Honestly, I think they should have spent a little more money on the Xbox One X and placed a better CPU in there. I would have rather seen a $550 or $600 console as the Xbox One X wasn't targeted for the casual gamer. They stated it wasn't.
We can see from a GPU standpoint some games are close, some games aren't, and most of the time those are obvious CPU issues, other times we don't know if its AMD's architecture or GPU compute that was moved off the cpu and adding more burden on the GPU compared to PC versions. As I keep saying its closer than what I think people expected.
Yah the CPU was likely a bad call but ryzen likely wasn't ready for when they wanted it. They need to have the specs locked down well before release and we won't see the new... ridge racer core? I can't remember and too lazy to google... the new 15 watt apu that will probably see next year.
you bet your ass if the XB1X had a beast CPU, and cost over 800 dollars, the cow circle jerks would be glorious.
They "compromised" to keep it a console.
Not really, again, there literally just wasn't anything else available to them as an APU when they probably finalized the spec. Considering the the ps4 pro is significantly weaker on both cpu/gpu they probably were just like... **** it ship with the jaguar... Ryzen cores aren't exactly super costly at the moment and integrated into an APU it probably would be fine.
Considering AMD's new infinity fabric I could see their next APU's being more like intels new module with a discreet vega chip.
well, one thing is for sure.. the future looks glorious!
We can see from a GPU standpoint some games are close, some games aren't, and most of the time those are obvious CPU issues, other times we don't know if its AMD's architecture or GPU compute that was moved off the cpu and adding more burden on the GPU compared to PC versions. As I keep saying its closer than what I think people expected.
Yah the CPU was likely a bad call but ryzen likely wasn't ready for when they wanted it. They need to have the specs locked down well before release and we won't see the new... ridge racer core? I can't remember and too lazy to google... the new 15 watt apu that will probably see next year.
Based on this your are correct especially for a game like Gears of War 4 where the framerate is locked at 30 FPS.
@commander said:
this is not how this discussion started. I said it would be a cold day in hell before the I7700k matches the I5 8600k, and apparation said the hyperthreading would make up for that and that's just not the case, even with some older games like crysis 3 but also newer games like watch dogs 2.
Actually THIS is how the discussion started. With you spouting off like you know what you're talking about. Gems such as:
It will be significant enough to put the 8600k in whole different range of performance
and
the two extra cores on the I5 8600k will dwarf the two extra threads on the I7 7700 and blow it out of the water.
and finally
Most people don't go for max overclock and even if the quad core is able to dish out a couple of 100 mhz, it will still get murdered by the hexacore
So you consider an overall performance boost of 5-10% as murdered, blown out of the water, and in a whole different range of performance then? Boy, you must believe the X1X is 2 generations ahead of it's time then!
Or only in the specific games you cherry picked that favor 6 physical cores but not 8 threads, while you conveniently ignore examples going the OTHER way from @04dcarraher and myself. Sounds pretty logical to me.
have fun with the hyperthreading lulz
You can cry all day, 50 percent more cores is 50 percent more cpu power, and a 6 core cpu will become the new standard. There's a reason they're calling quad cores i3 now, and hexacores I5's.
@commander:
Again you read my post wrong the 6850k, I know is a 6 core 12 thread vs 6700 which was a quad I was comparing the slight differences between those two with games that made use of 8 or more threads showing that have two cores and 4 threads extra does not translate into alot of extra performance when comparing same gen cpus. I do agree with you at same clocks the i5 6800k will perform better than i7 7700k but its still not a massive difference.
Well I don't know what games you're using for this, but games that are optimized for more threads/cores will make use of that power.
coffee lake and kaby lake are the same architecture. Even the overclocked I7 7700k gets can't compete with the stock clock 8700k in multithreaded games.
and the 8600k runs at 3.6 ghz, it overclocks to 5 ghz. That will make a big difference, since the i7 7700 k isn't better at overclocking and t's already getting beat by the 8600k when the latter is running 600 mhz slower. What do you think will happen when they run both at faster but similar speed.
You can debate that it won' be massive, but calling it slight is a bit ridiculous especially when you look at benchmarks that just target cpu power.
But like I said you don't have to look at the benchmarks. 50 percent more cores is 50 percent more cpu power, it might not translate directly into real world applications like games, but sooner or later it will be enough to say: Why did I listen to those people on the internet that said it was only a slight difference...
That's the 8700K, not the 8600K which is what we were comparing to the 7700K. Not sure what this has to do with our discussion.
Oh, and btw. owned again, keep them coming.
Keep in mind, this is the exact same site you just tried to use. So I don't wanna hear any "well, you cherry picked the results" bullshit.
@commander said:
this is not how this discussion started. I said it would be a cold day in hell before the I7700k matches the I5 8600k, and apparation said the hyperthreading would make up for that and that's just not the case, even with some older games like crysis 3 but also newer games like watch dogs 2.
Actually THIS is how the discussion started. With you spouting off like you know what you're talking about. Gems such as:
It will be significant enough to put the 8600k in whole different range of performance
and
the two extra cores on the I5 8600k will dwarf the two extra threads on the I7 7700 and blow it out of the water.
and finally
Most people don't go for max overclock and even if the quad core is able to dish out a couple of 100 mhz, it will still get murdered by the hexacore
So you consider an overall performance boost of 5-10% as murdered, blown out of the water, and in a whole different range of performance then? Boy, you must believe the X1X is 2 generations ahead of it's time then!
Or only in the specific games you cherry picked that favor 6 physical cores but not 8 threads, while you conveniently ignore examples going the OTHER way from @04dcarraher and myself. Sounds pretty logical to me.
have fun with the hyperthreading lulz
so your answer is to post a chart comparing the 7600K now? LOLOL.
Is old age dulling your memory? We were comparing the 7700K to the 8600K. Nothing else.
Here you go. have fun with the hyperthreading LULZ.
And may I also say....
REKT.
@commander said:
Actually THIS is how the discussion started. With you spouting off like you know what you're talking about. Gems such as:
and
and finally
So you consider an overall performance boost of 5-10% as murdered, blown out of the water, and in a whole different range of performance then? Boy, you must believe the X1X is 2 generations ahead of it's time then!
Or only in the specific games you cherry picked that favor 6 physical cores but not 8 threads, while you conveniently ignore examples going the OTHER way from @04dcarraher and myself. Sounds pretty logical to me.
have fun with the hyperthreading lulz
so your answer is to post a chart comparing the 7600K now? LOLOL.
Is old age dulling your memory? We were comparing the 7700K to the 8600K. Nothing else.
Here you go. have fun with the hyperthreading LULZ.
And may I also say....
REKT.
whoops, I already saw that btw hehe
oh well let me just put an end to this
@appariti0n:
What in the world is commander on?
yet based on his image off of DF/eurogamer i7 7700k vs 8700k....
ACU 22% difference
Ashes 20% difference
Tomb raider 10% difference
and other games showing less than 20% gains from 7700 to 8700k..... I didn't know using same gen i7's 6850k vs i7 6700 results confused him with comparing the results.
then lets look at i7 7700k vs i5 8600k from that same source of hos pic shows them within 5% on average of each other. HT does help....
I'll recap
50% more processing power does not translate to 25+ percent more performance.
@appariti0n:
What in the world is commander on?
yet based on his image off of DF/eurogamer i7 7700k vs 8700k....
ACU 22% difference
Ashes 20% difference
Tomb raider 10% difference
and other games showing less than 20% gains from 7700 to 8700k..... I didn't know using same gen i7's 6850k vs i7 6700 results confused him with comparing the results.
then lets look at i7 7700k vs i5 8600k from that same source of hos pic shows them within 5% on average of each other. HT does help....
I'll recap
50% more processing power does not translate to 25+ percent more performance.
I just gave apparation a bone to chew on, I once ridiculed the pc in a pic and he hasn't gotten over it yet...
here you can see a nice overclocking comparison.
@appariti0n:
What in the world is commander on?
yet based on his image off of DF/eurogamer i7 7700k vs 8700k....
ACU 22% difference
Ashes 20% difference
Tomb raider 10% difference
and other games showing less than 20% gains from 7700 to 8700k..... I didn't know using same gen i7's 6850k vs i7 6700 results confused him with comparing the results.
then lets look at i7 7700k vs i5 8600k from that same source of hos pic shows them within 5% on average of each other. HT does help....
I'll recap
50% more processing power does not translate to 25+ percent more performance.
I'm genuinely concerned here.
@commander are you feeling pain down your left arm? Seriously dude, you may be having a stroke, don't ignore the signs.
@appariti0n:
What in the world is commander on?
yet based on his image off of DF/eurogamer i7 7700k vs 8700k....
ACU 22% difference
Ashes 20% difference
Tomb raider 10% difference
and other games showing less than 20% gains from 7700 to 8700k..... I didn't know using same gen i7's 6850k vs i7 6700 results confused him with comparing the results.
then lets look at i7 7700k vs i5 8600k from that same source of hos pic shows them within 5% on average of each other. HT does help....
I'll recap
50% more processing power does not translate to 25+ percent more performance.
I'm genuinely concerned here.
@commander are you feeling pain down your left arm? Seriously dude, you may be having a stroke, don't ignore the signs.
have you not looked at the video. I wouldn't dare to look either if I was you. This is going to be the second time you made a fool of yourself lmao
@commander: You’re spreading nonsense. You got quoted spreading it and now you’re trying to save face by mitigating the bullshit you said. “The 7700K gets murdered” now it’s “oh well in the future whoever said the difference was slight in the past will be wrong” lol. How is that not spreading nonsense?
You and your posse should just man up and admit you were wrong but you can’t. You even attempted to turn blatant losses into stalemates or moral victories for you. You’ve made an idiot out of yourself in this very thread. Should have guessed that much when you showed you weren’t smart enough to find an up to date benchmark and are still struggling to use google.
Don’t hurt your brain trying to wrap your head around my post now. I know it’s very difficult for you to scroll down and read let alone understand it.
@commander: You’re spreading nonsense. You got quoted spreading it and now you’re trying to save face by mitigating the bullshit you said. “The 7700K gets murdered” now it’s “oh well in the future whoever said the difference was slight in the past will be wrong” lol. How is that not spreading nonsense?
You and your posse should just man up and admit you were wrong but you can’t. You even attempted to turn blatant losses into stalemates or moral victories for you. You’ve made an idiot out of yourself in this very thread. Should have guessed that much when you showed you weren’t smart enough to find an up to date benchmark and are still struggling to use google.
Don’t hurt your brain trying to wrap your head around my post now. I know it’s very difficult for you to scroll down and read let alone understand it.
but it does get murdered...
and that's what we see right now, it's only going to get worse in future.
@commander: You’re spreading nonsense. You got quoted spreading it and now you’re trying to save face by mitigating the bullshit you said. “The 7700K gets murdered” now it’s “oh well in the future whoever said the difference was slight in the past will be wrong” lol. How is that not spreading nonsense?
You and your posse should just man up and admit you were wrong but you can’t. You even attempted to turn blatant losses into stalemates or moral victories for you. You’ve made an idiot out of yourself in this very thread. Should have guessed that much when you showed you weren’t smart enough to find an up to date benchmark and are still struggling to use google.
Don’t hurt your brain trying to wrap your head around my post now. I know it’s very difficult for you to scroll down and read let alone understand it.
but it does get murdered...
and that's what we see right now, it's only going to get worse in future.
In this random youtube video that specifically focuses on games where 6 physical cores perform better than 4C/8T. Sure. This is your best case, cherry picked scenario right here, and it's what... 15-20% faster? Good god, no wonder you and Ron are so tight.
I love how eurogamer.net is suddenly not relevant, even though you authoritatively posted benchmarks from there claiming ownage when you erroneously thought it was supporting your case.
Until we pointed out that you weren't even benchmarking the correct CPU, and that the results actually work AGAINST you. Then you giggled like a school girl saying "hehe" when you knew you were caught like a deer in the headlights.
The 8600K being on a whole different level of performance has been proven to be bullshit.
Your claim of the 7700k's 8 virtual cores being dwarfed by 6 physical cores has been chopped off at the knees.
And the only thing being MURDERED around here, is the last shred of credibility you might have had.
@commander: You’re spreading nonsense. You got quoted spreading it and now you’re trying to save face by mitigating the bullshit you said. “The 7700K gets murdered” now it’s “oh well in the future whoever said the difference was slight in the past will be wrong” lol. How is that not spreading nonsense?
You and your posse should just man up and admit you were wrong but you can’t. You even attempted to turn blatant losses into stalemates or moral victories for you. You’ve made an idiot out of yourself in this very thread. Should have guessed that much when you showed you weren’t smart enough to find an up to date benchmark and are still struggling to use google.
Don’t hurt your brain trying to wrap your head around my post now. I know it’s very difficult for you to scroll down and read let alone understand it.
but it does get murdered...
and that's what we see right now, it's only going to get worse in future.
In this random youtube video that specifically focuses on games where 6 physical cores perform better than 4C/8T. Sure. This is your best case, cherry picked scenario right here, and it's what... 15-20% faster? Good god, no wonder you and Ron are so tight.
I love how eurogamer.net is suddenly not relevant, even though you authoritatively posted benchmarks from there claiming ownage when you erroneously thought it was supporting your case.
Until we pointed out that you weren't even benchmarking the correct CPU, and that the results actually work AGAINST you. Then you giggled like a school girl saying "hehe" when you knew you were caught like a deer in the headlights.
The 8600K being on a whole different level of performance has been proven to be bullshit.
Your claim of the 7700k's 8 virtual cores being dwarfed by 6 physical cores has been chopped off at the knees.
And the only thing being MURDERED around here, is the last shred of credibility you might have had.
Oh but that's alot of cherries there lol, the 7700k gets beaten across the board and this is only going to get worse in the future when hexacores becomes mainstream.
eurogamer.net is relevant for your case how? You point out the difference between the stock clocked 8600k and the stock clocked 7700k
and I was being sarcastic when I said have fun with the hyperthreading. I threw you a bone only to slap that video afterwards in your face. You think I didn't type in google 8600k oc vs 7700k , stumbled on that graph and compared the results of the 7600k of the 7700k.
but to come back where this initially started. I said 'It will be a cold day in hell before the 7700k comes close to 8600k performance'
and you replied with this
@commander: they'll be comparable. 7700K will likely retain the single core/overclock performance, yet still be ok in multi-thread thanks to HT.
they are not comparable. I may have exagerrated a bit when I said murder, but when it comes down to it , I'm right and you're wrong.
Even with that OC "video" average framerate difference is with each game was 5-20% and the minimums average is less variant. That 50% more cores is not translating into 25%+ gains vs quad with HT like what i stated. also dont ignore the fact that i5 8600k also has upto 300mhz gain over that i7 7700k again another 6% gain. and has a tad more cache. Even with the synthetic benchs which uses all aviable cores and threads at full tilt is only averaging 19% difference.
O lets look same guy on youtube doing i5 8600k vs i5 7600k both at stock....
https://www.youtube.com/watch?v=o_Fq9W7NgHU
difference here is on average for games is 11%-26%...... No HT and less cache while only 100mhz difference between boost clocks.
Now lets look at i5 8600k vs i7 7700k at stock
less than 10% difference while 7700 has 200 mhz upclock on 8600k
https://www.youtube.com/watch?v=RehWKg6zELc
Now lets compare cinebench scores acroos all three videos
i5 8600k: 1031
i5 7600k: 654
:37% difference
stock:
8600k: 1031
7700k: 971
6% difference
OC:
8600k:1311
7700k: 1074
18% difference.
So we are seeing HT filling in the gap of having less real cores. because HT allows the i7 to work on 8 jobs at once vs both i5's with 4 or 6 cores handling less at one time making the next task wait.
The i5 8600k will always have an edge on a quad with HT, but results are not wide as you would think in having 50% more physical cores with real world and synthetic results. I would recommend an i5 hex core over an older i7 any day because you do have more processing power within those 6 threads and is cheaper and they overclock well. But you cant cut out the quad core i7's out just yet because them being able to handle more work at once.
@04dcarraher: You should recommend Ryzen CPU's instead. The 5820k has been almost around for 3 years now. The 8700k has a few meager IPC improvements compared to the 5820k, but that's it.
Stop promoting that rebranding company until they get their sh*t together man.
Even with that OC "video" average framerate difference is with each game was 5-20% and the minimums average is less variant. That 50% more cores is not translating into 25%+ gains vs quad with HT like what i stated. also dont ignore the fact that i5 8600k also has upto 300mhz gain over that i7 7700k again another 6% gain. and has a tad more cache. Even with the synthetic benchs which uses all aviable cores and threads at full tilt is only averaging 19% difference.
O lets look same guy on youtube doing i5 8600k vs i5 7600k both at stock....
https://www.youtube.com/watch?v=o_Fq9W7NgHU
difference here is on average for games is 11%-26%...... No HT and less cache while only 100mhz difference between boost clocks.
Now lets look at i5 8600k vs i7 7700k at stock
less than 10% difference while 7700 has 200 mhz upclock on 8600k
https://www.youtube.com/watch?v=RehWKg6zELc
Now lets compare cinebench scores acroos all three videos
i5 8600k: 1031
i5 7600k: 654
:37% difference
stock:
8600k: 1031
7700k: 971
6% difference
OC:
8600k:1311
7700k: 1074
18% difference.
So we are seeing HT filling in the gap of having less real cores. because HT allows the i7 to work on 8 jobs at once vs both i5's with 4 or 6 cores handling less at one time making the next task wait.
The i5 8600k will always have an edge on a quad with HT, but results are not wide as you would think in having 50% more physical cores with real world and synthetic results. I would recommend an i5 hex core over an older i7 any day because you do have more processing power within those 6 threads and is cheaper and they overclock well. But you cant cut out the quad core i7's out just yet because them being able to handle more work at once.
It's not how this discussion started, he said they would be comparable.
The performance difference becomes bigger when both cpu's get overclocked.
When the I5 will become a mainstream 6 core cpu the difference will become even bigger because games will try to maximize performance and with 2 cores more there's just a lot more cpu power available.
Remember how you called me an idiot because I sold my I5 2500 back in 2014 to buy an I7 3820. Funny how you're now defending hyperthreading but it's understandable, hypterthreading will increase performance, but back in 2014 it didn't do that much for games. It's going to be a similar process with 6 core cpu's, only now today you already see a bigger difference because 6 core cpu's have more raw cpu power.
It's not how this discussion started, he said they would be comparable.
The performance difference becomes bigger when both cpu's get overclocked.
When the I5 will become a mainstream 6 core cpu the difference will become even bigger because games will try to maximize performance and with 2 cores more there's just a lot more cpu power available.
Remember how you called me an idiot because I sold my I5 2500 back in 2014 to buy an I7 3820. Funny how you're now defending hyperthreading but it's understandable, hypterthreading will increase performance, but back in 2014 it didn't do that much for games. It's going to be a similar process with 6 core cpu's, only now today you already see a bigger difference because 6 core cpu's have more raw cpu power.
They are comparable..... as seen above under 10% difference at stock and around 20% difference when overclocked.
any game that is making use of 8 threads with balanced loading across all available cores and threads is already maximizing those 6 core i5's. ie games like BF1 MP
if I remember correctly it was about your claim that your 3820 supported pci-e 3.0 when the cpu itself only supported 2.0 .... then it was that claim of yours that your 3820 became faster than I think it was 3770k because of it having quad channel memory and that it supported 40 pci-e lanes while using your AMD 7800 gpu vs a stronger gpu with that 3770k.... when in fact the 40 lanes and quad channel memory wouldn't make no difference with gpus like that.
Xbox One X was never going to be better than a GTX 980 from September 2014.
Too bad GTX 980 couldn't match X1X on FM7 and Killer Instinct S3.
all this lollygagging over a console vs PC.
at the end of the day, console vs console shits all over the cows and their PS4 Pro, take it and like it.
PC gaming still reigns supreme, so this argument was already lopsided.
basically stfu.. and quit damage controlling. each system has their postives.
Actually no.. I pointed out the TFLOPS because clearly the 1070 regardless of architecture, has higher peak processing power. Then went on to point out a game that is biased to a particular architecture... so noting the differences in architecture. If they both had 100% utilization the 1070 is factually superior.
But thanks for undermining your own argument about games being compared 1:1, if a game has a bias it will perform better on a particular architecture. Something you guys seem to be missing with what was presented. But since your couldn't contain yourself to try to jump on me... You should try harder not to self own your self.
Secondly why are you assuming I'm a console gamer? Wouldn't that be like the pot calling the kettle black?
Is that why the X1X gets a heart attack in GeOW4 at 1080p and drops to 44fps while GTX 1070 hits as high as 115fps and doesn't drop below 92fps? This is a first party MS game BTW...
"Teh Nvidia bias!!!1".
? Is that gonna be your excuse everytime the GTX 1070 destroys XboneX in performance? You're gonna find that every game will be biased then lol.
X1X was designed for Digital Foundry's resolution game against PS4 i.e. "we hear you" E3 2016 statement Project Scorpio reveal. X1X was designed to bring 900p/1080p XBO games into 4K with art asset improvements.
1070 reached 4K ~30 fps target with 37 fps average which means to should reach ~120 fps at 1080p and actual 1070 has 115 fps average result which is good conversion between 4K to 1080p processing.
Without being GPU bound, high frame rates needs a CPU with powerful sequential processing.
MS Store sells gaming Windows 10 based PCs for market segments that X1X doesn't cover.
R9-390X with 5.9 TFLOPS and design hardware bottlenecks is already reaching 30 fps average at 4K resolution which is 7 fps from GTX 1070's 37 fps. MSI Gaming X R9-390X OC at 6.026 TFLOPS is slightly faster than stock R9-390X.
For Gears of War 4, 6 TFLOPS class GPUs from R9-390X, GTX 980 Ti and GTX 1070 has similar performance range.
Fury X is a disappointment since it's classic GPU hardware is similar to R9-390X i.e. it's 8.6 TFLOPS is bottle-necked by R9-390X level classic GPU hardware.
GTX 1080 Ti shows Guru3D's high end PC CPU is bottlenecked by lesser GPUs.
all this lollygagging over a console vs PC.
at the end of the day, console vs console shits all over the cows and their PS4 Pro, take it and like it.
PC gaming still reigns supreme, so this argument was already lopsided.
basically stfu.. and quit damage controlling. each system has their postives.
Consoles & PC have there place but I do understand why this thread was made in the first place but really though, it has ran it's course.
it won't bottleneck, there are workarounds for this, the dx12 chip and gpgpu tools.
For 500$ the xboxone x is an amazing system.
It's also incredibly unbalanced with a good GPU and a shit CPU. Guess that's why it's only 500$.
It's unbalanced for 1080p high frame rates, but it's close to balance at 1800p-to-4K resolution.
PS3 inspired "cutting edge" optical drive push for X1X's 4K Blu-ray drive is also a distraction. Kinect was Nintendo inspired distraction for XBO.
The $$$ for X1X's 4K Blu-ray drive, 8 MB flash storage, HDMI input capture feature and Kinect related DSP (3 extra DSP IP) could have been reallocated for larger GPU or improved CPU. To match X1X's features, the PC would need HDMI capture card, Soundcard with DSP, 8 MB flash storage and 4K blu-ray drive.
Xbox 360 focused on gaming without distractions.
It's not how this discussion started, he said they would be comparable.
The performance difference becomes bigger when both cpu's get overclocked.
When the I5 will become a mainstream 6 core cpu the difference will become even bigger because games will try to maximize performance and with 2 cores more there's just a lot more cpu power available.
Remember how you called me an idiot because I sold my I5 2500 back in 2014 to buy an I7 3820. Funny how you're now defending hyperthreading but it's understandable, hypterthreading will increase performance, but back in 2014 it didn't do that much for games. It's going to be a similar process with 6 core cpu's, only now today you already see a bigger difference because 6 core cpu's have more raw cpu power.
They are comparable..... as seen above under 10% difference at stock and around 20% difference when overclocked.
any game that is making use of 8 threads with balanced loading across all available cores and threads is already maximizing those 6 core i5's. ie games like BF1 MP
if I remember correctly it was about your claim that your 3820 supported pci-e 3.0 when the cpu itself only supported 2.0 .... then it was that claim of yours that your 3820 became faster than I think it was 3770k because of it having quad channel memory and that it supported 40 pci-e lanes while using your AMD 7800 gpu vs a stronger gpu with that 3770k.... when in fact the 40 lanes and quad channel memory wouldn't make no difference with gpus like that.
This discussion isn't about stock clocks, the 8600k doesn't even have a non k version.
20 percent difference is not comparable, and like I said when I5's hexacores become more mainstream the differences are going to get bigger.
@commander: I can, but I want to hear your definition. As it's clearly different than the dictionary.
5-20% difference is comparable...... and the only way i5 hexacores will widen the gap from i7 7700k is when the next generation come out where their ipc improves again by the typical tick tock progression intel tends to do. because right now i5 8600k isnt outright slaughtering the 7700k is on par at stock and slightly faster when both are overclocked.
Please Log In to post.
Log in to comment