@ronvalencia: Don't know how you decided that the Xbox ONE X's CPU is equivalent to an Athlon 5350 (and then multiplying the cores by 2 to get an estimate bench score which is.....an odd thing to do...) but for the sake of argument lets say I believe you. That CPU is still weak, and nobody in their right mind would do 4k Gaming with something like a stock i5-2500k (let alone an i5-2500k), everything would be severally bottlenecked. Give me something like an i5-6600k and you might impress me.
@ronvalencia: Don't know how you decided that the Xbox ONE X's CPU is equivalent to an Athlon 5350 (and then multiplying the cores by 2 to get an estimate bench score which is.....an odd thing to do...) but for the sake of argument lets say I believe you. That CPU is still weak, and nobody in their right mind would do 4k Gaming with something like a stock i5-2500k (let alone an i5-2500k), everything would be severally bottlenecked. Give me something like an i5-6600k and you might impress me.
He's used to making shit up by copy-pasting technical stuff that he doesn't really understand while pretending he's technically savvy but he's a fraud. This guy was claiming at one point that the X1X was going to perform equivalently to a GTX1080.
@ronvalencia: Don't know how you decided that the Xbox ONE X's CPU is equivalent to an Athlon 5350 (and then multiplying the cores by 2 to get an estimate bench score which is.....an odd thing to do...) but for the sake of argument lets say I believe you. That CPU is still weak, and nobody in their right mind would do 4k Gaming with something like a stock i5-2500k (let alone an i5-2500k), everything would be severally bottlenecked. Give me something like an i5-6600k and you might impress me.
At 4K the GPU is the bottleneck.
The X1X doesn't have a GPU good enough to be bottlenecked by even a potato. The higher the resolution the more the GPU becomes the bottleneck... Its takes a 1080 Ti level of GPU to actually be bottlenecked by a 8 year old i5 at 4K. A RX 580 level GPU?... There will be no CPU bottleneck with games at 4K, so if the X1X isn't running the game well or its not running a game at 4K its because of the GPU.
@ronvalencia: Don't know how you decided that the Xbox ONE X's CPU is equivalent to an Athlon 5350 (and then multiplying the cores by 2 to get an estimate bench score which is.....an odd thing to do...) but for the sake of argument lets say I believe you. That CPU is still weak, and nobody in their right mind would do 4k Gaming with something like a stock i5-2500k (let alone an i5-2500k), everything would be severally bottlenecked. Give me something like an i5-6600k and you might impress me.
From World of tank developer on Xbox One X’s CPU and GPU balance
He replied that, “We’ve actually found the CPU and GPU improvements to complement each other quite well. Increasing the resolution from 1080p to 4K uses much of the additional power of the GPU but has basically no effect on the CPU.
From http://gamingbolt.com/xbox-one-xs-4k-resolution-has-no-impact-on-cpu-gpu-allowed-increased-lod-and-more-objects-dev
@ronvalencia: Don't know how you decided that the Xbox ONE X's CPU is equivalent to an Athlon 5350 (and then multiplying the cores by 2 to get an estimate bench score which is.....an odd thing to do...) but for the sake of argument lets say I believe you. That CPU is still weak, and nobody in their right mind would do 4k Gaming with something like a stock i5-2500k (let alone an i5-2500k), everything would be severally bottlenecked. Give me something like an i5-6600k and you might impress me.
From World of tank developer on Xbox One X’s CPU and GPU balance
He replied that, “We’ve actually found the CPU and GPU improvements to complement each other quite well. Increasing the resolution from 1080p to 4K uses much of the additional power of the GPU but has basically no effect on the CPU.
From http://gamingbolt.com/xbox-one-xs-4k-resolution-has-no-impact-on-cpu-gpu-allowed-increased-lod-and-more-objects-dev
For THAT specific game and THAT specific engine..... Go and ask DICE if they feel the CPU doesn't restrict their engine and game performance in anyway..... I would be they would say it does ;)
@ronvalencia: Don't know how you decided that the Xbox ONE X's CPU is equivalent to an Athlon 5350 (and then multiplying the cores by 2 to get an estimate bench score which is.....an odd thing to do...) but for the sake of argument lets say I believe you. That CPU is still weak, and nobody in their right mind would do 4k Gaming with something like a stock i5-2500k (let alone an i5-2500k), everything would be severally bottlenecked. Give me something like an i5-6600k and you might impress me.
From World of tank developer on Xbox One X’s CPU and GPU balance
He replied that, “We’ve actually found the CPU and GPU improvements to complement each other quite well. Increasing the resolution from 1080p to 4K uses much of the additional power of the GPU but has basically no effect on the CPU.
From http://gamingbolt.com/xbox-one-xs-4k-resolution-has-no-impact-on-cpu-gpu-allowed-increased-lod-and-more-objects-dev
For THAT specific game and THAT specific engine..... Go and ask DICE if they feel the CPU doesn't restrict their engine and game performance in anyway..... I would be they would say it does ;)
Listen , even if X1X had the most super ultra turbo nitro champions edition 100core cpu i wouldnt give a shit about it because there are no exclusives on the system.
If MS wont do the thing im telling you which is either start to provide some exclusives to the system or if thats not possible some time exclusives ( first on X1 , afterwards on PC ) then whatever the cpu ... its the end for this gen.
PS4/PS4 Pro's CPUs already has rapid pack math FP16 for it's CPU. https://en.wikipedia.org/wiki/Half-precision_floating-point_format
SPE's SIMD only supports 8 bit Integer, 16 bit Integer, 32 bit integer and 32 bit floating point rapid pack math formats.
PS4/PS4 Pro's CPU SIMDs supports 8 bit Integer, 16 bit Integer, 16 bit floating point, 32 bit integer and 32 bit floating point rapid pack math formats.
I'm curious, are you capable of reading what other people write? Also do you think people actually care about the irrelevant gibberish you post on a daily basis?
What you posted literally had nothing to do with what the other poster said and before you say he was off topic. He wasn't. He gave his view on the topic at hand, which is what you do on a forum.
This topic is about X1X CPU in relation to PS3 and Xbox 360 CPUs, while cows introduced out-of-topic with typical game exclusivity debate. The only irrelevant gibberish comes from you and the poster who introduced the game exclusivity argument inside X1X CPU in relation to PS3 and Xbox 360 topic.
I remained on topic while cows introduced game exclusivity argument.
SPE's SIMD only supports 8 bit Integer, 16 bit Integer, 32 bit integer and 32 bit floating point rapid pack math formats.
PS4/PS4 Pro's CPU SIMDs supports 8 bit Integer, 16 bit Integer, 16 bit floating point, 32 bit integer and 32 bit floating point rapid pack math formats.
I'm curious, are you capable of reading what other people write? Also do you think people actually care about the irrelevant gibberish you post on a daily basis?
What you posted literally had nothing to do with what the other poster said and before you say he was off topic. He wasn't. He gave his view on the topic at hand, which is what you do on a forum.
This topic is about X1X CPU in relation to PS3 and Xbox 360 CPUs, while cows introduced out-of-topic with typical game exclusivity debate. The only irrelevant gibberish comes from you and the poster who introduced the game exclusivity argument inside X1X CPU in relation to PS3 and Xbox 360 topic.
I remained on topic while cows introduced game exclusivity argument.
I understand that the finer points of a discussion are lost on you, but what he said was most certainly within the realms of this topic. Not all points of discussion have to specifically target the OP, they can also target the surrounding topics. If that point goes over your head, let me know and we can spend more time discussing it.
The only finer points of a discussion are lost is on you.
It's hypocrite BS when I get called for out of topic argument. When I remained in topic to restore out-of-topic argument into TC's argument, I still called from it.
@ronvalencia: Don't know how you decided that the Xbox ONE X's CPU is equivalent to an Athlon 5350 (and then multiplying the cores by 2 to get an estimate bench score which is.....an odd thing to do...) but for the sake of argument lets say I believe you. That CPU is still weak, and nobody in their right mind would do 4k Gaming with something like a stock i5-2500k (let alone an i5-2500k), everything would be severally bottlenecked. Give me something like an i5-6600k and you might impress me.
From World of tank developer on Xbox One X’s CPU and GPU balance
He replied that, “We’ve actually found the CPU and GPU improvements to complement each other quite well. Increasing the resolution from 1080p to 4K uses much of the additional power of the GPU but has basically no effect on the CPU.
From http://gamingbolt.com/xbox-one-xs-4k-resolution-has-no-impact-on-cpu-gpu-allowed-increased-lod-and-more-objects-dev
For THAT specific game and THAT specific engine..... Go and ask DICE if they feel the CPU doesn't restrict their engine and game performance in anyway..... I would be they would say it does ;)
Battlefield 1(Patch 1.04) 64 multiplayer PS4 Pro vs PS4
X1X has faster CPU, faster GPU and more efficient API when compared to PS4 Pro. PC DirectX12 is less efficient when compared to XBO's API.
Furthermore, X1X's Battlefront 2 support HDMI 2.1 VRR/FreeSync 2 reduces the need for 60 hz.
So in a thread about Xbox-X's CPU you use videos from a PS4?
Right....... I think you would find Xbox-X's CPU would cause CPU limitations, even at 4k in multiplayer and physics heavy scenes.... making my point still stand.
And freesync on X reduces the need for 60fps? Is the new salty, damage control argument you're now going to use moving forward when X fails to hit 60fps?
I'm curious, are you capable of reading what other people write? Also do you think people actually care about the irrelevant gibberish you post on a daily basis?
What you posted literally had nothing to do with what the other poster said and before you say he was off topic. He wasn't. He gave his view on the topic at hand, which is what you do on a forum.
This topic is about X1X CPU in relation to PS3 and Xbox 360 CPUs, while cows introduced out-of-topic with typical game exclusivity debate. The only irrelevant gibberish comes from you and the poster who introduced the game exclusivity argument inside X1X CPU in relation to PS3 and Xbox 360 topic.
I remained on topic while cows introduced game exclusivity argument.
I understand that the finer points of a discussion are lost on you, but what he said was most certainly within the realms of this topic. Not all points of discussion have to specifically target the OP, they can also target the surrounding topics. If that point goes over your head, let me know and we can spend more time discussing it.
The only finer points of a discussion are lost is on you.
It's hypocrite BS when I get called for out of topic argument. When I remained in topic to restore out-of-topic argument into TC's argument, I still called from it.
The hypocrite BS come from cows.
Being on topic for a thread is one thing, being on topic for a particular discussion is another. You were on topic as far as this thread goes but your post had nothing to do with the post you quoted. It seems you may require a bit of help to understand this concept.
Battlefield 1(Patch 1.04) 64 multiplayer PS4 Pro vs PS4
X1X has faster CPU, faster GPU and more efficient API when compared to PS4 Pro. PC DirectX12 is less efficient when compared to XBO's API.
Furthermore, X1X's Battlefront 2 support HDMI 2.1 VRR/FreeSync 2 reduces the need for 60 hz.
So in a thread about Xbox-X's CPU you use videos from a PS4?
Why not? If PC examples are being used against X1X, why not use PS4 Pro's examples when X1X is only higher grade PS4 Pro like box?
We have EA DICE's claims on API efficiency between XBO, PS4 and PC DirectX12.
PC's DirectX12 still has the inferior API efficiency when compared to game console API.
@scatteh316 said:
Right....... I think you would find Xbox-X's CPU would cause CPU limitations, even at 4k in multiplayer and physics heavy scenes.... making my point still stand.
Based on what reasoning? Is it based from PC's examples?
@scatteh316 said:
And freesync on X reduces the need for 60fps? Is the new salty, damage control argument you're now going to use moving forward when X fails to hit 60fps?
Lol....
At 4K Ultra, it's GPU bound with R9-390X to GTX 1070 class GPUs.
PC already has both FreeSync (AMD) and G-Sync (NVIDIA) for situations that missed 60 hz target. What's good for PC, it's good for X1X.
Battlefield 1(Patch 1.04) 64 multiplayer PS4 Pro vs PS4
X1X has faster CPU, faster GPU and more efficient API when compared to PS4 Pro. PC DirectX12 is less efficient when compared to XBO's API.
Furthermore, X1X's Battlefront 2 support HDMI 2.1 VRR/FreeSync 2 reduces the need for 60 hz.
So in a thread about Xbox-X's CPU you use videos from a PS4?
Why not? If PC examples are being used against X1X, why not use PS4 Pro's examples when X1X is only higher grade PS4 Pro like box?
We have EA DICE's claims on API efficiency between XBO, PS4 and PC DirectX12.
PC's DirectX12 still has the inferior API efficiency when compared to game console API.
@scatteh316 said:
Right....... I think you would find Xbox-X's CPU would cause CPU limitations, even at 4k in multiplayer and physics heavy scenes.... making my point still stand.
Based on what reasoning? Is it based from PC's examples?
@scatteh316 said:
And freesync on X reduces the need for 60fps? Is the new salty, damage control argument you're now going to use moving forward when X fails to hit 60fps?
Lol....
At 4K Ultra, it's GPU bound with R9-390X to GTX 1070 class GPUs.
PC already has both FreeSync (AMD) and G-Sync (NVIDIA) for situations that missed 60 hz target. What's good for PC, it's good for X1X.
Is X-Box a PC? Does it run on Windows?
And didn't you complain about cows going off topic in a thread Xbox X's cpu?
Is talking about PC API's and using charts from PC GRAPHICS cards in a thread about CPU's not going off the topic of the thread? - Hypocrite...
Battlefield 1(Patch 1.04) 64 multiplayer PS4 Pro vs PS4
X1X has faster CPU, faster GPU and more efficient API when compared to PS4 Pro. PC DirectX12 is less efficient when compared to XBO's API.
Furthermore, X1X's Battlefront 2 support HDMI 2.1 VRR/FreeSync 2 reduces the need for 60 hz.
So in a thread about Xbox-X's CPU you use videos from a PS4?
Why not? If PC examples are being used against X1X, why not use PS4 Pro's examples when X1X is only higher grade PS4 Pro like box?
We have EA DICE's claims on API efficiency between XBO, PS4 and PC DirectX12.
PC's DirectX12 still has the inferior API efficiency when compared to game console API.
@scatteh316 said:
Right....... I think you would find Xbox-X's CPU would cause CPU limitations, even at 4k in multiplayer and physics heavy scenes.... making my point still stand.
Based on what reasoning? Is it based from PC's examples?
@scatteh316 said:
And freesync on X reduces the need for 60fps? Is the new salty, damage control argument you're now going to use moving forward when X fails to hit 60fps?
Lol....
At 4K Ultra, it's GPU bound with R9-390X to GTX 1070 class GPUs.
PC already has both FreeSync (AMD) and G-Sync (NVIDIA) for situations that missed 60 hz target. What's good for PC, it's good for X1X.
Is X-Box a PC? Does it run on Windows?
And didn't you complain about cows going off topic in a thread Xbox X's cpu?
Is talking about PC API's and using charts from PC GRAPHICS cards in a thread about CPU's not going off the topic of the thread? - Hypocrite...
Battlefield 1(Patch 1.04) 64 multiplayer PS4 Pro vs PS4
X1X has faster CPU, faster GPU and more efficient API when compared to PS4 Pro. PC DirectX12 is less efficient when compared to XBO's API.
Furthermore, X1X's Battlefront 2 support HDMI 2.1 VRR/FreeSync 2 reduces the need for 60 hz.
So in a thread about Xbox-X's CPU you use videos from a PS4?
Why not? If PC examples are being used against X1X, why not use PS4 Pro's examples when X1X is only higher grade PS4 Pro like box?
We have EA DICE's claims on API efficiency between XBO, PS4 and PC DirectX12.
PC's DirectX12 still has the inferior API efficiency when compared to game console API.
@scatteh316 said:
Right....... I think you would find Xbox-X's CPU would cause CPU limitations, even at 4k in multiplayer and physics heavy scenes.... making my point still stand.
Based on what reasoning? Is it based from PC's examples?
@scatteh316 said:
And freesync on X reduces the need for 60fps? Is the new salty, damage control argument you're now going to use moving forward when X fails to hit 60fps?
Lol....
At 4K Ultra, it's GPU bound with R9-390X to GTX 1070 class GPUs.
PC already has both FreeSync (AMD) and G-Sync (NVIDIA) for situations that missed 60 hz target. What's good for PC, it's good for X1X.
Is X-Box a PC? Does it run on Windows?
And didn't you complain about cows going off topic in a thread Xbox X's cpu?
Is talking about PC API's and using charts from PC GRAPHICS cards in a thread about CPU's not going off the topic of the thread? - Hypocrite...
XBO runs on Windows 10 variant with better API efficiency when compared to PC's Windows 10/DirectX12.
Again, EA DICE's claims API efficiency comparison
For example, PC's Windows 10 hasn't move DirectX12 API to GPU instruction set translation stage into microcode based hardware engine translator and PC is wasting CPU power on CPU based software translation engine.
@scatteh316 said:
Is talking about PC API's and using charts from PC GRAPHICS cards in a thread about CPU's not going off the topic of the thread? - Hypocrite...
Graphics API overheads are mostly involve with the CPU side, stupid...
Decoding graphics API into GPU instruction set is before the actual GPU's processing, stupid.
Battlefield 1(Patch 1.04) 64 multiplayer PS4 Pro vs PS4
X1X has faster CPU, faster GPU and more efficient API when compared to PS4 Pro. PC DirectX12 is less efficient when compared to XBO's API.
Furthermore, X1X's Battlefront 2 support HDMI 2.1 VRR/FreeSync 2 reduces the need for 60 hz.
So in a thread about Xbox-X's CPU you use videos from a PS4?
Why not? If PC examples are being used against X1X, why not use PS4 Pro's examples when X1X is only higher grade PS4 Pro like box?
We have EA DICE's claims on API efficiency between XBO, PS4 and PC DirectX12.
PC's DirectX12 still has the inferior API efficiency when compared to game console API.
@scatteh316 said:
Right....... I think you would find Xbox-X's CPU would cause CPU limitations, even at 4k in multiplayer and physics heavy scenes.... making my point still stand.
Based on what reasoning? Is it based from PC's examples?
@scatteh316 said:
And freesync on X reduces the need for 60fps? Is the new salty, damage control argument you're now going to use moving forward when X fails to hit 60fps?
Lol....
At 4K Ultra, it's GPU bound with R9-390X to GTX 1070 class GPUs.
PC already has both FreeSync (AMD) and G-Sync (NVIDIA) for situations that missed 60 hz target. What's good for PC, it's good for X1X.
Is X-Box a PC? Does it run on Windows?
And didn't you complain about cows going off topic in a thread Xbox X's cpu?
Is talking about PC API's and using charts from PC GRAPHICS cards in a thread about CPU's not going off the topic of the thread? - Hypocrite...
XBO runs on Windows 10 variant with better API efficiency when compared to PC's Windows 10/DirectX12.
Again, EA DICE's claims API efficiency comparison
For example, PC's Windows 10 hasn't move DirectX12 API to GPU instruction set translation stage into microcode based hardware engine translator and it's wasting CPU power on CPU based software translation engine.
@scatteh316 said:
Is talking about PC API's and using charts from PC GRAPHICS cards in a thread about CPU's not going off the topic of the thread? - Hypocrite...
Graphics API overheads are mostly involve with CPU problem, stupid...
And classic Ronbot moving the goal posts... Lmao....
And didn't you complain about cows going off topic in a thread Xbox X's cpu?
Is talking about PC API's and using charts from PC GRAPHICS cards in a thread about CPU's not going off the topic of the thread? - Hypocrite...
Xbox is a PC and does run Windows
Windows 10 on PC still has inferior API subsystem efficiency.
I agree with ron tho.
You need atleast 4 way sli titans, to run minecraft at parity with the xbox one x.
The optimisation on PC is absolutely horrible to the point that PC gaming just seems completely useless in 2017.
Here's a demonstration:
Loading Video...
If you look good, you can see the stutter in the middle of the video. which indicates PC is on it's last legs. and probably is around original xbox 360 kind of performance.
@ronvalencia: @Grey_Eyed_Elf: Looks like there are no bottlenecks and that the GPU is what affects 4k performance, lesson learned. Though I still wouldn't gloat over an i5-750, frame timing was definitely an issue with newer games on that video and that is something the Xbox ONE X will have to deal with (unless they wanna go PS4 and make 90% of their library comprise of remasters of old games and then slap "Next Gen" on it), a stock i5-2500k @3.3Ghz would likely show the same results, its still not as impressive. Plus the Xbox ONE X GPU isn't near powerful enough to render demanding games at native 4k @60fps anyway. Sorry but there are currently far superior CPUs than the Xbox ONE X's CPU on the market, the Xbox ONE X CPU is a "shitty" CPU in comparison to the competition, hate to break it to you but the i5-2500k is a 5 year old CPU, it doesn't stand a chance to what is available right now. The i5-2500k is only decent for gaming and simple tasks, I myself plan on switching it with an i7 2600k/2700k/3770k.
@ronvalencia: @Grey_Eyed_Elf: Looks like there are no bottlenecks and that the GPU is what affects 4k performance, lesson learned. Though I still wouldn't gloat over an i5-750, frame timing was definitely an issue with newer games on that video and that is something the Xbox ONE X will have to deal with (unless they wanna go PS4 and make 90% of their library comprise of remasters of old games and then slap "Next Gen" on it), a stock i5-2500k @3.3Ghz would likely show the same results, its still not as impressive. Plus the Xbox ONE X GPU isn't near powerful enough to render demanding games at native 4k @60fps anyway. Sorry but there are currently far superior CPUs than the Xbox ONE X's CPU on the market, the Xbox ONE X CPU is a "shitty" CPU in comparison to the competition, hate to break it to you but the i5-2500k is a 5 year old CPU, it doesn't stand a chance to what is available right now. The i5-2500k is only decent for gaming and simple tasks, I myself plan on switching it with an i7 2600k/2700k/3770k.
1. As long PS4 Pro version has sufficient frame rates, it should be better on X1X. PC's large scale RTS exclusives would be CPU bottleneck problem on X1X.
2. X1X was designed for Digital Foundry's XBO resolution gate. Retail Xbox One X GPU already has higher grade version in dev kit Xbox One X GPU with 6.6 TFLOPS and 24 GB RAM. Chip yields wasn't perfect and Retail Xbox One X has the lesser 40 CU version.
PC market can afford different yields from the same silicon design e.g. both GTX 1080 and GTX 1070 GPUs are the same GP104 chip.
3. X1X was designed to target XBO's 360 mm2 size chip, hence targeting similar BOM cost. CPUs like 14 nm Skylake/Kabylake are large chips, hence $499 PC's large CPU:small GPU ratio problem.
4. MS manage to fit AMD's second largest scale GPU i.e. 44 CU based GCN.
You stop clogging up the thread. I remain in topic while you engaged in poster war.
we're all in topic, ron
Your post is not. Your constant poster attacks are not in topic.
all my posts have been in topic
You're supported a poster who introduced typical cow game exclusives debate into this CPU based topic. This topic has been derailed from CPU discussion.
Your post is not. Your constant poster attacks are not in topic.
all my posts have been in topic
You're supported a poster who introduced typical cow game exclusives debate into this CPU based topic. This topic has been derailed from CPU discussion.
Log in to comment