You're comparing a i7-4770k with the cpu in ps4/xbox one. You didn't show me 28 percent in cpu bound games, I already replied to you on that and explained it as well. Those benchmarks are not cpu bound. The only cpu stress you can see in your benchmarks is overhead that you don't have in a console. An i3 already has the same frames as an i7 in your benchies.
Your comparison with a gearbox shows that you don't understand what I'm trying to say. Gddr5 has higher latency and higher speed ram always has higher latency. The higher speed doesn't make up for the latency when there's not enough cpu cache to bridge the latency gap.
Explain it again. The test used 4 core Jaguar CPUs running in Windows, that means they are going to be a bigger bottleneck than 8 cores (6 for games) running in a console environment because a) the consoles have more CPU grunt, b) the consoles have lower CPU overhead coming from the OS and c) the APIs in the consoles have lower overhead than DX11 does on PC. Like I said it would have been interesting to compare DX to Mantle benchmarks in the BF4 test to see how much an improvement a lower level API has with such weak CPUs but they did not. What do i3s have to do with the Athlon 5350 and the 5150?
Just a FYI, CPU bound due to extra overhead or CPU bound due to games taking up all the cycles results in the same situation, you are CPU bound.
It is not that I do not understand what you are saying, it is that what you are saying is WRONG. You are saying higher latency in faster ram but it is not the case. It takes more cycles to perform an action so the latency timings are higher but that does not increase overall latency because the clock speed is also higher. Go find some Hynix documentation. By your reasoning DDR3 800 with 6-6-6 timings will have lower latency than DDR3 2133 with 14-14-14 timings and it is not the case as the ACT to ACT time for them is 52.5 ns and 46.09 ns respectively.
You keep saying that faster ram has higher latency, you keep being wrong so why not just sit down, listen, and learn instead of spouting shit you obviously know naff all about. You also keep saying that the 5150 and 5350 were not CPU bound in those tests, yet if that was the case the performance would have matched the faster processors so again you are wrong. You always seem to be wrong and since you are not learning I am done here.
To me choppy framerates don't mean superior
That is the problem dude the xbox one has even choppier frames,less foliage and effects...lol
not when driving it doesn't and gta is still about stealing cars you know 'GRAND THEFT AUTO'
the x1 version would have been a lot better if it was on 900p on the x1, for the ps4 this wouldn't have mattered, the cpu would still bottleneck.
Oh just give it a rest you little troll. The only time the PS4 drops more frames than the Xbox One is during free driving through heavy intersections and the difference is at best 2 FPS, that was on a single run through though so unless DF, or someone with both copies wants to go through the effort to do a proper test with multiple runs to account for run to run variance then 2 FPS could be an outlier. During the scripted chase sequences where traffic is the same performance is very similar with the PS4 ahead. Further those busy intersections when traversed at night work fine so there is obviously some subtle difference between the versions lighting models, or textures, or draw distance that would cause a slight hiccup on PS4.
When you get to shoot-outs though the PS4 steams along at 30 FPS rock solid and the Xbox One is stuck around 25 FPS, at least it is a consistently poor framerate but it is still poor none the less. This is by far the biggest difference in the game and these frame rates persist even in environments where there is more foliage on the PS4 version so it is keeping a higher frame rate while processing an higher graphical load, that is to be expected when the resolution is a match across versions.
Log in to comment