DICE Dev: Xbox One Isn’t As Powerful As PS4, DX12 Won’t Help Reduce Difference

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#201 Zero_epyon
Member since 2004 • 20502 Posts

@commander said:
@Zero_epyon said:
@commander said:
@Zero_epyon said:
@commander said:

it's 10 percent overclock. When there is a cpu shortage that's a lot, why do you think the ps4 struggles with framerates sometimes. It's because the cpu gets taxed. Of course you won't see it in star wars battlefront, it's mostly gpu bound.

That extra 100 mhz on all 8 cores has been noticable, look at mad max , unity , gta V, call of duty and the witcher 3. The ps4 has a much better gpu than the xbox one., that extra power on the xbox doesn't come out of thin air.

Besides , ms has freed up an extra core for games as well, which adds roughly an extra 14 percent. something else sony can't do, they're just not as good at making operating systems. It's probably the reason why mad max is better on the xboxone.

No it's only about 8% and that's theoretical speed. It does almost nothing in real world performance. The 7th core is shared with the OS and only a portion of it can be used by the game under certain circumstances. Adding complexity to the code and uncertainty to how often the core's resources will be available. The fact that framerate issues were fixed for games running on the PS4 without a sacrifice to quality though patches should tell you that it's not a cpu thing.

Assassin's creed was patched and runs smooth, so does GTA V. Call Of Duty runs at a native 1080 on PS4 while the Xbox One uses a lower resolution. Witcher 3, you know that one. And these games, the differences were almost negligible except for Witcher 3. And now PS4 outperforms the Xbox on the Witcher. Was the PS4 overclocked?

Lol it's 9.375 to be exact. Theoretical what? lmao, it's clocked 9.375 percent higher and there is nothing theoretical about it.

The 7th core is 50-80 percent of a core that's also 9.375 clocked higher. That's at least 18 percent more cpu resourses in total and 23 percent at max.

The fact that games like the witcher 3 take 6 months to be patched only shows how much optimizations is needed to counter that cpu advantage on the xbox one. Even with a much stronger gpu. Assasins creed took 1 month and a half. Mad max still runs beter on the x1. Call of duty still runs better on the X1, ok the resolution is lower, but not that much lower it would give the x1 a framerate advantage when you look at the gpu's. It's because of the faster cpu.

Like I said, the ps4 will always have the gpu advantage but the x1 will always have the cpu advantage. Better brace yourself for Assassins creed syndicate face off.

It's still negligible. And yes, those numbers you see on tech specs are theoretical limits, not real world limits. How do you not know this? The withcher 3's patches have actually been see-sawing all throughout on both consoles. Some patches actually hurt the Xbox One and PS4 where as other patches improved one over the other. The main difference is that the Xbox was able to maintain a marginal lead due to it's dynamic resolution capability and not a cpu difference. The PS4 looks the same as it did before, but now has less poppins, better texture streaming, more stable framerate and is still rendering 30% more pixels than the Xbox One.

Assassin's creed CPU performance was terrible across the board. Once they fixed their engine, the PS4 came back on top. What do you expect from Syndicate?

Everyone that knows a thing or two about computer hardware knows that that difference is not neglible. How can you not know this?

Assassins creed on the ps4 is not better on the ps4, maybe you should read digital foundry latest words on that latest patch

from digital foundry"

Overall, the results now more closely resemble the Xbox One game when the engine isn't fully taxed,

Compared to the pre-patched game, drops in frame-rate aren't quite so heavy, and the PS4 version now more closely matches the Xbox One release during gameplay.

"

it resembles xboxone performance, not beats. and that a month and a half after release at the same resolution while the ps4 has a stronger gpu. That's because of the cpu, mad max runs better on the x1 at the same resolution because of the cpu. Gta V has less framerate drops during high speed driving through busy locations because of the cpu.

Call of duty, witcher 3, far cry 4 could or can have better framerates because of the dynamic resolution and/or lower resolution but it's highly like the cpu also has something to do with it, it's too obvious in games like gta V, assasins creed and mad max.

Some games may have been optimized for the ps4 and transferred a lot of cpu work to the gpu, like with the witcher 3 and assasins creed but as you can see in assassin creed it isn't enough to match the x1 performance.

So in cpu intensive games the cpu gap will always be there, and that's exactly what you will (also)see in syndicate.

You're still avoiding the obvious. Why are those games running the same or better now after the patches than before? Do the games overclock the cpu? Or is it a software thing? Why is witcher 3 now performing worse on the Xbox than the previous patches? Did it get underclocked? Or is it a software thing? You also dodged the theoretical limit comment. Even if the Xbox has a ~10% jump, it doesn't translate to a ~10% increase in performance directly. Try overclocking you cpu by 10% and see how many frames you'll get extra. You will quickly realise that there was no difference. To see differences like you're claiming, you need to overclock by at least 1 GHz. The only reason these games run with a dynamic resolution is simply because the system can't keep up. Look at Transformers.

Avatar image for ToScA-
ToScA-

5783

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#202 ToScA-
Member since 2006 • 5783 Posts

@AzatiS:

The Emotion engine? PS2 to deliver Toy Story graphics? You mustn't have been around back then when Sony and Sega went at each other. Regardless of who started what, all fanboys are equally worthless

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#203 commander
Member since 2010 • 16217 Posts

@Zero_epyon said:
@commander said:

Everyone that knows a thing or two about computer hardware knows that that difference is not neglible. How can you not know this?

Assassins creed on the ps4 is not better on the ps4, maybe you should read digital foundry latest words on that latest patch

from digital foundry"

Overall, the results now more closely resemble the Xbox One game when the engine isn't fully taxed,

Compared to the pre-patched game, drops in frame-rate aren't quite so heavy, and the PS4 version now more closely matches the Xbox One release during gameplay.

"

it resembles xboxone performance, not beats. and that a month and a half after release at the same resolution while the ps4 has a stronger gpu. That's because of the cpu, mad max runs better on the x1 at the same resolution because of the cpu. Gta V has less framerate drops during high speed driving through busy locations because of the cpu.

Call of duty, witcher 3, far cry 4 could or can have better framerates because of the dynamic resolution and/or lower resolution but it's highly like the cpu also has something to do with it, it's too obvious in games like gta V, assasins creed and mad max.

Some games may have been optimized for the ps4 and transferred a lot of cpu work to the gpu, like with the witcher 3 and assasins creed but as you can see in assassin creed it isn't enough to match the x1 performance.

So in cpu intensive games the cpu gap will always be there, and that's exactly what you will (also)see in syndicate.

You're still avoiding the obvious. Why are those games running the same or better now after the patches than before? Do the games overclock the cpu? Or is it a software thing? Why is witcher 3 now performing worse on the Xbox than the previous patches? Did it get underclocked? Or is it a software thing? You also dodged the theoretical limit comment. Even if the Xbox has a ~10% jump, it doesn't translate to a ~10% increase in performance directly. Try overclocking you cpu by 10% and see how many frames you'll get extra. You will quickly realise that there was no difference. To see differences like you're claiming, you need to overclock by at least 1 GHz. The only reason these games run with a dynamic resolution is simply because the system can't keep up. Look at Transformers.

The cpu in both consoles is so weak that with cpu intensive games you get cpu bottlenecks, so 10 extra performance will translate in real world numbers. The reason why the ps4 gets better with patches is because of gpgpu tools and general optimization.

and yes some games run with dynamic resolution because either the esram isn't used properly and/or the game is very gpu intensive

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#204 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

And still people will be talking about closing the gap the entire generation, so get used to it.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#205  Edited By StormyJoe
Member since 2011 • 7806 Posts

@AzatiS: the reason the PS3 was ridiculed last gen was that while Sony was boasting how powerful it was, it did not have superior multi plats.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#206  Edited By Daious
Member since 2013 • 2315 Posts
@StormyJoe said:

@AzatiS: the reason the PS3 was ridiculed last gen was that while Sony was boasting how powerful it was, it did not have superior multi plats.

Sony dropped the ball. The issue was PS3 was released too early. It wasn't even suppose to have an nvidia GPU. They weren't done finalizing the PS3 and rush launched it. The sole reason it was rush launched was because Sony had a priority of winning the format war (beat out the hd-dvd format).

On the flip side MS also rush launched to release their console well before Sony. We see how that panned out with the RROD design flaw.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#207 Daious
Member since 2013 • 2315 Posts
@StormyJoe said:

@AzatiS: the reason the PS3 was ridiculed last gen was that while Sony was boasting how powerful it was, it did not have superior multi plats.

Sony dropped the ball. The issue was PS3 was released too early. It wasn't even suppose to have an nvidia GPU. They weren't done finalizing the PS3 and rush launched it. The sole reason it was rush launched was because Sony had a priority of winning the format war (beat out the hd-dvd format).

On the flip side MS also rush launched to release their console well before Sony. We see how that panned out with the RROD design flaw.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#208 Zero_epyon
Member since 2004 • 20502 Posts

@commander said:
@Zero_epyon said:
@commander said:

Everyone that knows a thing or two about computer hardware knows that that difference is not neglible. How can you not know this?

Assassins creed on the ps4 is not better on the ps4, maybe you should read digital foundry latest words on that latest patch

from digital foundry"

Overall, the results now more closely resemble the Xbox One game when the engine isn't fully taxed,

Compared to the pre-patched game, drops in frame-rate aren't quite so heavy, and the PS4 version now more closely matches the Xbox One release during gameplay.

"

it resembles xboxone performance, not beats. and that a month and a half after release at the same resolution while the ps4 has a stronger gpu. That's because of the cpu, mad max runs better on the x1 at the same resolution because of the cpu. Gta V has less framerate drops during high speed driving through busy locations because of the cpu.

Call of duty, witcher 3, far cry 4 could or can have better framerates because of the dynamic resolution and/or lower resolution but it's highly like the cpu also has something to do with it, it's too obvious in games like gta V, assasins creed and mad max.

Some games may have been optimized for the ps4 and transferred a lot of cpu work to the gpu, like with the witcher 3 and assasins creed but as you can see in assassin creed it isn't enough to match the x1 performance.

So in cpu intensive games the cpu gap will always be there, and that's exactly what you will (also)see in syndicate.

You're still avoiding the obvious. Why are those games running the same or better now after the patches than before? Do the games overclock the cpu? Or is it a software thing? Why is witcher 3 now performing worse on the Xbox than the previous patches? Did it get underclocked? Or is it a software thing? You also dodged the theoretical limit comment. Even if the Xbox has a ~10% jump, it doesn't translate to a ~10% increase in performance directly. Try overclocking you cpu by 10% and see how many frames you'll get extra. You will quickly realise that there was no difference. To see differences like you're claiming, you need to overclock by at least 1 GHz. The only reason these games run with a dynamic resolution is simply because the system can't keep up. Look at Transformers.

The cpu in both consoles is so weak that with cpu intensive games you get cpu bottlenecks, so 10 extra performance will translate in real world numbers. The reason why the ps4 gets better with patches is because of gpgpu tools and general optimization.

and yes some games run with dynamic resolution because either the esram isn't used properly and/or the game is very gpu intensive

lol no that's not how it works. If that's the logic you're basing it on, then you have no idea how computer hardware works. 10% of crap is still crap.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#209 deactivated-59d151f079814
Member since 2003 • 47239 Posts

.. If your concerned about the trivial graphical difference/performance between the two systems, you might as well go pc if your in it for the graphics.. After all the PC makes any kind of difference between the two consoles look down right comedic.. In less your not in it for the graphics, so why in the hell is this important to any one?

Avatar image for Martin_G_N
Martin_G_N

2124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#210 Martin_G_N
Member since 2006 • 2124 Posts

We know this. The differences with the X1 and the PS3 is that the jump in power isn't as big as it should have been going from last gen to this. While Sony didn't take any risks with the PS4, and was set out to make an affordable console while still making money on each one sold compared to the previous playstations, MS screwed up with the X1 making it even less powerfull because of the focus on Kinect.

The other difference is the Cell on the PS3. Yeah yeah, laugh, but it was true, compared to the Cloud and DX12. The PS3 actually had the processing power available within the difficult hardware, the X1 doesn't.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#211  Edited By commander
Member since 2010 • 16217 Posts

@Zero_epyon said:
@commander said:
@Zero_epyon said:

You're still avoiding the obvious. Why are those games running the same or better now after the patches than before? Do the games overclock the cpu? Or is it a software thing? Why is witcher 3 now performing worse on the Xbox than the previous patches? Did it get underclocked? Or is it a software thing? You also dodged the theoretical limit comment. Even if the Xbox has a ~10% jump, it doesn't translate to a ~10% increase in performance directly. Try overclocking you cpu by 10% and see how many frames you'll get extra. You will quickly realise that there was no difference. To see differences like you're claiming, you need to overclock by at least 1 GHz. The only reason these games run with a dynamic resolution is simply because the system can't keep up. Look at Transformers.

The cpu in both consoles is so weak that with cpu intensive games you get cpu bottlenecks, so 10 extra performance will translate in real world numbers. The reason why the ps4 gets better with patches is because of gpgpu tools and general optimization.

and yes some games run with dynamic resolution because either the esram isn't used properly and/or the game is very gpu intensive

lol no that's not how it works. If that's the logic you're basing it on, then you have no idea how computer hardware works. 10% of crap is still crap.

That may be true in your fanboy dreams but not in the real world.

the fx 4300 is roughly 6 percent faster than the fx 4100 and the difference in frames is a lot more than 6 percent. You could say the same for the fx 6100 and fx 6300. Even the fx 8150 and fx 8350 shows the same trend, enough said.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#212 Zero_epyon
Member since 2004 • 20502 Posts

@commander said:
@Zero_epyon said:
@commander said:
@Zero_epyon said:

You're still avoiding the obvious. Why are those games running the same or better now after the patches than before? Do the games overclock the cpu? Or is it a software thing? Why is witcher 3 now performing worse on the Xbox than the previous patches? Did it get underclocked? Or is it a software thing? You also dodged the theoretical limit comment. Even if the Xbox has a ~10% jump, it doesn't translate to a ~10% increase in performance directly. Try overclocking you cpu by 10% and see how many frames you'll get extra. You will quickly realise that there was no difference. To see differences like you're claiming, you need to overclock by at least 1 GHz. The only reason these games run with a dynamic resolution is simply because the system can't keep up. Look at Transformers.

The cpu in both consoles is so weak that with cpu intensive games you get cpu bottlenecks, so 10 extra performance will translate in real world numbers. The reason why the ps4 gets better with patches is because of gpgpu tools and general optimization.

and yes some games run with dynamic resolution because either the esram isn't used properly and/or the game is very gpu intensive

lol no that's not how it works. If that's the logic you're basing it on, then you have no idea how computer hardware works. 10% of crap is still crap.

That may be true in your fanboy dreams but not in the real world.

the fx 4300 is roughly 6 percent faster than the fx 4100 and the difference in frames is a lot more than 6 percent. You could say the same for the fx 6100 and fx 6300. Even the fx 8150 and fx 8350 shows the same trend, enough said.

How am I a fanboy? Aren't you the one celebrating a 3-4 fps advantage and dismissing a 20+fps disadvantage on the PS4 vs Xbox One?

Furthermore, you are looking at a chart of different processors with differing performance altering architectures, not a simple overclocking of a similar chip. For example, the FX-4100 sucks at multithreaded applications while the FX-4300 does not. It's an apples to oranges comparison once you change models. Show me a 4300 vs a 4300 with a 10% higher clock. Otherwise, quit trying to sound like you know what you're talking about.

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#213 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts

@commander said:
@Zero_epyon said:
@commander said:
@Zero_epyon said:

You're still avoiding the obvious. Why are those games running the same or better now after the patches than before? Do the games overclock the cpu? Or is it a software thing? Why is witcher 3 now performing worse on the Xbox than the previous patches? Did it get underclocked? Or is it a software thing? You also dodged the theoretical limit comment. Even if the Xbox has a ~10% jump, it doesn't translate to a ~10% increase in performance directly. Try overclocking you cpu by 10% and see how many frames you'll get extra. You will quickly realise that there was no difference. To see differences like you're claiming, you need to overclock by at least 1 GHz. The only reason these games run with a dynamic resolution is simply because the system can't keep up. Look at Transformers.

The cpu in both consoles is so weak that with cpu intensive games you get cpu bottlenecks, so 10 extra performance will translate in real world numbers. The reason why the ps4 gets better with patches is because of gpgpu tools and general optimization.

and yes some games run with dynamic resolution because either the esram isn't used properly and/or the game is very gpu intensive

lol no that's not how it works. If that's the logic you're basing it on, then you have no idea how computer hardware works. 10% of crap is still crap.

That may be true in your fanboy dreams but not in the real world.

the fx 4300 is roughly 6 percent faster than the fx 4100 and the difference in frames is a lot more than 6 percent. You could say the same for the fx 6100 and fx 6300. Even the fx 8150 and fx 8350 shows the same trend, enough said.

Those piledriver cpus are more than 6% faster than their bulldozer counterparts. Remember that the it isn't just the architecture improvements, but the piledrivers run at higher clock rates. For example, the 8150 is 3.6ghz while the 8350 is 4.0ghz. If you look here, while the difference varies quite a bit depending on the program. the 8350 is usually 10-15% faster than the 8150, which isn't too far off from the 18% difference in the asscreed benchmark you posted.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#214 Zero_epyon
Member since 2004 • 20502 Posts

@commander:

Here I went ahead and did it for you so you can't dodge it.

Example: The 8350 stock vs the 8350 clocked to 4.8GHz is overclocked by 16% but only puts out 7% extra frames on average. The Core i5-2500K at stock overclocked to a massive 5 GHz is overclocked by 34% but only gives about 15% more frames on average. So like I said, overclocking doesn't translate to performance directly. Now image a 1.6 GHz being overclocked by just under 10%. You seriously expect a massive difference? More like a frame or two if that.

Avatar image for LJS9502_basic
LJS9502_basic

180203

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#215 LJS9502_basic
Member since 2003 • 180203 Posts

@Chutebox said:

Not really surprising to most...

Correct.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#216  Edited By commander
Member since 2010 • 16217 Posts

@ferret-gamer said:

Those piledriver cpus are more than 6% faster than their bulldozer counterparts. Remember that the it isn't just the architecture improvements, but the piledrivers run at higher clock rates. For example, the 8150 is 3.6ghz while the 8350 is 4.0ghz. If you look here, while the difference varies quite a bit depending on the program. the 8350 is usually 10-15% faster than the 8150, which isn't too far off from the 18% difference in the asscreed benchmark you posted.

@Zero_epyon said:

How am I a fanboy? Aren't you the one celebrating a 3-4 fps advantage and dismissing a 20+fps disadvantage on the PS4 vs Xbox One?

Furthermore, you are looking at a chart of different processors with differing performance altering architectures, not a simple overclocking of a similar chip. For example, the FX-4100 sucks at multithreaded applications while the FX-4300 does not. It's an apples to oranges comparison once you change models. Show me a 4300 vs a 4300 with a 10% higher clock. Otherwise, quit trying to sound like you know what you're talking about.

Either way, the trend stays the same. Even when you consider that's it's a different architecture. like ferret gamer says; the performance differences are comparable to the framerate differences. So when you have an absolute cpu bottleneck the performance difference translates directly in framerates. The performance differences are quite obvious in his link so your architecture argument doesn't fly, it's true that it's more than 6 percent between the fx series though , but the framerate difference is more as well.

as for your skyrim comparison

@Zero_epyon said:

@commander:

Here I went ahead and did it for you so you can't dodge it.

Example: The 8350 stock vs the 8350 clocked to 4.8GHz is overclocked by 16% but only puts out 7% extra frames on average. The Core i5-2500K at stock overclocked to a massive 5 GHz is overclocked by 34% but only gives about 15% more frames on average. So like I said, overclocking doesn't translate to performance directly. Now image a 1.6 GHz being overclocked by just under 10%. You seriously expect a massive difference? More like a frame or two if that.

Skyrim is another game, it doesn't use more than two cores so that 16 percent overclock does really do much for 75 percent of the cpu yet it still manages to output 7 percent more frames.

it's quite obvious in the following bench,

the fx 4100 simply matches the fx 8150, it's the same architecture and the same clock speeds. the dual core e 8400 simply murders the fx 8150 and this with a quarter of the cores. I know it's overclocked but i'm sure you get the point. Skyrim is optimized for two cores, so the whole 16 percent overclock and only 7 percent extra performance is simply incorrect.

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#217  Edited By AzatiS
Member since 2004 • 14969 Posts

@ToScA- said:

@AzatiS:

The Emotion engine? PS2 to deliver Toy Story graphics? You mustn't have been around back then when Sony and Sega went at each other. Regardless of who started what, all fanboys are equally worthless

Dont start this shit with me , i know all those generations since day 1 ... Sony went at who with PS2 ? DC that discontinued after a few months ? Are you serious now or you talking out of your butt ?

Let me elaborate for you from all the generations i personally remember pretty well

SNES/GENESIS = SNES ( sheep went crazy about how arcades ports looked better and how SNES has more colors etc )

PS/SATURN/N64 = N64 ( sheep went crazy how anti-pixelated FPS (all games actually ) games were like DOOM etc etc vs the messy PS games )

PS2/DC/XBOX/GC = XBOX ( lems went crazy about its epic power and graphics etc vs PS2 ... it was its selling point vs PS2 ...

PS3/X360/Wii = X360 ( lems all about how crap PS3 is , how lame CELL is , laughing in every other post about something PS3 graphics related )

PS4/X1 = PS4 ( cows after so long poking ... talking shit after 25 years ?!! = oh how bad cows are when blablabla )

LoL to all of hypocrites and last but not least ... whoever talks out of his ass without any logic just to win an argument

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#218  Edited By Zero_epyon
Member since 2004 • 20502 Posts

@commander: Wow you really don't have a clue huh? I'm not repeating myself.

What's 16% of 4.0 GHz? 800 MHz. You needed 800 MHz to get 7% more frames on average. And you're telling me that ~150 MHz more than 1.6 is going to provide similar jumps?

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#219  Edited By Zero_epyon
Member since 2004 • 20502 Posts

@commander: Last post about this to you. Look at Total War, A cpu intensive game since you moved the goal post a bit.

the 8350 vs the 16% overclock only does about 15% more frames on average. Even though this is close, it still doesn't match 100%. And even if it did match, it still took an 800 MHz jump to achieve it. 23% average frame boost for the core i5-2500K but it took a 34% overclock which means it needed an extra 1.7 GHz to get there. What do you think 150 MHz extra is going to do for an already slow processor?

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#220  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@Zero_epyon said:

@commander: Last post about this to you. Look at Total War, A cpu intensive game since you moved the goal post a bit.

the 8350 vs the 16% overclock only does about 15% more frames on average. Even though this is close, it still doesn't match 100%. And even if it did match, it still took an 800 MHz jump to achieve it. 23% average frame boost for the core i5-2500K but it took at 34% overclock which means it needed an extra 1.7 GHz to get there. What do you think 150 MHz extra is going to do for an already slow processor?

I hope you know Shogun 2 was also single threaded for the longest time, the difference between a dual core and quad from the same architecture meant nothing. Skyrim when it was first released only used two threads on pc and later on after a few patches they upped the usage to four threads.

Phenom 2 X4 vs X6 with a a 300mhz difference 3 ghz vs 3.3 ghz which is a 10% increase, with shogun 2 seen a single frame difference with it being single threaded.

One thing your also overlooking is that Direct x overhead on pc does not reflect what a slight overclock can do in a low overhead closed system. With comparable hardware devs could get nearly 2x the performance vs a PC using DX9. a nearly 10% clock rate wont increase framerate by a whole lot maybe a couple frames at best. Now with the addition of that 7th core devs are using that extra processor core to smooth out performance and prevent a overtax on cpu.

10% increase on a slow ass cpu can mean the difference in a resource limited system.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#221  Edited By commander
Member since 2010 • 16217 Posts

@Zero_epyon said:

@commander: Wow you really don't have a clue huh? I'm not repeating myself.

What's 16% of 4.0 GHz? 800 MHz. You needed 800 MHz to get 7% more frames on average. And you're telling me that ~150 MHz more than 1.6 is going to provide similar jumps?

16 percent in a game where only 1/4 of the cpu is used, is not a good comparison.

@Zero_epyon said:

@commander: Last post about this to you. Look at Total War, A cpu intensive game since you moved the goal post a bit.

the 8350 vs the 16% overclock only does about 15% more frames on average. Even though this is close, it still doesn't match 100%. And even if it did match, it still took an 800 MHz jump to achieve it. 23% average frame boost for the core i5-2500K but it took a 34% overclock which means it needed an extra 1.7 GHz to get there. What do you think 150 MHz extra is going to do for an already slow processor?

It's the same problem with shogun, the game only uses four cores , yet you use an 8 core cpu as an example. You want to prove me wrong , use a assassins creed unity benchmark, because that's the game we're talking about.

or at least a game that taxes the cpu fully, you're trying to prove to me that an increase in cpu power in an absolute cpu bottleneck scenario doesn't translate directly into framerates.

That's not possible when you have a cpu that's not fully stressed (without disabling any cores)

Besides, there's also a differences between a cpu intensive game and an absolute cpu bottleneck, the previous will induce the latter faster but it's not the same. I mimicked it with my ac unity benchmark because i used such low tier cpu's combined with an overkill in gpu power.. Then the demand for cpu power becomes so high that it mimics an absolute cpu bottleneck. That's not the case here either.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#222 04dcarraher
Member since 2004 • 23858 Posts

@commander:

Shogun 2 only used one thread for a long time especially when all benchmarks where done when it came out

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#223 Zero_epyon
Member since 2004 • 20502 Posts

@commander: Confirmed that you aren't even reading my posts or selectively reading them. In both charts I cite the 8350 AND the core i5 2500k (a 4 core processor). So the game uses all 4 cores according to you on a core i5 and needs a whopping 1.7 GHz increase in clock speeds just to get a 23% boost in frames. My point is, and I've proven this over and over, is that cpu overclocking percentages don't translate directly to additional fps. I've already proved you wrong on two accounts. One, that a 10% cpu boost does not mean a 10% increase in fps and two, you can't compare different CPUs just based on their clocks. Every CPU is different and handles certain tasks differently even though they may have the same are close to the same clock speeds. This is basic.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#224  Edited By Zero_epyon
Member since 2004 • 20502 Posts

@04dcarraher said:
@Zero_epyon said:

@commander: Last post about this to you. Look at Total War, A cpu intensive game since you moved the goal post a bit.

the 8350 vs the 16% overclock only does about 15% more frames on average. Even though this is close, it still doesn't match 100%. And even if it did match, it still took an 800 MHz jump to achieve it. 23% average frame boost for the core i5-2500K but it took at 34% overclock which means it needed an extra 1.7 GHz to get there. What do you think 150 MHz extra is going to do for an already slow processor?

I hope you know Shogun 2 was also single threaded the difference between a dual core and quad from the same architecture meant nothing. Skyrim when it was first released only used two threads on pc and later on after a few patches they upped the usage to four threads.

Phenom 2 X4 vs X6 with a a 300mhz difference 3 ghz vs 3.3 ghz which is a 10% increase, with shogun 2 seen a single frame difference with it being single threaded.

One thing your also overlooking is that Direct x overhead on pc does not reflect what a slight overclock can do in a low overhead closed system. With comparable hardware devs could get nearly 2x the performance vs a PC using DX9. a nearly 10% clock rate wont increase framerate by a whole lot maybe a couple frames at best. Now with the addition of that 7th core devs are using that extra processor core to smooth out performance and prevent a overtax on cpu.

10% increase on a slow ass cpu can mean the difference in a resource limited system.

https://software.intel.com/en-us/articles/shogun2-total-war-case-study

Multicore from the beginning.

So PS4, with an even lower overhead than DX12, because it's closer to metal than DX12, will not see such an issue and will mitigate any differences. Unless you're going to tell me otherwise?

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#225  Edited By commander
Member since 2010 • 16217 Posts

@Zero_epyon said:

@commander: Confirmed that you aren't even reading my posts or selectively reading them. In both charts I cite the 8350 AND the core i5 2500k (a 4 core processor). So the game uses all 4 cores according to you on a core i5 and needs a whopping 1.7 GHz increase in clock speeds just to get a 23% boost in frames. My point is, and I've proven this over and over, is that cpu overclocking percentages don't translate directly to additional fps. I've already proved you wrong on two accounts. One, that a 10% cpu boost does not mean a 10% increase in fps and two, you can't compare different CPUs just based on their clocks. Every CPU is different and handles certain tasks differently even though they may have the same are close to the same clock speeds. This is basic.

Your i5 comparison isn't correct because there is no absolute cpu bottleneck. The game is cpu intensive , ok, but the gpu isn't waiting for the cpu to feed the data, hence the word bottleneck. In an absolute cpu bottleneck scenario the stream gets nerfed directly because of the cpu, that doesn't happen here. Increasing gpu power would also increase the frames.

While if you would add another gtx 980 with that fx 4100 in my ac unity benchmark the frames would hardly get better because the cpu is pretty much bottlenecking everything.

And it's (at least partly) true what carrahar says as well, shogun isn't even optimized for four cores, i thought it was at first but when looking at other benches it's even less. I don't know about one core but 2 cores and 4 cores don't seem to show much difference (the i3 530 simply matches the Q9550)

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#226  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@Zero_epyon:

wrong, in 2011 when it released one core was only used. in 2012+ 1-3 threads were used however 90% of the load was still the first thread.

In game benchmark using quad core

Summary:

4 cores:

load time:.......74 seconds

benchmark:.....43.25 fps

2 cores:

load time:.......74 seconds

benchmark:.....43.25 fps

1 core:

load time:.......78 seconds

benchmark:.....43.0 fps

Avatar image for AM-Gamer
AM-Gamer

8116

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#227  Edited By AM-Gamer
Member since 2012 • 8116 Posts

@kingtito: No I said PS+ offered better value then XBL because it did. I got lots of games with solid discounts and MS apparently thought it was a threat as they started doing the same.

Also if you don't own a XB1 why are you talking about the reliability of the service? It went down 2 days ago for **** sakes. It also has obnoxious sign in bugs and a sluggish OS. That's not even talking about the inferior hardware.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#228 FoxbatAlpha
Member since 2009 • 10669 Posts

@magmadragoonx4: nope. Just did some more research. The Quad is DX6. Sad really.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#229 blackace
Member since 2002 • 23576 Posts

@imt558 said:

Well :

http://gamingbolt.com/dice-dev-xbox-one-isnt-as-powerful-as-ps4-dx12-wont-help-reduce-difference-talks-development

This is pretty funny coming from a developer who can't seem to get a FPS to run at 1080P on the PS4 even though other developers have done it with no problems. DX12 will make some differences. Someone should have asked DICE if they will ever be able to make a console game that runs smoothly at 1080P 60fps.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#230 StormyJoe
Member since 2011 • 7806 Posts

@blackace: especially since the Developer of Caffeine said DX12 improves XB1's performance, but some people just can't accept it

http://www.gamezone.com/news/dx12-adding-20-performance-increase-on-pc-even-more-on-xbox-one-3423420

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#231 Shewgenja
Member since 2009 • 21456 Posts

I like how power and game performance isn't a big deal until you shut the door on some form of XBone secret sauce. Then, brace yourselves for a 6 page or more thread.

We're two years into this gen, people. Once upon a time, we'd be seeing this as the halfway point of a gen. No one is buying a secret hidden leap in XBones performance except the upper tier of deluded lemmings at this point. Just stop. It's not mysterious or even fun, anymore. It wouldn't even matter if DX12 put the bone on parity with the ps4 at this point because developers and publishers are not going to double down on it as a result.

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#232  Edited By silversix_
Member since 2010 • 26347 Posts

I love seeing lems quoting each other and agreeing with each other. Very cute.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#233  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:
@Zero_epyon said:
@commander said:
@Zero_epyon said:

You're still avoiding the obvious. Why are those games running the same or better now after the patches than before? Do the games overclock the cpu? Or is it a software thing? Why is witcher 3 now performing worse on the Xbox than the previous patches? Did it get underclocked? Or is it a software thing? You also dodged the theoretical limit comment. Even if the Xbox has a ~10% jump, it doesn't translate to a ~10% increase in performance directly. Try overclocking you cpu by 10% and see how many frames you'll get extra. You will quickly realise that there was no difference. To see differences like you're claiming, you need to overclock by at least 1 GHz. The only reason these games run with a dynamic resolution is simply because the system can't keep up. Look at Transformers.

The cpu in both consoles is so weak that with cpu intensive games you get cpu bottlenecks, so 10 extra performance will translate in real world numbers. The reason why the ps4 gets better with patches is because of gpgpu tools and general optimization.

and yes some games run with dynamic resolution because either the esram isn't used properly and/or the game is very gpu intensive

lol no that's not how it works. If that's the logic you're basing it on, then you have no idea how computer hardware works. 10% of crap is still crap.

That may be true in your fanboy dreams but not in the real world.

the fx 4300 is roughly 6 percent faster than the fx 4100 and the difference in frames is a lot more than 6 percent. You could say the same for the fx 6100 and fx 6300. Even the fx 8150 and fx 8350 shows the same trend, enough said.

You still doing this argument?

4100/6100/8150 is not a downclocked 4300/6300/8350.

4300/6300/8350 is not an overclocked 4100/6100/8150.

They are made with different architectural iteration. Zambezi versus Vishera. Vishera has a higher IPC than Zambezi. Vishera is Piledriver

4100/6100/8150 is Zambezi and 4300/6300/8350 is Vishera. Jesus, your arguments are terrible. Actually do research before you post.

But, thanks for the lols though. I still see your are sticking with the classic unoptimized launch mess that was ACU.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#234  Edited By Daious
Member since 2013 • 2315 Posts

@blackace said:
@imt558 said:

Well :

http://gamingbolt.com/dice-dev-xbox-one-isnt-as-powerful-as-ps4-dx12-wont-help-reduce-difference-talks-development

Someone should have asked DICE if they will ever be able to make a console game that runs smoothly at 1080P 60fps.

It is Dice's fault that this generation consoles are weak?

I am still waiting on your HUMA information you promised in 2013/2014/2015.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#235 commander
Member since 2010 • 16217 Posts

@daious : it doesn't matter , even if it's 10 percent difference the argument stays the same, allthough you probably don't know what the argument is otherwise you wouldn't be posting facts that are completely besides teh point.

read the thread before you reply and you won't look like ... you know what lmao

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#236  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:

@daious : it doesn't matter , even if it's 10 percent difference the argument stays the same, allthough you probably don't know what the argument is otherwise you wouldn't be posting facts that are completely besides teh point.

read the thread before you reply and you won't look like ... you know what lmao

Your consistently posting things (from the past and just now) that show your glaring lack of knowledge on the subject. You also posted the same exact thing before with the exact same incorrect 4100/4300 argument.. I corrected you then and you still didn't learn.

From the gis of it, the zero guy doesn't seem to know any technical details either but he didn't just repeat the same wrong information over and over again in multiple threads..

Edit: grammar

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#237  Edited By commander
Member since 2010 • 16217 Posts

@daious said:
@commander said:

@daious : it doesn't matter , even if it's 10 percent difference the argument stays the same, allthough you probably don't know what the argument is otherwise you wouldn't be posting facts that are completely besides teh point.

read the thread before you reply and you won't look like ... you know what lmao

Your consistently posting (from the past and just now) things that show your glaring lack of knowledge on the subject. You also posted the same exact thing before with the exact same incorrect 4100/4300 argument.. I corrected you then and you still didn't learn.

From the gis of it, the zero guy doesn't seem to know any technical details either but he didn't just repeat the same wrong information over and over again.

Edit: grammar

you pick out details and don't even know what the argument is about.

I knew there was something different about the architecture before i posted it but I didn't bother looking it up because I already knew the outcome. The performance difference is comparable to framerates because of the cpu bottleneck, the rest is besides the point.

If you would have read the discussion before you posted your criticism you would have noticed that I said 'The cpu in both consoles is so weak that with cpu intensive games you get cpu bottlenecks, so 10 extra performance will translate in real world numbers

the thing is , i know i posted this before in the exact same argument but it didn't matter then and it doesn't matter now, The other poster said that 10 percent cpu power difference would never result into direct game performance, I said it would, why it was more in that bench I didn't know exactly but again

IT IS BESIDES THE POINT

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#238  Edited By Daious
Member since 2013 • 2315 Posts
@commander said:
@daious said:
@commander said:

@daious : it doesn't matter , even if it's 10 percent difference the argument stays the same, allthough you probably don't know what the argument is otherwise you wouldn't be posting facts that are completely besides teh point.

read the thread before you reply and you won't look like ... you know what lmao

Your consistently posting (from the past and just now) things that show your glaring lack of knowledge on the subject. You also posted the same exact thing before with the exact same incorrect 4100/4300 argument.. I corrected you then and you still didn't learn.

From the gis of it, the zero guy doesn't seem to know any technical details either but he didn't just repeat the same wrong information over and over again.

Edit: grammar

you pick out details and don't even know what the argument is about.

you and a dozen others come into hardware threads because you found a mistake that doesn't matter.

I knew the architeture was different before i posted it, but it doesn't matter the performance difference is comparable to framerates because of the cpu bottleneck, the rest is besides the point.

One- ACU at launch on PC was a disaster. AMD architecture at launch (GPU/CPU) was a mess. It took future AMD drivers + patches helped fix.

There were issues with multicore performance all around. Look at the 4 core versus 4cores +4 virtual cores versus 8 core 8 virtual cores. There were so many issues with it. However, you disregard it. Meanwhile, in other threads you attack any other game for optimization that doesn't fit your view. You still use Assassin's creed Unity launch benchmarks before any of those patches and driver fixes were in place.

Two- you are implying that the difference in mhz between AMD 4100/4300 equated to the difference in performance we saw in those charts (when piledriver improved over a dozen different things compared to first gen bulldozer).

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#239 Zero_epyon
Member since 2004 • 20502 Posts

@daious said:

Your consistently posting things (from the past and just now) that show your glaring lack of knowledge on the subject. You also posted the same exact thing before with the exact same incorrect 4100/4300 argument.. I corrected you then and you still didn't learn.

From the gis of it, the zero guy doesn't seem to know any technical details either but he didn't just repeat the same wrong information over and over again in multiple threads..

Edit: grammar

Out of curiosity, what details don't I seem to know? I pretty much said the same thing you did.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#240  Edited By blackace
Member since 2002 • 23576 Posts

@Shewgenja said:

I like how power and game performance isn't a big deal until you shut the door on some form of XBone secret sauce. Then, brace yourselves for a 6 page or more thread.

We're two years into this gen, people. Once upon a time, we'd be seeing this as the halfway point of a gen. No one is buying a secret hidden leap in XBones performance except the upper tier of deluded lemmings at this point. Just stop. It's not mysterious or even fun, anymore. It wouldn't even matter if DX12 put the bone on parity with the ps4 at this point because developers and publishers are not going to double down on it as a result.

One dev shuts the door, while other devs who are saying the complete opposite are ignored? LOL!! Ok. Every PC developers will be using Win10/DX12 in the very near future. It won't matter, until it matters. We'll see what happens in 2016.

**************************************************************************************************************************

@daious said:
@blackace said:
@imt558 said:

Well :

http://gamingbolt.com/dice-dev-xbox-one-isnt-as-powerful-as-ps4-dx12-wont-help-reduce-difference-talks-development

Someone should have asked DICE if they will ever be able to make a console game that runs smoothly at 1080P 60fps.

It is Dice's fault that this generation consoles are weak?

I am still waiting on your HUMA information you promised in 2013/2014/2015.

Other devs seem to have no problems, except for Konami, but at least they can get 1080P on the PS4. The Frostbite and Fox engines are crap anyways. HUMA info?? Don't remember that.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#241  Edited By Daious
Member since 2013 • 2315 Posts
@Zero_epyon said:
@daious said:

Your consistently posting things (from the past and just now) that show your glaring lack of knowledge on the subject. You also posted the same exact thing before with the exact same incorrect 4100/4300 argument.. I corrected you then and you still didn't learn.

From the gis of it, the zero guy doesn't seem to know any technical details either but he didn't just repeat the same wrong information over and over again in multiple threads..

Edit: grammar

Out of curiosity, what details don't I seem to know? I pretty much said the same thing you did.

I may have misread your sarcasm in some of those rhetorical questions. I only rush skimmed it. I didn't read every post or even close to the entire thing.

When I came in this thread and saw the ACU benchmark again, it reminded me of the lol worthy argument that was made last time.

I actually came here to post about the cross-platform development that DX12 can provide and how excited I am for more DX12 games on PC.

The one thing that sucks about PC gaming this year is that all these new features are coming with DX12. However, Pascal and Greeland island GPUs (which will be the biggest increase between architectures in the last 5 years) is coming out next year. I don't think the latest GPU are worth it at this price when next years will blow it out of the water.

Avatar image for Zero_epyon
Zero_epyon

20502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#242 Zero_epyon
Member since 2004 • 20502 Posts

@daious said:
@Zero_epyon said:
@daious said:

Your consistently posting things (from the past and just now) that show your glaring lack of knowledge on the subject. You also posted the same exact thing before with the exact same incorrect 4100/4300 argument.. I corrected you then and you still didn't learn.

From the gis of it, the zero guy doesn't seem to know any technical details either but he didn't just repeat the same wrong information over and over again in multiple threads..

Edit: grammar

Out of curiosity, what details don't I seem to know? I pretty much said the same thing you did.

I may have misread your sarcasm in some of those rhetorical questions. I only rush skimmed it.

When I came in this thread and saw the ACU benchmark again, it reminded me of the lol worthy argument that was made last time.

I actually came here to post about the cross-platform development that DX12 can provide and how excited I am for more DX12 games on PC.

Ah ok. Carry on then!

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#243 Daious
Member since 2013 • 2315 Posts
@blackace said:
@Shewgenja said:

I like how power and game performance isn't a big deal until you shut the door on some form of XBone secret sauce. Then, brace yourselves for a 6 page or more thread.

We're two years into this gen, people. Once upon a time, we'd be seeing this as the halfway point of a gen. No one is buying a secret hidden leap in XBones performance except the upper tier of deluded lemmings at this point. Just stop. It's not mysterious or even fun, anymore. It wouldn't even matter if DX12 put the bone on parity with the ps4 at this point because developers and publishers are not going to double down on it as a result.

One dev shuts the door, while other devs who are saying the complete opposite are ignored? LOL!! Ok. Every PC developers will be using Win10/DX12 in the very near future. It won't matter, until it matters. We'll see what happens in 2016.

**************************************************************************************************************************

@daious said:
@blackace said:
@imt558 said:

Well :

http://gamingbolt.com/dice-dev-xbox-one-isnt-as-powerful-as-ps4-dx12-wont-help-reduce-difference-talks-development

Someone should have asked DICE if they will ever be able to make a console game that runs smoothly at 1080P 60fps.

It is Dice's fault that this generation consoles are weak?

I am still waiting on your HUMA information you promised in 2013/2014/2015.

Other devs seem to have no problems, except for Konami, but at least they can get 1080P on the PS4. The Frostbite and Fox engines are crap anyways. HUMA info?? Don't remember that.

You went on about HUMA/xboxone embargo that MS was suppose to lift this year around the launch of the xboxone.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#244  Edited By commander
Member since 2010 • 16217 Posts

@daious said:
@commander said:
@daious said:
@commander said:

@daious : it doesn't matter , even if it's 10 percent difference the argument stays the same, allthough you probably don't know what the argument is otherwise you wouldn't be posting facts that are completely besides teh point.

read the thread before you reply and you won't look like ... you know what lmao

Your consistently posting (from the past and just now) things that show your glaring lack of knowledge on the subject. You also posted the same exact thing before with the exact same incorrect 4100/4300 argument.. I corrected you then and you still didn't learn.

From the gis of it, the zero guy doesn't seem to know any technical details either but he didn't just repeat the same wrong information over and over again.

Edit: grammar

you pick out details and don't even know what the argument is about.

you and a dozen others come into hardware threads because you found a mistake that doesn't matter.

I knew the architeture was different before i posted it, but it doesn't matter the performance difference is comparable to framerates because of the cpu bottleneck, the rest is besides the point.

One- ACU at launch on PC was a disaster. AMD architecture at launch (GPU/CPU) was a mess. It took future AMD drivers + patches helped fix.

There were issues with multicore performance all around. Look at the 4 core versus 4cores +4 virtual cores versus 8 core 8 virtual cores. There were so many issues with it. However, you disregard it. Meanwhile, in other threads you attack any other game for optimization that doesn't fit your view. You still use Assassin's creed Unity launch benchmarks before any of those patches and driver fixes were in place.

Two- you are implying that the difference in mhz between AMD 4100/4300 equated to the difference in performance we saw in those charts (when piledriver improved over a dozen different things compared to first gen bulldozer).

That has again nothing to do with it.

The bench shows what any other cpu bottleneck will show of any other game, what does it matter ac unity was a mess at launch. It's a cpu bottleneck end of story.

What does it matter if I implied that the difference was 6 percent , even if it was 10 percent, the results stays the same but I understand correct is correct. Just for the sake of other people reading this it should be mentioned that that 6 percent is actually a bit more.

But this was already pointed out by someone else, so there's no reason to point it out again especially if it doesn't really matter in this discussion.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#245  Edited By Daious
Member since 2013 • 2315 Posts
@commander said:
@daious said:
@commander said:
@daious said:

Your consistently posting (from the past and just now) things that show your glaring lack of knowledge on the subject. You also posted the same exact thing before with the exact same incorrect 4100/4300 argument.. I corrected you then and you still didn't learn.

From the gis of it, the zero guy doesn't seem to know any technical details either but he didn't just repeat the same wrong information over and over again.

Edit: grammar

you pick out details and don't even know what the argument is about.

you and a dozen others come into hardware threads because you found a mistake that doesn't matter.

I knew the architeture was different before i posted it, but it doesn't matter the performance difference is comparable to framerates because of the cpu bottleneck, the rest is besides the point.

One- ACU at launch on PC was a disaster. AMD architecture at launch (GPU/CPU) was a mess. It took future AMD drivers + patches helped fix.

There were issues with multicore performance all around. Look at the 4 core versus 4cores +4 virtual cores versus 8 core 8 virtual cores. There were so many issues with it. However, you disregard it. Meanwhile, in other threads you attack any other game for optimization that doesn't fit your view. You still use Assassin's creed Unity launch benchmarks before any of those patches and driver fixes were in place.

Two- you are implying that the difference in mhz between AMD 4100/4300 equated to the difference in performance we saw in those charts (when piledriver improved over a dozen different things compared to first gen bulldozer).

That has again nothing to do with it.

The bench shows what any other cpu bottleneck will show of any other game, what does it matter ac unity was a mess at launch. It's a cpu bottleneck end of story.

What does it matter if I implied that the difference was 6 percent , even if it was 10 percent, the results stays the same but I understand correct is correct. Just for the sake of other people reading this it should be mentioned that that 6 percent is actually a bit more.

But this was already pointed out by someone else, so there's no reason to point it out again especially if it doesn't really matter in this discussion.

So your argument is that you show a benchmark and pick the weakest CPU (with a game that isn't properly optimized for CPU utilization) and say the obvious that it is the weakest CPU (30 buck) isn't doing much with a build with over thousand dollars worth of GPU without showing difference the the weakest CPU does with a slight overclock to actually give something even close to a comparison to your topic? If you are going to make a comparison show the difference between different clocks. You are picking CPUs with different IPC and pretending to have an argument about the difference mhz it makes. All this with using a game that wasn't properly coded for CPU utilization giving it an artificial increase in the CPU bottleneck. A game so horribly coded that they got attacked by gamers everywhere. A game so badly optimzed that the developers and ubisoft started placing the blame with CPU/GPU hardware manufacturers.

Is that a joke?

Not only the major issues about that game, couldn't someone cherry pick random benchmarks put a 1000dollar CPU (i7 octocore processor) with a ~50 dollar GPU and say the same thing but about a GPU when trying to push a game at a certain setting? Anyone can make any point doing something like this unless you actually make the comparisons relevant.

Find relevant comparisons with the exact same CPU with slight different overclocks while running a CPU extensive game that is properly coded for. Even then it wouldn't be completely relevant seeing that PS4 games and Xboxone games run their games at different settings. PS4 runs at a higher visual settings than most xboxone games. Thereby, making the xboxone game less demanding visually. We don't know how the PS4 will run a game at the same settings as an xboxone game.

On a side note: you made a huge deal with witcher 3 and multiple patch threads about performance. You seem to completely disregarding it now. But what I am curious about your opinion, doesn't the optimization mean that PS4 is capable of providing better framerates (circumventing the slightly weaker CPU issues) while offering a massive increase visual fidelity? Similar framerates and higher visual fidelity which includes higher resolution? What is your metric in better hardware?

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#246 commander
Member since 2010 • 16217 Posts

@daious said:
@commander said:

That has again nothing to do with it.

The bench shows what any other cpu bottleneck will show of any other game, what does it matter ac unity was a mess at launch. It's a cpu bottleneck end of story.

What does it matter if I implied that the difference was 6 percent , even if it was 10 percent, the results stays the same but I understand correct is correct. Just for the sake of other people reading this it should be mentioned that that 6 percent is actually a bit more.

But this was already pointed out by someone else, so there's no reason to point it out again especially if it doesn't really matter in this discussion.

So your argument is that you show a benchmark and pick the weakest CPU (with a game that isn't properly optimized for CPU utilization) and say the obvious that it is the weakest CPU (30 buck) isn't doing much with a build with over thousand dollars worth of GPU without showing difference the the weakest CPU does with a slight overclock to actually give something even close to a comparison to your topic? If you are going to make a comparison show the difference between different clocks. You are picking CPUs with different IPC and pretending to have an argument about the difference mhz it makes. All this with using a game that wasn't properly coded for CPU utilization giving it an artificial increase in the CPU bottleneck.

Is that a joke?

cpu's are not gpu's so don't start about different ipc's. The performance difference between the cpu's shows in the framerates. It shows even with much stronger cpu's than the fx 4100 and the consoles cpu's, like the fx 8150 and fx 8350. If you compare the fx 8150 and the fx 8350 in anandtech bench it shows about 15 percent performance difference and the framerates show the same differences. I won't be magically different in ac unity because the cpu has a slightly different architecture.

Besides there aren't many cpu benchmarks about ac unity, not that I can find anyway. So there isn't much else to go by but it's not that big a problem. The performance differences between those cpu's are widely known.

As for a cpu bottleneck. artificial or not. a cpu bottleneck is a cpu botteneck and the the consoles have very weak cpu's in comparison to the gpu's. So when the cpu is taxed the cpu bottleneck becomes more prevalent and that's exactly what happens with the ps4 and xboxone.

I know you're going to say but the gtx 980 sli... ok, but the detail settings are much higher as well, and the bottleneck still shows with cpu's like the fx 8150 and fx 8350or even with the i5 2500 and i5 4670k for that matter. Those cpu's are multiple times faster than the consoles cpu's. Which brings met to that artificial cpu bottleneck again, if there was so much excess cpu power needed, how come most of the cpu's can actually manage this game at the highest settings? if there was so much excess cpu power needed, a simple i3 wouldn't dish out 60 fps.

I know sony fans don't want to hear it because they keep hearing that song 'ps4 is the most powerfull console' but that really doesn't tell the whole story now does it.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#247  Edited By Daious
Member since 2013 • 2315 Posts

@commander said:
@daious said:
@commander said:

That has again nothing to do with it.

The bench shows what any other cpu bottleneck will show of any other game, what does it matter ac unity was a mess at launch. It's a cpu bottleneck end of story.

What does it matter if I implied that the difference was 6 percent , even if it was 10 percent, the results stays the same but I understand correct is correct. Just for the sake of other people reading this it should be mentioned that that 6 percent is actually a bit more.

But this was already pointed out by someone else, so there's no reason to point it out again especially if it doesn't really matter in this discussion.

So your argument is that you show a benchmark and pick the weakest CPU (with a game that isn't properly optimized for CPU utilization) and say the obvious that it is the weakest CPU (30 buck) isn't doing much with a build with over thousand dollars worth of GPU without showing difference the the weakest CPU does with a slight overclock to actually give something even close to a comparison to your topic? If you are going to make a comparison show the difference between different clocks. You are picking CPUs with different IPC and pretending to have an argument about the difference mhz it makes. All this with using a game that wasn't properly coded for CPU utilization giving it an artificial increase in the CPU bottleneck.

Is that a joke?

cpu's are not gpu's so don't start about different ipc's. The performance difference between the cpu's shows in the framerates. It shows even with much stronger cpu's than the fx 4100 and the consoles cpu's, like the fx 8150 and fx 8350. If you compare the fx 8150 and the fx 8350 in anandtech bench it shows about 15 percent performance difference and the framerates show the same differences. I won't be magically different in ac unity because the cpu has a slightly different architecture.

Besides there aren't many cpu benchmarks about ac unity, not that I can find anyway. So there isn't much else to go by but it's not that big a problem. The performance differences between those cpu's are widely known.

As for a cpu bottleneck. artificial or not. a cpu bottleneck is a cpu botteneck and the the consoles have very weak cpu's in comparison to the gpu's. So when the cpu is taxed the cpu bottleneck becomes more prevalent and that's exactly what happens with the ps4 and xboxone.

I know you're going to say but the gtx 980 sli... ok, but the detail settings are much higher as well, and the bottleneck still shows with cpu's like the fx 8150 and fx 8350or even with the i5 2500 and i5 4670k for that matter. Those cpu's are multiple times faster than the consoles cpu's. Which brings met to that artificial cpu bottleneck again, if there was so much excess cpu power needed, how come most of the cpu's can actually manage this game at the highest settings? if there was so much excess cpu power needed, a simple i3 wouldn't dish out 60 fps.

I know sony fans don't want to hear it because they keep hearing that song 'ps4 is the most powerfull console' but that really doesn't tell the whole story now does it.

I added things to my last post about more issues with your horrendous argument and problems in general with cherry-picking benchmarks.

If you want to know my opinions on consoles, both are complete letdowns. I moved in a new apartment ages ago and my xboxone/ps4 still aren't even plugged in. They are in a corner in an open box.

I don't make threads to make statements. I don't make threads to damage control. I am not emotionally invested in either console to care. I don't have a burden of proof. When I see something horribly supported (especially about hardware), I call it out.

Avatar image for ToScA-
ToScA-

5783

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#248  Edited By ToScA-
Member since 2006 • 5783 Posts

@AzatiS: Lol plz PS2's emotion engine was hyped through the roof by cows and Sony alike. The eventual fate of Dreamcast is hardly relevant to the point I'm making. Of course power was a selling point of PS2, it was only when it's main competitors released about a year later that that selling point was completely shut down due to evidently better hardware. But make no mistake, cows rode that power train for as long as they possibly could. They then switched to sales and games where both GameCube and Xbox were lagging behind.

What then happened the following generation was a total flip-flop. The cell hype was just on another level, almost scam-worthy when you consider the net result. This time lems were the ones humping sales, whereas cows stuck to

teh POWAH. Fanboys cling on to whatever bullshit parameter they currently have an upper hand in. To single out one group as some sort of progenitor is ridiculous and hypocritical, as history has time and time again showed us the hypocritical nature of fanboys.

So please, stop talking out of your butt cheeks.

Avatar image for Douevenlift_bro
Douevenlift_bro

6804

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#249 Douevenlift_bro
Member since 2013 • 6804 Posts

Only lems didn't know this LOL.

Avatar image for imt558
imt558

976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#250 imt558
Member since 2004 • 976 Posts

Looks like Xboners are in full force here. LOL @commander! Of course he gave the link from Eurogamer. How about this one :

Loading Video...