DICE Dev: Xbox One Isn’t As Powerful As PS4, DX12 Won’t Help Reduce Difference

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#301 tormentos
Member since 2003 • 33793 Posts

@04dcarraher said:

@tormentos:

Why are you ignoring key points

"In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints."

Again, your ignoring the true multi-threading model that DX11.X lacks that DX12 is introducing.

Digital Foundry: To what extent will DX12 prove useful on Xbox One? Isn't there already a low CPU overhead there in addressing the GPU?

Oles Shishkovstov: No, it's important. All the dependency tracking takes a huge slice of CPU power. And if we are talking about the multi-threaded command buffer chunks generation - the DX11 model was essentially a 'flop', while DX12 should be the right one.

"“The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel. There are still some bottlenecks to work out with memory flushing to garlic, even after changing to LCUE, the memory copying is still significant.”"

enough said

I think you have a problem.

They added tons of features just to work around limitations of the DX11 API model.

They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints.

DX11 had problems,but MS had tools to WORKAROUND those problems,you know what a WORKAROUND is.?

A workaround is a method, sometimes used temporarily, for achieving a task or goal when the usual or planned method isn't working. In information technology, a workaround is often used to overcome hardware, programming, or communication problems. Once a problem is fixed, a workaround is usually abandoned.

So they had tools to work around those problems which is why Project Cars doesn't suffer the same pitfalls PC does.

Worse against you they have a DX12/GNM like API already before August 2014,The developer of metro could not use it because of time constrain,which would not affect Project cars which would come basically a year latter.

Is on PC were DX11 multithreading restrcit them the most not xbox one,and the improvement on PC is 40% vs 7% on XBO which show that yeah the XBO already has most of the performance gains that patch bring to PC ompletely killing your argument.

But then you claim the PS4 use a different model.

False again as the developer of Project cars stated that if sony emulate DX12 main benefits on their machine their code would also see a good improvement,so the developer it self is telling you that DX12 main beneficts which you claim is multithreaded rendering is not on PS4 and that the PS4 would improve is the same code was apply.

Worse the 7 to 5 % gain on DX11 also benefit the PS4 a little which already was ahead by as much as 14FPS.

you should stop spinning dude you are WRONG.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#302 StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:
@04dcarraher said:

@tormentos:

listen you rabid cow, project cars is not the normal multiplat you fool they took the time to bypass DX's limitations.... "“It’s been challenging splitting the renderer further across threads in an even more fine-grained" . "The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel" In DX you cant run multiple core in parallel splitting the renderer to the gpu.

“It’s been challenging splitting the renderer further across threads in an even more fine-grained manner – even splitting already-small tasks into 2-3ms chunks. The single-core speed is quite slow compared to a high-end PC though so splitting across cores is essential.

“The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel. There are still some bottlenecks to work out with memory flushing to garlic, even after changing to LCUE, the memory copying is still significant.”

http://www.cinemablend.com/games/Project-CARS-Hits-Some-Hardware-Limits-Xbox-One-PS4-61215.html

You are a fool this is the interview were that was taken from is about both the PS4 and xbox one not just the PS4 you silly lemming.

But Microsoft is not sleeping, really. Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table.

They added tons of features just to work around limitations of the DX11 API model.

They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints.

http://www.eurogamer.net/articles/digitalfoundry-2014-metro-redux-what-its-really-like-to-make-a-multi-platform-game

The problem with you is simple you don't know how to read and you are to slow to understand that DX11.X on xbox one is not fu**ing DX11 on PC not even CLOSE not only MS had tools already to work around LIMITATIONS of DX11 on xbox one,but they even fu**ing have a GNM stily API for do it your self code like the PS4 which the developer of metro could not use because of time.

That interview was more than a year old,in fact is older than the xbox one Holiday update which unlock the 7core which Project cars use so i am sure that GNM style API had being use on xbox one because it was there just like they use the core to deliver the best possible version they could on xbox one.

So yeah you are the rabid fanboy and the one who refuses to see that DX11 on xbox one has never being like DX11 on PC not even close.

@StormyJoe said:

@tormentos: Again, didn't read a single line. I don't care what the biggest cow fanboy in SW says...

Hahahahaaa keep the denial alive bro you were owned hard..lol and by your own link..hahahaaa

@commander said:

it's in other games as well. The witcher 3 may have been able to surpass the x1 framerates after numerous optimizations, but it took 6 months. Kinda shitty if you want to play the game at release

It could take 12 it was optimized over the xbox one version and now you most play in slideshow mode on your xbox one...lol

@commander said:

cars stil runs well above 30 fps lmao. That wasn't the case with the witcher

Hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaa

Still didn't read a single line. I don't care what the biggest cow fanboy in SW has to say...

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#303 commander
Member since 2010 • 16217 Posts

@StormyJoe said:

Still didn't read a single line. I don't care what the biggest cow fanboy in SW has to say...

Well it would be a bit easier if he didn't quote a dozen people and answer to them in the same post.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#304  Edited By deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos said:
yeah on a CPU heavy game do to drawcalls the PS4 CPU is able to beat the xbox one with 7 cores,so much for the xbox one having an advantage there.
@commander said:

it's 10 percent overclock. When there is a cpu shortage that's a lot, why do you think the ps4 struggles with framerates sometimes. It's because the cpu gets taxed. Of course you won't see it in star wars battlefront, it's mostly gpu bound.

That extra 100 mhz on all 8 cores has been noticable, look at mad max , unity , gta V, call of duty and the witcher 3. The ps4 has a much better gpu than the xbox one., that extra power on the xbox doesn't come out of thin air.

Besides , ms has freed up an extra core for games as well, which adds roughly an extra 14 percent. something else sony can't do, they're just not as good at making operating systems. It's probably the reason why mad max is better on the xboxone.

Is not 100mhz is 150mhz.

And Mad Max is 30FPS basically lock on both,in fact a video was shown on the xbox one dropping like to 10 FPS the game slowed down teribly and was recorded with the xbox one internal recorder were ever it was a bug or something else it happen.

Unity = a screw up game.

GTA5 drops on xbox one in shut outs and while driving to and has less foliage.

COD AW runs online at 60FPS on PS4 like on xbox one and is 1080p unlike the xbox one 1360x1080p.

And the witcher 3 is faster on PS4.hahahaahahaaaaaa

You mean the extra core that Project Cars use and stil the xbox one version is as much as 14 FPS behind.?

Man 10% CPU is nothing specially on a crappy jaguar.

This should shut you up,look at the differnce 2 frames per second from a 2ghz drop from 4.5ghz to 2.5ghz..lol

2 freaking ghz of difference and the only gap it cause is a 2 frame drop in the witcher 3,those 150mhz on xbox one would do nothing not even a frame you know how many times 150 fit on 2000 right.?

Why do you continue to act like Project Cars is a CPU heavy game? You have proved yourself wrong on this multiple times. The GPU is the difference maker here time and time again.

Using your logic shouldnt we see a bigger difference in the bench below? How would anyone expect a 150 Mhz difference would generate a noticeable difference in frame rate? These are 500 MHz jumps with very little difference. A 150 MHz uptick would have no effect.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#305 StormyJoe
Member since 2011 • 7806 Posts

@commander said:
@StormyJoe said:

Still didn't read a single line. I don't care what the biggest cow fanboy in SW has to say...

Well it would be a bit easier if he didn't quote a dozen people and answer to them in the same post.

No. @tormentos cannot - ever - say a single good thing about Microsoft, even when what MS is talking about has nothing to do with Sony or the PS4. He is a self owning utterly worthless asshat fanboy, and I am tired of wasting time dealing with his stupidity.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#306  Edited By blackace
Member since 2002 • 23576 Posts

Wow.. look at El Tormo have another meltdown. Hilarious. Anyone read all that garbage he continues to write in every thread.

Yes El Tormo, the new devkits did come out and have made a difference. Why do you think there are more 1080P XBox One games in 2015, then in 2014. That was without using DX12 at all. Forza Horizon 2, Mad Max, Tomb Raider all prove my point. You act like the new devkits didn't do anything for the games, when in fact it did. Most of the games that are still 720P, 900P are using outdated graphic engines and are unoptimized ports from the PS4 version. DICE is clueless. They can't even get their games to run at 1080P on the PS4, when other developers have no problems with that. Shit Killzone ran at 1080P (most of the time) on the PS4. Konami ports all their PS4 games to the XB1. They continue to us that outdated Fox engine that was built for the PS platforms. Maybe if they built a graphic engine to work on the XBox platforms they could hit 1080P on it. Besides MGS, they don't seem to have much else to offer anymore.

@Chutebox said:

Damn, fucking laughing at clown blackace getting rolled hard haha

I didn't get rolled anywhere. Laughing at El Tormo's meltdowns as usual. The Jester of SW. Also, some games did get patches to increase resolutions on the XB1. Both Killer Instinct and Diablo 3 got resolution increases. LOL!! Poor.. poor El Tormo. What a fool.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#307 blackace
Member since 2002 • 23576 Posts

@commander said:
@StormyJoe said:

Still didn't read a single line. I don't care what the biggest cow fanboy in SW has to say...

Well it would be a bit easier if he didn't quote a dozen people and answer to them in the same post.

That's just one of the 100's reasons I stopped readying El Tormo's garbage. He hates to be wrong. Our SW Jester.

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#308 delta3074
Member since 2007 • 20003 Posts

@scatteh316 said:

@StormyJoe: Have you personally used PS4 and Xbone's API? And coded for them both?

Have you?

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#309 delta3074
Member since 2007 • 20003 Posts
@AM-Gamer said:

@1080pOnly: The difference was Sony started giving free games every Month before they charged for online. MS charged just for playing online. If you can't see the difference in value you are delusional.

'The difference was Sony started giving free games every Month before they charged for online.'

firstly, those games are not free, you have to pay a PS+ subscription for them and you only 'Rent' them for as long as you have PS+

Secondly, games for gold you get to keep even if you quit your gold subscription

Bascially your post is an out and out lie

https://en.wikipedia.org/wiki/List_of_Games_with_Gold_games

Games for gold, Completely free games, first implemented with fable 3 on june 10th 2013

60 free games on xbox 360,24 free games on xbox one

you are the one who is delusional sunshine.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#310 StormyJoe
Member since 2011 • 7806 Posts

@tormentos: Ok, that last line I did read (mainly because of the Chewbacca image).

So, this dev is credible to you, this project cars guy? That's fascinating to me. It's fascinating because he says that DX12 will improve Xb1 by a total of 12%-14%. Yet, you and the other ass-f**k cows in here say it won't do anything.

EPIC SELF OWNAGE. This is why I no longer entertain your posts.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#311 lostrib
Member since 2009 • 49999 Posts

@delta3074: you don't keep the X1 games

Avatar image for trollhunter2
trollhunter2

2054

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#312  Edited By trollhunter2
Member since 2012 • 2054 Posts

It's not surprising though

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#313  Edited By Daious
Member since 2013 • 2315 Posts

@napo_sp said:
@commander said:
@daious said:
@commander said:

Oh please, stop embarrassing yourself.

It seems to be the exact opposite judging by everyone's responses to your posts here. Doesn't help that you are consistently wrong. Also doesn't help that your arguments are completely faulty.

You think that someone that spends so much time damage controlling would actually be good at damage controlling.

What is even more ridiculous is that one you are called out for being wrong in one thread, you keep continuing to say the same thing. Lol.

Continue cherry picking results and ignoring everything that states otherwise. Your damage control threads are classics.Gotta love fanboys.

Are you now blind as well, not everyone disagrees with me. the only fanboy here is you.

only ignorant fools can believe that either dx12 or 150mhz extra cpu speed or both can somehow make a difference for xbone vs ps4.

That is not what I even tried to say. It is like talking to a brickwall.

It shows that you can't even understand your criticism and your flawed arguments. You can't support your arguments. You draw false conclusions. You don't understand hardware. You don't understand technical aspects of what you are saying. You say false statements about CPU extensive games. You also cherry pick benchmarks while ignoring others. You don't see the blatant hypocrisy in your posting. Your also a hard core fanboy.

Avatar image for NFJSupreme
NFJSupreme

6605

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#314 NFJSupreme
Member since 2005 • 6605 Posts

shocking

Avatar image for Vecna
Vecna

3425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#315 Vecna
Member since 2002 • 3425 Posts

Nothing shocking. What's really interesting is people are still worried about the consoles power. One is a turd, the other a polished turd.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#316 ronvalencia
Member since 2008 • 29612 Posts

@FoxbatAlpha said:

I'll wait for the DX9 (PS4) and DX12 (Xbox) comparisons.

DICE already stated PS4's APIs are similar to AMD's Mantle, hence Sony doesn't need DX12.

Avatar image for hansbeej
hansbeej

320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#317  Edited By hansbeej
Member since 2014 • 320 Posts

Sure, but this is all obvious. Even the CPU benchmarks better on the PS4 even though the clockspeed is lower. Not to mention the "to the metal" libraries have been present on both consoles from the start, with the PS4's already being far more robust (even allowing offloading of compute to the GPU thanks to the architecture/unified memory). On paper the Xbox One is a hunk of junk with it's 768-or-so GPU cores and DDR3 memory. Bleh.

Avatar image for ermacness
ermacness

10956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#318 ermacness
Member since 2005 • 10956 Posts

@1080pOnly said:
@Chutebox said:
@1080pOnly said:

Last gen Sony fans: Raw power isn't everything! Framerate and resolution are not important!

This gen Sony fans: Power is everything! Framerate and resolution are important!

...and the world wonders how corporations get rich selling shit to idiots.

And lems were the ones talking about framerate and resolution last gen. Thanks for the breaking news.

Never said the hypocrisy wasn't coming from both sides, I just remember more of it from the cows. Remember the endless hate threads towards lems about paying for online? Remember Sony announcing pay for online and all of the cows saying 'Yeah, it's cool that money can make the online better' and 'I don't mind paying cause of the value'. Bwahahaha.

I also remember "Video games consoles are gaming machines, not watching movies, surfing the net, and ect. That's what we got Blu-ray players and PC's for" from the lems side. The reason why you only remember that from cows so much is because lems made comparison thread back to back. Now the tides have turned and it's the lems on the short stick this generation. The only difference is that it wasn't too senseless when cows was waiting for multiplats on the ps3 to be equal or better than the 360 counterparts because well the ps3 was IN FACT more powerful (although not by much) than the 360. Now the ps4 is more powerful than the x1 on a bigger scale than the ps3 was over the 360, and lems still cling to "The Cloud, DX12, and other things.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#319 tormentos
Member since 2003 • 33793 Posts

@StormyJoe said:

Still didn't read a single line. I don't care what the biggest cow fanboy in SW has to say...

Hahaha you keep pretending to not read is a beauty..hahhaahahaaa

Your own Link owned you...lol PS4 API will always be ahead of DX12.lol

@commander said:

Well it would be a bit easier if he didn't quote a dozen people and answer to them in the same post.

That my posting style.

But you don't have to quote it all,you can clean everything and pretty much reply to what i say.

@GoldenElementXL said:

Why do you continue to act like Project Cars is a CPU heavy game? You have proved yourself wrong on this multiple times. The GPU is the difference maker here time and time again.

Using your logic shouldnt we see a bigger difference in the bench below? How would anyone expect a 150 Mhz difference would generate a noticeable difference in frame rate? These are 500 MHz jumps with very little difference. A 150 MHz uptick would have no effect.

You know you just proved my point right.? So Project cars drop 9 FPS when you comparing 4.5ghz vs 2.5ghz on a i7 which is much more efficient than any AMD CPU,yet on the witcher 3 it just drop 2 frames from that same 2ghz gap,so which is more CPU demanding and which suffer more for the difference in clock speed.

Funny because on the witcher 3 the i7 doesn't drop a single frame when you compare 4.5ghz vs 2.5ghz...

Congratulations you just proved that Project Cars suffers more than the witcher 3 from a 2.0ghz difference in clock speed.

Again i advice you to say the fu** out of this topics owning a good rig doesn't require skills it requires money,you may have a better PC than mine but you are a fool if you think that there is no CPU bottleneck on Project Cars when DF test clearly show there is and even state it.

There's plenty of scope for scalability in Project Cars on PC, but while Nvidia's stalwart GTX 750 Ti makes a good fist of matching PS4 visuals, a quad-core processor is required to maintain frame-rate in vehicle-heavy races. It's rare that we become CPU-bound with an entry-level enthusiast GPU like the 750 Ti, but the results here speak for themselves.

Digital Foundry.

And we know the Jaguar on PS4 is not even close to that i3 on Project Cars benchmark.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#320 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos: You are using a 2 GHz difference on PC and comparing it to a 150 MHz difference on consoles? Overclock or down clock your CPU 150 MHz and post the difference in frames in a game you own here. If you need help doing this let me know. I'm more than happy to help.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#321 tormentos
Member since 2003 • 33793 Posts

@GoldenElementXL:

150mhz is total shit on PC you will not get 2 frames let alone 6 like on the witcher 3 where you people claim the PS4 was CPU bound vs the xbox one.lol

It is the fact that you people lack the knowledge of simple chose to ignore reality,the is not 1 scenario were the xbox one will beat the PS4 because it is proven that with 1.6ghz CPU the PS4 can over shadow the xbox one peak peformance.

Project Cars on a 750Ti was being bottlneck by an i3 this is undesputable i link it,and even showed the screen comparing both the i5 was able to get 15 FPS more of the same GPU yeah that is officially the GPU beain bottleneck by the CPU and DF call it as so.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#322 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos: I could try to resoond to you with logic and reason but we all know that will get us nowhere. So I'm gonna break this down really simple for you.

Question 1

Which console does Project Cars run better on?

Question 2

Which console has the better GPU?

Question 3

Which console has the faster CPU?

If you answered "PS4" to question 1, "PS4" to question 2 and "Xbox One" to question 3, that gives you the answer. Project Cars runs better on PS4 because of the GPU. That makes the GPU more important to running the game well than the CPU. And this is ON CONSOLES so you can't compare a i3 and a 750Ti (potato PC) to console performance. But you will because it "helps" your argument.

The DF analysis shows the PS4 version 3-7 frames higher than the Xbox One version. It take a 1.5 GHZ difference in CPU clock speed to match that difference. Everything is pointing at the GPU advantage in the PS4 being the reason in the better performance. I don't understand why you aren't getting this.......

Also don't lump me in with "those people" because I didn't say the things they did.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#323  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@GoldenElementXL

Dont even bother with the rabid cow.

Witcher 3 "techspot" benchmark is not a real benchmark... The test lap they used is the griffen attack which means there is no stress on anything. Fact is their test shows an i3 performing virtually the same as an i5 or FX 8..... but in fact when a real stress test correctly done across multiple areas shows these lowest and average fps all on same gpu (Titan X).

Core i5 4690K – 52.0/79.2

Core i3 4130 – 38.0/68.9

FX 8350 – 56.0/75.2

FX 6300 – 45.0/69.6

tomato needs to stop being an idiot, Pc version of project cars throws excessive work on a single thread to handle the vast majority of the draw calls which means that performance drops when that single thread/core is over taxed this is why an i3 performs worse then i5 in Cars....An i3 has half the performance per thread and why an i3 perform better than FX 8 since its performance per thread is worse than an i3.

What is funny is the he thinks because an i3 cant get the same performance as an i5 its "demanding" when in fact its just poorly coded and that i3 has half the processing power of the i5

Here is a cpu usage comparison of Witcher 3 and Project cars so which one is actually more cpu demanding lol

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#324  Edited By scatteh316
Member since 2004 • 10273 Posts

@04dcarraher said:

@GoldenElementXL

Dont even bother with the rabid cow.

Witcher 3 "techspot" benchmark is not a real benchmark... The test lap they used is the griffen attack which means there is no stress on anything. Fact is their test shows an i3 performing virtually the same as an i5 or FX 8..... but in fact when a real stress test correctly done across multiple areas shows these lowest and average fps all on same gpu (Titan X).

Core i5 4690K – 52.0/79.2

Core i3 4130 – 38.0/68.9

FX 8350 – 56.0/75.2

FX 6300 – 45.0/69.6

tomato needs to stop being an idiot, Pc version of project cars throws excessive work on a single thread to handle the vast majority of the draw calls which means that performance drops when that single thread/core is over taxed this is why an i3 performs worse then i5 in Cars....An i3 has half the performance per thread and why an i3 perform better than FX 8 since its performance per thread is worse than an i3.

What is funny is the he thinks because an i3 cant get the same performance as an i5 its "demanding" when in fact its just poorly coded and that i3 has half the processing power of the i5

Here is a cpu usage comparison of Witcher 3 and Project cars so which one is actually more cpu demanding lol

An i3 does not have have the performance per thread as an i5.....

Avatar image for AM-Gamer
AM-Gamer

8116

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#325 AM-Gamer
Member since 2012 • 8116 Posts

@delta3074: First of all stop calling people sunshine like a pederass. Second of all games with gold didn't exist until PS+. Third Xbox live is completely worthless without gold. You can blabber your bullshit all you want I've been a live member since 2005.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#326 deactivated-59d151f079814
Member since 2003 • 47239 Posts

@BassMan said:

As soon as the specs were revealed, everybody knew the PS4 was more powerful and that both consoles would be shit.

This is what has me scratching my head.. We have fanboys trying to make a mountain over the trivial difference in performance for the two consoles.. Meanwhile there is a massive difference in graphics and performance with the pc, who is so far ahead that entry level pc gaming machines (which are around the same price of the consoles) out do the systems.. Then it suddenly turns into the cry out that "it isn't about all about graphics! Console has the exclusives I want!".. Well then why the hell are you guys making such a massive deal over such a trivial difference in performance.. Hell the PS4 isn't even capitalizing it, with giving a minimal resolution up in games with 1080p instead of 900p, instead of trying to aim for 60fps in games..

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#327 tormentos
Member since 2003 • 33793 Posts
@GoldenElementXL said:

@tormentos: I could try to resoond to you with logic and reason but we all know that will get us nowhere. So I'm gonna break this down really simple for you.

Question 1

Which console does Project Cars run better on?

Question 2

Which console has the better GPU?

Question 3

Which console has the faster CPU?

If you answered "PS4" to question 1, "PS4" to question 2 and "Xbox One" to question 3, that gives you the answer. Project Cars runs better on PS4 because of the GPU. That makes the GPU more important to running the game well than the CPU. And this is ON CONSOLES so you can't compare a i3 and a 750Ti (potato PC) to console performance. But you will because it "helps" your argument.

The DF analysis shows the PS4 version 3-7 frames higher than the Xbox One version. It take a 1.5 GHZ difference in CPU clock speed to match that difference. Everything is pointing at the GPU advantage in the PS4 being the reason in the better performance. I don't understand why you aren't getting this.......

Also don't lump me in with "those people" because I didn't say the things they did.

1-PS4

2-PS4

3-XBO

Oviously Project Cars will run better because of the GPU,fact is already it is doing 1080p on PS4 vs 900p on xbox one already you have a 44% pixel disparity,then the PS4 version has extra temporal AA which the xbox one version also lack,which make the gap even bigger,and is up to 14 FPS faster.

The logic behind people like you,04dcarraher,Ronvalencia and other is that if the game is CPU intensive the XBO will have faster frames,you people have argue that many times,you people use ACU and the witcher 3 as ammo trying to prove the xbox one had faster frames because the PS4 CPU was holding it back,in reality that wasn't the case it was a problem of bad optimization.

Dude you get a 15FPS hit just by having an i3,imagin if you have a FX 4 core how much bigger the impact would be on a 750ti,the i3 tend to beat those CPU with 4 cores hell even the six cores like my FX6350 get beaten by the i3.

So in this game CPU is important to,you can't use a low budget one and expect to get nice results,i just showed you how the 750Ti was bottleneck.

You are on another channel,my argument has always being that the CPU would not hold back the PS4 from beating the xbox one,it was you people claiming other wise,the PS4 beat the xbox one because it has a stronger GPU and its CPU isn't holding it back,which is my point,the game is CPU intensive and i prove it,the probem is the speed difference between both consoles is nothing,but lemmings act like a 150mhz gap will do anything when in reality it will not.

Project Cars isn't the most CPU intensive game,but is farily demanding because it has tons of draw calls and effects at once with a very high number of cars which most games if any don't push.

Forza is not running even close to 45 cars at once neither is driveclub.

@04dcarraher said:

@GoldenElementXL

Dont even bother with the rabid cow.

Witcher 3 "techspot" benchmark is not a real benchmark... The test lap they used is the griffen attack which means there is no stress on anything. Fact is their test shows an i3 performing virtually the same as an i5 or FX 8..... but in fact when a real stress test correctly done across multiple areas shows these lowest and average fps all on same gpu (Titan X).

Core i5 4690K – 52.0/79.2

Core i3 4130 – 38.0/68.9

FX 8350 – 56.0/75.2

FX 6300 – 45.0/69.6

tomato needs to stop being an idiot, Pc version of project cars throws excessive work on a single thread to handle the vast majority of the draw calls which means that performance drops when that single thread/core is over taxed this is why an i3 performs worse then i5 in Cars....An i3 has half the performance per thread and why an i3 perform better than FX 8 since its performance per thread is worse than an i3.

What is funny is the he thinks because an i3 cant get the same performance as an i5 its "demanding" when in fact its just poorly coded and that i3 has half the processing power of the i5

Here is a cpu usage comparison of Witcher 3 and Project cars so which one is actually more cpu demanding lol

Keep the denial alive i owned you ass over..lol

But but but the PS4 use special coding.hahahaaaaaa

Straigh from the developer it doesn't and further more using DX12 code on PS4 get a boost in performance.lol

The game is multithreaded on PC which you claim was not liar,and an i3 bottleneck a 750ti,a CPU bottleneck is when your GPU performance suffer because the CPU can feed it,the i3 has 4 threads but 2 are from hyperthreading,the i5 has 4 hardware ones which mean it HAS MORE RESOURCES so it deliver more draws and get more frames actually 15FPS more which is considerable but that is what a CPU bottleneck is.

You are a buffoon being bad coded apply to the i5 as well you joke of a poster the game is badly coded for all CPU not just 1 model,so your pathetic excuses are just that butthurt excuses the i5 deliver better results because it is more powerful fool yeah that PROVE the CPU was the issue.

And why Digital Foundry call it like that CPU bound. lol

YOU CLAIMED THE CPU WAS THE ISSUE IN PS4 VERSION OF THE WITCHER.

You were WRONG and now patheticly you ran to hide on ACU which was a broken ass game even on PC you loss and you don't have the guts to admit being wrong..lol

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#328 tormentos
Member since 2003 • 33793 Posts

@blackace said:

Wow.. look at El Tormo have another meltdown. Hilarious. Anyone read all that garbage he continues to write in every thread.

Yes El Tormo, the new devkits did come out and have made a difference. Why do you think there are more 1080P XBox One games in 2015, then in 2014. That was without using DX12 at all. Forza Horizon 2, Mad Max, Tomb Raider all prove my point. You act like the new devkits didn't do anything for the games, when in fact it did. Most of the games that are still 720P, 900P are using outdated graphic engines and are unoptimized ports from the PS4 version. DICE is clueless. They can't even get their games to run at 1080P on the PS4, when other developers have no problems with that. Shit Killzone ran at 1080P (most of the time) on the PS4. Konami ports all their PS4 games to the XB1. They continue to us that outdated Fox engine that was built for the PS platforms. Maybe if they built a graphic engine to work on the XBox platforms they could hit 1080P on it. Besides MGS, they don't seem to have much else to offer anymore.

I didn't get rolled anywhere. Laughing at El Tormo's meltdowns as usual. The Jester of SW. Also, some games did get patches to increase resolutions on the XB1. Both Killer Instinct and Diablo 3 got resolution increases. LOL!! Poor.. poor El Tormo. What a fool.

Please show me a list of 1080p games on xbox one on 2014 vs 2015.

Non of those games prove your point,Forza has being always 1080p on xbox one,so was tomb raider and Mad Max is a pretty unimpressive game,hitting 1080p on those games mean nothing,the xbox one can't hit 1080p on demanding games and that is a FACT.

Batman,The Witcher,COD,SWBF,BFH,not even in Halo 5 which look average it get 1080p dude.

Dice knows more about DX12 like coding than any other developer out there,they were the first you silly lemming they have 3 async shader games MS has non,sony has 1 and 1 on the making,so stop crying and inventing and downplaying Dice because your xbox one could not reach even 900p,the PS4 doesn't reach 1080p and you don't see me downplaying Dice,reality is to achieve 60FPS or close at the best possible image the PS4 has to sacrifice resolution i know that but you don't.

Is funny that you think that because an engine is old that mean the xbox one problem is that,when in reality it is so pathetic it can't even run a old engine well.

Ki should have never be 720p and is 900p which still is a joke is a 2d fighting game.

And Diablo runs at 1080p on a 7770 why should it not run at 1080p on xbox one,is a pretty uggly game saying the xbox one runs Diablo at 1080p is not some great achievement is pathetic if it didn't oh and still frames on it were all over the place.

But don't worry if 2016 doesn't deliver there is always 2017,and 2018 and 2019 and so on,so you have many years to play the lets wait for next year game..lol

@StormyJoe said:

@tormentos: Ok, that last line I did read (mainly because of the Chewbacca image).

So, this dev is credible to you, this project cars guy? That's fascinating to me. It's fascinating because he says that DX12 will improve Xb1 by a total of 12%-14%. Yet, you and the other ass-f**k cows in here say it won't do anything.

EPIC SELF OWNAGE. This is why I no longer entertain your posts.

Not you claim to have read that one because you think it favor you,is not 12 to 14% from DX12,is 5 to 7% from the same code beeing optimized,in other words DX11.X and 7% from DX12 which amount to like 2 frames more..lol

Funny enough i am sure you miss the line were he say the same DX12 benefist if apply to the PS4 also bring a good performance boost.... Hahahaaa Yeah you just being deny your shot at closing the gap the same code that help the xbox one also help the PS4 the gap is oing no where and that came from a developers mouth..lol

Now you can resume that you don't read my post again..lol

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#329  Edited By 04dcarraher
Member since 2004 • 23858 Posts

poor tomato swimming in his delusions and doesn't haven't a clue about the facts.

1. PS4 has better API bypassing DX 11 based limits..... again ignoring the facts.

2. I can give two shits about ether console, its you and twisted ideas, half truths and lies about the facts.

3. You are making assumptions based on bias outlook of bashing anything MS. DX12 is not a game changer allowing the X1 to be all powerful, its fixes and updates its flawed current API, adds new tools. This wont allow the X1's gpu to gain enough performance to close the large gap. fact is you have been proven wrong about Dx12 not doing anything while almost all dev's say it will help to some degree.

4. You keep on dreaming with the delusion with i3 vs i5, with a game where its draw calls are single threaded orientated. Again, fact that i3 out paces FX 8's or FX 6's means its not the cpu power you fool but poor allocation of cpu resources. A properly coded game FX8 trumps the i3. Fact that you still ignore cpu usage charts just hardens the fact your a troll.

5. You have no idea..... the cpu is the root of most of PS4's framerate drops and the reason why PS4 is limited to fps caps and they still drop. In the patch that introduced the framerate issues and stuttering are areas where cpu usage is higher....

6. lol just sad you clump everyone into one group, ACU issues were from excessive networking code eating Cpu cycles. Which means that PS4 does not have the cpu resources not to be fully allocated or you run into issues feeding the gpu. Having that extra cpu core and slight upclock can make a difference in areas however because of the PS4's large gpu advantage getting that extra couple frames means squat when the stronger gpu can push another average 10+ fps with resolution increase.

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#330 Spitfire-Six
Member since 2014 • 1378 Posts

wow what a meltdown lol.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#331  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:

@GoldenElementXL:

150mhz is total shit on PC you will not get 2 frames let alone 6 like on the witcher 3 where you people claim the PS4 was CPU bound vs the xbox one.lol

It is the fact that you people lack the knowledge of simple chose to ignore reality,the is not 1 scenario were the xbox one will beat the PS4 because it is proven that with 1.6ghz CPU the PS4 can over shadow the xbox one peak peformance.

Project Cars on a 750Ti was being bottlneck by an i3 this is undesputable i link it,and even showed the screen comparing both the i5 was able to get 15 FPS more of the same GPU yeah that is officially the GPU beain bottleneck by the CPU and DF call it as so.

Your citing of PC's i3 vs i5 for Project Cars is pointless since at it's quad threads mode would halve it's single thread performance and your Project Cars DX11 PC example will be obsolete by a confirmed DX12 build.

Project Cars DX11 PC build runs on a single rendering thread.

According to SMS, a single thread from Intel Core i3/i5/i7 at +3 Ghz ~= four AMD Jaguar cores at 1.6Ghz. Dual thread mode for Intel Core i3 would halve it's single thread performance hence the bottle neck. DX12 removes this bottleneck.

Gaming PC is NOT limited by Core i3 i.e. the user has the option to upgrade i.e. PC is not a static hardware games console.

Six AMD Jaguar cores at 1.6Ghz doesn't equal Intel Core i5 period!

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#332  Edited By StormyJoe
Member since 2011 • 7806 Posts

@tormentos: in one post, you boast about Sony's proprietary APIs, the next you say devs will use DX12.

Sigh... Back to ignoring the fanboy...

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#333  Edited By deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos said:
@GoldenElementXL said:

@tormentos: I could try to resoond to you with logic and reason but we all know that will get us nowhere. So I'm gonna break this down really simple for you.

Question 1

Which console does Project Cars run better on?

Question 2

Which console has the better GPU?

Question 3

Which console has the faster CPU?

If you answered "PS4" to question 1, "PS4" to question 2 and "Xbox One" to question 3, that gives you the answer. Project Cars runs better on PS4 because of the GPU. That makes the GPU more important to running the game well than the CPU. And this is ON CONSOLES so you can't compare a i3 and a 750Ti (potato PC) to console performance. But you will because it "helps" your argument.

The DF analysis shows the PS4 version 3-7 frames higher than the Xbox One version. It take a 1.5 GHZ difference in CPU clock speed to match that difference. Everything is pointing at the GPU advantage in the PS4 being the reason in the better performance. I don't understand why you aren't getting this.......

Also don't lump me in with "those people" because I didn't say the things they did.


The logic behind people like you,04dcarraher,Ronvalencia and other is that if the game is CPU intensive the XBO will have faster frames,you people have argue that many times,you people use ACU and the witcher 3 as ammo trying to prove the xbox one had faster frames because the PS4 CPU was holding it back,in reality that wasn't the case it was a problem of bad optimization.


But I never argued that.......... You are lumping everyone that engages you in discussion into one group. I have said on multiple occasions that all games running better on PS4 is because of the GPU. I also showed a benchmark describing how small the gains of increased CPU clock speeds are in games. I am AGREEING with you on this but your reading comprehension is keeping you from seeing that.

Wait, wait wait wait wait....... I get it now. Because I am saying Project Cars isn't CPU intensive I am getting grouped with your "Dirty Lem" category. Why is Project Cars being CPU intensive such a sticking point with you? Does it make you feel like the 150 MHz difference doesn't matter? Does that 150MHz difference bother you that much? The game RUNS BETTER ON PS4!!!!!!!!!!!!!!!!!! What the hell is your problem? You are just arguing to argue. Like when you tried to argue that "savor" and "enjoy" weren't synonyms.

Project Cars performance is more tied to GPU power than it is CPU power, even on PC. Notice in that bench I provided a few posts ago, the clock speed only changes a few frames after a 1.5 GHzuptick. How much does performance change with better GPU's??? Lets take a look......

17 frames between a 980 and 780. 30 frames between a 980 ad 680. Is it safe to say the generation gap between the PS4 and Xbox One GPU's is the cause of the difference in the consoles performance???

Avatar image for hansbeej
hansbeej

320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#334 hansbeej
Member since 2014 • 320 Posts

@ronvalencia: you're approaching this from the wrong angle. I do some development for work, and also some Android development on the side. None of these comparisons or metrics can be considered on a strictly linear basis.

Sure a modern i3 is more powerful than what is in a X1 or PS4. But developing something for a PC or Android phone or a console is all different. The many cores in the APU's of consoles can be leveraged to output something from meager processing power into something that looks and runs great on a phone or console moreso than a dev could squeeze out of a single, or even two, threads on a PC. A dev can have one core do this function, another that, use several for something else, and offload the rest to the GPU. That isn't really feasible when developing for a PC. This is one of the reasons so many PC ports run like garbage compared to a console edition.

It's an exact science in some respects, and in others it's not. I am not some hot shot top tier dev or else I would be earning my living programming and not high-level general IT (where development only accounts for about 25% of my duties). But I realize they're different environments with their own strengths and weaknesses. Just look at some of the games on the PS4: they look almost as good as PC ports with far superior hardware, for half the hardware cost.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#335  Edited By ronvalencia
Member since 2008 • 29612 Posts
@hansbeej said:

@ronvalencia: you're approaching this from the wrong angle. I do some development for work, and also some Android development on the side. None of these comparisons or metrics can be considered on a strictly linear basis.

Sure a modern i3 is more powerful than what is in a X1 or PS4. But developing something for a PC or Android phone or a console is all different. The many cores in the APU's of consoles can be leveraged to output something from meager processing power into something that looks and runs great on a phone or console moreso than a dev could squeeze out of a single, or even two, threads on a PC. A dev can have one core do this function, another that, use several for something else, and offload the rest to the GPU. That isn't really feasible when developing for a PC. This is one of the reasons so many PC ports run like garbage compared to a console edition.

It's an exact science in some respects, and in others it's not. I am not some hot shot top tier dev or else I would be earning my living programming and not high-level general IT (where development only accounts for about 25% of my duties). But I realize they're different environments with their own strengths and weaknesses. Just look at some of the games on the PS4: they look almost as good as PC ports with far superior hardware, for half the hardware cost.

You don't have a monopoly on software development for work (I program in C++) and you are ignoring SMS's statements.

Developers(SMS) for Project Cars already stated a single thread from Intel's high end CPU equals four AMD Jaguar cores at 1.6Ghz.

Read http://gamingbolt.com/project-cars-dev-ps4-single-core-speed-slower-than-high-end-pc-splitting-renderer-across-threads-challenging

Running 4 general threads on Intel Core i3 reduces it's single performance by about half, hence the result. PC's Project Cars runs from a single rendering thread.

Your comments are already obsolete with Windows 10 and DirectX12.

@GoldenElementXL said:

But I never argued that.......... You are lumping everyone that engages you in discussion into one group. I have said on multiple occasions that all games running better on PS4 is because of the GPU. I also showed a benchmark describing how small the gains of increased CPU clock speeds are in games. I am AGREEING with you on this but your reading comprehension is keeping you from seeing that.

Wait, wait wait wait wait....... I get it now. Because I am saying Project Cars isn't CPU intensive I am getting grouped with your "Dirty Lem" category. Why is Project Cars being CPU intensive such a sticking point with you? Does it make you feel like the 150 MHz difference doesn't matter? Does that 150MHz difference bother you that much? The game RUNS BETTER ON PS4!!!!!!!!!!!!!!!!!! What the hell is your problem? You are just arguing to argue. Like when you tried to argue that "savor" and "enjoy" weren't synonyms.

Project Cars performance is more tied to GPU power than it is CPU power, even on PC. Notice in that bench I provided a few posts ago, the clock speed only changes a few frames after a 1.5 GHzuptick. How much does performance change with better GPU's??? Lets take a look......

17 frames between a 980 and 780. 30 frames between a 980 ad 680. Is it safe to say the generation gap between the PS4 and Xbox One GPU's is the cause of the difference in the consoles performance???

FYI, Digital Foundry's Project Cars' I3 vs i5 PC test was with console settings.

Avatar image for hansbeej
hansbeej

320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#337 hansbeej
Member since 2014 • 320 Posts

@ronvalencia: Again, it doesn't come down to a linear comparison. Oh well, I guess we'll just have to disagree here.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#338  Edited By ronvalencia
Member since 2008 • 29612 Posts

@hansbeej said:

@ronvalencia: Again, it doesn't come down to a linear comparison. Oh well, I guess we'll just have to disagree here.

It's pretty simple

Intel Core i5's four threads with full speed for each thread. One of the four threads is the rendering thread.

Intel Core i3's four threads with compromised single thread performance i.e. departs from high performance single thread for rendering requirement.

The big benefits for DirectX12 for PC are reduced CPU overheads and proper MT render scaling.

SMS already confirmed DirectX12 build in development.

Avatar image for hansbeej
hansbeej

320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#339  Edited By hansbeej
Member since 2014 • 320 Posts

@ronvalencia said:
@hansbeej said:

@ronvalencia: Again, it doesn't come down to a linear comparison. Oh well, I guess we'll just have to disagree here.

It's pretty simple

Intel Core i5's four threads with full speed for each thread. One of the four threads is the rendering thread.

Intel Core i3's four threads with compromised single thread performance i.e. departs from high performance single thread for rendering requirement.

The big benefits for DirectX12 for PC are reduced CPU overheads and proper MT render scaling.

Reduced overhead isn't new. DirectX 12 isn't offering anything new. The X1 won't even get DX12 for a long time, it will get a subset of it. The PS4 libraries offer better performance already. And it will take years for developers to take advantage of DX12 on the PC. You're arguing for hypotheticals.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#340  Edited By ronvalencia
Member since 2008 • 29612 Posts

@hansbeej said:
@ronvalencia said:
@hansbeej said:

@ronvalencia: Again, it doesn't come down to a linear comparison. Oh well, I guess we'll just have to disagree here.

It's pretty simple

Intel Core i5's four threads with full speed for each thread. One of the four threads is the rendering thread.

Intel Core i3's four threads with compromised single thread performance i.e. departs from high performance single thread for rendering requirement.

The big benefits for DirectX12 for PC are reduced CPU overheads and proper MT render scaling.

Reduced overhead isn't new. DirectX 12 isn't offering anything new. The X1 won't even get DX12 for a long time, it will get a subset of it. The PS4 libraries offer better performance already. And it will take years for developers to take advantage of DX12 on the PC. You're arguing for hypotheticals.

There are two main issues with DX11 i.e.

1. Serialisation of deferred context threads into a single immediate thread, hence the bottleneck.

2. CPU overhead. XBO's DX11.X solves this issue but not for point one. For PC, DX12 solves this issue.

There are multiple incoming PC games with DX12 support i.e. most replaced AMD Mantle "Gaming Evolved" to DX12 build.

Most of AMD Mantle developers has shift to DirectX12. Major 3D game engines already has DX12 builds. You're arguing for hypotheticals.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#341 tormentos
Member since 2003 • 33793 Posts

@04dcarraher said:

poor tomato swimming in his delusions and doesn't haven't a clue about the facts.

1. PS4 has better API bypassing DX 11 based limits..... again ignoring the facts.

2. I can give two shits about ether console, its you and twisted ideas, half truths and lies about the facts.

3. You are making assumptions based on bias outlook of bashing anything MS. DX12 is not a game changer allowing the X1 to be all powerful, its fixes and updates its flawed current API, adds new tools. This wont allow the X1's gpu to gain enough performance to close the large gap. fact is you have been proven wrong about Dx12 not doing anything while almost all dev's say it will help to some degree.

4. You keep on dreaming with the delusion with i3 vs i5, with a game where its draw calls are single threaded orientated. Again, fact that i3 out paces FX 8's or FX 6's means its not the cpu power you fool but poor allocation of cpu resources. A properly coded game FX8 trumps the i3. Fact that you still ignore cpu usage charts just hardens the fact your a troll.

5. You have no idea..... the cpu is the root of most of PS4's framerate drops and the reason why PS4 is limited to fps caps and they still drop. In the patch that introduced the framerate issues and stuttering are areas where cpu usage is higher....

6. lol just sad you clump everyone into one group, ACU issues were from excessive networking code eating Cpu cycles. Which means that PS4 does not have the cpu resources not to be fully allocated or you run into issues feeding the gpu. Having that extra cpu core and slight upclock can make a difference in areas however because of the PS4's large gpu advantage getting that extra couple frames means squat when the stronger gpu can push another average 10+ fps with resolution increase.

1- I do know that and i have claim DX12 will never be as clean as sony API thank to DX12 been made to work with more than 1 hardware,is not an excuse you can't hang op to pathetic fanboy because you CLAIMED the CPU was the problem on the witcher not the API,in fact you moved not to ACU a screw up game to try to prove your weak point,if the API is more efficient for project cars is also more efficient for Assassins Creed Unity and Witcher 3,and in fact Ubifost had 2 games superior already on PS4 ACU was screw up and on purpose hold back.

2-Lets see how many times have i see you defending the PS4 like you daily defend the xbox one.? Yeah few or non.

3-All the argumenst about DX12 come from Brad wardell the argument you use about multithreaded rendering come from him,he claimed double performance not 7% he claim the xbox one was the biggest beneficiary,it wasn't we all know and you know it is PC,so you are rolling on information of some one who clearly was doing PR for DX12,the reality is the gains are bullshit and you want to pretend DX12 will fix something on xbox one 7% 2 frames on a game were the PS4 is as much as 14 FPS faster.

4-You don't know what a CPU bound scenario is apparently,i fu**ing know the i3 is weaker than a i5,but that is the thing being CPU bound mean that your CPU FOR WHAT EVER REASON CAN'T FEED YOUR GPU,the reason being weaker architecture,lower clock speed,number of physical cores,you name it the end result doesn't change one get more out of the GPU than the other which mean the GPU was being hold back by the CPU,is irrefutable and DF call it as so a CPU bound scenario because that is what it is regardless your excuses,the i5 has more juice per core so it eliver better performance so yeah CPU problem fixed by an i5.

5-No that is what you assume fool,quote developers for the games that drop frames stating that,no you are assuming that is the reason like you assume that was the reason for the drops on the Witcher 3 and you were WRONG you were WRONG you were owned the game runs faster now regardless of having less CPU speed than the xbox one,the code was bad and was optimize..lol

6-Bullshit that wasn't the only problem you blind lemming,ACU was hold back on purpose on PS4 it was screw up because MS freaking had a deal for millions of copy of that game,can't have the PS4 version running better when MS will buy 3 or 4 million and who knows if more from those games and also buy the same quantity of games from ACBF as well.

You are pathetic since you were owned,now you go running and hide on ACU which was a screw up game,which developer freaking focus more on fixing the microransaction part of the game than the gameplay it self,you are no making your argument no favor by using that game,ACBF was 1080p on PS4 there was no reason for ACU to be 900p,f they were CPU bound at 900p they would be exactly the same at 1080p resolution is a job of the GPU not CPU.

Reality is you claimed the witcher 3 was CPU bound trying to prove that Project Cars didn't matter,and you get owned by CD Project..lol

@ronvalencia said:

Your citing of PC's i3 vs i5 for Project Cars is pointless since at it's quad threads mode would halve it's single thread performance and your Project Cars DX11 PC example will be obsolete by a confirmed DX12 build.

Project Cars DX11 PC build runs on a single rendering thread.

According to SMS, a single thread from Intel Core i3/i5/i7 at +3 Ghz ~= four AMD Jaguar cores at 1.6Ghz. Dual thread mode for Intel Core i3 would halve it's single thread performance hence the bottle neck. DX12 removes this bottleneck.

Gaming PC is NOT limited by Core i3 i.e. the user has the option to upgrade i.e. PC is not a static hardware games console.

Six AMD Jaguar cores at 1.6Ghz doesn't equal Intel Core i5 period!

Now this is gold...hahahaaa

1-Pointless my ass,i have a source backing my statedment with a fu**ing benchmark showing how the 750TI was buttleneck by a i3 on the game,a CPU bound scenario is when you CPU for what ever reason CAN'T FEED YOUR GPU FAST ENOUGH,now the i5 got more frames because it has more per core performance than an i3 that is a FACT so yeah the CPU bound scenario was fixed by i5.

Lets wait and see because what you showed me was a damn screen showing dual 980GTX on SLI which do require way more CPU and used a i7 in nothing that setup represent an i3 with a 750ti in nothing.

Project Cars on PC is multithreaded it was just disable probably for the problems SMS faced.

Steam Startup parameters (right click Project CARS/Properties/Set Launch Options - Multiple strings separated by a single space):

Multi threaded -dx11mt

Cap Framerates -fpscap xx (ex. -fpscap 60)

Launch 32bit -x86

DirectX 9 -dx9

Disable crowds -skipcrowds

Disable VR headset support -novr

Disable join in progress (multiplayer) -disablejip

Developer Cameras -devcameras

Number of physics threads:

-pthreads 1 / -pthreads 2 / -pthreads 3 / -pthreads 4

• Default physics threads = 2

You can enable it...lol

And Project Cars was anounce as using DX11MT dude when it was on developments.

And? My argument wasn't the Jaguar is stronger than an i3,my argument was the game demanded more CPU than most game out there and was faster on PS4 over the xbox one.

No one is talking about gaming being limited by an i3 you walking salami get your fact straight and stop talking irrelevant crap,you also claimed the PS4 was CPU bound on the wtcher 3,and you are also officially owned..hahahahaa

How many times have i told you this games are screw up and not CPU bound.? Hahahaaa

Quote me saying a 6 core Jaguar = an i5 buffoon...lol

@StormyJoe said:

@tormentos: in one post, you boast about Sony's proprietary APIs, the next you say devs will use DX12.

Sigh... Back to ignoring the fanboy...

Yeah go ahead and spin what i say DX12 LIKE CODE is not the same as using DX12 buffoon,DX12 like code = mantle,vulcan GNM.

Yeah you can pretend to ignore me again after,you falsely claime DX12 would bring 12 to 14% more performance,because you can't read for sh*t..lol

You were owned by your own link..lol

Avatar image for deactivated-578f2053b4a13
deactivated-578f2053b4a13

1671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#342 deactivated-578f2053b4a13
Member since 2004 • 1671 Posts

@imt558: it's impossible to get on the Gaming Age forums unless you have a paid subscription based email.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#343 tormentos
Member since 2003 • 33793 Posts

@GoldenElementXL said:

But I never argued that.......... You are lumping everyone that engages you in discussion into one group. I have said on multiple occasions that all games running better on PS4 is because of the GPU. I also showed a benchmark describing how small the gains of increased CPU clock speeds are in games. I am AGREEING with you on this but your reading comprehension is keeping you from seeing that.

Wait, wait wait wait wait....... I get it now. Because I am saying Project Cars isn't CPU intensive I am getting grouped with your "Dirty Lem" category. Why is Project Cars being CPU intensive such a sticking point with you? Does it make you feel like the 150 MHz difference doesn't matter? Does that 150MHz difference bother you that much? The game RUNS BETTER ON PS4!!!!!!!!!!!!!!!!!! What the hell is your problem? You are just arguing to argue. Like when you tried to argue that "savor" and "enjoy" weren't synonyms.

Project Cars performance is more tied to GPU power than it is CPU power, even on PC. Notice in that bench I provided a few posts ago, the clock speed only changes a few frames after a 1.5 GHzuptick. How much does performance change with better GPU's??? Lets take a look......

17 frames between a 980 and 780. 30 frames between a 980 ad 680. Is it safe to say the generation gap between the PS4 and Xbox One GPU's is the cause of the difference in the consoles performance???

1-You falsely claim the i3 didn't meet the minimum requirements it does.

2-You falsely claims the races were lock to 20 cars,when it can be as much as 45 cars far more than any other racer.

3-A CPU bound scenario is when ever your CPU can't keep up with your GPU and as a result your GPU get hold back,that is a CPU bound scerario it is irrelevant how much GPU demanding the game is,because the CPU is affecting your performance and in the case of DF test is was a 750ti which isn't that powerful,but never the less was hold back 16FPS b an i3.

This benchmark prove many wrong including your self.

You claimed the i3 didn't meet the minimun requirements,Project Cars minimum requirements ask for a Phenom ll X 4 940,in this benchmark you can see the 965 doing 30 FPS the i3 4150 does 46FPS a 16FPS gap from the 965 which i am sure is faster than the 940.

Not only that the i3 4150 also beat my FX6350 stock and OC to 4.7ghz.

And for those like 04dcaraher,Ronvalecia,and many other lemmings who claim the CPU was the problem in ACU and The witcher 3,look at the FX6350 stock vs FX6350 OC to 4.7ghz,the difference in frames is 3 frames per second,and that is 800mhz over clock that is half of the PS4 CPU speed on a stronger line of CPP with a more efficient cores.

So if 800mhz = 3 frames mores,imagine what 150mhz would be,not even a frame of diference would amount so people holding tied to the whole CPU crap are losing their time,the xbox one will become GPU bound faster than the PS4 would become CPU bound in all scenarios i am sure 100%.

Even more when the PS4 push compute harder and the GPU offload some CPU task.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#344  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@04dcarraher said:

poor tomato swimming in his delusions and doesn't haven't a clue about the facts.

1. PS4 has better API bypassing DX 11 based limits..... again ignoring the facts.

2. I can give two shits about ether console, its you and twisted ideas, half truths and lies about the facts.

3. You are making assumptions based on bias outlook of bashing anything MS. DX12 is not a game changer allowing the X1 to be all powerful, its fixes and updates its flawed current API, adds new tools. This wont allow the X1's gpu to gain enough performance to close the large gap. fact is you have been proven wrong about Dx12 not doing anything while almost all dev's say it will help to some degree.

4. You keep on dreaming with the delusion with i3 vs i5, with a game where its draw calls are single threaded orientated. Again, fact that i3 out paces FX 8's or FX 6's means its not the cpu power you fool but poor allocation of cpu resources. A properly coded game FX8 trumps the i3. Fact that you still ignore cpu usage charts just hardens the fact your a troll.

5. You have no idea..... the cpu is the root of most of PS4's framerate drops and the reason why PS4 is limited to fps caps and they still drop. In the patch that introduced the framerate issues and stuttering are areas where cpu usage is higher....

6. lol just sad you clump everyone into one group, ACU issues were from excessive networking code eating Cpu cycles. Which means that PS4 does not have the cpu resources not to be fully allocated or you run into issues feeding the gpu. Having that extra cpu core and slight upclock can make a difference in areas however because of the PS4's large gpu advantage getting that extra couple frames means squat when the stronger gpu can push another average 10+ fps with resolution increase.

1- I do know that and i have claim DX12 will never be as clean as sony API thank to DX12 been made to work with more than 1 hardware,is not an excuse you can't hang op to pathetic fanboy because you CLAIMED the CPU was the problem on the witcher not the API,in fact you moved not to ACU a screw up game to try to prove your weak point,if the API is more efficient for project cars is also more efficient for Assassins Creed Unity and Witcher 3,and in fact Ubifost had 2 games superior already on PS4 ACU was screw up and on purpose hold back.

2-Lets see how many times have i see you defending the PS4 like you daily defend the xbox one.? Yeah few or non.

3-All the argumenst about DX12 come from Brad wardell the argument you use about multithreaded rendering come from him,he claimed double performance not 7% he claim the xbox one was the biggest beneficiary,it wasn't we all know and you know it is PC,so you are rolling on information of some one who clearly was doing PR for DX12,the reality is the gains are bullshit and you want to pretend DX12 will fix something on xbox one 7% 2 frames on a game were the PS4 is as much as 14 FPS faster.

4-You don't know what a CPU bound scenario is apparently,i fu**ing know the i3 is weaker than a i5,but that is the thing being CPU bound mean that your CPU FOR WHAT EVER REASON CAN'T FEED YOUR GPU,the reason being weaker architecture,lower clock speed,number of physical cores,you name it the end result doesn't change one get more out of the GPU than the other which mean the GPU was being hold back by the CPU,is irrefutable and DF call it as so a CPU bound scenario because that is what it is regardless your excuses,the i5 has more juice per core so it eliver better performance so yeah CPU problem fixed by an i5.

5-No that is what you assume fool,quote developers for the games that drop frames stating that,no you are assuming that is the reason like you assume that was the reason for the drops on the Witcher 3 and you were WRONG you were WRONG you were owned the game runs faster now regardless of having less CPU speed than the xbox one,the code was bad and was optimize..lol

6-Bullshit that wasn't the only problem you blind lemming,ACU was hold back on purpose on PS4 it was screw up because MS freaking had a deal for millions of copy of that game,can't have the PS4 version running better when MS will buy 3 or 4 million and who knows if more from those games and also buy the same quantity of games from ACBF as well.

You are pathetic since you were owned,now you go running and hide on ACU which was a screw up game,which developer freaking focus more on fixing the microransaction part of the game than the gameplay it self,you are no making your argument no favor by using that game,ACBF was 1080p on PS4 there was no reason for ACU to be 900p,f they were CPU bound at 900p they would be exactly the same at 1080p resolution is a job of the GPU not CPU.

Reality is you claimed the witcher 3 was CPU bound trying to prove that Project Cars didn't matter,and you get owned by CD Project..lol

@ronvalencia said:

Your citing of PC's i3 vs i5 for Project Cars is pointless since at it's quad threads mode would halve it's single thread performance and your Project Cars DX11 PC example will be obsolete by a confirmed DX12 build.

Project Cars DX11 PC build runs on a single rendering thread.

According to SMS, a single thread from Intel Core i3/i5/i7 at +3 Ghz ~= four AMD Jaguar cores at 1.6Ghz. Dual thread mode for Intel Core i3 would halve it's single thread performance hence the bottle neck. DX12 removes this bottleneck.

Gaming PC is NOT limited by Core i3 i.e. the user has the option to upgrade i.e. PC is not a static hardware games console.

Six AMD Jaguar cores at 1.6Ghz doesn't equal Intel Core i5 period!

Now this is gold...hahahaaa

1-Pointless my ass,i have a source backing my statedment with a fu**ing benchmark showing how the 750TI was buttleneck by a i3 on the game,a CPU bound scenario is when you CPU for what ever reason CAN'T FEED YOUR GPU FAST ENOUGH,now the i5 got more frames because it has more per core performance than an i3 that is a FACT so yeah the CPU bound scenario was fixed by i5.

Lets wait and see because what you showed me was a damn screen showing dual 980GTX on SLI which do require way more CPU and used a i7 in nothing that setup represent an i3 with a 750ti in nothing.

Project Cars on PC is multithreaded it was just disable probably for the problems SMS faced.

Steam Startup parameters (right click Project CARS/Properties/Set Launch Options - Multiple strings separated by a single space):

Multi threaded -dx11mt

Cap Framerates -fpscap xx (ex. -fpscap 60)

Launch 32bit -x86

DirectX 9 -dx9

Disable crowds -skipcrowds

Disable VR headset support -novr

Disable join in progress (multiplayer) -disablejip

Developer Cameras -devcameras

Number of physics threads:

-pthreads 1 / -pthreads 2 / -pthreads 3 / -pthreads 4

• Default physics threads = 2

You can enable it...lol

And Project Cars was anounce as using DX11MT dude when it was on developments.

And? My argument wasn't the Jaguar is stronger than an i3,my argument was the game demanded more CPU than most game out there and was faster on PS4 over the xbox one.

No one is talking about gaming being limited by an i3 you walking salami get your fact straight and stop talking irrelevant crap,you also claimed the PS4 was CPU bound on the wtcher 3,and you are also officially owned..hahahahaa

How many times have i told you this games are screw up and not CPU bound.? Hahahaaa

Quote me saying a 6 core Jaguar = an i5 buffoon...lol

@StormyJoe said:

@tormentos: in one post, you boast about Sony's proprietary APIs, the next you say devs will use DX12.

Sigh... Back to ignoring the fanboy...

Yeah go ahead and spin what i say DX12 LIKE CODE is not the same as using DX12 buffoon,DX12 like code = mantle,vulcan GNM.

Yeah you can pretend to ignore me again after,you falsely claime DX12 would bring 12 to 14% more performance,because you can't read for sh*t..lol

You were owned by your own link..lol

LOL. You don't know shit i.e. physics threads are different from rendering thread you stupid fool. Physics threads deals object placement (dot products and their final location after interaction calculations) while rendering thread deals with rendering.

Any bottleneck in the pipeline will gimp the performance i.e. slowest pipeline stage will gimp the frame rates e.g. physics threads calculations are fine until it hits draw call limitations in the rendering thread. For PC it's double hit since

1. CPU overhead. Physics threads calculations runs outside of Direct3D APIs hence not subject to Direct3D's CPU overhead issues. Windows 10 has slightly a lower CPU overhead when compared Windows 7/8 hence instant frame rate gain.

Loading Video...

2. There's only a single rendering thread i.e. the immediate context thread. DirectX12 removes this limitation which enables multiple threads to be immediate context thread type, hence removing the bottleneck.

DX11 MT rendering model

DirectX12 remove serialisation of deferred context threads into immediate context thread. Your command line switch enables the above DX11 MT model. AMD PC DX11 drivers wasn't optimised for DX11 MT, hence zero gain.

Running additional multi-threads (more physics threads + more deferred context threads) that exceeds 4 hardware threads introduces CPU context overheads issue. Your command line settings overrides the default behavior.

From http://www.dsogaming.com/pc-performance-analyses/project-cars-pc-performance-analysis/ Project Cars PC build automatically enables MT usage. One of the multi-threads refers to the single immediate context rendering thread.

Your Project Cars PC build's command line switches are red herrings to Digital Foundry's i3 vs i5 comparison.

By default behavior, Project Cars PC build will use MT mode which doesn't overcome DX11's MT model limitations (i.e. serialisation of deferred context threads into a single immediate context thread) and CPU API overhead issues, hence the confirmed DX12 build.

When the game wasn't bound by GPU, the developers already stated draw call limitations.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#345 04dcarraher
Member since 2004 • 23858 Posts

lol tomato acting like a noob

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#346 tormentos
Member since 2003 • 33793 Posts

@ronvalencia said:

LOL. You don't know shit i.e. physics threads are different from rendering thread you stupid fool. Physics threads deals object placement (dot products and their final location after interaction calculations) while rendering thread deals with rendering.

Any bottleneck in the pipeline will gimp the performance i.e. slowest pipeline stage will gimp the frame rates e.g. physics threads calculations are fine until it hits draw call limitations in the rendering thread. For PC it's double hit since

1. CPU overhead. Physics threads calculations runs outside of Direct3D APIs hence not subject to Direct3D's CPU overhead issues. Windows 10 has slightly a lower CPU overhead when compared Windows 7/8 hence instant frame rate gain.

Loading Video...

2. There's only a single rendering thread i.e. the immediate context thread. DirectX12 removes this limitation which enables multiple threads to be immediate context thread type, hence removing the bottleneck.

DX11 MT rendering model

DirectX12 remove serialisation of deferred context threads into immediate context thread. Your command line switch enables the above DX11 MT model. AMD PC DX11 drivers wasn't optimised for DX11 MT, hence zero gain.

Running additional multi-threads (more physics threads + more deferred context threads) that exceeds 4 hardware threads introduces CPU context overheads issue. There's a reason why Project Cars PC build doesn't enable additional physics and deferred context threads by default.

Your Project Cars PC build's command line switches are red herrings to Digital Foundry's i3 vs i5 comparison.

What the FU** hahahahaaaaaaaaaaaaaaa.. You keep using that video as some kind of proof.?

So that video has a 750ti with a i3.? No .?

But wait if the i3 improve from DX12 doesn't the i5 also improve.?

You are a fool.

i3+750ti = 34FPS + DX12 35 to 40% mean it will run at 45 to 47 FPS,but that also mean the i5 which was running at 50,with that same 35 to 40% from DX12 would run at 67 to 70 FPS again showing a gap compare to the i3 you freaking fanboy,no matter what DX12 improvement is not just for the I3 is for all CPU which mean the i5 also see a spike and again get a nice gap.

So basically you showed a video showing the i5 getting a huge gain,so the gap between i3 and i5 will mantain which mean the game in the end is being affected by CPU.

Which mean total shit since the argumen is the 750ti was bottleneck by a i3,which still would be since DX12 only delivers 35 to 40% gain and in both cases the i3 still fall behind the i5,worse if we apply the same gain to the i5 the gap again grows and shows a CPU disparity even with DX12.

You claimed the PS4 was CPU bound on the Witcher 3,it wasn't it was badly optimized ike Alien isolation is and Assassins creed unity.lol

@04dcarraher said:

lol tomato acting like a noob

Is not me denying the 750 ti beeing bottleneck is you,dude so who is the noob.? Oh wait didn't you claim the CPU was the problem for The witcher 3 PS4 performance.? hahahaahaa

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#347 StormyJoe
Member since 2011 • 7806 Posts

@04dcarraher said:

lol tomato acting like a noob

Has he resorted to Sniper Elite 3, or is he still trying to say he knows what he is talking about because of Project Cars?

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#348 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos said:
@GoldenElementXL said:

But I never argued that.......... You are lumping everyone that engages you in discussion into one group. I have said on multiple occasions that all games running better on PS4 is because of the GPU. I also showed a benchmark describing how small the gains of increased CPU clock speeds are in games. I am AGREEING with you on this but your reading comprehension is keeping you from seeing that.

Wait, wait wait wait wait....... I get it now. Because I am saying Project Cars isn't CPU intensive I am getting grouped with your "Dirty Lem" category. Why is Project Cars being CPU intensive such a sticking point with you? Does it make you feel like the 150 MHz difference doesn't matter? Does that 150MHz difference bother you that much? The game RUNS BETTER ON PS4!!!!!!!!!!!!!!!!!! What the hell is your problem? You are just arguing to argue. Like when you tried to argue that "savor" and "enjoy" weren't synonyms.

Project Cars performance is more tied to GPU power than it is CPU power, even on PC. Notice in that bench I provided a few posts ago, the clock speed only changes a few frames after a 1.5 GHzuptick. How much does performance change with better GPU's??? Lets take a look......

17 frames between a 980 and 780. 30 frames between a 980 ad 680. Is it safe to say the generation gap between the PS4 and Xbox One GPU's is the cause of the difference in the consoles performance???

1-You falsely claim the i3 didn't meet the minimum requirements it does.

2-You falsely claims the races were lock to 20 cars,when it can be as much as 45 cars far more than any other racer.

3-A CPU bound scenario is when ever your CPU can't keep up with your GPU and as a result your GPU get hold back,that is a CPU bound scerario it is irrelevant how much GPU demanding the game is,because the CPU is affecting your performance and in the case of DF test is was a 750ti which isn't that powerful,but never the less was hold back 16FPS b an i3.

This benchmark prove many wrong including your self.

You claimed the i3 didn't meet the minimun requirements,Project Cars minimum requirements ask for a Phenom ll X 4 940,in this benchmark you can see the 965 doing 30 FPS the i3 4150 does 46FPS a 16FPS gap from the 965 which i am sure is faster than the 940.

Not only that the i3 4150 also beat my FX6350 stock and OC to 4.7ghz.

And for those like 04dcaraher,Ronvalecia,and many other lemmings who claim the CPU was the problem in ACU and The witcher 3,look at the FX6350 stock vs FX6350 OC to 4.7ghz,the difference in frames is 3 frames per second,and that is 800mhz over clock that is half of the PS4 CPU speed on a stronger line of CPP with a more efficient cores.

So if 800mhz = 3 frames mores,imagine what 150mhz would be,not even a frame of diference would amount so people holding tied to the whole CPU crap are losing their time,the xbox one will become GPU bound faster than the PS4 would become CPU bound in all scenarios i am sure 100%.

Even more when the PS4 push compute harder and the GPU offload some CPU task.

I don't care what that bench mark shows. If the Developer tells me the minimum CPU requirement for a game is a quad core CPU, a dual core does not meet the requirement. The game may not be properly coded to run on a CPU with only 2 cores on PC. So to use this CPU as your evidence is incredibly flawed.

1 - You left out the Intel half of the requirement in your post. You claim it's a Phenom II X 4 950. The Intel minimum is a Intel Core 2 Quad Q 8400.
2 - Yes in that benchmark you posted the i3 4150 beat your FX6350. Don't you think it's a cause for concern when a i5 3350P beats EVERY AMD CPU? Maybe this game shouldn't be used to illustrate CPU performance?
3 - The CPU was defiantly a problem in Assassins Creed Unity. We will never know just how bad the consoles were gimped because of their CPU's in ACU because the game was such a mess.
4 - You are correct. The Xbox One GPU will be an issue before the PS4 CPU. In fact it happened at launch with games like COD and Battlefield running at lower resolutions. Both consoles' CPU's will hold back developers this entire Gen. But Xbox One is the main culprit.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#349  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@tormentos said:
@04dcarraher said:

lol tomato acting like a noob

Is not me denying the 750 ti beeing bottleneck is you,dude so who is the noob.? Oh wait didn't you claim the CPU was the problem for The witcher 3 PS4 performance.? hahahaahaa

O yes you are a noob, since you have no idea what your talking about, 750ti being bottlenecked is because

A. the game is throwing all gpu work on a single thread.

B. A thread on an i3 is half the performance of an i5 of same architecture does not mean a game is demanding on the cpu.

One could say Shogun 2 is very demanding since you need the best cpu's to get a good stable performance. But fact is that the game only used two thread to any real degree. That is the type of crap you trying to prove..... Poor utilization/allocation of cpu resources comparing a weaker cpu vs a stronger one and claiming its "demanding" since weaker cpu gets lower performance.

lol, O wait didn't you claim that PS4 would have 7gb of memory for games and GPU would use more than 4gb ?

Fact is you twist crap in your delusional mind all the time. Me claiming that Witcher 3 performance issues being a cpu issue after the patch is based on what is seen not some fanboy attacking your precious console. All examples that shown the issues was based on the the areas where the issues appeared the most ie areas that cpu load is at its most.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#350 ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

LOL. You don't know shit i.e. physics threads are different from rendering thread you stupid fool. Physics threads deals object placement (dot products and their final location after interaction calculations) while rendering thread deals with rendering.

Any bottleneck in the pipeline will gimp the performance i.e. slowest pipeline stage will gimp the frame rates e.g. physics threads calculations are fine until it hits draw call limitations in the rendering thread. For PC it's double hit since

1. CPU overhead. Physics threads calculations runs outside of Direct3D APIs hence not subject to Direct3D's CPU overhead issues. Windows 10 has slightly a lower CPU overhead when compared Windows 7/8 hence instant frame rate gain.

Loading Video...

2. There's only a single rendering thread i.e. the immediate context thread. DirectX12 removes this limitation which enables multiple threads to be immediate context thread type, hence removing the bottleneck.

DX11 MT rendering model

DirectX12 remove serialisation of deferred context threads into immediate context thread. Your command line switch enables the above DX11 MT model. AMD PC DX11 drivers wasn't optimised for DX11 MT, hence zero gain.

Running additional multi-threads (more physics threads + more deferred context threads) that exceeds 4 hardware threads introduces CPU context overheads issue. There's a reason why Project Cars PC build doesn't enable additional physics and deferred context threads by default.

Your Project Cars PC build's command line switches are red herrings to Digital Foundry's i3 vs i5 comparison.

What the FU** hahahahaaaaaaaaaaaaaaa.. You keep using that video as some kind of proof.?

So that video has a 750ti with a i3.? No .?

But wait if the i3 improve from DX12 doesn't the i5 also improve.?

You are a fool.

i3+750ti = 34FPS + DX12 35 to 40% mean it will run at 45 to 47 FPS,but that also mean the i5 which was running at 50,with that same 35 to 40% from DX12 would run at 67 to 70 FPS again showing a gap compare to the i3 you freaking fanboy,no matter what DX12 improvement is not just for the I3 is for all CPU which mean the i5 also see a spike and again get a nice gap.

So basically you showed a video showing the i5 getting a huge gain,so the gap between i3 and i5 will mantain which mean the game in the end is being affected by CPU.

Which mean total shit since the argumen is the 750ti was bottleneck by a i3,which still would be since DX12 only delivers 35 to 40% gain and in both cases the i3 still fall behind the i5,worse if we apply the same gain to the i5 the gap again grows and shows a CPU disparity even with DX12.

You claimed the PS4 was CPU bound on the Witcher 3,it wasn't it was badly optimized ike Alien isolation is and Assassins creed unity.lol

@04dcarraher said:

lol tomato acting like a noob

Is not me denying the 750 ti beeing bottleneck is you,dude so who is the noob.? Oh wait didn't you claim the CPU was the problem for The witcher 3 PS4 performance.? hahahaahaa

Have you added other PCARS DX11 patches + DX12's 40 percent improvements?

"40 percent" is just an estimate and they are still building DX12 version.

Notice SMS couldn't do a "simple" XBO DX11.X port to DX12 i.e. the difference is big enough for a "simple" port.