OK, so the PS4 is stronger than the XBOX ONE...

This topic is locked from further discussion.

Avatar image for Sagemode87
Sagemode87

3437

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#251 Sagemode87
Member since 2013 • 3437 Posts

[QUOTE="casharmy"]

[QUOTE="marklarmer"]

ugh, why do you people keep feeding this idiot? he's obviously desperate for the PS4 to be more powerful, which it is, so he's obviously going to spend the next couple of years posting the exact same flop numbers and celebrating every time there's any indication the Xbone is less powerful, there's nothing you can do about it, so just let him indulge in his pointless obsession on his own. He really doesnt need any help.

StormyJoe

If he is right about the PS4 being more powerful than x1, which you say it is, then aren't the people arguing against him the idiots?

lol takes a good bit of bias to try to say he is obsessed when the people arguing seem to be the ones who can't accept facts and are admitedly wrong.

The fact of the matter is, no developer who has worked on both has said anything other than "the differences are minor". I don't think anyone is arguing that the PS4 has a better GPU and RAM that is designed for graphics. What I have been saying since day 1 that the hardware advantages for the PS4 really aren't going to amount to too much at the end of the day.

What about all those PS3/360 multiplats Lems like yourself like to boast about? The devs would say they're the same when in some cases (however small the differences may be) they're not. You won't believe PR bull then but you'll believe it now? What a hypocrite :lol: Near double the flops won't amount to anything? Really? I bet you'd be singing a different tune had PS4 and X1 specs reversed. 

Avatar image for marklarmer
marklarmer

3883

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#252 marklarmer
Member since 2004 • 3883 Posts

I bet you'd be singing a different tune had PS4 and X1 specs reversed. 

Sagemode87

and so would cows, i can't imagine they'll be many of cows hiding behind their gaming PCs this generation.

Avatar image for Sagemode87
Sagemode87

3437

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#253 Sagemode87
Member since 2013 • 3437 Posts

[QUOTE="Sagemode87"]

I bet you'd be singing a different tune had PS4 and X1 specs reversed. 

marklarmer

and so would cows, i can't imagine they'll be many of cows hiding behind their gaming PCs this generation.

True. Fanboys from any faction will play the cards theyre dealt. Nothing new there. Lems are delusional though trying to downplay clear hardware advantages. Never seen Cows say the hardware advantages of Xbox didn't matter over PS2. There's more to boast about with the PlayStation brand than graphics regardless. 

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#254 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

[QUOTE="marklarmer"]

[QUOTE="Sagemode87"]

I bet you'd be singing a different tune had PS4 and X1 specs reversed. 

Sagemode87

and so would cows, i can't imagine they'll be many of cows hiding behind their gaming PCs this generation.

True. Fanboys from any faction will play the cards theyre dealt. Nothing new there. Lems are delusional though trying to downplay clear hardware advantages. Never seen Cows say the hardware advantages of Xbox didn't matter over PS2. There's more to boast about with the PlayStation brand than graphics regardless. 

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#255 darkangel115
Member since 2013 • 4562 Posts

[QUOTE="tormentos"]

[QUOTE="darkangel115"]

 

 

Have you see Titanfall graphics.,? is average at best bland building statics trees,it shows why is 60FPS and once again DF say the game wasn't confirmed to be 1080p,and that it was odbious from what they were seeing that it wasn't.

 

The demo shows a largely static level by design, with no procedural damage to the environment in the manner of Battlefield 4 or The Division, nor rag-doll physics - everything is set in stone, and not even the grass or trees animate.xboxiphoneps3

 

http://www.eurogamer.net/articles/digitalfoundry-titanfall-tech-analysis

 

Yeah everything set to stone...:lol:

That way every damn game can hit 60FPS...My god some of you are pathetic there was no animation,rag doll physics or anything...

 

In the pursuit of 60fps, something has to give way though - and it usually comes to light when looking too closely. It's the uncanny facial animations during NPC briefings, the rough-looking textures on Titan interiors and billboards, and what seem like scaling artifacts that betray either Titanfall's early development status, or its commitment to being about gameplay first, appearances second. The jaggies are a curious point in particular; even judged by the high quality feed we have availability direct from the Microsoft E3 conference there's more sub-pixel shimmering and rough edges than any other game on show, which suggests this may not be a full-blown 1080p title in its current state. From what Respawn has announced so far, the 60fps bullet-point is proudly announced, but it remains tight-lipped on what native resolution is intended for the final game.

 

Is sad...

 

lol so what your saying is a moving tree is better then super smooth gameplay in 1080p at 60FPS. lol also titanfall is said to use the cloud for physics offloading, so no cloud at the demo would be minus the physics of moving trees and such. wait until the final version ;)

titanfall wasnt running in 1080p at the E3 event, and DF cant even tell if it was running on a Xbox one or PC, its unconfirmed... man that sucks =/ at least Sony showed off all their games on their ACTUAL PS4 hardware

 

and it sucks, in order for the Xbox One to push 1080p at 30 fps like Crimson Dragon, it ends up looking like a upscaled Xbox One game according to Digital Foundry... are we already seeing the limits of Xbox One?! 

you do know they were not running on PC's they were running on Xbox One dev kits ;)
Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#256 StormyJoe
Member since 2011 • 7806 Posts

[QUOTE="StormyJoe"]

[QUOTE="casharmy"]

If he is right about the PS4 being more powerful than x1, which you say it is, then aren't the people arguing against him the idiots?

lol takes a good bit of bias to try to say he is obsessed when the people arguing seem to be the ones who can't accept facts and are admitedly wrong.

xboxiphoneps3

The fact of the matter is, no developer who has worked on both has said anything other than "the differences are minor". I don't think anyone is arguing that the PS4 has a better GPU and RAM that is designed for graphics. What I have been saying since day 1 that the hardware advantages for the PS4 really aren't going to amount to too much at the end of the day.

so what are developers going to say? "oh yeah im making my game multiplatform but im gonna go say that the PS4 has a advantage in graphics so buy my games on PS4 AND KILL MY SALES ON OTHER CONSOLES YEAHHH" smh at people's logic on system wars forums these days... i really think there is too many rampant 15 year olds that post on these forums now

They don't have any issues bashing the WiiU. Some third party devs have said in the past that games they made just for the PS3 "can't be done on the 360". Console companies kiss devs asses, not vise versa.

Avatar image for SKaREO
SKaREO

3161

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#257 SKaREO
Member since 2006 • 3161 Posts
PS4 is stronger than 95% of the PCs that gamers use.
Avatar image for Ricardomz
Ricardomz

2715

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#258 Ricardomz
Member since 2012 • 2715 Posts

When we get to play those consoles.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#259 tormentos
Member since 2003 • 33793 Posts

 

This is why I think you are full of sh*t, Tormentos. Basing things off demos? Tell me, why is it that NO PS4 game ran over 30 FPS at E3 (most struggled to maintain 30 FS)? None of them did. http://www.eurogamer.net/articles/digitalfoundry-hands-on-with-playstation-4

From the Article: "When it comes to the state of software development on PS4, the situation as it stands is surprising. On the one hand, freely playable first-party titles such as Knack and DriveClub suffer from noticeable frame-rate stutters down from 30fps, while on the other, "hands off" demos for the new Infamous and Assassin's Creed games appear to run without a perceptible hitch. This is in stark contrast to the playable software confirmed to be running direct from Xbox One hardware, such as Forza Motorsport 5 and Killer Instinct, which benefit to no end for targeting the 1080p60 gold standard, and largely succeed in doing so."

The answer, of course, is that they were not completed games. See? I don't even like Sony, and I can say that without issue. You cows are pathetic, fanboy troglodytes.

StormyJoe

 

Because games on PS4 are not targeting 60 FPS.

 

They are targeting this gen graphics...

 

Funny the games in question that target 60FPS on xbox one are.

Forza 5 = Everything baked and faked this game is build like a last gen game.

Titanfall was demo at 60 FPS DF claim is not 1080p,and that Respawn had not confirm the resolution,no physics,no animation,no damage.. A game like that on PS4 would run at 90FPS probably if not more.

Killer Intsint = fighting game enough say.

 

So yeah don't get mad if that sh** doesn't impress me at least PS4 games are shooting for great effects and great visuals.

Most games on PC will pass 60FPS without problem if you turn everything off and lower quality even on Crysis 3 that is a fact.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#260 StormyJoe
Member since 2011 • 7806 Posts

[QUOTE="StormyJoe"]

This is why I think you are full of sh*t, Tormentos. Basing things off demos? Tell me, why is it that NO PS4 game ran over 30 FPS at E3 (most struggled to maintain 30 FS)? None of them did. http://www.eurogamer.net/articles/digitalfoundry-hands-on-with-playstation-4

From the Article: "When it comes to the state of software development on PS4, the situation as it stands is surprising. On the one hand, freely playable first-party titles such as Knack and DriveClub suffer from noticeable frame-rate stutters down from 30fps, while on the other, "hands off" demos for the new Infamous and Assassin's Creed games appear to run without a perceptible hitch. This is in stark contrast to the playable software confirmed to be running direct from Xbox One hardware, such as Forza Motorsport 5 and Killer Instinct, which benefit to no end for targeting the 1080p60 gold standard, and largely succeed in doing so."

The answer, of course, is that they were not completed games. See? I don't even like Sony, and I can say that without issue. You cows are pathetic, fanboy troglodytes.

tormentos

Because games on PS4 are not targeting 60 FPS.

They are targeting this gen graphics...

Funny the games in question that target 60FPS on xbox one are.

Forza 5 = Everything baked and faked this game is build like a last gen game.

Titanfall was demo at 60 FPS DF claim is not 1080p,and that Respawn had not confirm the resolution,no physics,no animation,no damage.. A game like that on PS4 would run at 90FPS probably if not more.

Killer Intsint = fighting game enough say.

So yeah don't get mad if that sh** doesn't impress me at least PS4 games are shooting for great effects and great visuals.

Most games on PC will pass 60FPS without problem if you turn everything off and lower quality even on Crysis 3 that is a fact.

But, none of them are even running at 30 FPS with any consistency. I see you glossed over that point.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#261 tormentos
Member since 2003 • 33793 Posts

 

The fact of the matter is, no developer who has worked on both has said anything other than "the differences are minor". I don't think anyone is arguing that the PS4 has a better GPU and RAM that is designed for graphics. What I have been saying since day 1 that the hardware advantages for the PS4 really aren't going to amount to too much at the end of the day.

StormyJoe

 

That is a stupid excuse hold on to it as long as you can,developers been neutral will not stop the PS4 from over powering the xbox one.

700Gflops will not vanish because developer will not risc getting in problems with MS or sony for saying what console is more powerful,Ubisoft remark is basically the closes you will see,and is clear to any one but people on denial like you.

I already posted the link to what 480Gflops of difference can do,600 to 700 Gflops will deliver a bigger lead.

Avatar image for Gooeykat
Gooeykat

3412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 24

User Lists: 0

#262 Gooeykat
Member since 2006 • 3412 Posts
I'd say the results could show up early in multiplatform games. Dev's don't have to unlock the power of the PS4 like they did with the PS3, the architecture for both consoles are very similar to a PC. So it would be just a matter for the devs going in adjusting the graphical settings based on the power of each console. PS4 being more powerful would get more eye-candy of the two (or three if you include WiiU) but PC still looking the best overall.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#263 tormentos
Member since 2003 • 33793 Posts

 

um, because Tormentos is an idiot, of gpuking proportions, there's a difference between being more powerful and whatever it is cows are trying to claim the PS4 is compared to the Xbone. The point is cows already know PS4 games will probably look and/or run superior in at least some way, so there will be little concrete evidence to contradict whatever chasm of difference they claim there is, hence the hyperbolic reactions to everything shown at E3, the whole Driveclub/Forza thing being the most obvious, as well as jumping on any Xbone game not running 1080p/60fps and either ignoring the PS4 ones in the same category, or claiming they just "aren't finshed yet".

The thing people seem to be overlooking is the fact MS have screwed up the Xbone launch so badly that its sales will be significantly lower than the PS4, they now also have no established fan base, meaning its less likely devs will prioritize their console as the lead platform, this will probably have more of an impact on the quality of what will most likely be many PS4 ports rather than just because of having slightly weaker hardware.

marklarmer

 

Actually you are the idiot troll.

If you don't like my post no one force you to read them.

Second quote me saying the PS4 will melt the xbox one or some sh** like that,both consoles will run games at the same quality the PS4 will run them faster period because it has more resources.

So if a PS4 game runs at 45 FPS the xbox one would probably run at 25 FPS,but if the PS4 game is target to run at 30 FPS with incredible graphics,the xbox one will run it at 10 or 12 FPS,so in order to boost those frames the xbox one most drop quality,AA,resolution or a combination,there is no scaping this it happen on PC all the time,and it is how it works the GPU with more power get to run things with better detail faster.

The whole Forza drive club crap is stupid,Drive Club has everything dynamic,it doesn't fake effect like Forza 5 does,is not even and enclosed racer like Forza 5 is,so is more open world while it does everything dynamic even car damage is real,no 2 bumps are alike unlike Forza baked card damage,so yeah xbox one game may be pushing 60 FPS,but what they are doing isn't anything great to begin with.

So who is more stupid you or me.? The PS4 is stronger and will yield better results,yet you fight me because you can't handle the fact that i defend the PS4 power vs those who think that there will be no difference.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#264 tormentos
Member since 2003 • 33793 Posts

 

But, none of them are even running at 30 FPS with any consistency. I see you glossed over that point.

StormyJoe

 

Funny thing is compare Infamous to Dead Rising 2 open world games,Infamous looks incredible while it performs great,Dead Rising 3 doesn't look at all like Infamous and has deeps as low as 15 FPS,the performance the game has is horrible and DF say that,but when they talk about Infamous they call it jaw dropping and CGI like.

 

So yeah even that both games are aiming for 30FPS,one look and runs consitently better and is on PS4.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#265 StormyJoe
Member since 2011 • 7806 Posts

[QUOTE="StormyJoe"]

The fact of the matter is, no developer who has worked on both has said anything other than "the differences are minor". I don't think anyone is arguing that the PS4 has a better GPU and RAM that is designed for graphics. What I have been saying since day 1 that the hardware advantages for the PS4 really aren't going to amount to too much at the end of the day.

tormentos

That is a stupid excuse hold on to it as long as you can,developers been neutral will not stop the PS4 from over powering the xbox one.

700Gflops will not vanish because developer will not risc getting in problems with MS or sony for saying what console is more powerful,Ubisoft remark is basically the closes you will see,and is clear to any one but people on denial like you.

I already posted the link to what 480Gflops of difference can do,600 to 700 Gflops will deliver a bigger lead.

Sigh...

  • First off, that 700GFLOP GPU advantage, isn't all that. There is more to making software, including games, than that.
  • Secondly, third party devs don't seem to have any issue bashing the WiiU.
  • Thirdly, console manufacturers kiss devs asses, not vise versa - you don't see EA paying MS to put Call of Duty on the 360.

All your "But, look at the extra FLOPS!" doesn't amount to a hill of beans because that is just one aspect of the platform. You'd know that if you worked in software.

But, I tell you what: lets see what happens, shall we? If you are sill here in 2015, I'll make it a point to send you several messages and links back to these threads asking you "where's the PS4's graphic advantages coming?".

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#266 tormentos
Member since 2003 • 33793 Posts

 

Sigh...

  • First off, that 700GFLOP GPU advantage, isn't all that. There is more to making software, including games, than that.
  • Secondly, third party devs don't seem to have any issue bashing the WiiU.
  • Thirdly, console manufacturers kiss devs asses, not vise versa - you don't see EA paying MS to put Call of Duty on the 360.

All your "But, look at the extra FLOPS!" doesn't amount to a hill of beans because that is just one aspect of the platform. You'd know that if you worked in software.

But, I tell you what: lets see what happens, shall we? If you are sill here in 2015, I'll make it a point to send you several messages and links back to these threads asking you "where's the PS4's graphic advantages coming?".

StormyJoe

 

Really 480Gflops is enough to make a 20 FPS difference.

28FPS difference on Dirt.

As high as 26 FPS on Metro.

15 FPS in BF3 as much as 18FPS.

http://www.anandtech.com/bench/product/549?vs=536

 

All this with just 480Gflops.

So yeah go else where and pretend that 700Gflops is no difference.

That is because most of who bash the Wii U have cancel games on it.

Console makers decy if your game get a go or not,they can decy to block DLC to like MS did even to some big 3rd parties,wait didn't they stop Epic from releasing free content.? How about Valve.?

Hahaha...

 

http://gamer.blorge.com/2011/08/24/indie-devs-sick-of-xboxs-heavy-handed-policies-sony-pub-fund-the-answer/

 

How fast they forget how bad MS was to developers.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#267 StormyJoe
Member since 2011 • 7806 Posts

[QUOTE="StormyJoe"]

Sigh...

  • First off, that 700GFLOP GPU advantage, isn't all that. There is more to making software, including games, than that.
  • Secondly, third party devs don't seem to have any issue bashing the WiiU.
  • Thirdly, console manufacturers kiss devs asses, not vise versa - you don't see EA paying MS to put Call of Duty on the 360.

All your "But, look at the extra FLOPS!" doesn't amount to a hill of beans because that is just one aspect of the platform. You'd know that if you worked in software.

But, I tell you what: lets see what happens, shall we? If you are sill here in 2015, I'll make it a point to send you several messages and links back to these threads asking you "where's the PS4's graphic advantages coming?".

tormentos

Really 480Gflops is enough to make a 20 FPS difference.

28FPS difference on Dirt.

As high as 26 FPS on Metro.

15 FPS in BF3 as much as 18FPS.

http://www.anandtech.com/bench/product/549?vs=536

All this with just 480Gflops.

So yeah go else where and pretend that 700Gflops is no difference.

That is because most of who bash the Wii U have cancel games on it.

Console makers decy if your game get a go or not,they can decy to block DLC to like MS did even to some big 3rd parties,wait didn't they stop Epic from releasing free content.? How about Valve.?

Hahaha...

http://gamer.blorge.com/2011/08/24/indie-devs-sick-of-xboxs-heavy-handed-policies-sony-pub-fund-the-answer/

How fast they forget how bad MS was to developers.

See, there you go again. You point to raw GPU numbers, and don't take into account anything else. I reiterate my full of sh*t comment.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#268 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="tormentos"]

[QUOTE="StormyJoe"]

The fact of the matter is, no developer who has worked on both has said anything other than "the differences are minor". I don't think anyone is arguing that the PS4 has a better GPU and RAM that is designed for graphics. What I have been saying since day 1 that the hardware advantages for the PS4 really aren't going to amount to too much at the end of the day.

StormyJoe

That is a stupid excuse hold on to it as long as you can,developers been neutral will not stop the PS4 from over powering the xbox one.

700Gflops will not vanish because developer will not risc getting in problems with MS or sony for saying what console is more powerful,Ubisoft remark is basically the closes you will see,and is clear to any one but people on denial like you.

I already posted the link to what 480Gflops of difference can do,600 to 700 Gflops will deliver a bigger lead.

Sigh...

  • First off, that 700GFLOP GPU advantage, isn't all that. There is more to making software, including games, than that.
  • Secondly, third party devs don't seem to have any issue bashing the WiiU.
  • Thirdly, console manufacturers kiss devs asses, not vise versa - you don't see EA paying MS to put Call of Duty on the 360.

All your "But, look at the extra FLOPS!" doesn't amount to a hill of beans because that is just one aspect of the platform. You'd know that if you worked in software.

But, I tell you what: lets see what happens, shall we? If you are sill here in 2015, I'll make it a point to send you several messages and links back to these threads asking you "where's the PS4's graphic advantages coming?".

For X1,

1.187 TFLOPS / 30 fps, you have ~39.57 GFLOPS per frame.

1.310 TFLOPS / 30 fps, you have ~43.66 GFLOPS per frame.

---

For PS4

1.84 TFLOPS /30 fps, you have 61.3 GFLOPS per frame. Unknown allocation for shared.

---

At best case for PS4 vs worst case for X1, PS4 has about 21 GFLOPS extra power per frame for 30 fps target.

At best case for PS4 vs best case for X1, PS4 has about 17.64 GFLOPS extra power per frame for 30 fps target.

---

For 7970 Ghz Edition

4.3 TFLOPS / 30 fps, 7970 GE has an extra 143.36 GFLOPS power per frame for 30 fps target.

Against PS4, 7970 GE has a large gap of ~82.06 GFLOPS per frame for 30 fps target.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#269 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="btk2k2"]

And here we go again.

Wait wasn't you the one who claimed the 7850 @ 900 = the PS4 18 CU.?

But now you claim that 10CU at 1ghz doesn't = 12CU at 853mhz.

But wait are you again forgetting the 10% GPU reserve.? Oh yeah you are.. so even if the 12CU at 853 mhz are a tap higher i say a tap because the 7770 is 1280Gflops and the xbox one 1310gflops what are we talking here about 30,000 flops difference which is basically nothing,the xbox one can only use 1180 Gflops,and the 7770 actually has more flops performance in the end.

The xbox one doesn't have a Pitcairn it has a bonairne and every site knows it,the fact that you refuse to admit it says it all,and you still holding on tied to Pitcairn inside the xbox one,you are assuming and like always i want a lint to where it is confirm that the xbox one has a Pitcairn GPU inside.

xboxiphoneps3

Most PC benchmarks still has it's multi-tasking enabled, hence why you run the PC benchmark multiple times and get the average.

X1 doesn't have direct Bonaire since X1 has Pitcairn's 256bit memory controllers, it's related L2 cache/internal I/O crossbar setup.

X1 doesn't have direct Cape Verde since X1 has greater internal SRAM storage (e.g. reigster file, L1, LDS, scalar cache and 'etc') and triangle rate.

As for "7850 @ 900 = the PS4 18 CU" i.e. you missed ~ tilde character which means approximate. http://en.wikipedia.org/wiki/Tilde

regardless ps4 is still stronger then the xbox one, absolutely no denying that, not stronger by a land slide, but it has a advantage over the XB1,

I'm not denying PS4 > X1. The issue here is the gap between the two boxes. My post shows 7850/7830 prototype with 12 CU @860Mhz is under reference 7850 with 16 CU @ 860Mhz

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#270 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="StormyJoe"]

Sigh...

  • First off, that 700GFLOP GPU advantage, isn't all that. There is more to making software, including games, than that.
  • Secondly, third party devs don't seem to have any issue bashing the WiiU.
  • Thirdly, console manufacturers kiss devs asses, not vise versa - you don't see EA paying MS to put Call of Duty on the 360.

All your "But, look at the extra FLOPS!" doesn't amount to a hill of beans because that is just one aspect of the platform. You'd know that if you worked in software.

But, I tell you what: lets see what happens, shall we? If you are sill here in 2015, I'll make it a point to send you several messages and links back to these threads asking you "where's the PS4's graphic advantages coming?".

tormentos

Really 480Gflops is enough to make a 20 FPS difference.

28FPS difference on Dirt.

As high as 26 FPS on Metro.

15 FPS in BF3 as much as 18FPS.

http://www.anandtech.com/bench/product/549?vs=536

All this with just 480Gflops.

So yeah go else where and pretend that 700Gflops is no difference.

That is because most of who bash the Wii U have cancel games on it.

Console makers decy if your game get a go or not,they can decy to block DLC to like MS did even to some big 3rd parties,wait didn't they stop Epic from releasing free content.? How about Valve.?

Hahaha...

http://gamer.blorge.com/2011/08/24/indie-devs-sick-of-xboxs-heavy-handed-policies-sony-pub-fund-the-answer/

How fast they forget how bad MS was to developers.

Metro 2033 is a heavy tessellated game and you used a tessellation challenged 7770 SKU.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#271 ronvalencia
Member since 2008 • 29612 Posts
PS4 is stronger than 95% of the PCs that gamers use.SKaREO
Atm, PS4 has zero user install base.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#272 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]DF has attempt to match the console's shader power via 600Mhz 7870 XT and 7850. DF has negated the triangle rate difference between the two boxes.

7770's 1.28 TFLOPS was based on 10 CUs @ 1Ghz which doesn't match X1's 12 CU@ 853Mhz anyway.

From www.tomshardware.com/reviews/768-shader-pitcairn-review,3196-5.html

12 CU @ 860Mhz 7850/7830 prototype is the closest to X1's 12 CU @ 853Mhz.

0702%20Crysis2%20DX11.png

My posts are framed with the following dev quotes.

http://www.videogamer.com/news/xbox_one_and_ps4_have_no_advantage_over_the_other_says_redlynx.html

Speaking to VideoGamer.com at E3, Ilvessuo said: " Obviously we have been developing this game for a while and you can see the comparisons. I would say if you know how to use the platform they are both very powerful. I don't see a benefit over the other with any of the consoles."

----

http://www.videogamer.com/xboxone/metal_gear_solid_5_the_phantom_pain/news/ps4_and_xbox_one_power_difference_is_minimal_says_kojima.html

"The difference is small, and I don't really need to worry about it," he said, suggesting versions for Xbox One and PS4 won't be dramatically different.

----

http://gamingbolt.com/ubisoft-explains-the-difference-between-ps4-and-xbox-one-versions-of-watch_dogs

"Of course, the Xbox One isnt to be counted out. We asked Guay how the Xbox One version of Watch_Dogs would be different compared to the PC and PS4 versions of the game, to which he replied that, The Xbox One is a powerful platform, as of now we do not foresee a major difference in on screen result between the PS4 and the Xbox One. Obviously since we are still working on pushing the game on these new consoles, we are still doing R&D."

----

link

"We're still very much in the R&D period, that's what I call it, because the hardware is still new," Guay answered. "It's obvious to us that its going to take a little while before we can get to the full power of those machines and harness everything. But, even now we realise that both of them have comparable power, and for us thats good, but everyday it changes almost. Were pushing it and were going to continue doing that until [Watch Dogs] ship date."

http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis

"Other information has also come to light offering up a further Orbis advantage: the Sony hardware has a surprisingly large 32 ROPs (Render Output units) up against 16 on Durango. ROPs translate pixel and texel values into the final image sent to the display: on a very rough level, the more ROPs you have, the higher the resolution you can address (hardware anti-aliasing capability is also tied into the ROPs).16 ROPs is sufficient to maintain 1080p, 32 comes across as overkill, but it could be useful for addressing stereoscopic 1080p for instance, or even 4K. However, our sources suggest that Orbis is designed principally for displays with a maximum 1080p resolution."

http://www.polygon.com/2013/8/1/4580380/carmack-on-next-gen-console-hardware-very-close-very-good

Carmack on next-gen console hardware: 'very close,' 'very good'

btk2k2

There is also a pixel fillrate difference to take into account as well so if you shave off some performance of the card in that article and add a bit to the 7850 then you are basically there. -5% to the 768 pitcairn sample and +5% to the 7850 and you get 33 FPS vs 43 FPS in Crysis 2 and you get 44 FPS vs 57 FPS in BF3. A bit less than my predicted difference but close enough considering the amount of known unknowns we have regarding important performance factors on the X1. In the end though it means that games that push the X1 will be prettier and/or smoother on PS4.

Fillrate would be very minor issue for X1 vs PS4 since 7870 GE's 32 ROPS @ 1Ghz wasn't able match 7970's 32 ROPS @ 925Mhz.

3dm-color-fill.gif

Fillrate requires memory writes and ROPS doesn't operate in isolation. http://en.wikipedia.org/wiki/Fillrate


PS;

6970 has 176 GB/s memory bandwidth which is similar to PS4's 176 GB/s memory bandwidth i.e. notice the near-lack of improvements between 5870's result vs 7870's results.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#273 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="StormyJoe"]

This is why I think you are full of sh*t, Tormentos. Basing things off demos? Tell me, why is it that NO PS4 game ran over 30 FPS at E3 (most struggled to maintain 30 FS)? None of them did. http://www.eurogamer.net/articles/digitalfoundry-hands-on-with-playstation-4

From the Article: "When it comes to the state of software development on PS4, the situation as it stands is surprising. On the one hand, freely playable first-party titles such as Knack and DriveClub suffer from noticeable frame-rate stutters down from 30fps, while on the other, "hands off" demos for the new Infamous and Assassin's Creed games appear to run without a perceptible hitch. This is in stark contrast to the playable software confirmed to be running direct from Xbox One hardware, such as Forza Motorsport 5 and Killer Instinct, which benefit to no end for targeting the 1080p60 gold standard, and largely succeed in doing so."

The answer, of course, is that they were not completed games. See? I don't even like Sony, and I can say that without issue. You cows are pathetic, fanboy troglodytes.

tormentos

Because games on PS4 are not targeting 60 FPS.

They are targeting this gen graphics...

Funny the games in question that target 60FPS on xbox one are.

Forza 5 = Everything baked and faked this game is build like a last gen game.

Titanfall was demo at 60 FPS DF claim is not 1080p,and that Respawn had not confirm the resolution,no physics,no animation,no damage.. A game like that on PS4 would run at 90FPS probably if not more.

Killer Intsint = fighting game enough say.

So yeah don't get mad if that sh** doesn't impress me at least PS4 games are shooting for great effects and great visuals.

Most games on PC will pass 60FPS without problem if you turn everything off and lower quality even on Crysis 3 that is a fact.

Forza 4 has real time HDR and Forza 5 wasn't in the completed state i.e. before the GPU uplift and "mono" driver uppdate.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#274 btk2k2
Member since 2003 • 440 Posts
Fillrate would be a very minor issue for X1 vs PS4 since 7870 GE's 32 ROP @ 1Ghz wasn't able match 7970's 32 ROPS @ 925Mhz.

3dm-color-fill.gif

Fillrate requires memory writes and ROPS doesn't operate in isolation. http://en.wikipedia.org/wiki/Fillrate

ronvalencia
I like how you ignored my post. Here I will copy it for you. Lets just work this out once and for all and get some hard numbers. (Actual performance of GCN cards, source is AnandTech Bench Vantage Pixel Fillrate test) 7770 with 16 ROPS @ 72 GB/s = 3.7 Gpixels/s 7790 with 16 ROPS @ 96 GB/s = 5 Gpixels/s 7850 with 32 ROPS @ 153.6GB/s = 7.9 Gpixels/s 7970Ghz with 32 ROPS @ 288 GB/s = 13.9 Gpixels/s (Formulas used) Fillrate formula is: The basic formula for this is Bandwidth A/Bandwidth B * Fillrate B = Fillrate A. This works until you become ROP limited instead of bandwidth limited. Min Bandwidth formula is: Fillrate A / Fillrate B * Bandwidth B = Bandwidth A. This only works if Fillrate A is ROP limited rather than bandwidth limited. Clock scaling fillrate formula is: Fillrate A * Clockspeed B / Clockspeed A = Fillrate B. this only works if Fillrate A is ROP limited. Clock scaling bandwidth formula is: Bandwidth A * Clockspeed B / Clockspeed A = Bandwidth B. This only works if Bandwidth A is the minimum required to fully utilise the ROPS @ Clockspeed A. (Performance scaling based on bandwidth scaling alone) Starting with the 7770 performance of 3.7 Gpixels/s lets multiply up to see how close the bandwidth scaling is to the actual numbers 7770 scaling to 7790: 96/72 * 3.7 = 4.9 GPixels/s (very close to the 5 of the 7790) 7790 scaling to the 7850: 153.6/96 * 4.9 = 7.9 Gpixels/s (exactly the same as the 7850) 7850 scaling to the 7970Ghz: 288/153.6 * 7.9 = 14.85 Gpixels/s (more than the 7970 Ghz, looks like we have found a card where the ROPS are the limiting factor and not the bandwidth) (Working out minimum bandwidth to fully utilise 32 ROPS and 16 ROPS) From the 7770 upto the 7850 the % bandwidth increase is the same as the % fillrate increase. The 7850 to the 7970 Ghz however shows the fillrate increase is lower than the bandwidth increase. From this we can deduce that the 32 ROPS @ 1.05Ghz are fully utilised by 288GB/s of bandwidth. To get the minimum required bandwith for 32 ROPs we can do 13.9/7.9 * 153.6 = 270GB/s. To get the minimum bandwidth for 16 ROPS @ 1.05Ghz we can do 270/2 = 135GB/s. (Working out the maximum performance of 32 ROPS and 16 ROPS running at 1.05Ghz) We already know the maximum performance of 32 ROPS @ 1.05Ghz is 13.9 Gpixels/s so 16 ROPS will be 13.9/2 = 6.95 Gpixels/s. We could also work it out by doing 135/96 * 5 = 7 Gpixels/s, this helps confirm that the bandwidth scaling method is valid. (Working out the X1 max fillrate and min fillrate) Now the X1 GPU only runs at 853 Mhz so to get the maximum Pixel Fillrate we do 7 * 853/1050 = 5.7 Gpixels/s with a minimum bandwidth requirement of 135 * 853/1050 = 110 GB/s. To confirm the accuracy lets work it out based on bandwidth scaling, 110/96 * 5 = 5.7 GPixels/s. The additional bandwidth above 110 does not add any fillrate performance. Now if the Xbox is limited by the 68 GB/s DDR3 bandwidth then this maximum could drop down to a minimum of 5.7 * 68/110 = 3.5 GPixels/s, scaling from the 7770 shows the same performance figure, 68/72 * 3.7 = 3.5 Gpixels/s. (Working out the PS4 fillrate) The PS4 runs at 800Mhz so the Maximum Pixel Fillrate is 13.9 * 800/1050 = 10.6 Gpixels/s with a minimum bandwidth requirement of 270 * 800/1050 = 206 GB/s. Since the PS4 has less bandwidth than the minimum we need to work out the maximum performance based on its actual bandwidth so 176/206 * 10.6 = 9.05 GPixels/s. Since we know that the fillrate on PS4 is bandwidth limited we could also scale from the 7850 176/153.6 * 7.9 = 9.05 GPixels/s. If the CPU is using all of its 20GB/s allocation then this could fall to 9.05 * 156/176 = 8 GPixels/s, scaling up from the 7850 gives 156/153.6 * 7.9 = 8 GPixels/s. (Conclusion) Xbox 1 best case = 5.7 GPixels/s Xbox 1 worst case = 3.5 GPixels/s PS4 best case = 9.05 GPixels/s PS4 worst case = 8 GPixels/s There is no case where the Xbox 1 can match the PS4 in Pixel Fillrate. At the Xbox 1's very best the PS4 still has a 40% advantage (X1 best case vs PS4 worst case). On the other hand at the Xbox 1's very worst the PS4 has a 158% advantage (PS4 best case vs X1 worst case).
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#275 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]Fillrate would be a very minor issue for X1 vs PS4 since 7870 GE's 32 ROP @ 1Ghz wasn't able match 7970's 32 ROPS @ 925Mhz.

3dm-color-fill.gif

Fillrate requires memory writes and ROPS doesn't operate in isolation. http://en.wikipedia.org/wiki/Fillrate

btk2k2

I like how you ignored my post. Here I will copy it for you. Lets just work this out once and for all and get some hard numbers. (Actual performance of GCN cards, source is AnandTech Bench Vantage Pixel Fillrate test) 7770 with 16 ROPS @ 72 GB/s = 3.7 Gpixels/s 7790 with 16 ROPS @ 96 GB/s = 5 Gpixels/s 7850 with 32 ROPS @ 153.6GB/s = 7.9 Gpixels/s 7970Ghz with 32 ROPS @ 288 GB/s = 13.9 Gpixels/s (Formulas used) Fillrate formula is: The basic formula for this is Bandwidth A/Bandwidth B * Fillrate B = Fillrate A. This works until you become ROP limited instead of bandwidth limited. Min Bandwidth formula is: Fillrate A / Fillrate B * Bandwidth B = Bandwidth A. This only works if Fillrate A is ROP limited rather than bandwidth limited. Clock scaling fillrate formula is: Fillrate A * Clockspeed B / Clockspeed A = Fillrate B. this only works if Fillrate A is ROP limited. Clock scaling bandwidth formula is: Bandwidth A * Clockspeed B / Clockspeed A = Bandwidth B. This only works if Bandwidth A is the minimum required to fully utilise the ROPS @ Clockspeed A. (Performance scaling based on bandwidth scaling alone) Starting with the 7770 performance of 3.7 Gpixels/s lets multiply up to see how close the bandwidth scaling is to the actual numbers 7770 scaling to 7790: 96/72 * 3.7 = 4.9 GPixels/s (very close to the 5 of the 7790) 7790 scaling to the 7850: 153.6/96 * 4.9 = 7.9 Gpixels/s (exactly the same as the 7850) 7850 scaling to the 7970Ghz: 288/153.6 * 7.9 = 14.85 Gpixels/s (more than the 7970 Ghz, looks like we have found a card where the ROPS are the limiting factor and not the bandwidth) (Working out minimum bandwidth to fully utilise 32 ROPS and 16 ROPS) From the 7770 upto the 7850 the % bandwidth increase is the same as the % fillrate increase. The 7850 to the 7970 Ghz however shows the fillrate increase is lower than the bandwidth increase. From this we can deduce that the 32 ROPS @ 1.05Ghz are fully utilised by 288GB/s of bandwidth. To get the minimum required bandwith for 32 ROPs we can do 13.9/7.9 * 153.6 = 270GB/s. To get the minimum bandwidth for 16 ROPS @ 1.05Ghz we can do 270/2 = 135GB/s.

Note that you haven't factored in decompression/compression issue.

rop.jpg

X1 was claimed to have 133 GB/s for the alpha blend.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#276 btk2k2
Member since 2003 • 440 Posts
Note that you haven't factored in decompression/compression issue.

rop.jpg

X1 was claimed to have 133 GB/s for the alpha blend.

ronvalencia
I do not need to. 16 ROPS @ 853Mhz require 110GB/s of bandwidth to max out and achieve a maximum of 5.7 Gpixels/s. Any further increase in bandwidth will not result in a fillrate performance increase. If I had a 7790 I could clock it to 853Mhz, set the bandwidth to 110GB/s run the test, then run it again with the bandwidth at 133GB/s (or as close as the overclock would allow) to prove that an increase in bandwidth would not result in a fillrate increase. Also if you look at my theoretical max for the PS4 it is 9.05 GPixels/s, compared to the 6970 which gets 8.8 GPixels/s it is well within the margin of error. It also does not factor in any potential improvements made by AMD between the 6xxx series and the 7xxx series. Scaling from the 6850's 134.4 GB/s @ 6.62 GPixels/s to the 7850 using the same formula I used in the previous posts gets 153.6/134.4 * 6.62 = 7.57 GPixels/s, this is a bit less than the 7.9 GPixels/s the 7850 actually gets so it does suggest that the 7xxx series ROPS are more bandwidth efficient than those of the 6xxx series.
Avatar image for nameless12345
nameless12345

15125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#277 nameless12345
Member since 2010 • 15125 Posts

I think price, exclusives, support and "user-friendly" policies will sell the consoles, not the power itself.

The visual difference won't be a drastic one and the amount of "Giga FLOPS" won't result in more sales. ;)

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#278 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]Note that you haven't factored in decompression/compression issue.

rop.jpg

X1 was claimed to have 133 GB/s for the alpha blend.

btk2k2

I do not need to. 16 ROPS @ 853Mhz require 110GB/s of bandwidth to max out and achieve a maximum of 5.7 Gpixels/s. Any further increase in bandwidth will not result in a fillrate performance increase. If I had a 7790 I could clock it to 853Mhz, set the bandwidth to 110GB/s run the test, then run it again with the bandwidth at 133GB/s (or as close as the overclock would allow) to prove that an increase in bandwidth would not result in a fillrate increase. Also if you look at my theoretical max for the PS4 it is 9.05 GPixels/s, compared to the 6970 which gets 8.8 GPixels/s it is well within the margin of error. It also does not factor in any potential improvements made by AMD between the 6xxx series and the 7xxx series. Scaling from the 6850's 134.4 GB/s @ 6.62 GPixels/s to the 7850 using the same formula I used in the previous posts gets 153.6/134.4 * 6.62 = 7.57 GPixels/s, this is a bit less than the 7.9 GPixels/s the 7850 actually gets so it does suggest that the 7xxx series ROPS are more bandwidth efficient than those of the 6xxx series.

Your X1 result doesn't cover the claimed 133 GB/s alpha blending for the X1.

You haven't factored in the effective memory bandwdith.

gpu-zero-all.png

AMD Radeon HD 5870 has theoretical 153 GB/s with a 108 GB/s practical i.e. 70.5 percent efficient. This is just the basic memory copy.

Are you assuming the external memory stack has zero overheads/zero latency?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#279 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]Note that you haven't factored in decompression/compression issue.

rop.jpg

X1 was claimed to have 133 GB/s for the alpha blend.

btk2k2

I do not need to. 16 ROPS @ 853Mhz require 110GB/s of bandwidth to max out and achieve a maximum of 5.7 Gpixels/s. Any further increase in bandwidth will not result in a fillrate performance increase. If I had a 7790 I could clock it to 853Mhz, set the bandwidth to 110GB/s run the test, then run it again with the bandwidth at 133GB/s (or as close as the overclock would allow) to prove that an increase in bandwidth would not result in a fillrate increase. Also if you look at my theoretical max for the PS4 it is 9.05 GPixels/s, compared to the 6970 which gets 8.8 GPixels/s it is well within the margin of error. It also does not factor in any potential improvements made by AMD between the 6xxx series and the 7xxx series. Scaling from the 6850's 134.4 GB/s @ 6.62 GPixels/s to the 7850 using the same formula I used in the previous posts gets 153.6/134.4 * 6.62 = 7.57 GPixels/s, this is a bit less than the 7.9 GPixels/s the 7850 actually gets so it does suggest that the 7xxx series ROPS are more bandwidth efficient than those of the 6xxx series.

At 1080p and minus heavy MSAA, modern games are not ROPS bound.

http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis

"Other information has also come to light offering up a further Orbis advantage: the Sony hardware has a surprisingly large 32 ROPs (Render Output units) up against 16 on Durango. ROPs translate pixel and texel values into the final image sent to the display: on a very rough level, the more ROPs you have, the higher the resolution you can address (hardware anti-aliasing capability is also tied into the ROPs). 16 ROPs is sufficient to maintain 1080p, 32 comes across as overkill, but it could be useful for addressing stereoscopic 1080p for instance, or even 4K. However, our sources suggest that Orbis is designed principally for displays with a maximum 1080p resolution."

Are you claiming 115GB/s memory will halve my 7950-950Mhz's frame rate?

7950-950Mhz with GDDR5 @ 5000Mhz effective (240 GB/s). Minor fps hit with reduced memory speed.

Tomb Raider at Ultimate settings and 1080p shared.

7950-950_GDDR5-5000_zpsbeba6a3f.jpg

--------------

7950-950Mhz with GDDR5 @ 2400Mhz effective (115.2 GB/s)

Tomb Raider at Ultimate settings and 1080p shared.

7950-950_GDDR5-2400_zpsdc6cc02a.jpg

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#280 Krelian-co
Member since 2006 • 13274 Posts

is ron still arguing that faster ram doesnt matter and that a gimped 7770 is close to a 7850? lol.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#281 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]Fillrate would be a very minor issue for X1 vs PS4 since 7870 GE's 32 ROP @ 1Ghz wasn't able match 7970's 32 ROPS @ 925Mhz.

3dm-color-fill.gif

Fillrate requires memory writes and ROPS doesn't operate in isolation. http://en.wikipedia.org/wiki/Fillrate

btk2k2

I like how you ignored my post. Here I will copy it for you. Lets just work this out once and for all and get some hard numbers.

(Actual performance of GCN cards, source is AnandTech Bench Vantage Pixel Fillrate test)

7770 with 16 ROPS @ 72 GB/s = 3.7 Gpixels/s

....

(Working out the maximum performance of 32 ROPS and 16 ROPS running at 1.05Ghz) We already know the maximum performance of 32 ROPS @ 1.05Ghz is 13.9 Gpixels/s so 16 ROPS will be 13.9/2 = 6.95 Gpixels/s. We could also work it out by doing 135/96 * 5 = 7 Gpixels/s, this helps confirm that the bandwidth scaling method is valid.

....

(Working out the X1 max fillrate and min fillrate) Now the X1 GPU only runs at 853 Mhz so to get the maximum Pixel Fillrate we do 7 * 853/1050 = 5.7 Gpixels/s with a minimum bandwidth requirement of 135 * 853/1050 = 110 GB/s.

One problem, my 8870M with 8 ROPS** @ 72 GB/s has 3.55 Gpixels/s.

To test 1.05Ghz "13.9/2 = 6.95 Gpixels/s" scaling with 8870M's 8 ROPS @ 775Mhz i.e.

(Working out the maximum performance of 8 ROPS running at 1.05Ghz)

We already know the maximum performance of 32 ROPS @ 1.05Ghz is 13.9 Gpixels/s so 8 ROPS will be 13.9/4 = 3.47 Gpixels/s.

---

so to get the maximum Pixel Fillrate we do 3.47 x 775/1050 = 2.56 Gpixels/s.

Scaling breaks with 8870M's 8 ROPS i.e. 8870M's 8 ROPS already reached 3.55 Gpixels with 725Mhz/775Mhz clockspeed.


Your scaling breaks with GCN 8000M series generation.

-------------------

**8 Color ROP Units + 32 Z/Stencil ROP Units. 8870M ("Venus XT") has base clock speed of 725Mhz with 775Mhz boost @ 32 watts.

8870M (Venus XT) replaces 7870M (Heathrow XT/Cape Verde).

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#282 StormyJoe
Member since 2011 • 7806 Posts

[QUOTE="StormyJoe"]

[QUOTE="tormentos"]

That is a stupid excuse hold on to it as long as you can,developers been neutral will not stop the PS4 from over powering the xbox one.

700Gflops will not vanish because developer will not risc getting in problems with MS or sony for saying what console is more powerful,Ubisoft remark is basically the closes you will see,and is clear to any one but people on denial like you.

I already posted the link to what 480Gflops of difference can do,600 to 700 Gflops will deliver a bigger lead.

ronvalencia

Sigh...

  • First off, that 700GFLOP GPU advantage, isn't all that. There is more to making software, including games, than that.
  • Secondly, third party devs don't seem to have any issue bashing the WiiU.
  • Thirdly, console manufacturers kiss devs asses, not vise versa - you don't see EA paying MS to put Call of Duty on the 360.

All your "But, look at the extra FLOPS!" doesn't amount to a hill of beans because that is just one aspect of the platform. You'd know that if you worked in software.

But, I tell you what: lets see what happens, shall we? If you are sill here in 2015, I'll make it a point to send you several messages and links back to these threads asking you "where's the PS4's graphic advantages coming?".

For X1,

1.187 TFLOPS / 30 fps, you have ~39.57 GFLOPS per frame.

1.310 TFLOPS / 30 fps, you have ~43.66 GFLOPS per frame.

---

For PS4

1.84 TFLOPS /30 fps, you have 61.3 GFLOPS per frame. Unknown allocation for shared.

---

At best case for PS4 vs worst case for X1, PS4 has about 21 GFLOPS extra power per frame for 30 fps target.

At best case for PS4 vs best case for X1, PS4 has about 17.64 GFLOPS extra power per frame for 30 fps target.

---

For 7970 Ghz Edition

4.3 TFLOPS / 30 fps, 7970 GE has an extra 143.36 GFLOPS power per frame for 30 fps target.

Against PS4, 7970 GE has a large gap of ~82.06 GFLOPS per frame for 30 fps target.

Oh look, another "teh raw numbers" person enters the fray. The fact is, I don't care what numbers you spew, for 2 reasons:

  1. There is more to a platform than raw numbers. Especially when the numbers that you cows are so hung up on make up "part" of the hardware
  2. Developers who work on both say the difference is minimal.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#283 tormentos
Member since 2003 • 33793 Posts

 

For X1,

1.187 TFLOPS / 30 fps, you have ~39.57 GFLOPS per frame.

1.310 TFLOPS / 30 fps, you have ~43.66 GFLOPS per frame.

---

For PS4

1.84 TFLOPS /30 fps, you have 61.3 GFLOPS per frame. Unknown allocation for shared.

---

At best case for PS4 vs worst case for X1, PS4 has about 21 GFLOPS extra power per frame for 30 fps target.

At best case for PS4 vs best case for X1, PS4 has about 17.64 GFLOPS extra power per frame for 30 fps target.

---

For 7970 Ghz Edition

4.3 TFLOPS / 30 fps, 7970 GE has an extra 143.36 GFLOPS power per frame for 30 fps target.

Against PS4, 7970 GE has a large gap of ~82.06 GFLOPS per frame for 30 fps target.

ronvalencia

 

 

This remind me of how xbox fans use to break xbox live price into days and weeks to make it seen like it was ultra cheap.

 

Well at least you are starting to see the 1.18 as a possibility on xbox one.:lol:

 

 

Metro 2033 is a heavy tessellated game and you used a tessellation challenged 7770 SKU.

ronvalencia

 

Yeah but Dirt isn't and the gap is even bigger there.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#285 tormentos
Member since 2003 • 33793 Posts

 

rop.jpg

 

X1 was claimed to have 133 GB/s for the alpha blend.

 

 

ronvalencia

 

Yeah MS claimed 278GB/s for the xbox 360 to.. I am tyre of MS lies and agregated bandwidth tactics that don't represent real time scenarios.

 

The xbox one doesn't need 133GB'/s because the cripple GPU it has doesn't have the power to fuel them.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#286 tormentos
Member since 2003 • 33793 Posts

[QUOTE="btk2k2"][QUOTE="ronvalencia"]Fillrate would be a very minor issue for X1 vs PS4 since 7870 GE's 32 ROP @ 1Ghz wasn't able match 7970's 32 ROPS @ 925Mhz.

 

 

3dm-color-fill.gif

 

Fillrate requires memory writes and ROPS doesn't operate in isolation. http://en.wikipedia.org/wiki/Fillrate

 

 

ronvalencia

I like how you ignored my post. Here I will copy it for you. Lets just work this out once and for all and get some hard numbers.

(Actual performance of GCN cards, source is AnandTech Bench Vantage Pixel Fillrate test)

7770 with 16 ROPS @ 72 GB/s = 3.7 Gpixels/s

One problem, my 8870M with 8 ROPS** @ 72 GB/s has 3.55 Gpixels/s.

 

 

**8 Color ROP Units + 32 Z/Stencil ROP Units. 8870M ("Venus XT") has clock speed of 725Mhz with 775Mhz boost @ 32 watts.

 

 

Do you have a GPU factory because every time you speak about a GPU you say MY and put a GPU,my 7970,my 7950,my 8870..

Come on dude it becoming lame,fact is the xbox one has a gimped 7790 with more bandwidth that it needs you can either accept it or live on denial,the PS4 has 176Gb's is also faster than the 7870 so yeah it has the same advantage the xbox one has,more bandwidth than the orginal GPU series had,so the same advantage apply to the PS4 so the distant should be even higher,since you love to compare everythig to the 7850 knowing it doesn't have the CU TF performance or even the banwidth of the PS4.

In the end the PS4 has faster memory,more GPU power and even more customization GPU wise hance 8 ACES 64 command.

 

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#287 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

ROP's

GPU's

FLOPS'

IDIOT's





Rename the thread to monkey fighting.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#288 tormentos
Member since 2003 • 33793 Posts

 

Oh look, another "teh raw numbers" person enters the fray. The fact is, I don't care what numbers you spew, for 2 reasons:

  1. There is more to a platform than raw numbers. Especially when the numbers that you cows are so hung up on make up "part" of the hardware
  2. Developers who work on both say the difference is minimal.

 

StormyJoe

 

You are an idiot that was Ronvalencia he is basically now one of the biggest lemmings this site has all he does is defend the xbox one,and ignore PS4 advantage,you know how many months i have arguing with him so that he can even consider using in a post the 1.18 TF performance of the xbox one as possibility to do 10% GPU reservation.?

 

You just did not understand his post you are so sad that you actually confuse him with cows..:lol:

 

1-This 2 hardware are so alike they could stwich names from a vendor point of view this is not PS3 vs 360..:lol:

2-Developers will not say sh** and risk getting in problems that is a fact.

 

The difference is there and will be there all generation long.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#289 tormentos
Member since 2003 • 33793 Posts

ROP's

GPU's

FLOPS'

IDIOT's





Rename the thread to monkey fighting.

AMD655

 

You don't like it get the fu** out no one force you to read it.;)

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#290 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="btk2k2"] I like how you ignored my post. Here I will copy it for you. Lets just work this out once and for all and get some hard numbers.

(Actual performance of GCN cards, source is AnandTech Bench Vantage Pixel Fillrate test)

7770 with 16 ROPS @ 72 GB/s = 3.7 Gpixels/s

tormentos

One problem, my 8870M with 8 ROPS** @ 72 GB/s has 3.55 Gpixels/s.

**8 Color ROP Units + 32 Z/Stencil ROP Units. 8870M ("Venus XT") has clock speed of 725Mhz with 775Mhz boost @ 32 watts.

Do you have a GPU factory because every time you speak about a GPU you say MY and put a GPU,my 7970,my 7950,my 8870..(1)

Come on dude it becoming lame,fact is the xbox one has a gimped 7790 with more bandwidth that it needs you can either accept it or live on denial(2),the PS4 has 176Gb's is also faster than the 7870 so yeah it has the same advantage the xbox one has,more bandwidth than the orginal GPU series had,so the same advantage apply to the PS4 so the distant should be even higher,since you love to compare everythig to the 7850 knowing it doesn't have the CU TF performance or even the banwidth of the PS4.

In the end the PS4 has faster memory,more GPU power and even more customization GPU wise hance 8 ACES 64 command(3).

1. This is not the issue. The mentioned GCN models are on my sig.

2. btk2k2's scaling breaks with GCN 8000M generation e.g. 8870M.

3. 8 ACES 64 command doesn't change wave front processing rate.

On existing AMD GCN, each AMD ACE unit can handle a parallel stream of commands.

GCN_ACE_Parallel_Stream_zps14797320.png

computeunit_zpsa9e97df2.jpg

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#291 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

rop.jpg

X1 was claimed to have 133 GB/s for the alpha blend.

tormentos

Yeah MS claimed 278GB/s for the xbox 360 to.. I am tyre of MS lies and agregated bandwidth tactics that don't represent real time scenarios.

The xbox one doesn't need 133GB'/s because the cripple GPU it has doesn't have the power to fuel them.

278 GB wasn't the bandwidth connection between EDRAM and GPU unlike X1's 133 GB/s alpha blend via ESRAM.

A scaled 8870M to 16 ROPS would reach X1's 133GB/s target.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#292 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

For X1,

1.187 TFLOPS / 30 fps, you have ~39.57 GFLOPS per frame.

1.310 TFLOPS / 30 fps, you have ~43.66 GFLOPS per frame.

---

For PS4

1.84 TFLOPS /30 fps, you have 61.3 GFLOPS per frame. Unknown allocation for shared.

---

At best case for PS4 vs worst case for X1, PS4 has about 21 GFLOPS extra power per frame for 30 fps target.

At best case for PS4 vs best case for X1, PS4 has about 17.64 GFLOPS extra power per frame for 30 fps target.

---

For 7970 Ghz Edition

4.3 TFLOPS / 30 fps, 7970 GE has an extra 143.36 GFLOPS power per frame for 30 fps target.

Against PS4, 7970 GE has a large gap of ~82.06 GFLOPS per frame for 30 fps target.

tormentos

This remind me of how xbox fans use to break xbox live price into days and weeks to make it seen like it was ultra cheap.

Well at least you are starting to see the 1.18 as a possibility on xbox one.:lol:

Metro 2033 is a heavy tessellated game and you used a tessellation challenged 7770 SKU.

ronvalencia

Yeah but Dirt isn't and the gap is even bigger there.

10 precent allocation is still not confirmed by MS and my posted PC benchmarks has multitasking enabled.

7770 still doesn't reflect X1's memory subsystem.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#293 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="StormyJoe"]

Sigh...

  • First off, that 700GFLOP GPU advantage, isn't all that. There is more to making software, including games, than that.
  • Secondly, third party devs don't seem to have any issue bashing the WiiU.
  • Thirdly, console manufacturers kiss devs asses, not vise versa - you don't see EA paying MS to put Call of Duty on the 360.

All your "But, look at the extra FLOPS!" doesn't amount to a hill of beans because that is just one aspect of the platform. You'd know that if you worked in software.

But, I tell you what: lets see what happens, shall we? If you are sill here in 2015, I'll make it a point to send you several messages and links back to these threads asking you "where's the PS4's graphic advantages coming?".

StormyJoe

For X1,

1.187 TFLOPS / 30 fps, you have ~39.57 GFLOPS per frame.

1.310 TFLOPS / 30 fps, you have ~43.66 GFLOPS per frame.

---

For PS4

1.84 TFLOPS /30 fps, you have 61.3 GFLOPS per frame. Unknown allocation for shared.

---

At best case for PS4 vs worst case for X1, PS4 has about 21 GFLOPS extra power per frame for 30 fps target.

At best case for PS4 vs best case for X1, PS4 has about 17.64 GFLOPS extra power per frame for 30 fps target.

---

For 7970 Ghz Edition

4.3 TFLOPS / 30 fps, 7970 GE has an extra 143.36 GFLOPS power per frame for 30 fps target.

Against PS4, 7970 GE has a large gap of ~82.06 GFLOPS per frame for 30 fps target.

Oh look, another "teh raw numbers" person enters the fray. The fact is, I don't care what numbers you spew, for 2 reasons:

  1. There is more to a platform than raw numbers. Especially when the numbers that you cows are so hung up on make up "part" of the hardware
  2. Developers who work on both say the difference is minimal.

I'm showing its minimal from PC's POV.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#294 tormentos
Member since 2003 • 33793 Posts

[QUOTE="tormentos"]

[QUOTE="ronvalencia"]

One problem, my 8870M with 8 ROPS** @ 72 GB/s has 3.55 Gpixels/s.

 

 

**8 Color ROP Units + 32 Z/Stencil ROP Units. 8870M ("Venus XT") has clock speed of 725Mhz with 775Mhz boost @ 32 watts.

 

ronvalencia

 

Do you have a GPU factory because every time you speak about a GPU you say MY and put a GPU,my 7970,my 7950,my 8870..(1)

Come on dude it becoming lame,fact is the xbox one has a gimped 7790 with more bandwidth that it needs you can either accept it or live on denial(2),the PS4 has 176Gb's is also faster than the 7870 so yeah it has the same advantage the xbox one has,more bandwidth than the orginal GPU series had,so the same advantage apply to the PS4 so the distant should be even higher,since you love to compare everythig to the 7850 knowing it doesn't have the CU TF performance or even the banwidth of the PS4.

In the end the PS4 has faster memory,more GPU power and even more customization GPU wise hance 8 ACES 64 command(3).

 

1. This is not the issue. The mentioned GCN models are on my sig.

2. btk2k2's scaling breaks with GCN 8000M generation e.g. 8870M.

3. 8 ACES 64 command doesn't change wave front processing rate.

 

On existing AMD GCN, each AMD ACE unit can handle a parallel stream of commands.

 

 

GCN on xbox is not 8000M is a bonaire 7790 gimped.

it doesn't matter is can issue more out of orders execution than the xbox one or even your 7790 do to having more ACES,the GPU on PS4 is custom to take advantage of the huge compute power AMD GPU have,the xbox one wasn't.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#295 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="tormentos"]

Do you have a GPU factory because every time you speak about a GPU you say MY and put a GPU,my 7970,my 7950,my 8870..(1)

Come on dude it becoming lame,fact is the xbox one has a gimped 7790 with more bandwidth that it needs you can either accept it or live on denial(2),the PS4 has 176Gb's is also faster than the 7870 so yeah it has the same advantage the xbox one has,more bandwidth than the orginal GPU series had,so the same advantage apply to the PS4 so the distant should be even higher,since you love to compare everythig to the 7850 knowing it doesn't have the CU TF performance or even the banwidth of the PS4.

In the end the PS4 has faster memory,more GPU power and even more customization GPU wise hance 8 ACES 64 command(3).

tormentos

1. This is not the issue. The mentioned GCN models are on my sig.

2. btk2k2's scaling breaks with GCN 8000M generation e.g. 8870M.

3. 8 ACES 64 command doesn't change wave front processing rate.

On existing AMD GCN, each AMD ACE unit can handle a parallel stream of commands.

GCN on xbox is not 8000M is a bonaire 7790 gimped.

it doesn't matter is can issue more out of orders execution than the xbox one or even your 7790 do to having more ACES,the GPU on PS4 is custom to take advantage of the huge compute power AMD GPU have,the xbox one wasn't.

Bonaire doesn't have X1's 256bit memory controllers and side port ESRAM connection.

You are forgetting

1. 7970's code stream is optimised by the CPU.

2. 7970's ACE units handles larger wavefront pool compared to lesser GCNs i.e. the comparsion is like quad core Intel Core i5 Ivybridge (fewer front-end count but larger size) vs 8 core AMD Jaguar (more front-end count but smaller size). ACE unit size scales with CU count.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#296 btk2k2
Member since 2003 • 440 Posts
One problem, my 8870M with 8 ROPS** @ 72 GB/s has 3.55 Gpixels/s.

To test 1.05Ghz "13.9/2 = 6.95 Gpixels/s" scaling with 8870M's 8 ROPS @ 775Mhz i.e.

(Working out the maximum performance of 8 ROPS running at 1.05Ghz)

We already know the maximum performance of 32 ROPS @ 1.05Ghz is 13.9 Gpixels/s so 8 ROPS will be 13.9/4 = 3.47 Gpixels/s.

---

so to get the maximum Pixel Fillrate we do 3.47 x 775/1050 = 2.56 Gpixels/s.

Scaling breaks with 8870M's 8 ROPS i.e. 8870M's 8 ROPS already reached 3.55 Gpixels with 725Mhz/775Mhz clockspeed.


Your scaling breaks with GCN 8000M series generation.

-------------------

**8 Color ROP Units + 32 Z/Stencil ROP Units. 8870M ("Venus XT") has base clock speed of 725Mhz with 775Mhz boost @ 32 watts.

8870M (Venus XT) replaces 7870M (Heathrow XT/Cape Verde).

ronvalencia
According to the this AMD document the 8870 has 16 Colour ROPS and 64 Z/Stencil ROPS. Not 8 like you have suggested (It does say 8 on their website though, but other places have it at 16, which considering it is an upgrade for the 7870m which had 16 ROPS makes sense). Using my scaling from the 7790 the 8870m with 16 ROPS @ 775Mhz coupled with 72 GB/s of bandwidth gives 72/96 * 5 = 3.75 GPixels/s, which is the same as the 7770, as you would expact for a 16 ROP part with 72GB/s of bandwidth, your claimed 3.55 GB/s seems a little low but it is within the standard variation for the 3d Mark Vantage Pixel Fill test.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#297 tormentos
Member since 2003 • 33793 Posts

 

278 GB wasn't the bandwidth connection between EDRAM and GPU unlike X1's 133 GB/s alpha blend via ESRAM.

 

A scaled 8870M to 16 ROPS would reach X1's 133GB/s target.ronvalencia

 

MS claimed 278GB/s period they claimed that stop your damage controls dude there is nothing you can't say to fix that or change it,MS openly lie about the xbox 360 bandwidth ading bandwidth together like if those were apples were they counting.

Alpha blend claims mean sh** is more than the xbox one will need but since you hold so tied to it read this.

 

Theoretical peak performance is one thing, but in real-life scenarios it's believed that 133GB/s throughput has been achieved with alpha transparency blending operations (FP16 x4).

 

http://www.eurogamer.net/articles/digitalfoundry-xbox-one-memory-better-in-production-hardware

 

They don't even put that as fact they say its believe which is different to it does 133Gb/s with alpha blending,is lovable because that article was pin on some suppose developer leak,that tunr out to be MS it self,so yeah anything coming from MS bandwidth wise should be taken as a joke,specially how they already try to claim the xbox one had 200GB across the whole system and try to use it as an advantage vs the PS4 we all know it doesn't work that way.

 

Oh that was Richar Leadbetter which like you never had anything good to say about cell,but prised EDRAM on xbox 360 as the endless bandiwdth.

 

"The thing is, running in that single tile of eDRAM, Square-Enix has almost limitless bandwidth and enormous levels of fill-rate at its disposal. So it is extremely disappointing to note that the alpha-to-coverage interlace-style effect on the characters' hair remains in the Xbox 360 game."

 

Hahaha... Your article maker is under pay by MS...:lol:

 

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#298 btk2k2
Member since 2003 • 440 Posts

[QUOTE="btk2k2"][QUOTE="ronvalencia"]Note that you haven't factored in decompression/compression issue.

rop.jpg

X1 was claimed to have 133 GB/s for the alpha blend.

ronvalencia

I do not need to. 16 ROPS @ 853Mhz require 110GB/s of bandwidth to max out and achieve a maximum of 5.7 Gpixels/s. Any further increase in bandwidth will not result in a fillrate performance increase. If I had a 7790 I could clock it to 853Mhz, set the bandwidth to 110GB/s run the test, then run it again with the bandwidth at 133GB/s (or as close as the overclock would allow) to prove that an increase in bandwidth would not result in a fillrate increase. Also if you look at my theoretical max for the PS4 it is 9.05 GPixels/s, compared to the 6970 which gets 8.8 GPixels/s it is well within the margin of error. It also does not factor in any potential improvements made by AMD between the 6xxx series and the 7xxx series. Scaling from the 6850's 134.4 GB/s @ 6.62 GPixels/s to the 7850 using the same formula I used in the previous posts gets 153.6/134.4 * 6.62 = 7.57 GPixels/s, this is a bit less than the 7.9 GPixels/s the 7850 actually gets so it does suggest that the 7xxx series ROPS are more bandwidth efficient than those of the 6xxx series.

Your X1 result doesn't cover the claimed 133 GB/s alpha blending for the X1.

You haven't factored in the effective memory bandwdith.

gpu-zero-all.png

AMD Radeon HD 5870 has theoretical 153 GB/s with a 108 GB/s practical i.e. 70.5 percent efficient. This is just the basic memory copy.

Are you assuming the external memory stack has zero overheads/zero latency?

It makes no difference, 110GB/s is the required 'on paper' figure to max out 16 ROPS @ 853Mhz. If GDDR5 is 70% efficient then it only really needs 77GB/s of bandwidth in the real world. Anything more than that will not improve pixel fillrate no matter how efficient it is.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#299 tormentos
Member since 2003 • 33793 Posts

 

10 precent allocation is still not confirmed by MS and my posted PC benchmarks has multitasking enabled.

 

7770 still doesn't reflect X1's memory subsystem.

ronvalencia

 

It wasn't deny and Kotaku reported that it was given by MS it self,stop it you are embarrasing making excuses for MS.

And the xbox one doesn't reflect the 7770 full TF count either,is 1.18 TF the 7770 has 1.28...hahaha

 

Funny how you conveniently forget what you like only.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#300 btk2k2
Member since 2003 • 440 Posts
At 1080p and minus heavy MSAA, modern games are not ROPS bound.

http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis

"Other information has also come to light offering up a further Orbis advantage: the Sony hardware has a surprisingly large 32 ROPs (Render Output units) up against 16 on Durango. ROPs translate pixel and texel values into the final image sent to the display: on a very rough level, the more ROPs you have, the higher the resolution you can address (hardware anti-aliasing capability is also tied into the ROPs). 16 ROPs is sufficient to maintain 1080p, 32 comes across as overkill, but it could be useful for addressing stereoscopic 1080p for instance, or even 4K. However, our sources suggest that Orbis is designed principally for displays with a maximum 1080p resolution."

Are you claiming 115GB/s memory will halve my 7950-950Mhz's frame rate?

7950-950Mhz with GDDR5 @ 5000Mhz effective (240 GB/s). Minor fps hit with reduced memory speed.

Tomb Raider at Ultimate settings and 1080p shared.

--------------

7950-950Mhz with GDDR5 @ 2400Mhz effective (115.2 GB/s)

Tomb Raider at Ultimate settings and 1080p shared.

ronvalencia
I am isolating Pixel Fillrate performance, nothing more, nothing less. You keep saying that the PS4's 32 ROPS will be under utilised vs the 7970 due to bandwidth, which is true. I am just putting in actual numbers to show where the fillrate differences between the X1 and the PS4 will be. Your test is flawed because memory bandwidth impacts more than just fillrate so that performance deficit is not isolated to fillrate alone. I have never claimed that pixel fillrate is the be all and end all of GFX performance.