XBOX Ones architecture/Power Gap vs PS4 Possibly not as big

  • 96 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1  Edited By Areez
Member since 2002 • 6278 Posts

Just recently read two articles pertaining to the Xbox Ones internal design. After reading this, I was left with the impression that the power gap between the One and PS4 may not be as big as we thought.

http://www.ubergizmo.com/2013/08/xbox-one-processor-hot-chips-conference/

http://techreport.com/news/25288/xbox-one-soc-rivals-amd-tahiti-for-size-complexity

Avatar image for Randolph
Randolph

10542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2 Randolph
Member since 2002 • 10542 Posts

I'm pretty far pass the point of caring anymore. Xbone has Killer Instinct, therefore I will have an Xbone. :)

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3  Edited By Areez
Member since 2002 • 6278 Posts

@Randolph:

I am almost sure that this thread will receive little attention as it does not contain negative news about the XBX 1.

I hear you on KI....I just thought I post this as it really gives an in depth perspective in how the architecture works as opposed to just looking at numbers and making broad assumptions.

Avatar image for Randolph
Randolph

10542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#4 Randolph
Member since 2002 • 10542 Posts

@Areez:

You need a more catchy thread title, bait and switch!

"XBONE MORE POWERFUL THAN PS4 LAMLAMLAM!!!1!!1"

Avatar image for c_rakestraw
c_rakestraw

14627

Forum Posts

0

Wiki Points

0

Followers

Reviews: 64

User Lists: 0

#5 c_rakestraw  Moderator
Member since 2007 • 14627 Posts
@Randolph said:

I'm pretty far pass the point of caring anymore. Xbone has Killer Instinct, therefore I will have an Xbone. :)

Likewise. I don't care about spec threads anymore. Not interested in talking tech.

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 Areez
Member since 2002 • 6278 Posts

@Randolph:

LMAO! You are probably right....

Avatar image for Randolph
Randolph

10542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#8 Randolph
Member since 2002 • 10542 Posts

Xbox One hates baseball, apple pie, and Jesus! BOO!

Avatar image for ZZoMBiE13
ZZoMBiE13

22935

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#9 ZZoMBiE13
Member since 2002 • 22935 Posts

@Randolph said:

I'm pretty far pass the point of caring anymore. Xbone has Killer Instinct, therefore I will have an Xbone. :)

Replace "Killer Instinct" with "Dead Rising" and you pretty much sum up my feelings.

I'm excited for the PS4 as well though. I want to play Destiny on PS4 and InFamous looks really good. In short, I want ALL the toys. Even a WiiU... eventually. "Eventually" meaning once they have a Kirby game or a sequel to Elite Beat Agents.

Avatar image for Randolph
Randolph

10542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10 Randolph
Member since 2002 • 10542 Posts

I honestly forgot about Dead Rising 3, likely for the best. It is after all, a Capcom game. I just hate Capcom at this point, worse than Activision or EA in fact. Especially given that they said more DLC is their ticket to avoiding bankruptcy, that makes me question just how much stuff will be locked behind a paywall on the disc and accessed only when I cough up a few extra dollars for an unlock key. I hope I'm wrong, but with Capcom, I can never be too sure anymore. They really are just awful.

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#11 S0lidSnake
Member since 2002 • 29001 Posts

@Randolph said:

@Areez:

You need a more catchy thread title, bait and switch!

"XBONE MORE POWERFUL THAN PS4 LAMLAMLAM!!!1!!1"

This IS the more bait-ish title. The original thread title was simply titled 'Details on the Xbox One architecture' or something innocuous to that effect.

It was changed to add the comparison with the PS4 after it received no replies and very few views.

As for the articles, Digital Foundry did a pretty comprehensive Q&A with the Xbone architects and as it turns out, it's actually weaker than we originally thought since 10% of the GPU is reserved for Kinect and the Snapshot/Picture-in-Picture OS features.

Avatar image for Randolph
Randolph

10542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12  Edited By Randolph
Member since 2002 • 10542 Posts
@S0lidSnake said:

As for the articles, Digital Foundry did a pretty comprehensive Q&A with the Xbone architects and as it turns out, it's actually weaker than we originally thought since 10% of the GPU is reserved for Kinect and the Snapshot/Picture-in-Picture OS features.

Killer Instinct looks plenty good by itself for me, as is, without that extra 10% of the GPU. Or the EDU or RAM and EDRAM and blah blah technical stuff. I mean, at this point, all these numbers and technical blah bloo's become superfluous to the actual games. The games on these systems will look pretty. That's been settled. I don't think it makes a huge difference at this point if one set of pretty games is marginally more pretty than the other.

I think we'll have a similar situation to the current gen, most games will be about equally good looking on both, the exclusive games on the Sony one will have an extra level of sheen and polish over what's possible on the MS Box.

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#13 S0lidSnake
Member since 2002 • 29001 Posts

@Randolph said:
@S0lidSnake said:

As for the articles, Digital Foundry did a pretty comprehensive Q&A with the Xbone architects and as it turns out, it's actually weaker than we originally thought since 10% of the GPU is reserved for Kinect and the Snapshot/Picture-in-Picture OS features.

Killer Instinct looks plenty good by itself for me, as is, without that extra 10% of the GPU. Or the EDU or RAM and EDRAM and blah blah technical stuff. I mean, at this point, all these numbers and technical blah bloo's become superfluous to the actual games. The games on these systems will look pretty. That's been settled. I don't think it makes a huge difference at this point if one set of pretty games is marginally more pretty than the other.

I think we'll have a similar situation to the current gen, most games will be about equally good looking on both, the exclusive games on the Sony one will have an extra level of sheen and polish over what's possible on the MS Box.

Killer Instinct runs at 720p. It looks like a current gen game with some neat particle effects.

i agree with the rest.

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14  Edited By Areez
Member since 2002 • 6278 Posts

@S0lidSnake:

Did you read the two articles? Probably not....The disparity in power is not going to be as wide as you think....MS will have less issue with things such as latency with the architecture....We"ll probably see more of the same next gen as we did this gen...In the beginning it was teh power of the PS3 over the 360...And how did that turn out?

The 10% you mention can be worked around, by offloading A.I. and some of the in game physics to the cloud. Which the console is optimized to do. The ability to do this was highlighted in an interview by Titan Fall developer Respawn.

Avatar image for Pedro
Pedro

73962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#15 Pedro
Member since 2002 • 73962 Posts

@S0lidSnake said:

As for the articles, Digital Foundry did a pretty comprehensive Q&A with the Xbone architects and as it turns out, it's actually weaker than we originally thought since 10% of the GPU is reserved for Kinect and the Snapshot/Picture-in-Picture OS features.

The PS4 would share similar reservation of resources. This should not come as a surprise to anyone. In fact this reservation of resources is absolutely necessary if they want to have smooth user experience. Look at the PS3, using the XMB while playing a game is horrendous. So all none gaming features need to have their own resources outside of gaming to run smoothly and seamlessly. After all the PS4 like the Xbox One has GIGS of RAM allocated for this.

Avatar image for CarnageHeart
CarnageHeart

18316

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 CarnageHeart
Member since 2002 • 18316 Posts

@Areez:

If you want to trust the claims of the two articles more than people actually working on the hardware in question, suit yourself.

http://www.extremetech.com/gaming/166649-ps4-is-50-faster-than-xbox-one-report-game-developers

http://vr-zone.com/articles/indie-devs-call-sonys-ps4-powerful-console-world-due-gpu/59605.html?utm_source=rss&utm_medium=rss&utm_campaign=indie-devs-call-sonys-ps4-powerful-console-world-due-gpu

As I've noted before, its impressive that Sony made a system which is not only powerful, but easy to program for. Due to the quality and quantity of stuff one needs to put into a game, game development is doomed to be ever more time consuming, but its nice that Sony doing what it can to minimize the negative trend.

http://www.joystiq.com/2013/06/28/cerny-ps4s-time-to-triangle-to-rival-ps1/

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#17 S0lidSnake
Member since 2002 • 29001 Posts

@Pedro said:

@S0lidSnake said:

As for the articles, Digital Foundry did a pretty comprehensive Q&A with the Xbone architects and as it turns out, it's actually weaker than we originally thought since 10% of the GPU is reserved for Kinect and the Snapshot/Picture-in-Picture OS features.

The PS4 would share similar reservation of resources. This should not come as a surprise to anyone. In fact this reservation of resources is absolutely necessary if they want to have smooth user experience. Look at the PS3, using the XMB while playing a game is horrendous. So all none gaming features need to have their own resources outside of gaming to run smoothly and seamlessly. After all the PS4 like the Xbox One has GIGS of RAM allocated for this.

Again, based on what evidence? We haven't heard anything from the developers regarding this. Guerilla Games had a pretty extensive tech slide up and while it mentions that they are using only 6 of the 8 CPU cores, there was no mention of any GPU resources being allocated for gaming.

Yes, the non-gaming features DO have their own resources, they have access to 2 CPU cores. Why would they need an extra 184 Gflops of GPU power? Again, they do not have picture-in-picture and Kinect overheads to worry about. How much GPU power does your PC use to ruyn non-gaming features when playing games? 1%? 2%? The 25% CPU allocation along with the 2 GB of RAM should be more than enough to run non-gaming apps in the background without taking 5-10% of the GPU resources. After all it’s the CPU’s job to run apps.

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18  Edited By Areez
Member since 2002 • 6278 Posts

@S0lidSnake:

And you believe that Sony has not allocated resources internally in anticipation of PS Eye use?

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19  Edited By Areez
Member since 2002 • 6278 Posts

@CarnageHeart:

Carnage these articles were authored by architects who have a better understanding of the internal workings of the chip set design and overall system architecture. The information included in these articles is from the recent hot chip dev conference. I ll take the information and knowledge of hardware/chip architects than formulating an opinion from "Anonymous" sources.

From what I have read regarding architecture, it is similar to tuning a race car. You can have an engine with 700 hp, and if the build of the car is not optimized, as in poor aerodynamics, poor suspension or the engine running hot....Its performance peak could suffer and do worse than an engine that has 500hp, has better aerodynamics, suspension and so forth....

Point is many have probably over exaggerated the significance of numbers without understanding how the actual architecture of both systems work.

As Randolph said earlier, we will probably not see much of a difference visually as we did this generation....except in the case of first party titles. So at the end the debate over power will mean little.

Avatar image for CarnageHeart
CarnageHeart

18316

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 CarnageHeart
Member since 2002 • 18316 Posts

@Areez:

MS spouts nonsense about balance while they announce a change every other week and while the development tools are famously (and uncharacteristically for MS) hard to use due in part to them being ever changing. A few articles written by nongamers who are optimistic about it based on their reading of the spec sheet/what MS told them at a conference means less to me than the opinions of game developers who are actually working with the hardware.

Sony meanwhile started off talking about ease of development and hardware power. There have no reversals or tweaks. Developers on and off record sing the praises of the hardware. Both big publishers and indies are flocking to it, with game announcements being made almost daily.

Going with your car analogy, do you trust the opinion of mechanics who make a living working on a car and understand its real world performance or guys who merely studied the schematics of a car?

Avatar image for Randolph
Randolph

10542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 Randolph
Member since 2002 • 10542 Posts
@S0lidSnake said:

Killer Instinct runs at 720p. It looks like a current gen game with some neat particle effects.

*shrugs* At least were not still running 640p upscaled, eh? ;)

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22  Edited By Areez
Member since 2002 • 6278 Posts

@CarnageHeart:

Carnage. ...unlike many here....I do not get lost in all the hype and talking points about the tweaks MS has done.....And since when was improving upon something a bad thing anyway?

What I do trust is the knowledge of hardware architects who study and build these systems. The architects are the mechanics here and not the other way around. You mention Sony and its messaging of ease of development....In comparison to the PS3, what I have read is that it will be easier....However GDDR5 and all that horse power does not come with out its own potential road blocks....such as latency, and potential bottle knecks which devs may have to figure out work arounds for.....

The bigger point here is that both systems are much closer than people realize.....And if you study take the time to research the architecture on both systems you will see this...As opposed to just reading generic talking points and getting caught up in the "cool" trend of bashung everything MS....

Avatar image for CarnageHeart
CarnageHeart

18316

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23  Edited By CarnageHeart
Member since 2002 • 18316 Posts

@Areez:

As for your claim you trust the 'designers'. The designers of the Xbone and the PS4 have differing opinions of the relative power of their systems and both are on the payroll of the companies in question so even putting aside ego, they aren't going to damn their work and praise the other guy's.

Of course, you aren't taking the word of both sets of architects (which is impossible) you're taking the word of the guys who made the Xbone as gospel and ignoring people who 'merely' work on it. Suit yourself.

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24  Edited By Areez
Member since 2002 • 6278 Posts

@CarnageHeart:

No I am taking the words of architect's who do not work for MS, but who have detailed features of said architecture and what the numbers and tech mean ...big difference.

Avatar image for Pedro
Pedro

73962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#25 Pedro
Member since 2002 • 73962 Posts

@S0lidSnake said:

Again, based on what evidence? We haven't heard anything from the developers regarding this. Guerilla Games had a pretty extensive tech slide up and while it mentions that they are using only 6 of the 8 CPU cores, there was no mention of any GPU resources being allocated for gaming.

Yes, the non-gaming features DO have their own resources, they have access to 2 CPU cores. Why would they need an extra 184 Gflops of GPU power? Again, they do not have picture-in-picture and Kinect overheads to worry about. How much GPU power does your PC use to ruyn non-gaming features when playing games? 1%? 2%? The 25% CPU allocation along with the 2 GB of RAM should be more than enough to run non-gaming apps in the background without taking 5-10% of the GPU resources. After all it’s the CPU’s job to run apps.

I am not sure what evidence you are requesting. Developers did not mention anything about being limited to 5 Gigs of RAM but it was eventually disclosed that they were.

You argue that 2GB of RAM should be more than than enough but at least 3GB is allocated. The Kinect is optional for operation making this overhead also optional or conditional. The PS4 would have more or less the same amount of non gaming functions as the Xbox One so I am not sure why you think that the lack of certain Xbox One features on the PS4 is validation that their would not be a need for reserved GPU resources. You try to justify the absence of these inevitable "possibility" by asking why they would need the 184 Gflops of the GPU power but, a similar question can be thrown back at you. Why do they need 3GB of RAM? The DVR functionality for game recording is NOT going to be done on the CPU, so what do you think is going to process all of that beautiful 720p or more of pixel data at 24FPS while playing the game? And that is only one non gaming application and a processor extensive application. Add in the user interface and you would have a healthy use of the 184 Gflops you are referencing.

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 Areez
Member since 2002 • 6278 Posts

@Pedro:

Great post....these are examples of things that a lot of people do not understand....They read articles by beat reporters and see 50% more powerful and blah blah blah and take it as gospel....Without researching what the architecture on both systems are capable of....

Avatar image for CarnageHeart
CarnageHeart

18316

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 CarnageHeart
Member since 2002 • 18316 Posts

@Areez said:

@CarnageHeart:

No I am taking the words of architect's who do not work for MS, but who have detailed features of said architecture and what the numbers and tech mean ...big difference.

Reading schematics (as opposed to drawing them) does not an architect make. Like I said, real world experience with hardware means more than what a guy divines by reading a spec sheet or watching a power point presentation.

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Areez
Member since 2002 • 6278 Posts

@CarnageHeart:

Again its deeper than reading a spec sheet...It is the understanding on how it works...and can be manipulated. Remember that the PS3 was touted as a much superior system, and by all accounts is compared to the 360. And how did that turn out? The point is, many of placed way to much emphasis on teh power without understanding the architecture behind teh power.....

Avatar image for CarnageHeart
CarnageHeart

18316

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29  Edited By CarnageHeart
Member since 2002 • 18316 Posts

@Areez said:

@CarnageHeart:

Again its deeper than reading a spec sheet...It is the understanding on how it works...and can be manipulated. Remember that the PS3 was touted as a much superior system, and by all accounts is compared to the 360. And how did that turn out? The point is, many of placed way to much emphasis on teh power without understanding the architecture behind teh power.....

The PS3 was too hard to develop for and too expensive out of the gate (due to the inclusion of expensive technology with dubious gaming benefits). Things improved dramatically once Sony slashed the price/improved sales and improved the devkits. Its striking that MS is repeating Sony's biggest mistakes.

Expecting developers to learn arcana in order to squeeze the most out of a system isn't realistic in the world of multiplatform development.

Avatar image for firefox59
firefox59

4530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30  Edited By firefox59
Member since 2005 • 4530 Posts

@CarnageHeart said:

@Areez said:

@CarnageHeart:

Again its deeper than reading a spec sheet...It is the understanding on how it works...and can be manipulated. Remember that the PS3 was touted as a much superior system, and by all accounts is compared to the 360. And how did that turn out? The point is, many of placed way to much emphasis on teh power without understanding the architecture behind teh power.....

The PS3 was too hard to develop for and too expensive out of the gate (due to the inclusion of expensive technology with dubious gaming benefits). Things improved dramatically once Sony slashed the price/improved sales and improved the devkits. Its striking that MS is repeating Sony's biggest mistakes.

Expecting developers to learn arcana in order to squeeze the most out of a system isn't realistic in the world of multiplatform development.

That last sentence, it is one of the main 'negatives' that doesn't make sense. People have been using embedded RAM for years. This isn't some new tech or architecture that MS created like the CELL processor. It won't be as difficult to develop for as everyone is making it out to be.

The on paper numbers are what they are but in terms of optimization and ease of development, that is all just pointless speculation.

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#32 S0lidSnake
Member since 2002 • 29001 Posts

@dvader654 said:

I remember last gen the taking sides before a new gen ended up with epic forum battles, fun ones where it wasn't about a bunch of specs. Back in the day it was about who got what exclusive (I remember the day VF went to Xbox that was huge, one by one PS exclusives were jumping ship), who had the better features, Xbox live charging you for something that should be free, how $600 was insane. Now its all about some weird numbers language I don't understand.

I guess it has mostly to do with the fact that the age of the third party exclusive is over.

Last gen had plenty of spec talk. The Cell was touted as a super computer that could run microwaves, dishwashers and cure cancer. Sony lied about the PS3 being capable of doing 2.0 Tflops.( PS4 is barely even 2.0 Tflops) Microsoft boosted their 360 RAM bandwidth to ridiculous numbers (276 GB/s when Xbone RAM Bandwidth is around 100 GB/s and talked about their console having 1.0 Tflops of power. (It ended up somewhere around 0.35 Tflops) Then you had years of Multiplatform wars where the 360 versions were declared winners because the games looked better.

The point is they talked about this shit last time. But this time around gamers have grown up and are better at reading specs. It also helps that both consoles are using AMD cards which PC gamers have been familiar with for a good 2-3 years now. Now when Microsoft tries to bullshit, we can easily call them out on it. When they say shit like having 6 extra GPU cores will not give PS4 an advantage, we can simply look at PC benchmarks and tell them to go f*ck their mothers because the benchmarks show having more power translates into having a better framerate or higher res or better effects.

And thats the reason MS keeps getting bad press. They keep going on the offense on gaf, twitter, and Digital Foundry downplaying the PS4 advantage at every turn, but this time around we are educated enough to know their claims are absolutely false.

It is really not that complicated if you have ever shopped for a GPU/CPU. The first thing people look at is the Tflops number. Then the benchmarks.

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#34  Edited By S0lidSnake
Member since 2002 • 29001 Posts

@Pedro said:


I am not sure what evidence you are requesting. Developers did not mention anything about being limited to 5 Gigs of RAM but it was eventually disclosed that they were.You argue that 2GB of RAM should be more than than enough but at least 3GB is allocated. The Kinect is optional for operation making this overhead also optional or conditional. The PS4 would have more or less the same amount of non gaming functions as the Xbox One so I am not sure why you think that the lack of certain Xbox One features on the PS4 is validation that their would not be a need for reserved GPU resources. You try to justify the absence of these inevitable "possibility" by asking why they would need the 184 Gflops of the GPU power but, a similar question can be thrown back at you. Why do they need 3GB of RAM? The DVR functionality for game recording is NOT going to be done on the CPU, so what do you think is going to process all of that beautiful 720p or more of pixel data at 24FPS while playing the game? And that is only one non gaming application and a processor extensive application. Add in the user interface and you would have a healthy use of the 184 Gflops you are referencing.

I am talking about actual Sony documents distributed to Developers that clearly state 18 CUs are available for games, and MS leaked documents that clearly state 10% of the GPU is reserved for the OS. Then we have Xbone architects confirming this 10% allocation verbally. That's the evidence I am requesting. You have none. You are just going by your gut feeling.

The RAM example does not apply here. The PS4 got a RAM increase fairly late. Hell even their first party devs did not know about the RAM increase until the PS4 reveal. The point is that is something that they had a wiggle room on. Remember, the devs were promised 3.5 out of the 4GB they had originally. If you are saying that they have since looked at the Xbone and decided to reserve some of the GPU power then I cannot agree because by now it is too late for them to go back to the KZ devs and say 'oh btw, you only have 90% of the total GPU power'. This simply wont happen.

Kinect's OS reserve is NOT optional or conditional. That 10% of the GPU is simply not available to Crytek just because their game does not use Kinect or because some of the gamers have turned off their Kinect. It simply does not work like that. If it was truly optional, MS would not have reserved it. Developers do NOT have access to that 10% regardless of whether or not Kinect is optional until MS unlocks it for everyone.

Both PS4 and Xbox One have dedicated video processors that handle the DVR functionality. It was confirmed at the reveal that the streaming and recording features will have no effect on the game's resources.

And the reason why I believe that PS4's lack of Kinect and Snapshot features is the reason why they dont need upto 10% of the GPU resources is because Xbone Architects themselves said the 10% of the GPU is being used for the GPGPU calculations of the Kinect and Snapshot features. These are their words. Their examples. Their reasoning. Why in the world would Sony decide to copy MS's snapshot and Kinect features this late in the dev cycle after seeing the response MS got? Why would they spend 5 years designing something for gamers and then f*ck over their devs five months before their games launch?

Avatar image for Pedro
Pedro

73962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#35 Pedro
Member since 2002 • 73962 Posts

@S0lidSnake said:

I am talking about actual Sony documents distributed to Developers that clearly state 18 CUs are available for games, and MS leaked documents that clearly state 10% of the GPU is reserved for the OS. Then we have Xbone architects confirming this 10% allocation verbally. That's the evidence I am requesting. You have none. You are just going by your gut feeling.

The RAM example does not apply here. The PS4 got a RAM increase fairly late. Hell even their first party devs did not know about the RAM increase until the PS4 reveal. The point is that is something that they had a wiggle room on. Remember, the devs were promised 3.5 out of the 4GB they had originally. If you are saying that they have since looked at the Xbone and decided to reserve some of the GPU power then I cannot agree because by now it is too late for them to go back to the KZ devs and say 'oh btw, you only have 90% of the total GPU power'. This simply wont happen.

Kinect's OS reserve is NOT optional or conditional. That 10% of the GPU is simply not available to Crytek just because their game does not use Kinect or because some of the gamers have turned off their Kinect. It simply does not work like that. If it was truly optional, MS would not have reserved it. Developers do NOT have access to that 10% regardless of whether or not Kinect is optional until MS unlocks it for everyone.

Both PS4 and Xbox One have dedicated video processors that handle the DVR functionality. It was confirmed at the reveal that the streaming and recording features will have no effect on the game's resources.

And the reason why I believe that PS4's lack of Kinect and Snapshot features is the reason why they dont need upto 10% of the GPU resources is because Xbone Architects themselves said the 10% of the GPU is being used for the GPGPU calculations of the Kinect and Snapshot features. These are their words. Their examples. Their reasoning. Why in the world would Sony decide to copy MS's snapshot and Kinect features this late in the dev cycle after seeing the response MS got? Why would they spend 5 years designing something for gamers and then f*ck over their devs five months before their games launch?

If you allocated a specified amount of resources for non gaming application it would be factual to state that DVR functionality would have no effect of gaming resources. Thus the reason for this allocation. These systems are design to be cost effective and it is quite logical that they would utilize all in one solutions than rely on dedicated processors. It is the reason both companies moved to APUs. I have not seen any documentation indicating that the PS4 would be sporting a dedicated video encoder. I can be wrong or missed that published documentation clearly stating the use of a dedicated encoder. However, if the system lacks a dedicated encoder then the it would have to be processed on the GPU without cutting into the game performance and in order to that you would need to reserve GPU resources. The same would apply to the user interface.

I am not claiming or hinting that Sony is making moves based on Microsoft. A large chunk of their decisions with regards to the system would have already been made prior to MS announcements. These are all computers and they all have to abide by the rules of computing. If Sony wants a responsive user interface and seamless DVR functionality GPU reservation is required (at the very minimum for the user interface);

It is my understanding that the Kinect is optional for Xbox One functionality. Are you saying that the system cannot function without the Kinect? Or are you saying that the resources for the Kinect even if its not being used is inaccessible? If the later is the case I would like to see where this has been stated. Again I can be very wrong but it would be illogical for MS to make the gaming resources that a for the Kinect inaccessible to developers for developers who would not be using the Kinect functionality. Prior to Sony revealing their memory allocation you strongly believed that they would not be reserving as much memory as Microsoft only to be surprised that they were doing the exactly the same. The differences between the systems are trivial. They are both going to operate in the same manner. Sony has just been more tight lipped than MS but the inner workings are going to reflect that of the Xbox One. Also the last time a check the PS4 was initially planned to include the Eyetoy and it was a last minute decision to pull this inclusion. Since this was originally part of their plan, all of this would not be a last minute inclusion but an inclusion from the start.

You say gamers have gotten smarter with regards to specs and I beg to differ. They are still falling to the smoke and mirrors trick. Both systems are OK specwise. Neither one outshines the other. The only really tangible difference is the price beyond that is the insignificant graphic power difference.

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#36  Edited By S0lidSnake
Member since 2002 • 29001 Posts

@Pedro, Both consoles have dedicated video and audio processors. You keep talking about what is logical instead of simply looking at the design of the PS4. Here:

"For example, by having the hardware dedicated unit for audio, that means we can support audio chat without the games needing to dedicate any significant resources to them. The same thing for compression and decompression of video." The audio unit also handles decompression of "a very large number" of MP3 streams for in-game audio, Cerny added.

Ok so now since that is settled, can we at least agree that the PS4 would not require GPU resources for the encoding of the recorded videos?

I am saying that the 10% of the reserved resources for the Kinect are indeed inaccessible to developers for graphics. Why do you think I have bitching about MS's design choices for the past six or so months? They designed it as an entertainment unit first, at the expense of games. Here is the kotaku article which first leaked this:

1) Running: The game is loaded in memory and is fully running. The game has full access to the reserved system resources, which are six CPU cores, 90 percent of GPU processing power, and 5 GB of memory. The game is rendering full-screen and the user can interact with it.

http://kotaku.com/the-five-possible-states-of-xbox-one-games-are-strangel-509597078

It was then confirmed by Xbone Architects who said they 'plan' to make 'some' of this reserved space available to developers 'in the future'. I posted this thread and you got into an argument with me. What else did you think it meant?

Xbox One reserves 10 per cent of graphics resources for Kinect and apps functionality, Digital Foundry can confirm, with Microsoft planning to open up this additional GPU power for game development in the future. This, and further graphics and performance-based information was revealed during our lengthy discussions with two of the architects behindthe Xbox One silicon.

http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers

Again, can we agree that this 10% of GPU processing power is inaccessible to developers as of today? And just because MS made Kinect optional does not mean they also went back and removed this reservation because they did not. They 'plan to in the future'. I also cant believe you made the Eyetoy comparison. But let's talk about it some other time.

Lastly, I think the only one who is falling for smoke and mirrors are people who believe the convoluted Xbone architecture will somehow help bridge the 600 gflop gap b/w the PS4 and Xbox One. People who look at GDDR5 and say ESRAM has lower latency and imply it's better for video games that mother f*cking GDDR5. People who hilariously make Sports Car analogies implying that the Xbox is more efficient while completely ignoring that the PS4 also has dedicated audio and video processors, direct access to RAM and this is a big one... 64 Compute Queues vs 16 for the Xbox One making PS4 a much more efficient processor by default.

I find it hilarious that you disregard PC benchmarks. Here are the benchmarks of two of the best matched AMD GPUs we have and they show a massive framerate advantage. The AMD 7850 (1.76 Tflops) vs the 7770 (1.28 Tflops). (PS4 1.84 Tflops vs Xbone 1.31 Tflops or rather 1.18 Tflops since 10% of it is reserved by the OS)

http://www.anandtech.com/bench/product/549?vs=536

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#37  Edited By S0lidSnake
Member since 2002 • 29001 Posts

Here is the GPU comparison. Please tell me this makes both consoles the same.

PS4 - Xbox One - Advantage

  • 18 Compute Units - 12 Compute Units - 50%
  • 72 Texture Units - 48 Texture Units - 50%
  • 1152 Shader Cores - 768 Shader Cores - 50%
  • 32 ROPs - 16 ROPs - 100%
  • 1.84 Tflops - 1.18 Tflops (available for Games) - 55%
  • 64 Compute Queues - 16 Compute Queues - 400%

Look at this massive 50% framerate advantage in the PC comparison of GPUs similar in power to the Xbox and PS4 GPUs.

7850 - 7770 - Advantage

  • 64 Texture Units - 40 Texture Units - 60% - 10% more than PS4/Xbone
  • 1024 Shader Cores - 640 Shader Cores - 60% - 10% more than PS4/Xbone advantage
  • 32 ROPs - 16 ROPs - 100% Same as PS4/Xbone.
  • 1.76 Tflops - 1.28 Tflops - 37% - 18% Less than the PS4/Xbone advantage.

http://www.anandtech.com/bench/product/549?vs=536

Avatar image for teuf_
Teuf_

30805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38  Edited By Teuf_
Member since 2004 • 30805 Posts

Just an FYI, all GCN-based GPU's have a hardware H.264 encoder that can handle 1080p at 60fps.

Avatar image for GodModeEnabled
GodModeEnabled

15314

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#39 GodModeEnabled
Member since 2005 • 15314 Posts

Are games going to be 1080p as standard next gen instead of 720?-- random question.

Avatar image for Shame-usBlackley
Shame-usBlackley

18266

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#40  Edited By Shame-usBlackley
Member since 2002 • 18266 Posts

@S0lidSnake said:

Here is the GPU comparison. Please tell me this makes both consoles the same.

PS4 - Xbox One - Advantage

18 Compute Units - 12 Compute Units - 50%

72 Texture Units - 48 Texture Units - 50%

1152 Shader Cores - 768 Shader Cores - 50%

32 ROPs - 16 ROPs - 50%

1.84 Tflops - 1.18 Tflops (available for Games) - 55%

64 Compute Queues - 16 Compute Queues - 400%

Look at this massive 50% framerate advantage in the PC comparison of GPUs similar in power to the Xbox and PS4 GPUs.

7850 - 7770 - Advantage

64 Texture Units - 40 Texture Units - 60% - 10% more than PS4/Xbone

1024 Shader Cores - 640 Shader Cores - 60% - 10% more than PS4/Xbone advantage

32 ROPs - 16 ROPs - 100% Same as PS4/Xbone.

1.76 Tflops - 1.28 Tflops - 37% - 18% Less than the PS4/Xbone advantage.

http://www.anandtech.com/bench/product/549?vs=536

That looks pretty self-explanatory to me.

I still like Pedro's argument that they're both the same, though -- it's kind of like watching someone try to build a sand castle in a tornado. He builds up an argument that sounds nice, and then the specs just blow it all away. :)

Don't get me wrong, I like seeing the minority view advocated, but come on, you have to willfully distrust all those specs to believe it.

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#41  Edited By S0lidSnake
Member since 2002 • 29001 Posts

@GodModeEnabled said:

Are games going to be 1080p as standard next gen instead of 720?-- random question.

Nope. While both consoles can do 1080p, developers may choose to add more details and more effects to make their games look pretty rather than output them at 1080p. You are already seeing that with Ryse which is now 900p instead of 1080p. Killer Instinct is 720p but that's probably more of a testament to the team that's making it than the hardware.

I think all the PS4 exclusives are 1080p except for The Order which is doing this weird letterbox thing that some movies do. I have a feeling it was mostly due to them just wanting to use the GPU resources to put more detail and more effects. I suspect as the gen goes on, we will see more and more devs do 900p.

They will lose a lot of detail in 720p so I highly doubt we'll see 720p be the standard. most games will be around the 900p mark.

Avatar image for Pedro
Pedro

73962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#42 Pedro
Member since 2002 • 73962 Posts

@S0lidSnake:

I know you fancy these long drawn arguments. The reality is that you are hell bent on looking at things from a specific angle with almost no regards to other elements in the equation. The GCN architecture according to AMD features hardware encoding which utilizes the CU(Compute Units for video encoding). It is also mentioned here page 16 where it shows a diagram of the hybrid encoding of video content. This technology is reliant on the GPU because of the programmable nature of GCN architecture. Feel free to argue against this. One of the main selling points over the last two or three years for high end video cards is hardware encoding using the raw parallel processing of the GPU.

Since AMD docs indicate that the GPU would be handling video encoding and decoding then it is expected since both systems are based on the same architecture the same reservations would exist for the PS4. Fortunately the PS4 has more CUs but it does not change the need for this reservation. Again feel free to link to docs indicating that AMD encoding solution is NOT GPU based.

My smoke an mirrors comment directly relates to both the PS4 and the Xbox One. I know for fact that their are many who think the differences are going be profound or noticeable and in reality it would not. The Xbox One has 12 CUs, supposedly 14 during manufacturing with 2 locked/disabled. Its also interesting that this was mentioned

"...this may explain why Sony considered the optimal balance 14 CU with a lower clock." in this article.

Posting comparable PC cards is a good reference but not a true indicator of console performance because their are other factors to consider. The Xbox One sports 12 CUs and not 10 like the 7770 card you are referencing. The 7850 card reference for the PS4 has 14 CUs while the PS4 supposedly is using 18 but the DailyTech article indicates optimal performance at 14CUs at a lower clock speed. Power consumption and heat are other factors that would affect the performance of both systems. In the end the PS4 is still going be the faster system but not remotely close to your ludicrous 50%. You don't look at raw theoretical numbers and subtract to get an idea of real world performance. It does not work that way. The point I am simply making is that in the end the performance difference is NOT going to be 50% and is not going to be anything worth writing about.

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#43  Edited By S0lidSnake
Member since 2002 • 29001 Posts

@Pedro said:

I know you fancy these long drawn arguments.

I do not. I do find them frustrating though. Especially when people fail to understand simple concepts like additional processors and GPU reservations for the OS.

What difference does it make if GCN GPUs have support for video encoding? Both the Xbox One and PS4 have DEDICATED processors for video encoding that will either do all the work themselves or help take most of the load off of the GPU. There is a reason why both Sony and MS chose to add an extra video processor on the mobo, and you are having a tough time understanding why.

And again, that link you posted references the original Digital Foundry interview with the Xbox One Architects who specifically state that the 10% of GPU was reserved for Kinect and Snapshot features. They make no mention of reserving it for the video encoding so why in the world do you keep lumping it with the GPU reservation? Even if they are using the GPU to help with the video encoding, the hit on the GPU is so minor neither MS or Sony would comment on it.

So you know for a fact that a 50-100% advantage I showed in the benchmark above is not profound? I find it hilarious that you take a comparison that FAVORS the Xbox One and use it to downplay the PS4 advantage. There is absolutely NO evidence that performance somehow hits a bottleneck at 14 CUs. AMD's entire GCN lineup is based around increasing the CU count to improve performance. I can post benchmarks after benchmarks proving this, but facts and evidence are not your forte.

That Sony line actually comes from the Digital Foundry article where the Xbox One Architects in a lousy attempt to downplay the PS4 CU advantage reference a leaked Sony document from 2012 which encouraged developers to take advantage of the GPGPU compute functions instead of simply using it for rendering. Again, there is no evidence whatsoever that performance does not scale in a linear fashion after 14 CUs. MS however conveniently forgot to mention they only have 16 Render Output Units compared to PS4 and 7850's 32 ROPs. Hmm... you'd wonder if that 100% advantage had anything to do with them not seeing the results they wanted to see.

It seems you are not going to accept any of this so I maybe barking up the wrong tree here. I suppose there is nothing for us to do other than wait another month and see the difference ourselves. Here's what I am predicting. PS4 will see an advantage either on the resolution OR the framerate or better effects/AA. Given that most developers would likely lock the framerate at either 30 or 60 fps, I'd predict that if the PS4 multiplat is running at 1080p then the Xbox One version will run at 900p. Or it will have better effects/AA and whatnot.

BTW, That comes out to be a 44% increase in pixels. Feel free to downplay it.

Avatar image for Pedro
Pedro

73962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#44 Pedro
Member since 2002 • 73962 Posts

@S0lidSnake said:

I do not. I do find them frustrating though. Especially when people fail to understand simple concepts like additional processors and GPU reservations for the OS.

What difference does it make if GCN GPUs have support for video encoding? Both the Xbox One and PS4 have DEDICATED processors for video encoding that will either do all the work themselves or help take most of the load off of the GPU. There is a reason why both Sony and MS chose to add an extra video processor on the mobo, and you are having a tough time understanding why.

And again, that link you posted references the original Digital Foundry interview with the Xbox One Architects who specifically state that the 10% of GPU was reserved for Kinect and Snapshot features. They make no mention of reserving it for the video encoding so why in the world do you keep lumping it with the GPU reservation? Even if they are using the GPU to help with the video encoding, the hit on the GPU is so minor neither MS or Sony would comment on it.

So you know for a fact that a 50-100% advantage I showed in the benchmark above is not profound? I find it hilarious that you take a comparison that FAVORS the Xbox One and use it to downplay the PS4 advantage. There is absolutely NO evidence that performance somehow hits a bottleneck at 14 CUs. AMD's entire GCN lineup is based around increasing the CU count to improve performance. I can post benchmarks after benchmarks proving this, but facts and evidence are not your forte.

That Sony line actually comes from the Digital Foundry article where the Xbox One Architects in a lousy attempt to downplay the PS4 CU advantage reference a leaked Sony document from 2012 which encouraged developers to take advantage of the GPGPU compute functions instead of simply using it for rendering. Again, there is no evidence whatsoever that performance does not scale in a linear fashion after 14 CUs. MS however conveniently forgot to mention they only have 16 Render Output Units compared to PS4 and 7850's 32 ROPs. Hmm... you'd wonder if that 100% advantage had anything to do with them not seeing the results they wanted to see.

It seems you are not going to accept any of this so I maybe barking up the wrong tree here. I suppose there is nothing for us to do other than wait another month and see the difference ourselves. Here's what I am predicting. PS4 will see an advantage either on the resolution OR the framerate or better effects/AA. Given that most developers would likely lock the framerate at either 30 or 60 fps, I'd predict that if the PS4 multiplat is running at 1080p then the Xbox One version will run at 900p. That comes out to be a 44% increase in pixels. Feel free to downplay it.

No! You have demonstrated that you fail to understand simple concepts like additional processors and GPU reservation because you still argue that MS and Sony DO NOT KNOW what they are doing with the system THEY design and that their current reservation is ILLOGICAL because you know best. :)

Its seems like you are more hell bent on arguing the problems with the Xbox One than the meaningful advantage the PS4 would have. The mere fact you were playing around with raw numbers to draw a conclusion is simply absurd. Its akin to 2 CPUs == 200% performance.

In the end I really don't care what either company is claiming about their relatively underwhelming hardware. These debates occur each time a new system is released and it proven time and time again that their (plural reference) are always overstated. So I leave you to liberally get bent. :)

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#45  Edited By S0lidSnake
Member since 2002 • 29001 Posts

@Pedro said:

No! You have demonstrated that you fail to understand simple concepts like additional processors and GPU reservation because you still argue that MS and Sony DO NOT KNOW what they are doing with the system THEY design and that their current reservation is ILLOGICAL because you know best. :)

What? How did you get that from what I wrote? Everything I have written in reply to you is to show you that you dont understand how much is reserved and for what purpose, and what a 50% spec advantage translates to in similar PC cards that use the same damn GCN architecture these two GPUs are using. How in the world did you get the idea that I am arguing that Sony and MS do not know what they are doing and that the 10% reservation is illogical?

No wonder this went on for this long. You seem to have create this alternate S0lidSnake and this figment of your imagination is arguing the complete opposite of what I have been talking about?

Avatar image for Areez
Areez

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46  Edited By Areez
Member since 2002 • 6278 Posts

@Pedro:

You make a great point. That much of this is overstated as it was last gen. In terms of visuals, we will probably see more of the same as this console generation. Where the games between both consoles will not be dramatically different as many think. Which in the end is going to make Solids head pop off that the PS4 doesn't look 50% better. :-)

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#47 ronvalencia
Member since 2008 • 29612 Posts

@

@Areez said:

Just recently read two articles pertaining to the Xbox Ones internal design. After reading this, I was left with the impression that the power gap between the One and PS4 may not be as big as we thought.

http://www.ubergizmo.com/2013/08/xbox-one-processor-hot-chips-conference/

http://techreport.com/news/25288/xbox-one-soc-rivals-amd-tahiti-for-size-complexity

Radeon HD R9 280X (Tahiti XT2) has both memory bandwidth and CU count that murders X1's GCN solution.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#48  Edited By ronvalencia
Member since 2008 • 29612 Posts
@@S0lidSnake said:

Here is the GPU comparison. Please tell me this makes both consoles the same.

PS4 - Xbox One - Advantage

  • 18 Compute Units - 12 Compute Units - 50%
  • 72 Texture Units - 48 Texture Units - 50%
  • 1152 Shader Cores - 768 Shader Cores - 50%
  • 32 ROPs - 16 ROPs - 100%
  • 1.84 Tflops - 1.18 Tflops (available for Games) - 55%
  • 64 Compute Queues - 16 Compute Queues - 400%

Look at this massive 50% framerate advantage in the PC comparison of GPUs similar in power to the Xbox and PS4 GPUs.

7850 - 7770 - Advantage

  • 64 Texture Units - 40 Texture Units - 60% - 10% more than PS4/Xbone
  • 1024 Shader Cores - 640 Shader Cores - 60% - 10% more than PS4/Xbone advantage
  • 32 ROPs - 16 ROPs - 100% Same as PS4/Xbone.
  • 1.76 Tflops - 1.28 Tflops - 37% - 18% Less than the PS4/Xbone advantage.

http://www.anandtech.com/bench/product/549?vs=536

7770's result doesn't reflect X1's GCN

1. 7770's ~1.0 billion triangle per second vs X1's 1.7 billion triangle per second.

2. 7770's 72GB/s memory bandwidth vs X1's 68 GB/s memory bandwidth + 204 GB/s ESRAM (~165 GB/s for read and write from 16 ROPs).

---

PS4's 32 ROPs wouldn't be fully utilized i.e. it's gimped by it's 176 GB/s (theoretical) memory bandwidth.

7950 has the same 32 ROPs as PS4, but the 7950 BE has higher 240 GB/s (theoretical) memory bandwidth to support it's 32 ROP's read and write memory operations.

---

http://www.tomshardware.com/reviews/768-shader-pitcairn-review,3196-5.html

Prototype Radeon HD 7850 with 12 CU or 768 stream processors with 153.6GB/s (theoretical) memory bandwidth.

•12 Compute Units (CU) at 860Mhz

•48 Texture Units (TMU)

•768 Shader Cores

Avatar image for firefox59
firefox59

4530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 firefox59
Member since 2005 • 4530 Posts

Solid, read what this last dude just posted. What Pedro was saying is that you have to understand how those numbers work together. The system will only be as fast as it's worst bottleneck. Some of those numbers like the cores and the speeds are affected by each other and multiplied. Others are independent but yet still have to function within the framework of the architecture. It isn't a 1:1 comparison. I'm sure you understand this and will refrain from posting easy to understand comparisons. Idk why you're arguing about it.

Avatar image for S0lidSnake
S0lidSnake

29001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#50 S0lidSnake
Member since 2002 • 29001 Posts

@ronvalencia said:

1. 7770's ~1.0 billion triangle per second vs X1's 1.7 billion triangle per second.

2. 7770's 72GB/s memory bandwidth vs X1's 68 GB/s memory bandwidth + 204 GB/s ESRAM (~165 GB/s for read and write from 16 ROPs).

---

PS4's 32 ROPs wouldn't be fully utilized i.e. it's gimped by it's 176 GB/s (theoretical) memory bandwidth.

7950 has the same 32 ROPs as PS4, but the 7950 BE has higher 240 GB/s (theoretical) memory bandwidth to support it's 32 ROP's read and write memory operations.

---

http://www.tomshardware.com/reviews/768-shader-pitcairn-review,3196-5.html

Prototype Radeon HD 7850 with 12 CU or 768 stream processors with 153.6GB/s (theoretical) memory bandwidth.

•12 Compute Units (CU) at 860Mhz

•48 Texture Units (TMU)

•768 Shader Cores

lol No. It does NOT have 48 Texture units. It has 64 compared to the X1's 48. Check out the comparison sheet on Page 2 of the article you linked.

I also like how you conveniently forgot to mention that this 7850 they are using to test has 32 ROPs instead of 16 that the X1 has.

If anything this proves my point that having more Texture Units (PS4 actually has 72), more Shader units (1152) and more ROPS (32) will result in a pretty substantial performance difference.

You do bring a good point about the 7770 having only 72 GB/s memory bandwidth. That's something I completely overlooked. However, I'd highly disagree that the X1 has 270 GB/s memory bandwidth like the X1 architects having been talking about. The ESRAM can do 109 GB/s, but you simply cannot add up the DDR3 bandwidth and the ESRAM bandwidth. Quite a few tech websites like Anandtech and Arstechnica have questioned MS's claims. MS did walk back on it in the DF interview and said it's close to 134GB/s in their 'tests'.

Fact is that devs HAVE been bitching about the ESRAM and you can ask anyone who is not a MS employee and they would pick GDDR5 over ESRAM any day of the week and twice on Sunday.

Lastly, my comparison favored the X1 by a good 200 Gflops and it still blew it away. The Tflops number on the 7770 is 1.28 while the X1 only has 1.18 available for gaming. The Tflops number on the 7850 is 1.76 while the PS4 has 1.84 Tflops. So this comparison was skewed more than 200 Gflops in X1's favor.

And before you say Tflops number does not matter, it takes the stream processors and the clock speed into account.

853*768*2=1.31 Tflops for the X1.

800*1152*2=1.84 Tflops for the PS4.

My comparison had the 7770 running at 1000 Mhz and the 7850 running at 860 Mhz. If you use MS's own claims that increasing clock speed gives a bigger advantage than adding CUs (more stream processors, more texture units and potentially more ROPs) then you shouldn't have a problem with that comparison.

1000*640*2= 1.28Tflops for the 7770.

860*1024*2= 1.76Tflops for the 7850.