Not Expecting Xbox One eSRAM Bottleneck To Be Patched

  • 107 results
  • 1
  • 2
  • 3
Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1  Edited By scatteh316
Member since 2004 • 10273 Posts

Having developed Ryse: Son of Rome for the Xbox One and managed an excellent visual standard despite working in restricted resolution, Crytek would obviously knows more than a thing or two about properly taking advantage of the console’s power. But as more developers speak out about the limitations that eSRAM is posing, what are Crytek’s thoughts on the same? How does it plan to circumvent them?

“I can’t speak to whether this will be patched or improved in the future, but I wouldn’t expect it as a developer,” he said to GamingBolt. “Game developers are masters at engineering workarounds to hardware limitations and I think you’ll see unique and novel technology developed just for such a purpose.”

LINK

So Crytek confirm that ESRAM is a bottleneck and it's highly likely that Microsoft can't fix it with a patch or clever software, something anyone with common sense has known all along anyway.

That is fail,

Avatar image for -CC-
-CC-

2048

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2  Edited By -CC-
Member since 2006 • 2048 Posts

Xbox turn off. RIP 2013-2014 you wont be missed

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#3  Edited By MK-Professor
Member since 2009 • 4218 Posts

It have a ULTRA low-end GPU (HD7790), this can't be fixed with a patch or clever software.

Avatar image for GotNugz
GotNugz

681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4  Edited By GotNugz
Member since 2010 • 681 Posts

think we all knew that you can't patch the esram. I'll say it again that MS never had the esram as part of the original design and only added it after Sony dropped 8gb gddr5 on their head. No amount of trickery will turn that Bonaire pos gpu into anything special, what were the engineers thinking?

Avatar image for jhcho2
jhcho2

5103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

#5 jhcho2
Member since 2004 • 5103 Posts

A bottleneck isn't a bug which can be resolved just by patches. Bottlenecks in this context is process oriented, not really software.

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

Well all but the hardcore lems could see this a mile off really, quite funny after they tried to spin it as a benefit.

Avatar image for Mr-Kutaragi
Mr-Kutaragi

2466

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 Mr-Kutaragi
Member since 2013 • 2466 Posts

How can you patch hardware chip? They can only change architecture in future version but this will anger early adopter and costly and force dev to develop game for both version. Very messy to do this.

Game will be optimize in future and look good. DO not know why still rumor over this issue.

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#8 Spitfire-Six
Member since 2014 • 1378 Posts

Gaming news is not slow enough for us to revisit ESRAM. So whats really going on ?

Avatar image for SolidTy
SolidTy

49991

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#9  Edited By SolidTy
Member since 2005 • 49991 Posts

I learned in SW that you could magically upgrade physical hardware, add RAM and GPUchips and stuff if you bought Xbox Live Gold annually and got cloud powerz.

DX12 is due by Xmas 2015, maybe DX12 can add new bottleneck chipz that Xbone needs. I think Xbone needs more bottleneck chips, whatever those are. Maybe none of these fairy powers are enough, if this dev speaks the truth. I always imagined that the Xbone was an Xman mutant, and it's mutant power was going to allow it to turn into an Xbox Two in the days of future past.

inb4Nintendo&SonypaidRysedeveloperoff

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10  Edited By scatteh316
Member since 2004 • 10273 Posts

@GotNugz said:

think we all knew that you can't patch the esram. I'll say it again that MS never had the esram as part of the original design and only added it after Sony dropped 8gb gddr5 on their head. No amount of trickery will turn that Bonaire pos gpu into anything special, what were the engineers thinking?

No dude, ESRAM was always there, you can't re-design a whole GPU core to have ESRAM in the little time they had.

Avatar image for Sollet
Sollet

8288

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#11  Edited By Sollet
Member since 2003 • 8288 Posts

Guys... Guys... There's a hidden second GPU in there. Somewhere.

Avatar image for NathanDrakeSwag
NathanDrakeSwag

17392

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 NathanDrakeSwag
Member since 2013 • 17392 Posts

Lems have been Xboned again.

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14  Edited By deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

@FastRobby said:

The developer isn't saying that it is a bottleneck... He is just saying that it probably isn't going to be improved, but I don't see him telling that this is a problem. After all that great Xbox One news, cows must be searching for something to bitch about

"Game developers are masters at engineering workarounds to hardware limitations and I think you’ll see unique and novel technology developed just for such a purpose.”

He describes the ESRAM as a hardware limitation and an area that requires a work around. There's no searching needed when the evidence is in the second quote.

Sure, he doesn't describe it as a bottleneck but that could imply the other parts are just as flawed so pick your problem, shitty hardware in general or an ESRAM bottleneck?

Avatar image for hitmanactual
HitmanActual

1351

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#15  Edited By HitmanActual
Member since 2013 • 1351 Posts

So there really is no secret sauce?

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 silversix_
Member since 2010 • 26347 Posts

But the real question is, will they'll be able to patch POTATO7770? The answer is you don't patch a potato and expect anything in return.

Avatar image for killzowned24
killzowned24

7345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18  Edited By killzowned24
Member since 2007 • 7345 Posts

Limited in memory and GPU power= weak sauce xbone games just as we have been saying and every single game on it has shown so far,including ones not even released yet like sunset overdrive http://www.videogamer.com/xboxone/sunset_overdrive/news/sunset_overdrive_is_currently_sub-1080p.html

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21  Edited By Krelian-co
Member since 2006 • 13274 Posts

@Salt_The_Fries said:

Before you jerk off to death and choke on your own cum in a fit of blind joy, ask yourself one question: does Ryse in its current shape and form need optimization? If it does, it's minuscule at best. I'd be happy if Ryse-quality graphics became standard this gen.

lol why so salty? and double post, calm down take five, breathe in breathe out, make more excuses.

Avatar image for tdkmillsy
tdkmillsy

6617

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22 tdkmillsy
Member since 2003 • 6617 Posts

Game developers are masters at engineering workarounds to hardware limitations and I think you’ll see unique and novel technology developed just for such a purpose.

Early SDK's used to create Ryse and it looks ace. Microsoft have and will produce better drivers and SDK. Ryse 2 will look better (and hopefully play better).

Moving forward developers will get more from the hardware, how do you think games always look better ESram used well can have a big effect just like using CPU well. Of course it would be better to have more of it but to say its the end is just lame.

Games are already getting better and closer to PS4, will they always be behind - yes. By a lot nope.

Avatar image for Scipio8
Scipio8

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24  Edited By Scipio8
Member since 2013 • 937 Posts

The bottleneck is the software not the hardware so yes it can be improved. The next SDK will do that and allow overflow from ESRAM to DDR3 RAM while only keeping assets requiring high bandwidth in the ESRAM and swap as needed. But cows gonna cow.

Avatar image for rrjim1
rrjim1

1983

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25 rrjim1
Member since 2005 • 1983 Posts

Just to let all you Sony Fangirls know the PS4 also has Hardware limitations.

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#26  Edited By MK-Professor
Member since 2009 • 4218 Posts
@Scipio8 said:

The bottleneck is the software not the hardware so yes it can be improved. The next SDK will do that and allow overflow from ESRAM to DDR3 RAM while only keeping assets requiring high bandwidth in the ESRAM and swap as needed. But cows gonna cow.

LOL

xbox1 fans are true dreamers

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#27  Edited By Krelian-co
Member since 2006 • 13274 Posts

@FastRobby said:

@Krelian-co said:

@Salt_The_Fries said:

Before you jerk off to death and choke on your own cum in a fit of blind joy, ask yourself one question: does Ryse in its current shape and form need optimization? If it does, it's minuscule at best. I'd be happy if Ryse-quality graphics became standard this gen.

lol why so salty? and double post, calm down take five, breathe in breathe out, make more excuses.

Excuses?

"but teh ryse" doesn't sound as an excuse to you? a corridor game at 900p?

Avatar image for epic-gamerz
Epic-gamerz

222

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#28 Epic-gamerz
Member since 2014 • 222 Posts

Thank you M$, I as a proud Lemming have my spirit lift up each time I hear the words DX12 and Drivers update. Thank you my Lord, my God Microsoft

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 FoxbatAlpha
Member since 2009 • 10669 Posts

Lol, cows agreeing about Crytek experiencing a EsRam bottleneck when they overlook the fact that Ryse was a beautiful game. That doesn't look like a bottleneck to me. It looks like cow butthurt.

Avatar image for mbrockway
mbrockway

3560

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30  Edited By mbrockway
Member since 2007 • 3560 Posts

The esram isn't big enough to hold a 1080p framebuffer. It was never big enough, and it can't be fixed or patched. The only thing that can be down is reducing resolution to 900p (since it fits) or implementing some kind of tiling system. Its kind of stupid, MS was like 20 extra mb of ram away from having a pretty good system, but decided a camera and hdmi pass through was more important.

Avatar image for StrongBlackVine
StrongBlackVine

13262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31  Edited By StrongBlackVine
Member since 2012 • 13262 Posts

Another sub-1080p Xflop exclusive. It will be a miracle if 900p becomes standard.

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32  Edited By deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

@FoxbatAlpha said:

Lol, cows agreeing about Crytek experiencing a EsRam bottleneck when they overlook the fact that Ryse was a beautiful game. That doesn't look like a bottleneck to me. It looks like cow butthurt.

And you look like a butthurt lemming who's just lost their excuse for the Xbone performing worse in every multiplat so has to use Ryse as justification.

Perhaps it's time to admit to the main point cows have been pushing and you've been continually avoiding, that the PS4 has the superior hardware that will not be matched by the Xbox One due to objectively weaker hardware be it the flawed design with the ESRAM, the weaker GPU or inferior memory.

Lemmings have consistently spoken of how ESRAM was a benefit to the Xbox One be it the claims it was a design decision, or tiled resources or so on and not a compromise. It's time to fess up and dig into some tasty crow.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 FoxbatAlpha
Member since 2009 • 10669 Posts

@hoosier7: I give you credit for still living that dream. A dream it is.

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

@FoxbatAlpha: What dream? I don't get it, the evidence is all there, weaker GPU, worse architecture and bottlenecking, worse RAM and seemingly a worse CPU.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 FoxbatAlpha
Member since 2009 • 10669 Posts

@hoosier7: we can argue it day and night. The weaker CPU is a new one though. Now you are making shit up.

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37  Edited By deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

@FoxbatAlpha:

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38  Edited By FoxbatAlpha
Member since 2009 • 10669 Posts

@hoosier7 said:

@FoxbatAlpha:

Dude you are not going to win with me no matter what you show. The PS4 is based off of horsepower. What the ONE is based off of hasn't been even used yet and requires less powerful hardware. In a year or two if the Xbox is still doing the same thing with sub 1080p and hasn't made a benefit from tiled resources, cloud compute and DX12, then I will hand you this victory.

Until then 1080p hasn't been a game changer for the PS4 like cows claim. The differences are very minor and in some instances I still think the ONE has better detail. Don't forget about the other mulitplats that are better on the ONE also. The tide is already turning and the Full POWA of the ONE hasn't even started.

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

@FoxbatAlpha: There's like two multiplats that are better on the Xbox One that's an insignificant number and it sure wasn't because of inferior hardware which is sure is a factor in all the others.

Your tune has certainly changed since last gen if the differences are minor when they're in fact many times greater.

How is the tide turning? The PS4 is continually racking up superior multiplats and embarrassing the X1 graphically in every comparable game, InFamous - DR3 for example, Driveclub seems to be pushing harder than Forza (though i will wait for it's launch) and so on.

You do realise the PS4 has PRT which does exactly the same thing as tiled resources? So how will that make up the gap?

Sony are working on the same optimisations DX12 implements so how will that close the gap? Even if DX12 does as you hope then it's surely temporary.

You can grasp to the excuses if you like but they being stricken off one by one. You're backing this imaginary hidden power when the PS4 is showing the real world applications.

The X1 also has no answer to GPU compute which aids gives the ability to be using as much of the GPUs power as possible all the time and to take loads of the CPU.

I will admit though that i won't win with you, it's impossible to argue with those than neglect the information that's placed directly in front of them.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40  Edited By FoxbatAlpha
Member since 2009 • 10669 Posts

@hoosier7 said:

@FoxbatAlpha: There's like two multiplats that are better on the Xbox One that's an insignificant number and it sure wasn't because of inferior hardware which is sure is a factor in all the others.

Your tune has certainly changed since last gen if the differences are minor when they're in fact many times greater.

How is the tide turning? The PS4 is continually racking up superior multiplats and embarrassing the X1 graphically in every comparable game, InFamous - DR3 for example, Driveclub seems to be pushing harder than Forza (though i will wait for it's launch) and so on.

You do realise the PS4 has PRT which does exactly the same thing as tiled resources? So how will that make up the gap?

Sony are working on the same optimisations DX12 implements so how will that close the gap? Even if DX12 does as you hope then it's surely temporary.

You can grasp to the excuses if you like but they being stricken off one by one. You're backing this imaginary hidden power when the PS4 is showing the real world applications.

The X1 also has no answer to GPU compute which aids gives the ability to be using as much of the GPUs power as possible all the time and to take loads of the CPU.

I will admit though that i won't win with you, it's impossible to argue with those than neglect the information that's placed directly in front of them.

You should feel honored that I took the time to read this. Main issue, DX12 and the Cloud have been demonstrated in real time. Don't downplay the fact that the hammer is coming down. You speak of the same hope of things to help the PS4. I can easily say you are dreaming of something imaginary.

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#41  Edited By misterpmedia
Member since 2013 • 6209 Posts

@hoosier7 said:

Well all but the hardcore lems could see this a mile off really, quite funny after they tried to spin it as a benefit.

lol don't expose them, let them wallow in their DX12 a bit longer.

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#42  Edited By misterpmedia
Member since 2013 • 6209 Posts

@Sollet said:

Guys... Guys... There's a hidden second GPU in there. Somewhere.

where?

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

@FoxbatAlpha: The cloud hasn't been demonstrated on a real world connection and is utterly useless if you game offline, all we've had is some pretty crappy AI. You got a link on DX12? I haven't seen any X1 demos of it yet?

Also the difference is the PS4 doesn't need the hope that the X1 does, it's got the raw hardware to compensate lol and there's a good reason why the R9 290X has launched with the same number of ACE units the PS4 has.

I'm going to ignore your comments on dreaming since you couldn't back them up last time.

Well at the least you've learnt something given you've not been able to refute any of my points.

Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#44 Gue1
Member since 2004 • 12171 Posts

kind of funny seeing a bunch of no name devs that still haven't even released a game on the xbone saying that the esram is no bottleneck but now comes crytek saying that it is.

Avatar image for GravityX
GravityX

865

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 GravityX
Member since 2013 • 865 Posts

Facts:

Andrew Goossen: I just wanted to jump in from a software perspective. This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM.

Of course with Xbox One we're going with a design where ESRAM has the same natural extension that we had with eDRAM on Xbox 360, to have both going concurrently. It's a nice evolution of the Xbox 360 in that we could clean up a lot of the limitations that we had with the eDRAM. The Xbox 360 was the easiest console platform to develop for, it wasn't that hard for our developers to adapt to eDRAM, but there were a number of places where we said, "Gosh, it would sure be nice if an entire render target didn't have to live in eDRAM," and so we fixed that on Xbox One where we have the ability to overflow from ESRAM into DDR3 so the ESRAM is fully integrated into our page tables and so you can kind of mix and match the ESRAM and the DDR memory as you go.

Sometimes you want to get the GPU texture out of memory and on Xbox 360 that required what's called a "resolve pass" where you had to do a copy into DDR to get the texture out - that was another limitation we removed in ESRAM, as you can now texture out of ESRAM if you want to. From my perspective it's very much an evolution and improvement - a big improvement - over the design we had with the Xbox 360. I'm kind of surprised by all this, quite frankly.

The future is bright for Xbox One.

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

@GravityX said:

Facts:

Andrew Goossen: I just wanted to jump in from a software perspective. This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM.

Of course with Xbox One we're going with a design where ESRAM has the same natural extension that we had with eDRAM on Xbox 360, to have both going concurrently. It's a nice evolution of the Xbox 360 in that we could clean up a lot of the limitations that we had with the eDRAM. The Xbox 360 was the easiest console platform to develop for, it wasn't that hard for our developers to adapt to eDRAM, but there were a number of places where we said, "Gosh, it would sure be nice if an entire render target didn't have to live in eDRAM," and so we fixed that on Xbox One where we have the ability to overflow from ESRAM into DDR3 so the ESRAM is fully integrated into our page tables and so you can kind of mix and match the ESRAM and the DDR memory as you go.

Sometimes you want to get the GPU texture out of memory and on Xbox 360 that required what's called a "resolve pass" where you had to do a copy into DDR to get the texture out - that was another limitation we removed in ESRAM, as you can now texture out of ESRAM if you want to. From my perspective it's very much an evolution and improvement - a big improvement - over the design we had with the Xbox 360. I'm kind of surprised by all this, quite frankly.

The future is bright for Xbox One.

Goossen works for MS not exactly reliable when the third party says this:

"If we do port to the Xbox One, it will be unfortunate that we still have to deal with things like the eSRAM in this generation of consoles."

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#47 misterpmedia
Member since 2013 • 6209 Posts

lol the argument in here is epic because @FoxbatAlpha is Sisyphus and the esram is the boulder. Day and night he'll fight the good fight, in vain.

Avatar image for ZombeGoast
ZombeGoast

437

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 ZombeGoast
Member since 2010 • 437 Posts

Patching ram is like downloading more ram its dumb. The problems is that these guys are terrible at balancing the available memory on the eSram the same with the Wii U's eDram. If you want 1080p on the Wii U and Xbox One you will have to sacrifice 16mb to achieve that with no AA.. Its better to drop it to 720p to add more effects than trying to making something like Forza 5.

You guys are talking about how the eSram is an evolution of the 360's eDram while the Wii U is using eDram which is much more of the evolution of the 360's. Both CPU can access them but the eSram is said to be vary slow to access it while the opposite is said for tor the Wii U's.

Also the Ps4 is no better it can't even render 1080p at a locked 60fps.

Avatar image for GravityX
GravityX

865

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49  Edited By GravityX
Member since 2013 • 865 Posts

@hoosier7 said:

@GravityX said:

Facts:

Andrew Goossen: I just wanted to jump in from a software perspective. This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM.

Of course with Xbox One we're going with a design where ESRAM has the same natural extension that we had with eDRAM on Xbox 360, to have both going concurrently. It's a nice evolution of the Xbox 360 in that we could clean up a lot of the limitations that we had with the eDRAM. The Xbox 360 was the easiest console platform to develop for, it wasn't that hard for our developers to adapt to eDRAM, but there were a number of places where we said, "Gosh, it would sure be nice if an entire render target didn't have to live in eDRAM," and so we fixed that on Xbox One where we have the ability to overflow from ESRAM into DDR3 so the ESRAM is fully integrated into our page tables and so you can kind of mix and match the ESRAM and the DDR memory as you go.

Sometimes you want to get the GPU texture out of memory and on Xbox 360 that required what's called a "resolve pass" where you had to do a copy into DDR to get the texture out - that was another limitation we removed in ESRAM, as you can now texture out of ESRAM if you want to. From my perspective it's very much an evolution and improvement - a big improvement - over the design we had with the Xbox 360. I'm kind of surprised by all this, quite frankly.

The future is bright for Xbox One.

Goossen works for MS not exactly reliable when the third party says this:

"If we do port to the Xbox One, it will be unfortunate that we still have to deal with things like the eSRAM in this generation of consoles."

A lot devs can't afford, time and money, to effectively or even are willing to use eSRAM if they are porting over a game that wasn't built with eSRAM in mind.

So currently in their minds it's more of a hurdle they don't or can't afford to jump.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 tormentos
Member since 2003 • 33798 Posts

Bububut Tilling tricks and tile resources.hahaha

Man i have been so on the spot with the xbox one hardware problems is scary.

It is a limitation today it will be in 5 years as well.