Not Expecting Xbox One eSRAM Bottleneck To Be Patched

  • 107 results
  • 1
  • 2
  • 3
Avatar image for GravityX
GravityX

865

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 GravityX
Member since 2013 • 865 Posts

@tormentos said:

Bububut Tilling tricks and tile resources.hahaha

Man i have been so on the spot with the xbox one hardware problems is scary.

It is a limitation today it will be in 5 years as well.

You know thats not true. If any thing X1 has more room for optimization and growth. Unlike the straight forward PS4. It's going to be the lazy devs console.

And when are buying a console Torms? You and Thuway don't even own consoles and talk a lot of smack. lol

Avatar image for Heil68
Heil68

60833

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#52  Edited By Heil68
Member since 2004 • 60833 Posts

Hard to compete with the PS$ being the worlds most powerful video game console to have ever been created.

Avatar image for sam890
sam890

1124

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 sam890
Member since 2005 • 1124 Posts

Lems can't catch a break ;'(

Avatar image for ladyblue
LadyBlue

4943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54 LadyBlue
Member since 2012 • 4943 Posts

@Salt_The_Fries said:

Before you jerk off to death and choke on your own cum in a fit of blind joy, ask yourself one question: does Ryse in its current shape and form need optimization? If it does, it's minuscule at best. I'd be happy if Ryse-quality graphics became standard this gen.

Would be a treat, too bad it isn't happening.

Avatar image for b4x
B4X

5660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#55 B4X
Member since 2014 • 5660 Posts

Both of these consoles are weak sauce. They both have huge bottle necks. Xbox one home of the non P's. PS4 home of the 30 fps.

Carry on.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#57  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

Dx 12 will save the xbone.

Avatar image for epic-gamerz
Epic-gamerz

222

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#58  Edited By Epic-gamerz
Member since 2014 • 222 Posts

@ZombeGoast: Resogun is 1080p 60fps locked. Eat your words

Avatar image for SambaLele
SambaLele

5552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59  Edited By SambaLele
Member since 2004 • 5552 Posts

@FastRobby said:

@Krelian-co said:

@Salt_The_Fries said:

Before you jerk off to death and choke on your own cum in a fit of blind joy, ask yourself one question: does Ryse in its current shape and form need optimization? If it does, it's minuscule at best. I'd be happy if Ryse-quality graphics became standard this gen.

lol why so salty? and double post, calm down take five, breathe in breathe out, make more excuses.

Excuses?

Will Ryse's graphical level be enough in a few years? How will that compare to what's on the competitor's system? I don't know, even if it's not an excuse, it certainly isn't a favorable argument.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 Tighaman
Member since 2006 • 1038 Posts

READ THE ARTICLE THE GUY DID NOT MAKE RYSE hes never worked on the x1 ESRAM PATCHED? sounds like babble

Avatar image for NathanDrakeSwag
NathanDrakeSwag

17392

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62  Edited By NathanDrakeSwag
Member since 2013 • 17392 Posts

Who needs games when you have TV and Kinect?

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#63  Edited By Wasdie  Moderator
Member since 2003 • 53622 Posts

Well duh, you can't "patch" a hardware bottleneck. Those thinking some SDK changes would overcome that were just fooling themselves.

You can compensate for it using software but you'll never completely patch it out and it will always be something devs have to work around.

A hardware bottleneck can only be removed with new hardware or bypassing the bottleneck all together. Considering the ESRAM is used for low-latency RAM to compensate for the lower throughput of DDR3, it's not something you can just avoid using.

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#65 Spitfire-Six
Member since 2014 • 1378 Posts

@Wasdie: Correct me if I'm wrong as Ive said Im still learning this stuff. It sounds like theoretically you can write to both sets of ram. They are not independent of each other. Does this mean that you could write certain aspects of the frame buffer into esram things that need to stream faster, and things that need to read slower through the ddr3?

Avatar image for KiZZo1
KiZZo1

3989

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#66 KiZZo1
Member since 2007 • 3989 Posts

bu bu but t3h powerz 0f the cloodze :(

Avatar image for SambaLele
SambaLele

5552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67  Edited By SambaLele
Member since 2004 • 5552 Posts

@FastRobby said:

@SambaLele said:

@FastRobby said:

@Krelian-co said:

@Salt_The_Fries said:

Before you jerk off to death and choke on your own cum in a fit of blind joy, ask yourself one question: does Ryse in its current shape and form need optimization? If it does, it's minuscule at best. I'd be happy if Ryse-quality graphics became standard this gen.

lol why so salty? and double post, calm down take five, breathe in breathe out, make more excuses.

Excuses?

Will Ryse's graphical level be enough in a few years? How will that compare to what's on the competitor's system? I don't know, even if it's not an excuse, it certainly isn't a favorable argument.

Do you think Ryse already maxed out the console? :s

Exactly, no, I don't, but Salt_The_Fries implied that it's "enough" for the system: "ask yourself one question: does Ryse in its current shape and form need optimization? If it does, it's minuscule at best."

Are you really in this conversation?

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#68 Wasdie  Moderator
Member since 2003 • 53622 Posts

@spitfire-six said:

@Wasdie: Correct me if I'm wrong as Ive said Im still learning this stuff. It sounds like theoretically you can write to both sets of ram. They are not independent of each other. Does this mean that you could write certain aspects of the frame buffer into esram things that need to stream faster, and things that need to read slower through the ddr3?

Well yeah, that's exactly why they have that 32mbs of fast ram. The problem is that 32mbs isn't enough for a 1080p image, especially when you need 60 frames per second. It's a straight up hardware bottleneck. They should have gone with GDDR 5 system ram like Sony did if they wanted 1 pool of unified memory. They could have done a more PC like architecture and gone like 2 gbs of GDDR5 vram (more than enough for 1080p) and then like 6 gbs of DDR3 system ram.

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#70  Edited By Spitfire-Six
Member since 2014 • 1378 Posts

@Wasdie said:

@spitfire-six said:

@Wasdie: Correct me if I'm wrong as Ive said Im still learning this stuff. It sounds like theoretically you can write to both sets of ram. They are not independent of each other. Does this mean that you could write certain aspects of the frame buffer into esram things that need to stream faster, and things that need to read slower through the ddr3?

Well yeah, that's exactly why they have that 32mbs of fast ram. The problem is that 32mbs isn't enough for a 1080p image, especially when you need 60 frames per second. It's a straight up hardware bottleneck. They should have gone with GDDR 5 system ram like Sony did if they wanted 1 pool of unified memory. They could have done a more PC like architecture and gone like 2 gbs of GDDR5 vram (more than enough for 1080p) and then like 6 gbs of DDR3 system ram.

Ok I was under the impression that you don't have to write everything to just the 32mb. The example of a frame buffer I saw it had specific things targeted to specific ram addresses. I figured you should be able to write certain objects such as background textures, sky's, and etc to the ddd 3 ram and save the 32mb for the things in the foreground as well as updating the camera position. Ill also find this post where a developer from trials gave an example of writing a frame buffer that would fit in esram

edit: posted by Sebbi

"MJPs g-buffer layout is actually only two RTs in the g-buffer rendering stage and one RT in the lighting stage. And a depth buffer of course. Quite normal stuff.

On GCN you want to pack your data to 64 bpp (4 x 16 bit integer) render targets because that doubles your fill rate compared to using more traditional 32 bpp RTs (GCN can do 64 bit filling at same ROP rate as 32 bit filling).

I assume that the packing is like this:

Gbuffer1 = normals + tangents (64 bit)

Gbuffer2 = diffuse + brdf + specular + roughness (64 bits)

Depth buffer (32 bits)

Without any modifications this takes 40 megabytes of memory (1080p).

The lighting step doesn't need extra 8 MB for the 4x16f RT, because compute shader can simultaneously read and write to the same resource, allowing you to to lighting "in-place", writing the output over the existing g-buffer. This is also very cache friendly since the read pulls the cache lines to L1 and the write thus never misses L1 (GCN has fully featured read & write caches).

It's also trivial to get this layout down to 32 MB from the 40 MB. Replace gbuffer1 with a 32 bit RT (32 MB target reached at 1080p). Store normal as 11+11 bit using lambert azimuth equal area projection. You can't see any quality difference. 5+5 bits for tangents is enough (4 bits for exponent = mip level + 1 bit mantissa). 11+11+5+5=32. Also if you only use the tangents for shadow mapping / other planar projections, you don't need them at all, since you can analytically calculate the derivatives from the stored normal vector.

This layout is highly efficient for both g-buffer rendering and lighting. And of course also for post processing since all your heavy data fits in the fast memory. Shadow maps obviously need to be sampled from main memory during the lighting, but this is actually a great idea since the lighting pass woudn't otherwise use any main memory BW at all (it would be completely unused = wasted)."

Posted in this thread: http://forum.beyond3d.com/showthread.php?p=1830129#post1830129

Avatar image for ziggyww
ziggyww

907

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71  Edited By ziggyww
Member since 2012 • 907 Posts

I feel developers would be better off not even looking to do anything with the ESRAM.

It was only added in the last minute to bring up the bandwidth of the console stats wise to try and shift more consoles and not to give Sony a reason to advertise how there console has multiple times the bandwidth of its physical memory.

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#72 Spitfire-Six
Member since 2014 • 1378 Posts

@ziggyww said:

I feel developers would be better off not even looking to do anything with the ESRAM.

It was only added in the last minute to bring up the bandwidth of the console stats wise to try and shift more consoles and not to give Sony a reason to advertise how there console has multiple times the bandwidth of its physical memory.

that makes no sense.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#73  Edited By Daious
Member since 2013 • 2315 Posts

@FoxbatAlpha said:

@hoosier7 said:

@FoxbatAlpha:

Dude you are not going to win with me no matter what you show. The PS4 is based off of horsepower. What the ONE is based off of hasn't been even used yet and requires less powerful hardware. In a year or two if the Xbox is still doing the same thing with sub 1080p and hasn't made a benefit from tiled resources, cloud compute and DX12, then I will hand you this victory.

Until then 1080p hasn't been a game changer for the PS4 like cows claim. The differences are very minor and in some instances I still think the ONE has better detail. Don't forget about the other mulitplats that are better on the ONE also. The tide is already turning and the Full POWA of the ONE hasn't even started.

Fairly round bout way of dodging the weaker Cpu statement.

Avatar image for SambaLele
SambaLele

5552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74  Edited By SambaLele
Member since 2004 • 5552 Posts

@FastRobby said:

@SambaLele said:

@FastRobby said:

@SambaLele said:

@FastRobby said:

@Krelian-co said:

@Salt_The_Fries said:

Before you jerk off to death and choke on your own cum in a fit of blind joy, ask yourself one question: does Ryse in its current shape and form need optimization? If it does, it's minuscule at best. I'd be happy if Ryse-quality graphics became standard this gen.

lol why so salty? and double post, calm down take five, breathe in breathe out, make more excuses.

Excuses?

Will Ryse's graphical level be enough in a few years? How will that compare to what's on the competitor's system? I don't know, even if it's not an excuse, it certainly isn't a favorable argument.

Do you think Ryse already maxed out the console? :s

Exactly, no, I don't, but Salt_The_Fries implied that it's "enough" for the system: "ask yourself one question: does Ryse in its current shape and form need optimization? If it does, it's minuscule at best."

Are you really in this conversation?

I'm drinking at the moment, so no?

Does Ryse need optimization? Graphic wise, not really, gameplay wise, yes :)

Deviating much? "I'd be happy if Ryse-quality graphics became standard this gen."

So... "excuses?"

Yes.

Simple objective text interpretation man...

Avatar image for ziggyww
ziggyww

907

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#76  Edited By ziggyww
Member since 2012 • 907 Posts

@spitfire-six said:

@ziggyww said:

I feel developers would be better off not even looking to do anything with the ESRAM.

It was only added in the last minute to bring up the bandwidth of the console stats wise to try and shift more consoles and not to give Sony a reason to advertise how there console has multiple times the bandwidth of its physical memory.

that makes no sense.

Umm...yeah it does!

8GB of DDR3 is a lot slower then 8GB of GDDR5, Microsoft knew that, Sony knew that...everybody knows that.

Microsoft then adds 32Mb of ESRAM to then bring up the bandwidth, as ESRAM is expensive but insanely quick. This then brings the Bandwidth of the consoles physical memory lower but not by much leaving Sony unable to capitalise on it and use it to their advantage as a selling point for their console about how much quicker its bandwidth is over the X1.

The truth of it is still that 32Mb isn't really that much and there isn't a lot of stuff you can logically fill it up with. The new batmobile model takes up 160mb in itself, this then clearly means that 99.9% of the game goes through the much slower 8GB of DDR3 memory where 100% of PS4 games go through the much quicker 8GB of GDDR5 memory.

Hope that clears it up for you.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#77 blackace
Member since 2002 • 23576 Posts

@scatteh316 said:

Having developed Ryse: Son of Rome for the Xbox One and managed an excellent visual standard despite working in restricted resolution, Crytek would obviously knows more than a thing or two about properly taking advantage of the console’s power. But as more developers speak out about the limitations that eSRAM is posing, what are Crytek’s thoughts on the same? How does it plan to circumvent them?

“I can’t speak to whether this will be patched or improved in the future, but I wouldn’t expect it as a developer,” he said to GamingBolt. “Game developers are masters at engineering workarounds to hardware limitations and I think you’ll see unique and novel technology developed just for such a purpose.”

LINK

So Crytek confirm that ESRAM is a bottleneck and it's highly likely that Microsoft can't fix it with a patch or clever software, something anyone with common sense has known all along anyway.

That is fail,

That won't be a problem in the near future. Especially with DX12, tile resources and new graphic engines.

Avatar image for no-scope-AK47
no-scope-AK47

3755

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#78  Edited By no-scope-AK47
Member since 2012 • 3755 Posts

This problem will only get worse vs the ps4 there is no secret sauce. MS is trying everything they can to minimize the gap in power vs the ps4.

1. They cloud will increase the xflops powah

2. There is really no noticeable difference between 720p vs 1080p cause we upconvert to 1080p

3. The dx12 will save us

4. We will use eSRAM so there is no bottle neck vs the ps4

5. We got new rendering and stacked secret chips

Did I miss any other damage control ??

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#79 blackace
Member since 2002 • 23576 Posts

@Scipio8 said:

The bottleneck is the software not the hardware so yes it can be improved. The next SDK will do that and allow overflow from ESRAM to DDR3 RAM while only keeping assets requiring high bandwidth in the ESRAM and swap as needed. But cows gonna cow.

It's ok. Let them believe what they want. They think they know everything. Another year or so from now they will be like. "WTF. How is the XB1 doing that!?!?" lmao!!

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#80 Spitfire-Six
Member since 2014 • 1378 Posts

@ziggyww said:

@spitfire-six said:

@ziggyww said:

I feel developers would be better off not even looking to do anything with the ESRAM.

It was only added in the last minute to bring up the bandwidth of the console stats wise to try and shift more consoles and not to give Sony a reason to advertise how there console has multiple times the bandwidth of its physical memory.

that makes no sense.

Umm...yeah it does!

8GB of DDR3 is a lot slower then 8GB of GDDR5, Microsoft knew that, Sony knew that...everybody knows that.

Microsoft then adds 32Mb of ESRAM to then bring up the bandwidth, as ESRAM is expensive but insanely quick. This then brings the Bandwidth of the consoles physical memory lower but not by much leaving Sony unable to capitalise on it and use it to their advantage as a selling point for their console about how much quicker its bandwidth is over the X1.

The truth of it is still that 32Mb isn't really that much and there isn't a lot of stuff you can logically fill it up with. The new batmobile model takes up 160mb in itself, this then clearly means that 99.9% of the game goes through the much slower 8GB of DDR3 memory where 100% of PS4 games go through the much quicker 8GB of GDDR5 memory.

Hope that clears it up for you.

It doesn't explain why your initial post calls for devs to ignore 32mb of the fastest ram on the system just to keep Sony from advertising the hardware differences. Hardware is nice but games sale consoles not hardware and ignoring that 32mb of fast ram for marketing purposes is ridiculous.

32mb is not a lot of ram but if I am correct (currently looking for the developers guide or power point for x1) the 32mb of esram and the 8gb of ddr3 are viewed as the same pool of memory. You do not have to pass in a linear fashion between both. You can write to esram and ddr3 for the same frame buffer. Furthermore I posted an example of a frame buffer layout that could make it feasible to fit a 1080p frame buffer into 32mb of memory. I personally cannot test it but the members of that forum are active developers. So while I thank you for regurgitating the same talking points it is not needed as it has been beat to death on this forum. Does that clear it up for you?

Avatar image for CDWJUSTIN
CDWJUSTIN

2078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#81 CDWJUSTIN
Member since 2003 • 2078 Posts

xboys think a software upgrade will physically upgrade the hardware

Avatar image for no-scope-AK47
no-scope-AK47

3755

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#82  Edited By no-scope-AK47
Member since 2012 • 3755 Posts

The eSRAM is only 32mb vs the 8gb of ddr3. On pc when your gpu runs out of memory and has to go to system memory your games take a massive performance hit. Sure exclusives can try to work around this problem but that will take time and money and still wont address the gap in gpu's (x1 vs ps4). MS trired to be slick and overclock to try to distract xbots yet again.

Yet multiplats show that the ps4 has a major advantage and cost 100 bucks less.

Avatar image for leandrro
leandrro

1644

Forum Posts

0

Wiki Points

0

Followers

Reviews: -2

User Lists: 0

#83 leandrro
Member since 2007 • 1644 Posts

@MK-Professor said:

It have a ULTRA low-end GPU (HD7790), this can't be fixed with a patch or clever software.

thia is not true

7790 has 14CUS

Xbox One has only 12 working CUs and 10% os this is reserved for kinect so its 10,8

about 30% lower end than 7790

noy to mention the lower clocks and DDR3 memory

X1 is a joke

Avatar image for Animal-Mother
Animal-Mother

27362

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#84 Animal-Mother
Member since 2003 • 27362 Posts

@FoxbatAlpha said:

Lol, cows agreeing about Crytek experiencing a EsRam bottleneck when they overlook the fact that Ryse was a beautiful game. That doesn't look like a bottleneck to me. It looks like cow butthurt.

Yes, lets not take one of the worlds best engine creators seriously.

Avatar image for SonySoldier-_-
SonySoldier-_-

1186

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85  Edited By SonySoldier-_-
Member since 2012 • 1186 Posts

XBottleneckOne.

Avatar image for Indicud
Indicud

745

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86 Indicud
Member since 2013 • 745 Posts

@FoxbatAlpha said:

@hoosier7 said:

@FoxbatAlpha: There's like two multiplats that are better on the Xbox One that's an insignificant number and it sure wasn't because of inferior hardware which is sure is a factor in all the others.

Your tune has certainly changed since last gen if the differences are minor when they're in fact many times greater.

How is the tide turning? The PS4 is continually racking up superior multiplats and embarrassing the X1 graphically in every comparable game, InFamous - DR3 for example, Driveclub seems to be pushing harder than Forza (though i will wait for it's launch) and so on.

You do realise the PS4 has PRT which does exactly the same thing as tiled resources? So how will that make up the gap?

Sony are working on the same optimisations DX12 implements so how will that close the gap? Even if DX12 does as you hope then it's surely temporary.

You can grasp to the excuses if you like but they being stricken off one by one. You're backing this imaginary hidden power when the PS4 is showing the real world applications.

The X1 also has no answer to GPU compute which aids gives the ability to be using as much of the GPUs power as possible all the time and to take loads of the CPU.

I will admit though that i won't win with you, it's impossible to argue with those than neglect the information that's placed directly in front of them.

You should feel honored that I took the time to read this. Main issue, DX12 and the Cloud have been demonstrated in real time. Don't downplay the fact that the hammer is coming down. You speak of the same hope of things to help the PS4. I can easily say you are dreaming of something imaginary.

Why are you so willfully ignorant?

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87 FoxbatAlpha
Member since 2009 • 10669 Posts

@Indicud: I can't know.

Avatar image for Indicud
Indicud

745

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#88  Edited By Indicud
Member since 2013 • 745 Posts

@FoxbatAlpha said:

@Indicud: I can't know.

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#89  Edited By Krelian-co
Member since 2006 • 13274 Posts

@blackace said:

@scatteh316 said:

Having developed Ryse: Son of Rome for the Xbox One and managed an excellent visual standard despite working in restricted resolution, Crytek would obviously knows more than a thing or two about properly taking advantage of the console’s power. But as more developers speak out about the limitations that eSRAM is posing, what are Crytek’s thoughts on the same? How does it plan to circumvent them?

“I can’t speak to whether this will be patched or improved in the future, but I wouldn’t expect it as a developer,” he said to GamingBolt. “Game developers are masters at engineering workarounds to hardware limitations and I think you’ll see unique and novel technology developed just for such a purpose.”

LINK

So Crytek confirm that ESRAM is a bottleneck and it's highly likely that Microsoft can't fix it with a patch or clever software, something anyone with common sense has known all along anyway.

That is fail,

That won't be a problem in the near future. Especially with DX12, tile resources and new graphic engines.

and teh cloud, keep on hoping!

i can already see it, in 3 years: "guys microsoft said teh directx 13!!!!"

Avatar image for naz99
naz99

2941

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#90  Edited By naz99
Member since 2002 • 2941 Posts

Patching hardware???

Lol

Avatar image for ZombeGoast
ZombeGoast

437

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91 ZombeGoast
Member since 2010 • 437 Posts

@epic-gamerz said:

@ZombeGoast: Resogun is 1080p 60fps locked. Eat your words

What a compiling argument by bringing a list of games.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#92 tormentos
Member since 2003 • 33798 Posts

@GravityX said:

You know thats not true. If any thing X1 has more room for optimization and growth. Unlike the straight forward PS4. It's going to be the lazy devs console.

And when are buying a console Torms? You and Thuway don't even own consoles and talk a lot of smack. lol

Total buffoonery the xbox one can't optimize pas 1.31 Tf hell not pass 1.28TF which is its limit period,the PS4 has a stronger GPU that doesn't require any mystical coding or some obscure technique.

There is a difference between optimization and braking your back to try to make the code run optimal on the xbox one,there is no much room for improvement on xbox one,and you will see that as time move on,in the xbox 360 case the more it got push the lower the resolution went,i expect the same with the xbox one.

But if you have any other secret sauce please share it with us..lol

@spitfire-six said:

@Wasdie: Correct me if I'm wrong as Ive said Im still learning this stuff. It sounds like theoretically you can write to both sets of ram. They are not independent of each other. Does this mean that you could write certain aspects of the frame buffer into esram things that need to stream faster, and things that need to read slower through the ddr3?

You can Turn 10 already did it on Forza 5,the sky since it doesn't change or does anything was allocated on the main pool,anything else was move to ESRAM,but if you watched forza 5 early show,you can see that the spectators were 3d models, but on the retail copy they were replace with photos of people to cut resources consumptions,you can do it in the end basically yield the same results,you can't have dynamic sky or weather if your using the main pool of ram is to slow.

I am sure that is the same reason why MGS5 has dynamic sky while the xbox one version lack it completely.

@blackace said:

That won't be a problem in the near future. Especially with DX12, tile resources and new graphic engines.

Hahahaaaaaaaaaaaaaaaaaaaaaaaa......

@blackace said:

It's ok. Let them believe what they want. They think they know everything. Another year or so from now they will be like. "WTF. How is the XB1 doing that!?!?" lmao!!

Man you will be so own next year..hahaha

Didn't you say MS would deliver on this year E3.?

Avatar image for Phazevariance
Phazevariance

12356

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 Phazevariance
Member since 2003 • 12356 Posts

@scatteh316 said:

Having developed Ryse: Son of Rome for the Xbox One and managed an excellent visual standard despite working in restricted resolution, Crytek would obviously knows more than a thing or two about properly taking advantage of the console’s power. But as more developers speak out about the limitations that eSRAM is posing, what are Crytek’s thoughts on the same? How does it plan to circumvent them?

“I can’t speak to whether this will be patched or improved in the future, but I wouldn’t expect it as a developer,” he said to GamingBolt. “Game developers are masters at engineering workarounds to hardware limitations and I think you’ll see unique and novel technology developed just for such a purpose.”

LINK

So Crytek confirm that ESRAM is a bottleneck and it's highly likely that Microsoft can't fix it with a patch or clever software, something anyone with common sense has known all along anyway.

That is fail,

That's not what he says.

He says that developers can't rely on the hardware manufacturer to make the patch so they, already being experienced programmers, make the works-arounds to the hardware limitations for their games when they can.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#95 tormentos
Member since 2003 • 33798 Posts

@Phazevariance said:

@scatteh316 said:

Having developed Ryse: Son of Rome for the Xbox One and managed an excellent visual standard despite working in restricted resolution, Crytek would obviously knows more than a thing or two about properly taking advantage of the console’s power. But as more developers speak out about the limitations that eSRAM is posing, what are Crytek’s thoughts on the same? How does it plan to circumvent them?

“I can’t speak to whether this will be patched or improved in the future, but I wouldn’t expect it as a developer,” he said to GamingBolt. “Game developers are masters at engineering workarounds to hardware limitations and I think you’ll see unique and novel technology developed just for such a purpose.”

LINK

So Crytek confirm that ESRAM is a bottleneck and it's highly likely that Microsoft can't fix it with a patch or clever software, something anyone with common sense has known all along anyway.

That is fail,

That's not what he says.

He says that developers can't rely on the hardware manufacturer to make the patch so they, already being experienced programmers, make the works-arounds to the hardware limitations for their games when they can.

It is a technical limitation dude.

I can’t speak to whether this will be patched or improved in the future, but I wouldn’t expect it as a developer,” he said to GamingBolt. “Game developers are masters at engineering workarounds to hardware limitations

You don't need a PHD to understand what he is saying.

Ryse look great but is 180 turn from what they do on PC,which is more open games hell Crysis on PC is endlessly open compare to Ryse,which you can't move more than 6 feet from side to side most of the time,and you can't basically explore anything,not to mention that Ryse was say to be 1080p with 150K polygon characters,only to be downgrade to 900p with 85k polygon characters and still Ryse isn't a 30FPS lock game,it has drops into the teens and runs mostly at 26 FPS.

This come from Crytek far from your average weak developer.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#96  Edited By blackace
Member since 2002 • 23576 Posts

@Krelian-co said:

@blackace said:

@scatteh316 said:

Having developed Ryse: Son of Rome for the Xbox One and managed an excellent visual standard despite working in restricted resolution, Crytek would obviously knows more than a thing or two about properly taking advantage of the console’s power. But as more developers speak out about the limitations that eSRAM is posing, what are Crytek’s thoughts on the same? How does it plan to circumvent them?

“I can’t speak to whether this will be patched or improved in the future, but I wouldn’t expect it as a developer,” he said to GamingBolt. “Game developers are masters at engineering workarounds to hardware limitations and I think you’ll see unique and novel technology developed just for such a purpose.”

LINK

So Crytek confirm that ESRAM is a bottleneck and it's highly likely that Microsoft can't fix it with a patch or clever software, something anyone with common sense has known all along anyway.

That is fail,

That won't be a problem in the near future. Especially with DX12, tile resources and new graphic engines.

and teh cloud, keep on hoping!

i can already see it, in 3 years: "guys microsoft said teh directx 13!!!!"

http://www.ensenandoacomeramihijo.com/wp-content/uploads/2011/11/crying_cow-150x150.jpg

http://static.gamespot.com/uploads/original/1042/10429256/2526943-5119614969-PoshG.gif

Avatar image for Idontremember
Idontremember

965

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#97  Edited By Idontremember
Member since 2003 • 965 Posts

Seriously, if instead of using the expensive and problematic Esram, MS had just done something easy like throwing an extra 2 GB of GDDR5 on the side, we woudn't be having this discussion, except for the weak GPU.
Only an idiot would have thought that DDR3 would be fast enough for a GPU...
They are supposed to be the experts, but a decision like that, even if it was taken years ago shows a complete lack of intelligence from the decision makers at MS. I pity those engineers who knew how stupid the top was but could just stay there and watch the fail unfold knowing they would be blamed for it years later. Stupid Mattrick and Balmer and their egos.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98 tormentos
Member since 2003 • 33798 Posts

@Idontremember said:

Seriously, if instead of using the expensive and problematic Esram, MS had just done something easy like throwing an extra 2 GB of GDDR5 on the side, we woudn't be having this discussion, except for the weak GPU.

Only an idiot would have thought that DDR3 would be fast enough for a GPU...

They are supposed to be the experts, but a decision like that, even if it was taken years ago shows a complete lack of intelligence from the decision makers at MS. I pity those engineers who knew how stupid the top was but could just stay there and watch the fail unfold knowing they would be blamed for it years later. Stupid Mattrick and Balmer and their egos.

Yep and lemmings actually believe that he ESRAM inclusion is some how something to boost performance of graphics,when in reality is something to fix MS choosing DDR3 over GDDR5,if the xbox one had GDDR5 ESRAM would be out of the picture,and the xbox one would perform better,obviously to the point were its GPU allows it.

The xbox one actually cost Dom Mattrick his job in the end just like the PS3 cost Kuturagi his,it was a blonder which cost them the little good faith some had in MS and little good will they won with the xbox 360.

Avatar image for Idontremember
Idontremember

965

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#99  Edited By Idontremember
Member since 2003 • 965 Posts

@tormentos said:

@Idontremember said:

Seriously, if instead of using the expensive and problematic Esram, MS had just done something easy like throwing an extra 2 GB of GDDR5 on the side, we woudn't be having this discussion, except for the weak GPU.

Only an idiot would have thought that DDR3 would be fast enough for a GPU...

They are supposed to be the experts, but a decision like that, even if it was taken years ago shows a complete lack of intelligence from the decision makers at MS. I pity those engineers who knew how stupid the top was but could just stay there and watch the fail unfold knowing they would be blamed for it years later. Stupid Mattrick and Balmer and their egos.

Yep and lemmings actually believe that he ESRAM inclusion is some how something to boost performance of graphics,when in reality is something to fix MS choosing DDR3 over GDDR5,if the xbox one had GDDR5 ESRAM would be out of the picture,and the xbox one would perform better,obviously to the point were its GPU allows it.

The xbox one actually cost Dom Mattrick his job in the end just like the PS3 cost Kuturagi his,it was a blonder which cost them the little good faith some had in MS and little good will they won with the xbox 360.

Well, with the 360, the Edram was a nice addition, but the system memory was GDDR3 and not DDR2, so it was fine.

What I can agree with MS is that for multitasking and multiple OS, DDR3 is the best choice. Mabey even 8Gb of GDDR5 and 2GB of slow DDR3 1066Mhz would have been enough. The thing is, with a single SOC, would it even have been possible to have two main pool of system ram accessible? mabey with one SOC that would have been imposible, making the full GDDR5 memory the only true viable option, just like how GDDR3 was for the 360.

As for the PS3, well, Sony got a bit over ambitious with the Cell and it didn't live up to it's expectation. The blu ray was a cost killer, but in the end, it helped destroy HD DVD, so only the Cell was the only true mistake (or more of a failed prototype than a real mistake), with the low amount of Ram (just like the 360). Kuturagi's mistake is nothing compared to Mattrick's complete stupidity.

Avatar image for mbrockway
mbrockway

3560

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#100  Edited By mbrockway
Member since 2007 • 3560 Posts

@Idontremember said:

@tormentos said:

@Idontremember said:

Seriously, if instead of using the expensive and problematic Esram, MS had just done something easy like throwing an extra 2 GB of GDDR5 on the side, we woudn't be having this discussion, except for the weak GPU.

Only an idiot would have thought that DDR3 would be fast enough for a GPU...

They are supposed to be the experts, but a decision like that, even if it was taken years ago shows a complete lack of intelligence from the decision makers at MS. I pity those engineers who knew how stupid the top was but could just stay there and watch the fail unfold knowing they would be blamed for it years later. Stupid Mattrick and Balmer and their egos.

Yep and lemmings actually believe that he ESRAM inclusion is some how something to boost performance of graphics,when in reality is something to fix MS choosing DDR3 over GDDR5,if the xbox one had GDDR5 ESRAM would be out of the picture,and the xbox one would perform better,obviously to the point were its GPU allows it.

The xbox one actually cost Dom Mattrick his job in the end just like the PS3 cost Kuturagi his,it was a blonder which cost them the little good faith some had in MS and little good will they won with the xbox 360.

Well, with the 360, the Edram was a nice addition, but the system memory was GDDR3 and not DDR2, so it was fine.

What I can agree with MS is that for multitasking and multiple OS, DDR3 is the best choice. Mabey even 8Gb of GDDR5 and 2GB of slow DDR3 1066Mhz would have been enough. The thing is, with a single SOC, would it even have been possible to have two main pool of system ram accessible? mabey with one SOC that would have been imposible, making the full GDDR5 memory the only true viable option, just like how GDDR3 was for the 360.

As for the PS3, well, Sony got a bit over ambitious with the Cell and it didn't live up to it's expectation. The blu ray was a cost killer, but in the end, it helped destroy HD DVD, so only the Cell was the only true mistake (or more of a failed prototype than a real mistake), with the low amount of Ram (just like the 360). Kuturagi's mistake is nothing compared to Mattrick's complete stupidity.

It would have been a boost if MS had put in 48mb so they could fit 1080p in the framebuffer. But they cheaped out. DDR? 32mb ESRAM? Slower GPU? Proprietary hard drives?

PS3, despite a wonky architecture, was on paper more powerful than the 360. Xbone is not in any way more powerful. Its both weaker and harder to dev for. Its weaker than the PS4, and even getting to that level of weakness is more difficult. No one is gonna bother, they're just gonna stick with 900p for this gen.