X1 esram explanation

  • 65 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for vpacalypse
vpacalypse

589

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 vpacalypse
Member since 2005 • 589 Posts

I'm no pc genius so can someone can explain Why X1 had to go for esram... why not just add more normal ram?

I'm not having a go at X1 (bcos from my understanding real world performance will be almost identical...or otherwise time will tell at best)

Why did they take complicated route. What was the thinking here. Am I missing something... does esram have some additional qualities to it or something.

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#2 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts
Adding more ram wouldn't have changed anything, DDR3 ram is slow as crap for graphics applications, they put the esram in to help speed it up.
Avatar image for Applesexual1
Applesexual1

54

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 Applesexual1
Member since 2013 • 54 Posts
There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.
Avatar image for Cali3350
Cali3350

16134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 Cali3350
Member since 2003 • 16134 Posts

The ESRAM has a couple applications. 

1.  Very low latency.  Floating Point OPS that dont require much data (like Physics calcs and the like) will benefit more from low latency than more memory bandwidth.

2.  Help with slow system bandwidth.  The Xbox 1 is bandwidth starved.  Its not as starved as it seems since it doesnt have to feed as powerful a GPU (specifically half the ROPS) but its still lacking.  The ESRAM allows you to store framebuffer info at a higher bandwidth rate to help alleivate this.

3.  Microsoft knew they needed 8GB of ram and werent ready to gamble on GDDR5 being available at that quanitity in time.  Sony was originally targetting 4GB ram and simply doubled it when RAM prices came down a large enough amount.  M$ was never willing to make this gamble and therefore went a safer route to get to 8GB.  

Avatar image for vpacalypse
vpacalypse

589

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 vpacalypse
Member since 2005 • 589 Posts

Yh these comments all make sense to me now.

1. X1 didn't want to gamble on gdr5 being availble by late 2013...so they went for normal ram.

2. they got esram to try & speed things up.

ok.

Avatar image for wolverine4262
wolverine4262

20832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 wolverine4262
Member since 2004 • 20832 Posts
Just to add on, they did the same thing with the 360 and 10mb of edram.
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts
Just to add on, they did the same thing with the 360 and 10mb of edram.wolverine4262
edram a little different then esram
Avatar image for wolverine4262
wolverine4262

20832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 wolverine4262
Member since 2004 • 20832 Posts
[QUOTE="wolverine4262"]Just to add on, they did the same thing with the 360 and 10mb of edram.xboxiphoneps3
edram a little different then esram

I didnt say esram and edram were the same thing, just that they were implemented for similar reasons. If I remember correctly, the edram was meant to be for AA. lol at how that turned out.
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts
[QUOTE="xboxiphoneps3"][QUOTE="wolverine4262"]Just to add on, they did the same thing with the 360 and 10mb of edram.wolverine4262
edram a little different then esram

I didnt say esram and edram were the same thing, just that they were implemented for similar reasons. If I remember correctly, the edram was meant to be for AA. lol at how that turned out.

the eDRAM in the 360 actually came into play in a few titles... it did basically "apply" free AA.. but it wasnt anything really special
Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10 darkangel115
Member since 2013 • 4562 Posts
[QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.

lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.
Avatar image for TheKingIAm
TheKingIAm

1531

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 TheKingIAm
Member since 2013 • 1531 Posts
Because MS is becoming the next Nintendo
Avatar image for ZoomZoom2490
ZoomZoom2490

3943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 ZoomZoom2490
Member since 2008 • 3943 Posts

You really want to know why? bigger profit, plain and simple.

If MS went with GDDR5 the X1 would still be priced $499 but they would of made less profit for each unit.

Avatar image for ZoomZoom2490
ZoomZoom2490

3943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 ZoomZoom2490
Member since 2008 • 3943 Posts

[QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.darkangel115
lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

you need help if you believe that. Gaming PC's use GDDR5 for graphics, unless you are buying the X1 for watching TV and use Netflix all day.

have a nice day troll of trollers.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts
[QUOTE="darkangel115"][QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.

lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

GDDR5 in a traditional PC setup as system memory wouldnt work obviously. but this is a customized APU... not a traditional PC, all the parts are much closer and transmission latency is severely reduced, the "latency" GDDR5 is complete nonsense bandwidth is one of the most important factors in a system... and 200 GB/s is completely theoretical and they only achieved 133 GB/S real life alpha blend... and also at the end of the day it is only 32 MB..... post backfire lul
Avatar image for ZoomZoom2490
ZoomZoom2490

3943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 ZoomZoom2490
Member since 2008 • 3943 Posts
Because MS is becoming the next Nintendo TheKingIAm
you are right technically, the Wii U is using the same exact RAM architecture like the X1, MS went with the cheapest way as possible just like Nintendo's biggest failure they ever made.
Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 MonsieurX
Member since 2008 • 39858 Posts
Because MS is becoming the next Nintendo TheKingIAm
Because fanboy
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

[QUOTE="darkangel115"][QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.ZoomZoom2490

lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

you need help if you believe that. Gaming PC's use GDDR5 for graphics, unless you are buying the X1 for watching TV and use Netflix all day.

have a nice day troll of trollers.

rofl
Avatar image for Sagemode87
Sagemode87

3439

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 Sagemode87
Member since 2013 • 3439 Posts

[QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.darkangel115
lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

This is bullsh*t

Avatar image for ZoomZoom2490
ZoomZoom2490

3943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 ZoomZoom2490
Member since 2008 • 3943 Posts
[QUOTE="xboxiphoneps3"][QUOTE="darkangel115"][QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.

lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

GDDR5 in a traditional PC setup as system memory wouldnt work obviously. but this is a customized APU... not a traditional PC, all the parts are much closer and transmission latency is severely reduced, the "latency" GDDR5 is complete nonsense bandwidth is one of the most important factors in a system... and 200 GB/s is completely theoretical and they only achieved 133 GB/S real life alpha blend... and also at the end of the day it is only 32 MB..... post backfire lul

Something tells me that he will come back with Cloud Processing next.
Avatar image for High_Fly
High_Fly

2846

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 High_Fly
Member since 2003 • 2846 Posts
What I don't get is why is MS only using DDR memory? Where is the GDDR?
Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#21 04dcarraher
Member since 2004 • 23859 Posts

[QUOTE="darkangel115"][QUOTE="Applesexual1"] lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.xboxiphoneps3
GDDR5 in a traditional PC setup as system memory wouldnt work obviously. but this is a customized APU... not a traditional PC, all the parts are much closer and transmission latency is severely reduced, the "latency" GDDR5 is complete nonsense bandwidth is one of the most important factors in a system... and 200 GB/s is completely theoretical and they only achieved 133 GB/S real life alpha blend... and also at the end of the day it is only 32 MB..... post backfire lul

 
What I don't get is why is MS only using DDR memory? Where is the GDDR?High_Fly
GDDR5 latency isnt nonsense Intel have servers that use intergrated GDDR5 as memory and when tests were done vs servers with DDR3, With normal cpu workloads the server that used DDR3 was faster , while when handling large data workloads GDDR5 was faster.

Avatar image for psymon100
psymon100

6835

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 psymon100
Member since 2012 • 6835 Posts

[QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.darkangel115
lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

Actually he is right. Sony did take a gamble on the availability of higher capacity GDDR5 modules, their gamble paid off. 

Avatar image for xhawk27
xhawk27

12194

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 xhawk27
Member since 2010 • 12194 Posts

I still don't see the reason MS did choose a lower GPU for the Xbox One. 

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 darkangel115
Member since 2013 • 4562 Posts

[QUOTE="darkangel115"][QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.ZoomZoom2490

lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

you need help if you believe that. Gaming PC's use GDDR5 for graphics, unless you are buying the X1 for watching TV and use Netflix all day.

have a nice day troll of trollers.

so you didn't read the whole thing? becuase that is what i said. they use graphcis cards that have GDDR5 ram built into it for the GPU and the CPU still uses DDR3 RAM becuase of its lower latency
Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25 darkangel115
Member since 2013 • 4562 Posts
[QUOTE="ZoomZoom2490"][QUOTE="xboxiphoneps3"][QUOTE="darkangel115"] lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

GDDR5 in a traditional PC setup as system memory wouldnt work obviously. but this is a customized APU... not a traditional PC, all the parts are much closer and transmission latency is severely reduced, the "latency" GDDR5 is complete nonsense bandwidth is one of the most important factors in a system... and 200 GB/s is completely theoretical and they only achieved 133 GB/S real life alpha blend... and also at the end of the day it is only 32 MB..... post backfire lul

Something tells me that he will come back with Cloud Processing next.

lol don't defending anything. just setting the facts straight. i mean its a game forums. i know 99% of people here don't understand computers. everything i said is 100% correct. I should also challenge you. show me one PC that uses GDDR5 RAM for its CPU. go for it. link me to 1 of them? ;) and the custom SPU means nothing. they have CPUs and all CPUs are latency sensitive. computers 101 ;) thats why they don't use GDDR5 RAM for computers, only in graphics cards
Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#26 darkangel115
Member since 2013 • 4562 Posts

[QUOTE="darkangel115"][QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.psymon100

lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

Actually he is right. Sony did take a gamble on the availability of higher capacity GDDR5 modules, their gamble paid off. 

did it pay off? cause until we see how bad the latency effects the CPU we won't know. Even Cerny was noticeably flustered when asked about it and dodged the question
Avatar image for navyguy21
navyguy21

17957

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#27 navyguy21
Member since 2003 • 17957 Posts

There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.Applesexual1

 

192 GB/s for the ESRAM

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 Tighaman
Member since 2006 • 1038 Posts
What I don't get is why is MS only using DDR memory? Where is the GDDR?High_Fly
its a balance thing why would you want to put a high powered motor in a car when you have a regular transmission? Im going to tell yall again sony is not part of the tree sony buys from AMD NVIDIA INTEL IBM, Microsoft make and sell things to them companys, NVIDIA doesnt use gddr5 im sure they shared some solutions for bandwidth.
Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 stereointegrity
Member since 2007 • 12151 Posts

[QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.navyguy21

 

192 GB/s for the ESRAM

u dont just add the two bandwidths together :|
Avatar image for navyguy21
navyguy21

17957

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#31 navyguy21
Member since 2003 • 17957 Posts
[QUOTE="navyguy21"]

[QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.stereointegrity

 

192 GB/s for the ESRAM

u dont just add the two bandwidths together :|

i didnt :|
Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#32 04dcarraher
Member since 2004 • 23859 Posts
[QUOTE="navyguy21"]

[QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.stereointegrity

 

192 GB/s for the ESRAM

u dont just add the two bandwidths together :|

If the memory and ESram are working along its total bandwidth. sustained its 133GB/s theoretical bandwidth is just that it could reach that speed in bursts. the ESram is a 1024bit bus so its quite possible for it to get bursts that fast. As long as the memory is feeding the gpu with the data it needs you will not see noticeable issues with the lower bandwidth.
Avatar image for danjammer69
danjammer69

4331

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#33 danjammer69
Member since 2004 • 4331 Posts

[QUOTE="darkangel115"][QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.Sagemode87

lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

This is bullsh*t

Correct. Some of the most ill-informed crap i have read in a while.
Avatar image for Scipio8
Scipio8

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 Scipio8
Member since 2013 • 937 Posts

Contrary to what PS fanboys believe, you do not need all that bandwidth afforded by GDDR5 for every part of a game. Microsoft went with a solution that is well balanced between a gaming machine and a multimedia hub. The Xbox One has a fast DDR3 RAM which is sufficient for the majority of gaming with 32MB of ESRAM to boost bandwidth in areas where its needed. Its not a comlicated setup and is well known by developers as they have been coding with Xbox 360's GDDR3 + EDRAM architecture.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 btk2k2
Member since 2003 • 440 Posts

I'm no pc genius so can someone can explain Why X1 had to go for esram... why not just add more normal ram?

I'm not having a go at X1 (bcos from my understanding real world performance will be almost identical...or otherwise time will tell at best)

Why did they take complicated route. What was the thinking here. Am I missing something... does esram have some additional qualities to it or something.

vpacalypse
When they were designing the X1 they wanted 8GB of main memory. At that time 8GB of fast GDDR5 was not on the horizon (or it was a very risky move to make) so they went with the safe option of slower, but available DDR3. To help make up for the slower speed they added 32 MB of ESRAM that could be used to make up for the reduced speed DDR3. Sony however decided they wanted a fast pool of unified ram, at the time they designed the Console they originally went with 2GB of RAM, later they decided to upgrade that to 4GB as that would give devs more freedom. As it so happens the GDDR5 density was able to be increased enabling Sony to upgrade to 8GB of RAM by only changing the chips and nothing else, they decided to do this so they could be on parity with X1 in terms of quantity but it put them ahead in terms of speed. If MS knew back when they designed the X1 that 8GB of GDDR5 would be available they would have gone that way.
Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 btk2k2
Member since 2003 • 440 Posts
GDDR5 latency isnt nonsense Intel have servers that use intergrated GDDR5 as memory and when tests were done vs servers with DDR3, With normal cpu workloads the server that used DDR3 was faster , while when handling large data workloads GDDR5 was faster.04dcarraher
Citation Needed.
Avatar image for Scipio8
Scipio8

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Scipio8
Member since 2013 • 937 Posts

[QUOTE="vpacalypse"]

I'm no pc genius so can someone can explain Why X1 had to go for esram... why not just add more normal ram?

I'm not having a go at X1 (bcos from my understanding real world performance will be almost identical...or otherwise time will tell at best)

Why did they take complicated route. What was the thinking here. Am I missing something... does esram have some additional qualities to it or something.

btk2k2

When they were designing the X1 they wanted 8GB of main memory. At that time 8GB of fast GDDR5 was not on the horizon (or it was a very risky move to make) so they went with the safe option of slower, but available DDR3. To help make up for the slower speed they added 32 MB of ESRAM that could be used to make up for the reduced speed DDR3. Sony however decided they wanted a fast pool of unified ram, at the time they designed the Console they originally went with 2GB of RAM, later they decided to upgrade that to 4GB as that would give devs more freedom. As it so happens the GDDR5 density was able to be increased enabling Sony to upgrade to 8GB of RAM by only changing the chips and nothing else, they decided to do this so they could be on parity with X1 in terms of quantity but it put them ahead in terms of speed. If MS knew back when they designed the X1 that 8GB of GDDR5 would be available they would have gone that way.

The intention was never GDDR5 but rather DDR4. Since DDR4 wasn't ready for mass market production they had to go with DDR3.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 btk2k2
Member since 2003 • 440 Posts
The intention was never GDDR5 but rather DDR4. Since DDR4 wasn't ready for mass market production they had to go with DDR3.Scipio8
DDR4 was never an option. All that was available at design time was DDR3 and GDDR5. GDDR5 was not dense enough to support 8GB of ram so they went with DDR3 + ESRAM. It was their only choice to guarantee 8GB of ram.
Avatar image for ManatuBeard
ManatuBeard

1121

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 ManatuBeard
Member since 2012 • 1121 Posts

[QUOTE="Applesexual1"]There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.darkangel115
lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.

You do realize that the "G" in "GDDR" means "Graphics"?

and

"GDDR5 never came available for PCs"

is probably the MOST STUPID THING i have read in this forum...

Avatar image for inggrish
inggrish

10503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#40 inggrish
Member since 2005 • 10503 Posts

I still don't see the reason MS did choose a lower GPU for the Xbox One. 

xhawk27

 

Well it's just the GPU they ended up settling on. Obviously they had no competition to compare at that time except Nintendo, so they had to set their own benchmark of where they wanted to be. It just so happened that Sony did exactly the same but chose a slightly more capable GPU.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 Tighaman
Member since 2006 • 1038 Posts
[QUOTE="stereointegrity"][QUOTE="navyguy21"]

 

192 GB/s for the ESRAM

04dcarraher
u dont just add the two bandwidths together :|

If the memory and ESram are working along its total bandwidth. sustained its 133GB/s theoretical bandwidth is just that it could reach that speed in bursts. the ESram is a 1024bit bus so its quite possible for it to get bursts that fast. As long as the memory is feeding the gpu with the data it needs you will not see noticeable issues with the lower bandwidth.

No 133GB/s was actual bandwith in alpha blends for esram 192 was theoretical.
Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#42 darkangel115
Member since 2013 • 4562 Posts

[QUOTE="Sagemode87"]

[QUOTE="darkangel115"] lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.danjammer69

This is bullsh*t

Correct. Some of the most ill-informed crap i have read in a while.

 

care to elaborate, cause everything i said is 100% true, its not my fault most "gamers" have no clue about computers and spout stupid buzzwords they know nothing about like GDDR5 RAM and HUMA and TFLOPs

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43 darkangel115
Member since 2013 • 4562 Posts

[QUOTE="vpacalypse"]

I'm no pc genius so can someone can explain Why X1 had to go for esram... why not just add more normal ram?

I'm not having a go at X1 (bcos from my understanding real world performance will be almost identical...or otherwise time will tell at best)

Why did they take complicated route. What was the thinking here. Am I missing something... does esram have some additional qualities to it or something.

btk2k2

When they were designing the X1 they wanted 8GB of main memory. At that time 8GB of fast GDDR5 was not on the horizon (or it was a very risky move to make) so they went with the safe option of slower, but available DDR3. To help make up for the slower speed they added 32 MB of ESRAM that could be used to make up for the reduced speed DDR3. Sony however decided they wanted a fast pool of unified ram, at the time they designed the Console they originally went with 2GB of RAM, later they decided to upgrade that to 4GB as that would give devs more freedom. As it so happens the GDDR5 density was able to be increased enabling Sony to upgrade to 8GB of RAM by only changing the chips and nothing else, they decided to do this so they could be on parity with X1 in terms of quantity but it put them ahead in terms of speed. If MS knew back when they designed the X1 that 8GB of GDDR5 would be available they would have gone that way.

 

lol thats way wrong. MS chose the DDR3 becuase thats what PCs run on. its the universally best ram for running a PC. It has nothing to do with densitey and 8GB of GDDR5 was more then avilable. they have 16 512MB chips, your saying taht they had 16 256 MB chips years ago? why do all you spam misinformation. you aren't computer experts, know nothing what your talking about, but feel the need to make crap up?

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#44 darkangel115
Member since 2013 • 4562 Posts

[QUOTE="darkangel115"][QUOTE="Applesexual1"] lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.ManatuBeard

You do realize that the "G" in "GDDR" means "Graphics"?

and

"GDDR5 never came available for PCs"

is probably the MOST STUPID THING i have read in this forum...

 

yes i do. and it didn't. show me one PC that runs exclsuively off GDDR5 RAM. you can't. they never even put it on the market becuase the high latency isn't good to run a PC. instead its only aviable on GPUs butil into the video card

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45 ronvalencia
Member since 2008 • 29612 Posts

There was no guarantee that Samsung and Hynix would get high density GDDR5 RAM modules by the end of 2013. Most people assumed that only 4 GB would be realistic today. Microsoft needed 8 GB, so they went with DDR3. To patch up its low bandwidth, they've added 32 mb of eSRAM. It helps a bit, but it's not as good as having an entire 8 GB with 176 GB/s. The effective performance of eSRAM is around 130 GB/s while GDDR5 is ~150 GB/s. Yeah, Microsoft is kinda screwed.Applesexual1

Intel's Xeon Phi has been using PS4's level GDDR5 density modules since Q2 2013 i.e. 512bit = 16 GB hence at 256bit = 8 GB.

Intel Xeon Phi is not a GPU i.e. it's a 61 core mini-X86 CPUs with each core equiped with 512bit wide SIMD unit.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#46 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="darkangel115"][QUOTE="Applesexual1"] lol so wrong. MS went with DDR3 becuase that is what every gaming PC runs. GDDR5 never came available for PCs due to the high latency and CPUs being very dependent on low latency. Programs measuring the latency of GDDR5 ram have pinned it at about 10X more then that of DDDR3. Where the GDDR5 RAM is used is in graphics cards only becuase GPUs are very "forgiving" with latency from RAM and it gives the GPU more bandwidth then DDR3. the best gaming PCs use large pools of DDR3 RAM and have a graphics card with GDDR5 RAM for the GPU. that's an "ideal" build. With teh consoles however it would have been too costly (my guess at least) to do so. Sony went high on graphics but ignored the issue with the latency which will bite them in the butt. with all the multitasking and snap and app switching the X1 has, GDDR5 RAM wouldn't even work. PS4 will have a hampered performance to the CPU due to its lack of DDR3 RAM. so what MS did to boost the bandwidth was add in the eSRAM and have obtained over 200GB/s bandwidth with it to the GPU which is higher then the PS4 gets with its GDDR5 (176GB/s) and done with lower latency. so MS got the best of both worlds out of it and sony got a RAM thats not made to work well with GPUs and still less bandwidth to their GPU.ManatuBeard

You do realize that the "G" in "GDDR" means "Graphics"?

and

"GDDR5 never came available for PCs"

is probably the MOST STUPID THING i have read in this forum...

You realize Intel Xeon Phi is not a GPU.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#47 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ZoomZoom2490"][QUOTE="xboxiphoneps3"] GDDR5 in a traditional PC setup as system memory wouldnt work obviously. but this is a customized APU... not a traditional PC, all the parts are much closer and transmission latency is severely reduced, the "latency" GDDR5 is complete nonsense bandwidth is one of the most important factors in a system... and 200 GB/s is completely theoretical and they only achieved 133 GB/S real life alpha blend... and also at the end of the day it is only 32 MB..... post backfire lul darkangel115
Something tells me that he will come back with Cloud Processing next.

lol don't defending anything. just setting the facts straight. i mean its a game forums. i know 99% of people here don't understand computers. everything i said is 100% correct. I should also challenge you. show me one PC that uses GDDR5 RAM for its CPU. go for it. link me to 1 of them? ;) and the custom SPU means nothing. they have CPUs and all CPUs are latency sensitive. computers 101 ;) thats why they don't use GDDR5 RAM for computers, only in graphics cards

http://www.jedec.org/standards-documents/docs/so-018a

"DDR4 and GDDR5M Small Outline Dual Inline Memory Module (SODIMM), 256 pin, 0.50mm pitch Socket Outline"

DDR4 and GDDR5M will be sharing the same SODIMM standards.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts
games need massive bandwidth, not "latency sensitive" memory, GDDR5 in the ps4 is superior to the memory setup in the Xbox One... you dont do compute and other stuff with "latency sensitive memory"... there is a reason why they left behind DDR3 memory for GPU's and went to GDDR5... and at the end of the day its only 32 MB... thats not much at all, its pretty bad how more then half your system bandwidth is locked up in only 32 MB...
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

[QUOTE="ManatuBeard"]

[QUOTE="darkangel115"]

You do realize that the "G" in "GDDR" means "Graphics"?

and

"GDDR5 never came available for PCs" darkangel115

is probably the MOST STUPID THING i have read in this forum...

 

yes i do. and it didn't. show me one PC that runs exclsuively off GDDR5 RAM. you can't. they never even put it on the market becuase the high latency isn't good to run a PC. instead its only aviable on GPUs butil into the video card

your not certified to talk about anything at all.. you dont know anything.. shut up fool, these are gaming consoles, gaming consoles need raw bandwidth over this supposed "latency sensitive memory" which really makes no sense because these are APU's and they are totally different from a traditional PC setup...
Avatar image for Cali3350
Cali3350

16134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 Cali3350
Member since 2003 • 16134 Posts

[QUOTE="darkangel115"]

[QUOTE="ManatuBeard"] is probably the MOST STUPID THING i have read in this forum...

xboxiphoneps3

 

yes i do. and it didn't. show me one PC that runs exclsuively off GDDR5 RAM. you can't. they never even put it on the market becuase the high latency isn't good to run a PC. instead its only aviable on GPUs butil into the video card

your not certified to talk about anything at all.. you dont know anything.. shut up fool, these are gaming consoles, gaming consoles need raw bandwidth over this supposed "latency sensitive memory" which really makes no sense because these are APU's and they are totally different from a traditional PC setup...

 

graphics processing needs raw bandwidth.  GPU Compute actually does need latency sensitive memory for max thourougput. Â