X1 esram explanation

  • 65 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

[QUOTE="xboxiphoneps3"][QUOTE="darkangel115"]

 

yes i do. and it didn't. show me one PC that runs exclsuively off GDDR5 RAM. you can't. they never even put it on the market becuase the high latency isn't good to run a PC. instead its only aviable on GPUs butil into the video card

Cali3350

your not certified to talk about anything at all.. you dont know anything.. shut up fool, these are gaming consoles, gaming consoles need raw bandwidth over this supposed "latency sensitive memory" which really makes no sense because these are APU's and they are totally different from a traditional PC setup...

 

graphics processing needs raw bandwidth.  GPU Compute actually does need latency sensitive memory for max thourougput.  

this talk about latency sensitive memory is nonsense on a APU design, with the onion and garlic buses, where everything on one die is much closer and transmission latency is severely reduced..

 

and the PS4 rips xbox one apart in compute, and spits it back out

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#52 darkangel115
Member since 2013 • 4562 Posts
[QUOTE="ManatuBeard"]

[QUOTE="darkangel115"]

You do realize that the "G" in "GDDR" means "Graphics"?

and

"GDDR5 never came available for PCs" ronvalencia

is probably the MOST STUPID THING i have read in this forum...

You realize Intel Xeon Phi is not a GPU.

I looked it up and wrong. its not a traditional CPU, its a "coprocessor" as intel calls it. and more importantly its specially designed to deal with the latency issues of GDDR5 RAM. the PS4's CPU is not. So its irrelevant it somewhere out there, somebody built a special device taht allows CPUs to work well with GDDR5 RAM becuase the PS4 doesn't have said device.
Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#53 darkangel115
Member since 2013 • 4562 Posts

[QUOTE="Cali3350"]

[QUOTE="xboxiphoneps3"] your not certified to talk about anything at all.. you dont know anything.. shut up fool, these are gaming consoles, gaming consoles need raw bandwidth over this supposed "latency sensitive memory" which really makes no sense because these are APU's and they are totally different from a traditional PC setup... xboxiphoneps3

 

graphics processing needs raw bandwidth.  GPU Compute actually does need latency sensitive memory for max thourougput.  

this talk about latency sensitive memory is nonsense on a APU design, with the onion and garlic buses, where everything on one die is much closer and transmission latency is severely reduced..

 

and the PS4 rips xbox one apart in compute, and spits it back out

 

1st i'm more then qualified to discuss this. most of the people here tryign to "debate me" aren't ;)

i'm also getting both systems becuase when your in your field of work you make a hting called money, and being in IT and netowkring for 10+ years means i make a good living and aren't forced to choose between the 2 and beg my parents to buy me one little kid ;)

you konw nothing of the onion and garlic busses becuase you know what the bandwidth is on them? 20GB/s combined with teh high latency of 10 times the amount of DDR3 and with your bandwidth at about 1/9 of the RAM itsself, the perofrmanc eis gonna bog down. last gen the PS3 was all about memory leak, this gen i'm thinking its gonna be more processor related as the CPU tries to deal with teh latency it wasn't designed to handle.

 

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#54 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="ManatuBeard"] is probably the MOST STUPID THING i have read in this forum...darkangel115
You realize Intel Xeon Phi is not a GPU.

I looked it up and wrong. its not a traditional CPU, its a "coprocessor" as intel calls it. and more importantly its specially designed to deal with the latency issues of GDDR5 RAM. the PS4's CPU is not. So its irrelevant it somewhere out there, somebody built a special device taht allows CPUs to work well with GDDR5 RAM becuase the PS4 doesn't have said device.

Intel Xeon Phi's "co-processor" tag is due to it's not running the host OS.

Host OS capable X86 CPUs will later get 512bit AVX SIMD and GDDR5M SODIMMs. Intel's AVX SIMD road map goes up to 1024 bit wide SIMD.

Intel Xeon Phi runs on customised Linux based OS.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

[QUOTE="xboxiphoneps3"]

[QUOTE="Cali3350"]

 

graphics processing needs raw bandwidth.  GPU Compute actually does need latency sensitive memory for max thourougput.  

darkangel115

this talk about latency sensitive memory is nonsense on a APU design, with the onion and garlic buses, where everything on one die is much closer and transmission latency is severely reduced..

 

and the PS4 rips xbox one apart in compute, and spits it back out

 

1st i'm more then qualified to discuss this. most of the people here tryign to "debate me" aren't ;)

i'm also getting both systems becuase when your in your field of work you make a hting called money, and being in IT and netowkring for 10+ years means i make a good living and aren't forced to choose between the 2 and beg my parents to buy me one little kid ;)

you konw nothing of the onion and garlic busses becuase you know what the bandwidth is on them? 20GB/s combined with teh high latency of 10 times the amount of DDR3 and with your bandwidth at about 1/9 of the RAM itsself, the perofrmanc eis gonna bog down. last gen the PS3 was all about memory leak, this gen i'm thinking its gonna be more processor related as the CPU tries to deal with teh latency it wasn't designed to handle.

 

LOL....10x the latency on a APU design? where is the link for that? there is absolutely no synchronization issues between the ps4 gpu/cpu and both are in perfect sync.

 

once again these are gaming consoles and that is what they are ment to do, play games, and games need raw bandwidth over anything else, which the PS4 has more of... /thread 

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#56 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="xboxiphoneps3"][QUOTE="darkangel115"]

yes i do. and it didn't. show me one PC that runs exclsuively off GDDR5 RAM. you can't. they never even put it on the market becuase the high latency isn't good to run a PC. instead its only aviable on GPUs butil into the video card

Cali3350

your not certified to talk about anything at all.. you dont know anything.. shut up fool, these are gaming consoles, gaming consoles need raw bandwidth over this supposed "latency sensitive memory" which really makes no sense because these are APU's and they are totally different from a traditional PC setup...

graphics processing needs raw bandwidth. GPU Compute actually does need latency sensitive memory for max thourougput.

It depends if the GPU can hide latency issue with multiple threads i.e. NV's Gigathreads and AMD's Ultrathreads was designed to hide this latency issue.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#57 btk2k2
Member since 2003 • 440 Posts
lol thats way wrong. MS chose the DDR3 becuase thats what PCs run on. its the universally best ram for running a PC. It has nothing to do with densitey and 8GB of GDDR5 was more then avilable. they have 16 512MB chips, your saying taht they had 16 256 MB chips years ago? why do all you spam misinformation. you aren't computer experts, know nothing what your talking about, but feel the need to make crap up?darkangel115
Holy shit you are dumb. The bus width is 256 bit, GDDR5 is 32 bit per module so you need 8 chips to get a 256bit interface. DDR3 supports 16bit modules so you need 16 chips to get a 256bit interface. Back when MS designed the X1 there were no 1GB GDDR5 modules, and it was not really on the horizon. It would have been one hell of a risk to go with GDDR5 knowing what they wanted to achieve with their system so they went the safe route with DDR3 + ESRAM to help with the bandwidth shortfall. Back before the PS4 reveal the rumour was 4GB of ram because back then the high density 1GB GDDR5 modules were scarce. As it so happens Sony obviously felt confident that there would be enough of them to enable their use in the console so they ended up with 8GB of ram as well, that was quite lucky for them but they designed the system with a different goal in mind hence why 4GB was acceptable to them. If you want to stay an ignorant dumb ass then it is up to you but if you do not know what you are talking about then I recommend you do not assume everybody else talks shit too because there are a few people on this board who do know their shit.
Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 btk2k2
Member since 2003 • 440 Posts
I looked it up and wrong. its not a traditional CPU, its a "coprocessor" as intel calls it. and more importantly its specially designed to deal with the latency issues of GDDR5 RAM. the PS4's CPU is not. So its irrelevant it somewhere out there, somebody built a special device taht allows CPUs to work well with GDDR5 RAM becuase the PS4 doesn't have said device.darkangel115
Relative to DDR3, GDDR5 does not have any latency issues, they are pretty much the same. Page 51 for DDR3 here ACT to ACT is 45ns. (think ready state to ready state) Page 133 for GDDR5 here ACT to ACT is 40ns. I guess you will assume that Hynix are not 'computer experts' either, even though they make the modules.
Avatar image for Cali3350
Cali3350

16134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 Cali3350
Member since 2003 • 16134 Posts

[QUOTE="darkangel115"]I looked it up and wrong. its not a traditional CPU, its a "coprocessor" as intel calls it. and more importantly its specially designed to deal with the latency issues of GDDR5 RAM. the PS4's CPU is not. So its irrelevant it somewhere out there, somebody built a special device taht allows CPUs to work well with GDDR5 RAM becuase the PS4 doesn't have said device.btk2k2
Relative to DDR3, GDDR5 does not have any latency issues, they are pretty much the same. Page 51 for DDR3 here ACT to ACT is 45ns. (think ready state to ready state) Page 133 for GDDR5 here ACT to ACT is 40ns. I guess you will assume that Hynix are not 'computer experts' either, even though they make the modules.

 

You do realize that is for specific modules they produce, and that DDR3 can have very varying amounts of latency per its production right?

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60 btk2k2
Member since 2003 • 440 Posts
1st i'm more then qualified to discuss this. most of the people here tryign to "debate me" aren't ;)

i'm also getting both systems becuase when your in your field of work you make a hting called money, and being in IT and netowkring for 10+ years means i make a good living and aren't forced to choose between the 2 and beg my parents to buy me one little kid ;)

you konw nothing of the onion and garlic busses becuase you know what the bandwidth is on them? 20GB/s combined with teh high latency of 10 times the amount of DDR3 and with your bandwidth at about 1/9 of the RAM itsself, the perofrmanc eis gonna bog down. last gen the PS3 was all about memory leak, this gen i'm thinking its gonna be more processor related as the CPU tries to deal with teh latency it wasn't designed to handle.

 

darkangel115
I highly doubt the Super-Onion bus has high latency, its on die to on die, its probably similar to the ESRAM latency on the X1 as that is also on die.
Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 btk2k2
Member since 2003 • 440 Posts

[QUOTE="btk2k2"][QUOTE="darkangel115"]I looked it up and wrong. its not a traditional CPU, its a "coprocessor" as intel calls it. and more importantly its specially designed to deal with the latency issues of GDDR5 RAM. the PS4's CPU is not. So its irrelevant it somewhere out there, somebody built a special device taht allows CPUs to work well with GDDR5 RAM becuase the PS4 doesn't have said device.Cali3350

Relative to DDR3, GDDR5 does not have any latency issues, they are pretty much the same. Page 51 for DDR3 here ACT to ACT is 45ns. (think ready state to ready state) Page 133 for GDDR5 here ACT to ACT is 40ns. I guess you will assume that Hynix are not 'computer experts' either, even though they make the modules.

 

You do realize that is for specific modules they produce, and that DDR3 can have very varying amounts of latency per its production right?

It is not going to vary by that much though is it, the DDR3 and the GDDR5 are going to be practically the same from a latency POV. It just debunks this myth that GDDR5 is much higher latency than DDR3.
Avatar image for Cali3350
Cali3350

16134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 Cali3350
Member since 2003 • 16134 Posts

[QUOTE="Cali3350"]

[QUOTE="btk2k2"] Relative to DDR3, GDDR5 does not have any latency issues, they are pretty much the same. Page 51 for DDR3 here ACT to ACT is 45ns. (think ready state to ready state) Page 133 for GDDR5 here ACT to ACT is 40ns. I guess you will assume that Hynix are not 'computer experts' either, even though they make the modules.btk2k2

 

You do realize that is for specific modules they produce, and that DDR3 can have very varying amounts of latency per its production right?

It is not going to vary by that much though is it, the DDR3 and the GDDR5 are going to be practically the same from a latency POV. It just debunks this myth that GDDR5 is much higher latency than DDR3.

 

GDDR5 is much higher latency in terms of cycles.  But since it has so many more cycles its largely a wash in real world values, absolutely.  

But his quote above is simply silly.  It has little to do with the ram used in either console.  

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#63 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts

[QUOTE="xboxiphoneps3"]

[QUOTE="Cali3350"]

 

graphics processing needs raw bandwidth.  GPU Compute actually does need latency sensitive memory for max thourougput.  

darkangel115

this talk about latency sensitive memory is nonsense on a APU design, with the onion and garlic buses, where everything on one die is much closer and transmission latency is severely reduced..

 

and the PS4 rips xbox one apart in compute, and spits it back out

 

1st i'm more then qualified to discuss this. most of the people here tryign to "debate me" aren't ;)

i'm also getting both systems becuase when your in your field of work you make a hting called money, and being in IT and netowkring for 10+ years means i make a good living and aren't forced to choose between the 2 and beg my parents to buy me one little kid ;)

you konw nothing of the onion and garlic busses becuase you know what the bandwidth is on them? 20GB/s combined with teh high latency of 10 times the amount of DDR3 and with your bandwidth at about 1/9 of the RAM itsself, the perofrmanc eis gonna bog down. last gen the PS3 was all about memory leak, this gen i'm thinking its gonna be more processor related as the CPU tries to deal with teh latency it wasn't designed to handle.

 

Ah, yes, the "I'm in a marginally related field therefore I know everything regardless of whether I can back up my claims or not" defense. Reminds me of the guy who came on here claiming he was an expert on the PS3's cell because he got accepted into a university for computer science and read a pdf file.

 

Last I checked IT and networking doesn't have much to do with hardware engineering, you aren't any more qualified to talk about this than anyone else on the forum.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#64 StormyJoe
Member since 2011 • 7806 Posts

I'm no pc genius so can someone can explain Why X1 had to go for esram... why not just add more normal ram?

I'm not having a go at X1 (bcos from my understanding real world performance will be almost identical...or otherwise time will tell at best)

Why did they take complicated route. What was the thinking here. Am I missing something... does esram have some additional qualities to it or something.

vpacalypse

Prepare for 100s of responses from Cows who seem to think that because they are gamers, they are also hardware engineers...

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 btk2k2
Member since 2003 • 440 Posts
GDDR5 is much higher latency in terms of cycles.  But since it has so many more cycles its largely a wash in real world values, absolutely.  

But his quote above is simply silly.  It has little to do with the ram used in either console.  

Cali3350
Indeed, just trying to educate a bit. It is not like the X1 is poor architecture, it is just with the goals they had at design time they had fewer options than Sony. If MS would have known that GDDR5 would be available in high enough densities to have 8GB they probably would have a very similar system to Sony do. With going DDR3 and ESRAM they blew transistor and power budget which could have been used to beef up the GPU but at the time they had no other choice. I am glad they went back on their DRM policies though as it means I will consider their console in a few years when it is cheaper and has some nice exclusives.