AMD Says the following....

  • 113 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for Aidenfury19
Aidenfury19

2488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#51 Aidenfury19
Member since 2007 • 2488 Posts

Link? And is this talking about ram or GPU memory?ShaineTheNerd

They're the same frigging thing and there is NO difference electronically. Physically you can slap an arbitrary RAM type on a GPU and it will perform the same or better than it would if you threw it onto DIMMs and used it for a CPU. The only thing that distinguishes RAM is how it is allocated (logical) and where it is located (physical).

Avatar image for ShaineTheNerd
ShaineTheNerd

1578

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52 ShaineTheNerd
Member since 2012 • 1578 Posts

[QUOTE="ShaineTheNerd"]Link? And is this talking about ram or GPU memory?Aidenfury19

They're the same frigging thing and there is NO difference electronically. Physically you can slap an arbitrary RAM type on a GPU and it will perform the same or better than it would if you threw it onto DIMMs and used it for a CPU. The only thing that distinguishes RAM is how it is allocated (logical) and where it is located (physical).

I only asked because from what I understand, GPU memory and ram are two different things, and the article this whole thread is based off is talking about GPU memory. I'm not the most tech-savy guy out there, but this was my understanding of things. I may be wrong.
Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 superclocked
Member since 2009 • 5864 Posts
[QUOTE="superclocked"][QUOTE="Kaz_Son"]Taht's not what I was talking about. I was saying that GPU is more important than RAM when it comes to graphics. Even if PS4's RAM advantage isn't as big as you say the GPU advantage is significant enough to ensure that both systems won't be identical.Kaz_Son
True, but hopefully the upclock rumor from first party devs is true. An increase to just 950MHz would make it 1.84 TFLOPS vs 1.46 TFLOPS. But, even at 1.23 TFLOPS, the XBox One GPU is only 33% slower than the PS4 GPU. That's really not all that much. If someone wants the most powerful system, they should really be gaming on PC. My $200 GCN GPU put out 3.69 TFLOPS @ 1.2GHz...

True, PC gaming wins either way but the PS4 is still more powerful than the XB1 for $100 cheaper and that's significant IMO.

Very true, I can't argue with that. The XBox One would likely win if they released an SKU without Kinect for $300...
Avatar image for ShaineTheNerd
ShaineTheNerd

1578

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54 ShaineTheNerd
Member since 2012 • 1578 Posts
[QUOTE="Kaz_Son"][QUOTE="superclocked"]True, but hopefully the upclock rumor from first party devs is true. An increase to just 950MHz would make it 1.84 TFLOPS vs 1.46 TFLOPS. But, even at 1.23 TFLOPS, the XBox One GPU is only 33% slower than the PS4 GPU. That's really not all that much. If someone wants the most powerful system, they should really be gaming on PC. My $200 GCN GPU put out 3.69 TFLOPS @ 1.2GHz...superclocked
True, PC gaming wins either way but the PS4 is still more powerful than the XB1 for $100 cheaper and that's significant IMO.

Very true, I can't argue with that. The XBox One would likely win if they released an SKU without Kinect for $300...

It would. The most powerful console NEVER wins... it's always the cheapest that takes the gold. Sony is set up to gather the most sales, definitely, but a 200 dollar price drop for the Xbox One would essentially be game over.
Avatar image for mrbojangles25
mrbojangles25

60869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#55 mrbojangles25
Member since 2005 • 60869 Posts

[QUOTE="uninspiredcup"]

Looks like the pc wins agan. Ah wells.

meconate

Was there ever a time it didn't?

wait, are PC gamers even allowed here? Thought this was console only

Avatar image for NTSC-U
NTSC-U

490

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56 NTSC-U
Member since 2008 • 490 Posts

[QUOTE="meconate"]

[QUOTE="uninspiredcup"]

Looks like the pc wins agan. Ah wells.

mrbojangles25

Was there ever a time it didn't?

wait, are PC gamers even allowed here? Thought this was console only

 

havent you figured it out by now? pc elites must throw themselves into any topic by any means.

Avatar image for Ribnarak
Ribnarak

2299

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#57 Ribnarak
Member since 2008 • 2299 Posts

i think everyone knows PC is the greatest. Why do hermits have to keep saying it over and over again. 

Anyways. PS4> Xbox one 1. In terms of hardware thats a fact.  

Avatar image for Aidenfury19
Aidenfury19

2488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#58 Aidenfury19
Member since 2007 • 2488 Posts

[QUOTE="Aidenfury19"]

[QUOTE="ShaineTheNerd"]Link? And is this talking about ram or GPU memory?ShaineTheNerd

They're the same frigging thing and there is NO difference electronically. Physically you can slap an arbitrary RAM type on a GPU and it will perform the same or better than it would if you threw it onto DIMMs and used it for a CPU. The only thing that distinguishes RAM is how it is allocated (logical) and where it is located (physical).

I only asked because from what I understand, GPU memory and ram are two different things, and the article this whole thread is based off is talking about GPU memory. I'm not the most tech-savy guy out there, but this was my understanding of things. I may be wrong.

They're not different, at least they don't have to be. DDR3 can (and is) used for lower-end GPUs because it tends to be less expensive for the OEMs and weaker GPUs don't require as much bandwidth to perform well. That is the same stuff most PCs now use for their CPU memory and in BOTH cases it is RAM (random access memory).

Higher-end GPUs and the PS4 use GDDR5 currently because it allows for much higher memory bandwidth, meaning that it can move much more data within a given period of time at the same clockspeed. When people criticize the XBONE for using DDR3 it is because of that: the memory bandwidth is a much, MUCH lower than for the PS4 so it can (may) run into situations where it is unable to transfer data to memory fast enough which would make performance suffer.

Microsoft has added quite a few things to try to minimize how often that will occur, but almost none of them are necessary on the PS4 because it's using RAM with far superior performance. GDDR5 doesn't have any downsides compared to DDR3 other than cost (which is one of the two big reasons why your CPU doesn't use it) either and that includes latency.

Those are just two types of RAM and both types are what is known as volatile RAM meaning that they lose whatever data is on them if they lose power. There is also non-volatile RAM or NVRAM (e.g. Flash memory) that doesn't lose data when it's powered off. It's a very broad category, but generally when people talk about memory they're talking about RAM (a.k.a. primary memory), while when they talk about storage they're talking about HDDs (a.k.a. secondary memory).

Avatar image for mrbojangles25
mrbojangles25

60869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#59 mrbojangles25
Member since 2005 • 60869 Posts

i think everyone knows PC is the greatest. Why do hermits have to keep saying it over and over again. 

Anyways. PS4> Xbox one 1. In terms of hardware thats a fact.  

Ribnarak

since you brought it up, would someone please clarify who is what?

So far I"ve heard of cows, sheep, lemmings, hermits....what else is there?

Can't believe I don't know this yet lol...

Avatar image for Ribnarak
Ribnarak

2299

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 Ribnarak
Member since 2008 • 2299 Posts

[QUOTE="Ribnarak"]

i think everyone knows PC is the greatest. Why do hermits have to keep saying it over and over again. 

Anyways. PS4> Xbox one 1. In terms of hardware thats a fact.  

mrbojangles25

since you brought it up, would someone please clarify who is what?

So far I"ve heard of cows, sheep, lemmings, hermits....what else is there?

Can't believe I don't know this yet lol...

Cows = Sony fan Lems = Microsoft fan Sheep = Nintendo Fan Hermit = PC fan Lol how do u have so many posts and u don t know this
Avatar image for Kaz_Son
Kaz_Son

1389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 Kaz_Son
Member since 2013 • 1389 Posts

[QUOTE="Ribnarak"]

i think everyone knows PC is the greatest. Why do hermits have to keep saying it over and over again. 

Anyways. PS4> Xbox one 1. In terms of hardware thats a fact.  

mrbojangles25

since you brought it up, would someone please clarify who is what?

So far I"ve heard of cows, sheep, lemmings, hermits....what else is there?

Can't believe I don't know this yet lol...

Cows = Sony fanboys Lemmings: MS fanboys Sheep: Nintendo fanboys Hermits: PC fanboys Manticores: Unbiased gamers (they don't exist)
Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 superclocked
Member since 2009 • 5864 Posts
[QUOTE="superclocked"][QUOTE="Kaz_Son"]True, PC gaming wins either way but the PS4 is still more powerful than the XB1 for $100 cheaper and that's significant IMO.ShaineTheNerd
Very true, I can't argue with that. The XBox One would likely win if they released an SKU without Kinect for $300...

It would. The most powerful console NEVER wins... it's always the cheapest that takes the gold. Sony is set up to gather the most sales, definitely, but a 200 dollar price drop for the Xbox One would essentially be game over.

Oh well, the XBox One will help push Kinect technology, which will in turn bring me one step closer to a Lawnmower Man or holodeck type setup using the Oculus Rift and PC Kinect :D
Avatar image for GD1551
GD1551

9645

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 GD1551
Member since 2011 • 9645 Posts

Why does power matter so much to console gamers?

The Xbox was multiple times more powerful than the PS2. It wasn't simply 50% more....More like 500% more.

Everyone chose the PS2 anyways because it had the most games.

 

There is nothing about the power difference between the PS4 and Xbox One that's going to actually change the gaming experience between the two systems. It'll be an almost indifferential amount (thanks to diminishing returns) that you won't know exists without going to pixel counting sites like Digital Foundry.

Ly_the_Fairy

Because they are going to be virtually sharing the same games so power is the deciding factor here....

Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 superclocked
Member since 2009 • 5864 Posts

[QUOTE="Ly_the_Fairy"]

Why does power matter so much to console gamers?

The Xbox was multiple times more powerful than the PS2. It wasn't simply 50% more....More like 500% more.

Everyone chose the PS2 anyways because it had the most games.

 

There is nothing about the power difference between the PS4 and Xbox One that's going to actually change the gaming experience between the two systems. It'll be an almost indifferential amount (thanks to diminishing returns) that you won't know exists without going to pixel counting sites like Digital Foundry.

GD1551

Because they are going to be virtually sharing the same games so power is the deciding factor here....

The XBox One's GPU is only 33% less powerful than the PS4's GPU. The main problem is the $100 price difference.. But, if people want the most powerful gaming system, then they should be gaming on PC...
Avatar image for sebbi11
sebbi11

1190

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#66 sebbi11
Member since 2004 • 1190 Posts

[QUOTE="Ly_the_Fairy"]

Why does power matter so much to console gamers?

The Xbox was multiple times more powerful than the PS2. It wasn't simply 50% more....More like 500% more.

Everyone chose the PS2 anyways because it had the most games.

 

There is nothing about the power difference between the PS4 and Xbox One that's going to actually change the gaming experience between the two systems. It'll be an almost indifferential amount (thanks to diminishing returns) that you won't know exists without going to pixel counting sites like Digital Foundry.

GD1551

Because they are going to be virtually sharing the same games so power is the deciding factor here....

Nope, your wrong. Performance > specs, xbox 360 taught us that, and E3 taught us that YET AGAIN. And quality of the online service is much more important than a theoretical advantage in specs. And then theres the the massive server support, games, hardware... sorry, power is not the deciding factor. 

Avatar image for Trail_Mix
Trail_Mix

2579

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67 Trail_Mix
Member since 2011 • 2579 Posts

Does anyone else notice how the more diehard Cows pick and choose what threads they post in? Hilarious. No reason for this not to be in that othr thread. The one most Cows are avoiding.

timbers_WSU

Strange, coming from you.

I mean, you've always come off as someone who hates when Cows go off on tangents with pointless damage control.

 

 

Avatar image for Snugenz
Snugenz

13388

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68 Snugenz
Member since 2006 • 13388 Posts

...lol i know lemmings have built up anger and sourness over the past few weeks/months look how fast they jump on these new threads damage controlling copy and pasting basically the same sentences to try and damage control as much as possible. They jump on it asap ... LOL whats up system wars same old shit huh xboxiphoneps3

You're in the exact same threads doing the same thing.

Avatar image for Phazevariance
Phazevariance

12356

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 Phazevariance
Member since 2003 • 12356 Posts

[QUOTE="timbers_WSU"]

Does anyone else notice how the more diehard Cows pick and choose what threads they post in? Hilarious. No reason for this not to be in that othr thread. The one most Cows are avoiding.

Trail_Mix

Strange, coming from you.

I mean, you've always come off as someone who hates when Cows go off on tangents with pointless damage control.

 

 

You mean the one where it was announced that PS4 will use more ram for the OS than the X1 uses for their OS? That thread? Less ram for games on the PS4 thread? Are they ignoring the thread where 360 has more ram even if its slower yet has faster l4 cache? that thread? Can i say Thread again?
Avatar image for GD1551
GD1551

9645

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 GD1551
Member since 2011 • 9645 Posts

[QUOTE="GD1551"]

[QUOTE="Ly_the_Fairy"]

Why does power matter so much to console gamers?

The Xbox was multiple times more powerful than the PS2. It wasn't simply 50% more....More like 500% more.

Everyone chose the PS2 anyways because it had the most games.

 

There is nothing about the power difference between the PS4 and Xbox One that's going to actually change the gaming experience between the two systems. It'll be an almost indifferential amount (thanks to diminishing returns) that you won't know exists without going to pixel counting sites like Digital Foundry.

sebbi11

Because they are going to be virtually sharing the same games so power is the deciding factor here....

Nope, your wrong. Performance > specs, xbox 360 taught us that, and E3 taught us that YET AGAIN. And quality of the online service is much more important than a theoretical advantage in specs. And then theres the the massive server support, games, hardware... sorry, power is not the deciding factor. 

Ok... you realize they are using the same hardware except it's better in the PS4 right? X360's performance over the PS3 was due to how complicated it was to develop for the PS3. Now Games and Hardware.. which the PS4 is going to be better with (if the 360vsPS3 is any indication). So all you got is online, which is already fine on X and PS3, so it will be fine on both.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#71 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="Kaz_Son"][QUOTE="superclocked"]Uuum, I'm well aware that the PS4 has a better GPU. It was the firs sentence in my post :|superclocked
Taht's not what I was talking about. I was saying that GPU is more important than RAM when it comes to graphics. Even if PS4's RAM advantage isn't as big as you say the GPU advantage is significant enough to ensure that both systems won't be identical.

True, but hopefully the upclock rumor from first party devs is true. An increase to just 950MHz would make it 1.84 TFLOPS vs 1.46 TFLOPS. But, even at 1.23 TFLOPS, the XBox One GPU is only 33% slower than the PS4 GPU. That's really not all that much. If someone wants the most powerful system, they should really be gaming on PC. My $200 GCN GPU put out 3.69 TFLOPS @ 1.2GHz...

PS4's 18 CU still slightly has greater register data storage L1 cache and local data store (LDS) than the X1.

12 CU: 3 MB register file, 192 KB L1 cache, 768 KB LDS

18 CU: 4.5 MB register file, 288 KB L1 cache, 1152 KB LDS

32 CU: 8 MB register file, 512 KB L1 cache, 2048 KB LDS

These are all SRAM type memory.

---

After L2 cache and global cache, the above SRAM storage are compute storage before it hits external memory (with multi-cycle latencies).

X1 has low latency 32 ESRAM which has lower latencies than both DDR3 and GDDR5, but higher than internal SRAM.

References

AMD_GCN_CUnit_689.jpg

GCN-CU_zps901d7285.png

Avatar image for nyzma23
nyzma23

1003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 nyzma23
Member since 2013 • 1003 Posts

[QUOTE="ShaineTheNerd"]Link? And is this talking about ram or GPU memory?rocoswav

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx 

 

it's GPU MEMORY not SYSTEM MEMORY ,gddr5 only good when used for gpu memory because of high bandwidth but worse when used for system memory because of high latency that's where ddr3 shine

there's a reason why pc have 2 different type of memory ddr3 for system and gddr5 for gpu

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#73 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="rocoswav"]

[QUOTE="ShaineTheNerd"]Link? And is this talking about ram or GPU memory?nyzma23

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx

it's GPU MEMORY not SYSTEM MEMORY ,gddr5 only good when used for gpu memory because of high bandwidth but worse when used for system memory because of high latency that's where ddr3 shine

there's a reason why pc have 2 different type of memory ddr3 for system and gddr5 for gpu

GDDR5's latency example https://www.skhynix.com/products/graphics/view.jsp?info.ramKind=26&info.serialNo=H5GQ2H24AFR

Programmable CAS latency: 5 to 20 tCK

Programmable WRITE latency: 1 to 7 tCK

DDR3's latency example http://download.micron.com/pdf/datasheets/dram/ddr3/1Gb_DDR3_SDRAM.pdf

CAS READ latency (CL): 5, 6, 7, 8, 9, 10, or 11

CAS WRITE latency (CWL): 5, 6, 7, 8, based on tCK

------------

http://forums.anandtech.com/showpost.php?p=34710477&postcount=28

Memory latency is most appropriately measured in nanoseconds. In terms of nanosecond latency, DDR3 and GDDR5 are very similar. Someone already posted a link that says that GDDR5 has programmable CAS latency (from 5 to 20 cycles), which if you take the increased clockspeed of GDDR5 into account, means that the latency in terms of nanoseconds is very similar to DDR3.

GDDR5's architecture is different from typical DDR3 in only a few important performance-oriented ways: * More banks (16 in GDDR5 versus 8 [typical] in DDR3, this is actually a latency-*reducing* feature). * More data pins per GDDR5 device (32 in GDDR5 versus [typicall] 8 or [very rarely] 16 in DDR3). This makes it so you can get all of the data for a cache line (or whatever granularity of access you're talking about), in a reasonable number of cycles from a single chip. In DDR3, all 8 chips in a rank work together to provide 64 bits per transfer of a cache line (8 bits per transfer each). This width, plus the very high clockspeed of GDDR5 has the net effect of data transfer taking *less time* with GDDR5 compared to DDR3, but data transfer is never the latency bottleneck in a DRAM system, so this part isn't very important. Suffice it to say, this does not have any negative impact on GDDR5 latency. * Obviously also very high clockspeeds and data transfer speeds.

As a source, other than the link provided earlier in the page, I myself researched DRAM for a couple years, and my lab mate just finished the GDDR5 version of his DRAM simulator, with all the industry-correct timing parameters involved, and according to him they have the almost identical latency.

-------------------

http://www.jedec.org/standards-documents/docs/so-018a

GDDR5M/DDR4 SODIMM standard coming soon for the PC.

Avatar image for nyzma23
nyzma23

1003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74 nyzma23
Member since 2013 • 1003 Posts

[QUOTE="nyzma23"]

[QUOTE="rocoswav"]

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx

ronvalencia

 

it's GPU MEMORY not SYSTEM MEMORY ,gddr5 only good when used for gpu memory because of high bandwidth but worse when used for system memory because of high latency that's where ddr3 shine

there's a reason why pc have 2 different type of memory ddr3 for system and gddr5 for gpu

GDDR5's latency example https://www.skhynix.com/products/graphics/view.jsp?info.ramKind=26&info.serialNo=H5GQ2H24AFR

Programmable CAS latency: 5 to 20 tCK

Programmable WRITE latency: 1 to 7 tCK

 

DDR3's latency example http://download.micron.com/pdf/datasheets/dram/ddr3/1Gb_DDR3_SDRAM.pdf

CAS READ latency (CL): 5, 6, 7, 8, 9, 10, or 11

CAS WRITE latency (CWL): 5, 6, 7, 8, based on tCK

 

------------

http://forums.anandtech.com/showpost.php?p=34710477&postcount=28

Memory latency is most appropriately measured in nanoseconds. In terms of nanosecond latency, DDR3 and GDDR5 are very similar. Someone already posted a link that says that GDDR5 has programmable CAS latency (from 5 to 20 cycles), which if you take the increased clockspeed of GDDR5 into account, means that the latency in terms of nanoseconds is very similar to DDR3.

GDDR5's architecture is different from typical DDR3 in only a few important performance-oriented ways: * More banks (16 in GDDR5 versus 8 [typical] in DDR3, this is actually a latency-*reducing* feature). * More data pins per GDDR5 device (32 in GDDR5 versus [typicall] 8 or [very rarely] 16 in DDR3). This makes it so you can get all of the data for a cache line (or whatever granularity of access you're talking about), in a reasonable number of cycles from a single chip. In DDR3, all 8 chips in a rank work together to provide 64 bits per transfer of a cache line (8 bits per transfer each). This width, plus the very high clockspeed of GDDR5 has the net effect of data transfer taking *less time* with GDDR5 compared to DDR3, but data transfer is never the latency bottleneck in a DRAM system, so this part isn't very important. Suffice it to say, this does not have any negative impact on GDDR5 latency. * Obviously also very high clockspeeds and data transfer speeds.

As a source, other than the link provided earlier in the page, I myself researched DRAM for a couple years, and my lab mate just finished the GDDR5 version of his DRAM simulator, with all the industry-correct timing parameters involved, and according to him they have the almost identical latency.

 

-------------------

http://www.jedec.org/standards-documents/docs/so-018a

 

GDDR5M/DDR4 SODIMM standard coming soon for the PC.

 

 

if it's almost identical latency why pc oem and apple doesn't make unified ram in their laptop just like ps4 ? i admire anandtech but i think apple & microsoft engineer have more experience with computer hardware than anandtech

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#75 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="nyzma23"]

 

it's GPU MEMORY not SYSTEM MEMORY ,gddr5 only good when used for gpu memory because of high bandwidth but worse when used for system memory because of high latency that's where ddr3 shine

there's a reason why pc have 2 different type of memory ddr3 for system and gddr5 for gpu

nyzma23

GDDR5's latency example https://www.skhynix.com/products/graphics/view.jsp?info.ramKind=26&info.serialNo=H5GQ2H24AFR

Programmable CAS latency: 5 to 20 tCK

Programmable WRITE latency: 1 to 7 tCK

 

DDR3's latency example http://download.micron.com/pdf/datasheets/dram/ddr3/1Gb_DDR3_SDRAM.pdf

CAS READ latency (CL): 5, 6, 7, 8, 9, 10, or 11

CAS WRITE latency (CWL): 5, 6, 7, 8, based on tCK

 

------------

http://forums.anandtech.com/showpost.php?p=34710477&postcount=28

Memory latency is most appropriately measured in nanoseconds. In terms of nanosecond latency, DDR3 and GDDR5 are very similar. Someone already posted a link that says that GDDR5 has programmable CAS latency (from 5 to 20 cycles), which if you take the increased clockspeed of GDDR5 into account, means that the latency in terms of nanoseconds is very similar to DDR3.

GDDR5's architecture is different from typical DDR3 in only a few important performance-oriented ways: * More banks (16 in GDDR5 versus 8 [typical] in DDR3, this is actually a latency-*reducing* feature). * More data pins per GDDR5 device (32 in GDDR5 versus [typicall] 8 or [very rarely] 16 in DDR3). This makes it so you can get all of the data for a cache line (or whatever granularity of access you're talking about), in a reasonable number of cycles from a single chip. In DDR3, all 8 chips in a rank work together to provide 64 bits per transfer of a cache line (8 bits per transfer each). This width, plus the very high clockspeed of GDDR5 has the net effect of data transfer taking *less time* with GDDR5 compared to DDR3, but data transfer is never the latency bottleneck in a DRAM system, so this part isn't very important. Suffice it to say, this does not have any negative impact on GDDR5 latency. * Obviously also very high clockspeeds and data transfer speeds.

As a source, other than the link provided earlier in the page, I myself researched DRAM for a couple years, and my lab mate just finished the GDDR5 version of his DRAM simulator, with all the industry-correct timing parameters involved, and according to him they have the almost identical latency.

 

-------------------

http://www.jedec.org/standards-documents/docs/so-018a

 

GDDR5M/DDR4 SODIMM standard coming soon for the PC.

 

 

if it's almost identical latency why pc oem and apple don't make unified ram in their laptop just like ps4 ? i admire anandtech but i think apple & microsoft engineer have more experience with computer hardware than anandtech

Memory density vs cost and mainline PCs has larger/faster cache than AMD Jaguar based CPU solutions.
Avatar image for Mr-Kutaragi
Mr-Kutaragi

2466

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#76 Mr-Kutaragi
Member since 2013 • 2466 Posts

Memory_Performance_chart_v3.png

Nuff said

 

rocoswav
TLHBO
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#77 faizan_faizan
Member since 2009 • 7869 Posts
Old.
Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78 FoxbatAlpha
Member since 2009 • 10669 Posts
Once again, the weaker console, is more powerful.
Avatar image for DrTrafalgarLaw
DrTrafalgarLaw

4487

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#79 DrTrafalgarLaw
Member since 2011 • 4487 Posts

Once again, the weaker console, is more powerful. FoxbatAlpha


Can you not interpret graphs? :cool:

Avatar image for bezza2011
bezza2011

2729

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#80 bezza2011
Member since 2006 • 2729 Posts

Why does power matter so much to console gamers?

The Xbox was multiple times more powerful than the PS2. It wasn't simply 50% more....More like 500% more.

Everyone chose the PS2 anyways because it had the most games.

 

There is nothing about the power difference between the PS4 and Xbox One that's going to actually change the gaming experience between the two systems. It'll be an almost indifferential amount (thanks to diminishing returns) that you won't know exists without going to pixel counting sites like Digital Foundry.

Ly_the_Fairy

Yet again someone who doesn't put all the other factors into place before claiming that power isn't always the winner.

lets put it all straight, Sony came into the game with the ps1, a new easier and cost effective cd drive than the n64 bulky cartridge, this was always going to win because of the future tech, sony's ps2 was top notch it was the only platform really to go on for third party developmet, sega mucked up with the sega saturn and nintendo went a strange way with the gamecube using them crazy small disc's, so sony was there, xbox came out a whole year after, they were a brand new system no one really heard much about it and it was after sony's ps2 that word was going about a new console. this is the reason sony won hands down, of course the xbox was going to be more powerful it was designed a year or so later.

xbox 360 won because it came out a year before the ps3, it had a great low selling point, it did have hiddne costs but these costs people justified with puchasing the console, this was the only system which was out which allowed multiplayer on a mainstream level, people paid it grew because of this, by the time ps3 came out and told people about the free online serivce, it was a little late and with the high price of the console due to the hig tech of the blu ray player it didn't sit well and for most of the gen xbox 360 won.

the wii won i know but the only reason the wii won is because it is a great little cheap party console that is the only reason it sold well not for anything else but the party games.

this time around is completely different to any other gen, we have 2 main companies now both producing on par levels of preformance, we all know deep down that no matter how much more power the ps4 has they both will look the same to the general public graphics for at least a couple of years, they are both releasing at the exact same time, and the one problem is, xbox one is 499 and ps4 is 399, does the xbox one really show anything that i want for the extra 100 no, but thats just me, this is going to be the closest war we have ever had, no one is at an advantage this time around

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#81 clyde46
Member since 2005 • 49061 Posts
AMD also said, MOAR CORES!
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#82 faizan_faizan
Member since 2009 • 7869 Posts

Once again, the weaker console, is more powerful. FoxbatAlpha

Once again?

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#83 clyde46
Member since 2005 • 49061 Posts

[QUOTE="FoxbatAlpha"]Once again, the weaker console, is more powerful. faizan_faizan

Once again?

The PS2 was weaker than the original Xbox.
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#84 faizan_faizan
Member since 2009 • 7869 Posts
[QUOTE="faizan_faizan"]

[QUOTE="FoxbatAlpha"]Once again, the weaker console, is more powerful. clyde46

Once again?

The PS2 was weaker than the original Xbox.

"is more powerful."
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#85 clyde46
Member since 2005 • 49061 Posts
[QUOTE="clyde46"][QUOTE="faizan_faizan"]

Once again?

faizan_faizan
The PS2 was weaker than the original Xbox.

"is more powerful."

Oh yea, his wording was suspect.
Avatar image for Snugenz
Snugenz

13388

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86 Snugenz
Member since 2006 • 13388 Posts

[QUOTE="faizan_faizan"][QUOTE="clyde46"] The PS2 was weaker than the original Xbox.clyde46
"is more powerful."

Oh yea, his wording was suspect.

I read it as "popular" instead of "powerful" at first aswell :P

Avatar image for TheRealBigRich
TheRealBigRich

784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87 TheRealBigRich
Member since 2010 • 784 Posts

[QUOTE="Ly_the_Fairy"]

Why does power matter so much to console gamers?

The Xbox was multiple times more powerful than the PS2. It wasn't simply 50% more....More like 500% more.

Everyone chose the PS2 anyways because it had the most games.

 

There is nothing about the power difference between the PS4 and Xbox One that's going to actually change the gaming experience between the two systems. It'll be an almost indifferential amount (thanks to diminishing returns) that you won't know exists without going to pixel counting sites like Digital Foundry.

kalipekona

If they care so much about graphics and performance they should be gaming on a PC.

this
Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#88 superclocked
Member since 2009 • 5864 Posts
[QUOTE="faizan_faizan"]

[QUOTE="FoxbatAlpha"]Once again, the weaker console, is more powerful. clyde46

Once again?

The PS2 was weaker than the original Xbox.

The PS2 actually had a better CPU than the XBox, but the XBox's GPU was far more advanced...
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#89 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="clyde46"][QUOTE="faizan_faizan"]

Once again?

superclocked

The PS2 was weaker than the original Xbox.

The PS2 actually had a better CPU than the XBox, but the XBox's GPU was far more advanced...

What are the reasons of Half Life 2, Far Cry, DOOM 3 and CoR:EfBB not showing up on the PS2, then? I think the GPU might be one of the reasons for that, but Half Life 2 wasn't even in the league with the others when taking graphics in to consideration. It was about physics. And as far as I remember, during the PS2/XBOX era, GPGPU computing didn't exist. Half Life 2 didn't even feature dynamic shadows.

Avatar image for deactivated-63e66f5999640
deactivated-63e66f5999640

495

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#90 deactivated-63e66f5999640
Member since 2005 • 495 Posts

Everyone knows the PS4 is the most powerful console in video game history. It's pointless to say the same thing over and over. Heil68

 

Every new generation is considerably more powerful than the last, so every new generation are the most powerful gaming consoles in history. That's just stupid to say. Shit, none of us has even played these systems yet or have seen there full potential. People are getting real premature with this power stuff. I mean look at the PS4 RAM news, just shutup until we can actually get our hands on it and see the actual facts. 

Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91 superclocked
Member since 2009 • 5864 Posts

[QUOTE="superclocked"][QUOTE="clyde46"] The PS2 was weaker than the original Xbox.faizan_faizan

The PS2 actually had a better CPU than the XBox, but the XBox's GPU was far more advanced...

What are the reasons of Half Life 2, Far Cry, DOOM 3 and CoR:EfBB not showing up on the PS2, then? I think the GPU might be one of the reasons for that, but Half Life 2 wasn't even in the league with the others when taking graphics in to consideration. It was about physics. And as far as I remember, during the PS2/XBOX era, GPGPU computing didn't exist. Half Life 2 didn't even feature dynamic shadows.

Although it was not as good, the CPU in the XBox was an x86 PC CPU. It was a Pentium 3 based Celeron. Porting games from PC to XBox was easy. The Emotion Engine in the PS2 was like the Cell in the PS3. It was an alien architecture compared to what PC devs were used to working with...
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#92 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="faizan_faizan"]

[QUOTE="superclocked"]The PS2 actually had a better CPU than the XBox, but the XBox's GPU was far more advanced...superclocked

What are the reasons of Half Life 2, Far Cry, DOOM 3 and CoR:EfBB not showing up on the PS2, then? I think the GPU might be one of the reasons for that, but Half Life 2 wasn't even in the league with the others when taking graphics in to consideration. It was about physics. And as far as I remember, during the PS2/XBOX era, GPGPU computing didn't exist. Half Life 2 didn't even feature dynamic shadows.

Although it was not as good, the CPU in the XBox was an x86 PC CPU. It was a Pentium 3 based Celeron. Porting games from PC to XBox was easy. The Emotion Engine in the PS2 was like the Cell in the PS3. It was an alien architecture compared to what PC devs were used to working with...

Makes sense, but can you list the PS2 games that had superior physics to XBOX games such as Half Life 2? Or even Batman Begins for that matter.

Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#93 superclocked
Member since 2009 • 5864 Posts

[QUOTE="superclocked"][QUOTE="faizan_faizan"]

What are the reasons of Half Life 2, Far Cry, DOOM 3 and CoR:EfBB not showing up on the PS2, then? I think the GPU might be one of the reasons for that, but Half Life 2 wasn't even in the league with the others when taking graphics in to consideration. It was about physics. And as far as I remember, during the PS2/XBOX era, GPGPU computing didn't exist. Half Life 2 didn't even feature dynamic shadows.

faizan_faizan

Although it was not as good, the CPU in the XBox was an x86 PC CPU. It was a Pentium 3 based Celeron. Porting games from PC to XBox was easy. The Emotion Engine in the PS2 was like the Cell in the PS3. It was an alien architecture compared to what PC devs were used to working with...

Makes sense, but can you list the PS2 games that had superior physics to XBOX games such as Half Life 2? Or even Batman Begins for that matter.

You won't find any. The Emotion Engine was mostly used for graphics, just like they planned to do with the Cell processor. Sony only put the RSX in the PS3 to combat Microsoft. Had Microsoft not entered the console market, the PS3 would've simply used a Cell processor for graphics...
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="faizan_faizan"]

[QUOTE="superclocked"]Although it was not as good, the CPU in the XBox was an x86 PC CPU. It was a Pentium 3 based Celeron. Porting games from PC to XBox was easy. The Emotion Engine in the PS2 was like the Cell in the PS3. It was an alien architecture compared to what PC devs were used to working with...superclocked

Makes sense, but can you list the PS2 games that had superior physics to XBOX games such as Half Life 2? Or even Batman Begins for that matter.

You won't find any. The Emotion Engine was mostly used for graphics, just like they planned to do with the Cell processor. Sony only put the RSX in the PS3 to combat Microsoft. Had Microsoft not entered the console market, the PS3 would've simply used a Cell processor for graphics...

It's games would have looked like shit if they only used the CELL for raster rendering.

Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#95 superclocked
Member since 2009 • 5864 Posts

[QUOTE="superclocked"][QUOTE="faizan_faizan"]

Makes sense, but can you list the PS2 games that had superior physics to XBOX games such as Half Life 2? Or even Batman Begins for that matter.

faizan_faizan

You won't find any. The Emotion Engine was mostly used for graphics, just like they planned to do with the Cell processor. Sony only put the RSX in the PS3 to combat Microsoft. Had Microsoft not entered the console market, the PS3 would've simply used a Cell processor for graphics...

It's games would have looked like shit if they only used the CELL for raster rendering.

Well, the PS2 did have the "graphics synthesizer" GPU, but it could only push pixels. It didn't have any shader pipelines, so developers relied heavily on the Emotion Engine for graphics...
Avatar image for II_Seraphim_II
II_Seraphim_II

20534

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#96 II_Seraphim_II
Member since 2007 • 20534 Posts
[QUOTE="Far_RockNYC"]

:lol:AMD

DrTrafalgarLaw
Your precious XB1 GPU is made by AMD, you priaspismic phallus. :?

:lol: damn, the insult :lol: Take it easy with the priaspismic phallus...guy is probably crying now lol.
Avatar image for dramaybaz
dramaybaz

6020

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#97 dramaybaz
Member since 2005 • 6020 Posts
No worries, PC has been using GDDR5 for graphics for years.
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts
[QUOTE="faizan_faizan"]

[QUOTE="superclocked"]The PS2 actually had a better CPU than the XBox, but the XBox's GPU was far more advanced...superclocked

What are the reasons of Half Life 2, Far Cry, DOOM 3 and CoR:EfBB not showing up on the PS2, then? I think the GPU might be one of the reasons for that, but Half Life 2 wasn't even in the league with the others when taking graphics in to consideration. It was about physics. And as far as I remember, during the PS2/XBOX era, GPGPU computing didn't exist. Half Life 2 didn't even feature dynamic shadows.

Although it was not as good, the CPU in the XBox was an x86 PC CPU. It was a Pentium 3 based Celeron. Porting games from PC to XBox was easy. The Emotion Engine in the PS2 was like the Cell in the PS3. It was an alien architecture compared to what PC devs were used to working with...

i loved ps2 and original xbox... playing splinter cell on xbox live multiplayer was so fun... spy vs mercenaries man that was too much fun..i hope this next SC will bring back that nostaglia and feeling... ps2 had spectacular games too like kingdom hearts and final fantasy 10 and others.
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#99 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts
[QUOTE="DrTrafalgarLaw"][QUOTE="Far_RockNYC"]

:lol:AMD

II_Seraphim_II
Your precious XB1 GPU is made by AMD, you priaspismic phallus. :?

:lol: damn, the insult :lol: Take it easy with the priaspismic phallus...guy is probably crying now lol.

blantanly obvious troll is obvious though, doesnt even know whats in the Xbox One.. what a fail and he runs around these forums spamming.. why waste time far rock ?
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#100 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

[QUOTE="faizan_faizan"]

[QUOTE="superclocked"]You won't find any. The Emotion Engine was mostly used for graphics, just like they planned to do with the Cell processor. Sony only put the RSX in the PS3 to combat Microsoft. Had Microsoft not entered the console market, the PS3 would've simply used a Cell processor for graphics...superclocked

It's games would have looked like shit if they only used the CELL for raster rendering.

Well, the PS2 did have the "graphics synthesizer" GPU, but it could only push pixels. It didn't have any shader pipelines, so developers relied heavily on the Emotion Engine for graphics...

ps2 fill rate was very good at the time though, it even has more fill rate then the ps3 i believe, thats why you couldnt even do heat shimmer effects on GT in ps3 while on ps2 you could of and other effects in different games