By the time 8GB DDR5 RAM actually means something the PS4 will be a gen behind

This topic is locked from further discussion.

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#51 mitu123
Member since 2006 • 155290 Posts

[QUOTE="tormentos"][QUOTE="iamrob7"]

Right now PC gamers are playing in 1080p and above with 60FPS.  That's what I play in and no game uses anything like 8GB RAM.  The most RAM heavy game I play is DayZ, the Arma 2 mod, coming in at around 2GB RAM.  Most likely because it is an unoptimised mod for another game.  

 

Crysis 3 in ultra 1080p, uses around 1.5GB RAM.  

 

My GPU, a GTX 580 SC, has 1.5GB DDR5 RAM.  More than enough to cover any game currently at 1080p.  I am on the verge of getting a new GPU and even the medium spec GPU nowadays have 2-3GB DDR5 RAM.  I'm looking at a GTX 680 with 4GB DDR5.  Now that is a pretty decent PC, but that 4GB DDR5 will only be relevant if I have a multi monitor setup and some obscene resolution right now.

 

I see a lot of talk of this 8GB DDR5 RAM on the PS4, firstly the GPU and CPU are far more important features for gaming.  The 8GB is really only for longetivity, it is not going to be used for years, the games coming out on the PS4 will use 1-2GB of it at most for 1080p games (which as has already been stated, will be the stndard for the PS4).

 

So this 8GB DDR5 RAM is actually completely irrelevant and will be for a number of years, until games start actually using all that RAM.  In the meantime it will just sit there doing nothing in the console.  Waiting for the day it becomes relevant.  By that time the PC will be running DDR8 or DDR9, or perhaps something even beyond that whole concept.   

 

TL;DR - In short, by the time the 8GB DDR5 becomes actually useful, the standard gaming PC will be another generation ahead in terms of technology.  It will likely feature more RAM of a more advanced variety.  

 

The PS4's GPU seems to be the equivalent of a medium range gaming PC and the CPU is an open question.  I know playing Planetside 2 right now, my computer is CPU limited in terms of framerate, as opposed to GPU limited.  In large battles my framerate dips to 35-40 FPS, which is borderline unacceptable for me.  Now that is down to the large number of people in a battle, sometimes 500-600 people in a single area battling.  I have a 3770K (4 cores) @ 4.6Ghz.  Will 8 cores @ 1.6Ghz really work?  I don't know.

 

What I'm pleased about with the new consoles is that they are very similar to PC's now, the same architecture, means that dodgy PC ports should be a thing of the past.  I do however think that the latest console specs, whilst providing a big boost initially at least on the previous generation, will leave the consoles further behind the PC than this current generation ultimately.  The last generation of consoles were far closer to a top end PC than this generation will be on release.

 

faizan_faizan

At 1920x1200 which is a little higher than 1080p on Max Crysis 3 uses 2.2GB. http://www.guru3d.com/articles_pages/crysis_3_graphics_performance_review_benchmark,8.html

Maybe he's using FXAA
index.php?ct=articles&action=file&id=252 

I use 8xMSAA on Very High at 1200p and this is what I get for highest:

irSmC3iATS40P.png

 

Avatar image for no-scope-AK47
no-scope-AK47

3755

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#52 no-scope-AK47
Member since 2012 • 3755 Posts

Yet, smelly hermits still boast about their peasantly DDR3 RAM. Call me when DDR4 has become a standard for system memory. Better yet, call me when DDR5 is actually introduced to the general consumers. Call me again when people are actually able to afford 8 GB of DDR5 RAM instead of sticking to 4.DrTrafalgarLaw

Unlike most broke a$$ consolites hermits got bank. Every time we post pics the hate flows thru the lowly consolites. Why he got a 2000watt sony system just to play pc games and a bose setup for his consoles :lol:

Just our speakers cost more than most consolites whole setups but we jelly uh sure you right derp :lol:

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#53 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="Wasdie"]

Exactly. Remeber, the PS4 is also doing in-game video recording on a dedicated video encoding chip (to save CPU), so that will suck up a good chunk of that 8 gigs of RAM. 

I imagine that most games will have like 4-5 gigs allocated to them. This is more than enough really. Even Planetside 2 runs with 3ish gigs of RAM total (about 2 gigs for system ram, 1 gig for vRAM).

tormentos

 

Planetside 2 is way out there in terms of modern games, it uses far more resources than any game I'm aware of and a large part of that by the developers admission is down to poor optimisation on a new engine.  It actually uses around 0.5-1GB VRAM depending on your settings and between 1-2GB system RAM.  So at absolute worst it is roughly 3GB usage or less depending on your settings.  

 

Where exactly do you make the jump from that to most 1080p@30 console games in the next generation using 4-5 GB of RAM?  When the most extreme case on the PC in a new engine is only requiring 3GB 1080p@60 (or abouts)?  Crysis 3 ultra in 1440p only just goes above 3GB overall usage.  What exactly is going to be going on in "most" console games to be using 4-5GB RAM when they are only running in 1080p@30?

 

For the sake of argument, let's pretend a game on the newly released consoles out this year uses 5GB of RAM.  With mid range PC GPU's having 3GB DDR5 6 months before the new console generation comes out and high end GPU's having 4GB+ DDR5, this still provides no real advantage and certainly far less of an advantage than the previous generation of consoles had.  That's on release.  By the time the full 8GB DDR5 is used, the equivalent on the PC will be a generation ahead.     

All this points mean little.. First if you want to know what the PS4 will do with the extra ram,aside from bigger world and some of the best if not the best looking textures,why don't ask Crytek they were very vocal about Ram issues on consoles,and asked for the minimum to be 8GB maybe they know something you don't.. Also the fact that Crytek used a limit on their ram doesn't mean sony or other developers on PS4 have to fallow that line,on PC YOU CANT PUSH FOR GAMES TO USE BIG AMOUNTS OF RAM. Why because there is a legacy to keep on PC,and most card have 2GB or less,you can think that Crytek will go all out and optimize a game to run on the Titan specs,using its 6GB of memory,it just simple isn't done,it would not work on all other GPU or will work extremely cripple.

 

Wrong consistently.  I don't "want to know what the PS4 will do with the extra ram", I know exactly how RAM is used.  Here is my point 1080p@30 games at their most extreme at 1080p use 3GB RAM total, that's for super bastard settings in the most advanced games at higher framerates thant 30FPS.  Higher quality textures = Higher resolution.  PS4 games are going to be 1080p on release are they not?  So this 8GB DDR5 is not going to be used in a game for years to come, not until they come out with graphical possibilities far beyond what we have now.  Planetside 2 has battles featuring 2000 players, 3GB usage. Crysis 3 uses every graphical trick available right now, 3GB top end usage.  Mid range PC GPU's these days have 2-3GB GDDR5 on top of their system memory.  That's 6 months before the PS4 even releases.  

 

The PS4 has a mid range PC gaming GPU.  It can't do anything beyond Crysis 3 at 1080p, there is nothing it can handle that can use anything like 8GB DDR5 right now.  Not even close.  

 

So like I said, as is patentily obvious, by the time 8GB DDR5 is relevant and even 60% utilised, the PC equivalent will be years ahead.  It's a meaningless statistic that "cows" are clinging to desperately, because their GPU is average and their CPU is highly questionable.

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#54 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="Wasdie"]

Exactly. Remeber, the PS4 is also doing in-game video recording on a dedicated video encoding chip (to save CPU), so that will suck up a good chunk of that 8 gigs of RAM. 

I imagine that most games will have like 4-5 gigs allocated to them. This is more than enough really. Even Planetside 2 runs with 3ish gigs of RAM total (about 2 gigs for system ram, 1 gig for vRAM).

SaltyMeatballs

Where exactly do you make the jump from that to most 1080p@30 console games in the next generation using 4-5 GB of RAM?  When the most extreme case on the PC in a new engine is only requiring 3GB 1080p@60 (or abouts)?  Crysis 3 ultra in 1440p only just goes above 3GB overall usage.  What exactly is going to be going on in "most" console games to be using 4-5GB RAM when they are only running in 1080p@30?

That is PC, it has to run on many systems with limited amounts of RAM (limited by what people have in their systems). Consoles games will actually be able to use all the 8GB available (8GB -OS and other background tasks).

 

 

ROFL, what on earth does that sentence even mean.  Do you even know?  I certainly can't make sense of it.  You don't seem to be saying anything.  Try to come up with;

 

- Something that makes sense

- A valid point

 

Only then can you take part in this discussion.

Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="faizan_faizan"]

[QUOTE="tormentos"] At 1920x1200 which is a little higher than 1080p on Max Crysis 3 uses 2.2GB. http://www.guru3d.com/articles_pages/crysis_3_graphics_performance_review_benchmark,8.htmlmitu123

Maybe he's using FXAA
index.php?ct=articles&action=file&id=252 

I use 8xMSAA on Very High at 1200p and this is what I get for highest:

irSmC3iATS40P.png

 

That chart is correct.
Avatar image for way2funny
way2funny

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#56 way2funny
Member since 2003 • 4570 Posts

[QUOTE="way2funny"]

[QUOTE="iamrob7"]

 

GDDR5 is EXACTLY the same thing as DDR5.  The additional G only identifies whether it is being applied solely to the GPU or not.  

 

We are talking about game performance, that's what system wars is about correct?  So whether you can use the 8GB for OS and whatever other apps you might want to run on your PS4 and you can't use the GDDR5 on the PC for that is completely irrelevant.  We are talking about game performance.

 

I'll eagerly anticipate your apology.

iamrob7

No GDDR5 is DDR3 thats customized for graphics workloads.

 

That's exactly what the PS4 is using.  What you see on GPU's as GDDR5 is being used on the PS4 as system memory.  The DDR5 is situated on the PS4's GPU and integrated as system memory.  It's GDDR5.  There is NO SUCH THING as normal DDR5 memory.  They are calling it DDR5 because it is GDDR5 used system wide.

 

Please feel free to google it if you don't believe me.

 

I guess that's two apologies I'll be waiting for now :S

Right its GDDR5, no such thing as DDR5, they are NOT calling it DDR5 because its GDDR5 so they are calling it GDDR5.

Like its predecessor, GDDR4, GDDR5 is based on DDR3 SDRAM memory

The G doesnt just mean its on the graphics card, it means its functionally different.

PS4Specs.png

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#57 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="no-scope-AK47"]

IMO that 8gb is great you can never have too much memory. I do video encoding and large 1080p files will use as much memory as you have.

no-scope-AK47

 

Yes it is great to have more RAM undoubtedly.  That's not what my post is about though, I'm not saying 8GB of DDR5 is a bad thing.  I'm saying that by the time it provides any advantage in the actual games, it will be far behind the PC equivalent.  Rendering it meaningless in terms of SW.

All the memory is not for games obviously. The ps4 has a host of in games services and can encode video to upload to social networks or youtube. I hear they have real time cross game video chat. Also back ground uploading and down loading of hd content and games. You can start playing games while your still down loading them. You can have spectators watch you own and if you get stuck your man can take over your game and clear the level/boss. Pretty sure I forgot some features but even so more features are coming that will make use of the extra memory besides just games.

 

Please re-read the first two pages where I explain 3-4 times that this has nothing to do with whether 8GB can be utilised for other tasks.  I'm not suggesting 8GB is a bad idea, I'm saying it won't be utilised in games.  It provides no advantage in games over the standard current PC setup and won't do for years.  By the time it does, it will be a generation behind what the PC has.

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#58 mitu123
Member since 2006 • 155290 Posts

That chart is correct.faizan_faizan
I know, but I'm pointing out that some levels have more vram usage than others, I've seen 2.3GB and 2.4GB often later on.

I can show more pics of vram usage around that as well.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#59 Kinthalis
Member since 2002 • 5503 Posts

I just wanted to point out that Graphics cards can only tell you how much RAM has been allocated, NOT how much is actually used.

 

It's possible for Crysis 3 to allocate over 2 Gigs for various reasons, but never use that amount.  In fact Crysis 3 max runs fine on my GTX 680's in SLI at 2560x1440, so it obviously doesn't need over 2 gigs.

Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="faizan_faizan"] That chart is correct.mitu123

I know, but I'm pointing out that some levels have more vram usage than others, I've seen 2.3GB and 2.4GB often later on.

I can show more pics of vram usage around that as well.

ok.
Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#61 mitu123
Member since 2006 • 155290 Posts

I just wanted to point out that Graphics cards can only tell you how much RAM has been allocated, NOT how much is actually used.

 

It's possible for Crysis 3 to allocate over 2 Gigs for various reasons, but never use that amount.  In fact Crysis 3 max runs fine on my GTX 680's in SLI at 2560x1440, so it obviously doesn't need over 2 gigs.

Kinthalis

What AA do you use?

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#62 Kinthalis
Member since 2002 • 5503 Posts

[QUOTE="Kinthalis"]

I just wanted to point out that Graphics cards can only tell you how much RAM has been allocated, NOT how much is actually used.

 

It's possible for Crysis 3 to allocate over 2 Gigs for various reasons, but never use that amount.  In fact Crysis 3 max runs fine on my GTX 680's in SLI at 2560x1440, so it obviously doesn't need over 2 gigs.

mitu123

What AA do you use?

SMAA 2x

Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="Kinthalis"]

I just wanted to point out that Graphics cards can only tell you how much RAM has been allocated, NOT how much is actually used.

 

It's possible for Crysis 3 to allocate over 2 Gigs for various reasons, but never use that amount.  In fact Crysis 3 max runs fine on my GTX 680's in SLI at 2560x1440, so it obviously doesn't need over 2 gigs.

mitu123

What AA do you use?

Hey mitu what AA do you think is the best for Crysis 3? Personally it's TXAA.
Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#64 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="way2funny"]

No GDDR5 is DDR3 thats customized for graphics workloads.

way2funny

 

That's exactly what the PS4 is using.  What you see on GPU's as GDDR5 is being used on the PS4 as system memory.  The DDR5 is situated on the PS4's GPU and integrated as system memory.  It's GDDR5.  There is NO SUCH THING as normal DDR5 memory.  They are calling it DDR5 because it is GDDR5 used system wide.

 

Please feel free to google it if you don't believe me.

 

I guess that's two apologies I'll be waiting for now :S

Right its GDDR5, no such thing as DDR5, they are NOT calling it DDR5 because its GDDR5 so they are calling it GDDR5.

Like its predecessor, GDDR4, GDDR5 is based on DDR3 SDRAM memory

The G doesnt just mean its on the graphics card, it means its functionally different.

PS4Specs.png

 

The lack of a G is to denote whether the RAM is being used system wide or not.  Calling it DDR5 makes it clear that I understand it is being used system wide as opposed to labelling it GDDR5 and saves on the responses of people saying it is not just GPU memory.  The reason it has been reported as DDR5 everywhere under the sun is to highlight the fact that it will be used as system wide memory, as opposed to just the GPU.  Effectively it is a new form of system memory as GDDR5 has not been used for this purpose before.  So the simplest solution for me is to label it DDR5 in the title as that's how it has been reported absolutely everywhere.

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#65 mitu123
Member since 2006 • 155290 Posts

[QUOTE="mitu123"]

[QUOTE="Kinthalis"]

I just wanted to point out that Graphics cards can only tell you how much RAM has been allocated, NOT how much is actually used.

 

It's possible for Crysis 3 to allocate over 2 Gigs for various reasons, but never use that amount.  In fact Crysis 3 max runs fine on my GTX 680's in SLI at 2560x1440, so it obviously doesn't need over 2 gigs.

Kinthalis

What AA do you use?

SMAA 2x

No wonder, I used it and I don't even see it ever getting past 2GB, ever, and I have 4GB cards.

Avatar image for way2funny
way2funny

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#66 way2funny
Member since 2003 • 4570 Posts

I just wanted to point out that Graphics cards can only tell you how much RAM has been allocated, NOT how much is actually used.

 

It's possible for Crysis 3 to allocate over 2 Gigs for various reasons, but never use that amount.  In fact Crysis 3 max runs fine on my GTX 680's in SLI at 2560x1440, so it obviously doesn't need over 2 gigs.

Kinthalis

From a computer's perspective, allocating and using is the same thing. If its allocated, nothing else can use it anyway, so its being used. Wether or not its effective use is another question and unless you know the ins and outs of crysis 3, and Nvidia's / AMD's drivers and how they allocate resources in the graphics card it is really not your call.

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#67 mitu123
Member since 2006 • 155290 Posts

[QUOTE="mitu123"]

[QUOTE="Kinthalis"]

I just wanted to point out that Graphics cards can only tell you how much RAM has been allocated, NOT how much is actually used.

 

It's possible for Crysis 3 to allocate over 2 Gigs for various reasons, but never use that amount.  In fact Crysis 3 max runs fine on my GTX 680's in SLI at 2560x1440, so it obviously doesn't need over 2 gigs.

faizan_faizan

What AA do you use?

Hey mitu what AA do you think is the best for Crysis 3? Personally it's TXAA.

That works but can blur a bit so SMAA is also a great choice.

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#68 iamrob7
Member since 2007 • 2138 Posts

I just wanted to point out that Graphics cards can only tell you how much RAM has been allocated, NOT how much is actually used.

 

It's possible for Crysis 3 to allocate over 2 Gigs for various reasons, but never use that amount.  In fact Crysis 3 max runs fine on my GTX 680's in SLI at 2560x1440, so it obviously doesn't need over 2 gigs.

Kinthalis

 

Good point, although effectively if it has been allocated, then it is still "using it".

Avatar image for way2funny
way2funny

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#69 way2funny
Member since 2003 • 4570 Posts

[QUOTE="way2funny"]

[QUOTE="iamrob7"]

 

That's exactly what the PS4 is using.  What you see on GPU's as GDDR5 is being used on the PS4 as system memory.  The DDR5 is situated on the PS4's GPU and integrated as system memory.  It's GDDR5.  There is NO SUCH THING as normal DDR5 memory.  They are calling it DDR5 because it is GDDR5 used system wide.

 

Please feel free to google it if you don't believe me.

 

I guess that's two apologies I'll be waiting for now :S

iamrob7

Right its GDDR5, no such thing as DDR5, they are NOT calling it DDR5 because its GDDR5 so they are calling it GDDR5.

Like its predecessor, GDDR4, GDDR5 is based on DDR3 SDRAM memory

The G doesnt just mean its on the graphics card, it means its functionally different.

PS4Specs.png

 

The lack of a G is to denote whether the RAM is being used system wide or not.  Calling it DDR5 makes it clear that I understand it is being used system wide as opposed to labelling it GDDR5 and saves on the responses of people saying it is not just GPU memory.  The reason it has been reported as DDR5 everywhere under the sun is to highlight the fact that it will be used as system wide memory, as opposed to just the GPU.  Effectively it is a new form of system memory as GDDR5 has not been used for this purpose before.  So the simplest solution for me is to label it DDR5 in the title as that's how it has been reported absolutely everywhere.

No its GDDR5 thats being used system wide. NOT DDR5. The G is NOT there to denote wether its being used system wide or not. The G is there to denote that it is functionally different than DDR3. You can use DDR3 for graphics, it would NOT be called GDDR3, because GDDR3 exists and is different. It would just be called GPU w/ DDR3 meory. Sony just decided to use GDDR5 as system wide memory. Its still the same memory, now just the CPU has direct access to it. And FYI DDR3 is a lot better than GDDR5 for CPU workloads. Theres a reason we use GDDR5 in graphics and DDR3 with everything else.

Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware.

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#70 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="way2funny"]

Right its GDDR5, no such thing as DDR5, they are NOT calling it DDR5 because its GDDR5 so they are calling it GDDR5.

Like its predecessor, GDDR4, GDDR5 is based on DDR3 SDRAM memory

The G doesnt just mean its on the graphics card, it means its functionally different.

PS4Specs.png

way2funny

 

The lack of a G is to denote whether the RAM is being used system wide or not.  Calling it DDR5 makes it clear that I understand it is being used system wide as opposed to labelling it GDDR5 and saves on the responses of people saying it is not just GPU memory.  The reason it has been reported as DDR5 everywhere under the sun is to highlight the fact that it will be used as system wide memory, as opposed to just the GPU.  Effectively it is a new form of system memory as GDDR5 has not been used for this purpose before.  So the simplest solution for me is to label it DDR5 in the title as that's how it has been reported absolutely everywhere.

No its GDDR5 thats being used system wide. NOT DDR5. The G is NOT there to denote wether its being used system wide or not. The G is there to denote that it is functionally different than DDR3. Sony just decided to use GDDR5 as system wide memory. Its still the same memory, now just the CPU has direct access to it. And FYI DDR3 is a lot better than GDDR5 for CPU workloads. Theres a reason we use GDDR5 in graphics and DDR3 with everything else.

Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware.

 

Are you not reading what I just wrote in the last two comments?  That's EXACTLY what I am saying.  2 comments ago I specifically said GDDR5 is DDR5 because DDR5 doesn't exist.  What is it exactly I am saying that is not getting through to you?  Please explain to me what part of my previous comments you didn't manage to take in?  Was it the part where I said DDR5 doesn't exist and the only thing in the PS4 is GDDR5?  How did that not sink in?

 

 I'll say it one more time, I put DDR5 in the title to highlight that I was aware it is being used as system wide memory because that's how it has been reported EVERYWHERE.  Dealing with a single person who is obsessing over meaningless semantics is a lot simpler than having to answer 20 responses from consolites who believe I am ignoring the fact that it is being used system wide and not just on the GPU.  Although you are pushing it now.  Labels are used to express meaning, the reality is far more people interpret it the way I've used, whether it's correct or not.  As that's how it has been reported. 

 

edit - Let me just add this as I re-read this line and it is ridiculous;

 

"Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware."

 

It has literally ZERO bearing on my point or any of the points I've made in this thread.  None whatsoever.  Please do explain to me how that affects the meaning of my point? 

Avatar image for way2funny
way2funny

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#71 way2funny
Member since 2003 • 4570 Posts

[QUOTE="way2funny"]

[QUOTE="iamrob7"]

 

The lack of a G is to denote whether the RAM is being used system wide or not.  Calling it DDR5 makes it clear that I understand it is being used system wide as opposed to labelling it GDDR5 and saves on the responses of people saying it is not just GPU memory.  The reason it has been reported as DDR5 everywhere under the sun is to highlight the fact that it will be used as system wide memory, as opposed to just the GPU.  Effectively it is a new form of system memory as GDDR5 has not been used for this purpose before.  So the simplest solution for me is to label it DDR5 in the title as that's how it has been reported absolutely everywhere.

iamrob7

No its GDDR5 thats being used system wide. NOT DDR5. The G is NOT there to denote wether its being used system wide or not. The G is there to denote that it is functionally different than DDR3. Sony just decided to use GDDR5 as system wide memory. Its still the same memory, now just the CPU has direct access to it. And FYI DDR3 is a lot better than GDDR5 for CPU workloads. Theres a reason we use GDDR5 in graphics and DDR3 with everything else.

Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware.

 

Are you not reading what I just wrote in the last two comments?  That's EXACTLY what I am saying.  2 comments ago I specifically said GDDR5 is DDR5 because DDR5 doesn't exist.  What is it exactly I am saying that is not getting through to you?  Please explain to me what part of my previous comments you didn't manage to take in?  Was it the part where I said DDR5 doesn't exist and the only thing in the PS4 is GDDR5?  How did that not sink in?

 

 I'll say it one more time, I put DDR5 in the title to highlight that I was aware it is being used as system wide memory because that's how it has been reported EVERYWHERE.  Dealing with a single person who is obsessing over meaningless semantics is a lot simpler than having to answer 20 responses from consolites who believe I am ignoring the fact that it is being used system wide and not just on the GPU.  Although you are pushing it now.  Labels are used to express meaning, the reality is far more people interpret it the way I've used, whether it's correct or not.  As that's how it has been reported. 

 

edit - Let me just add this as I re-read this line and it is ridiculous;

 

"Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware."

 

It has literally ZERO bearing on my point or any of the points I've made in this thread.  None whatsoever.  Please do explain to me how that affects the meaning of my point? 

Right im glad you are aware that it is being used as system memory, but it is NOT DDR5 and is not called DDR5. Its called GDDR5 no matter where and how you use it.

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#72 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="way2funny"]

No its GDDR5 thats being used system wide. NOT DDR5. The G is NOT there to denote wether its being used system wide or not. The G is there to denote that it is functionally different than DDR3. Sony just decided to use GDDR5 as system wide memory. Its still the same memory, now just the CPU has direct access to it. And FYI DDR3 is a lot better than GDDR5 for CPU workloads. Theres a reason we use GDDR5 in graphics and DDR3 with everything else.

Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware.

way2funny

 

Are you not reading what I just wrote in the last two comments?  That's EXACTLY what I am saying.  2 comments ago I specifically said GDDR5 is DDR5 because DDR5 doesn't exist.  What is it exactly I am saying that is not getting through to you?  Please explain to me what part of my previous comments you didn't manage to take in?  Was it the part where I said DDR5 doesn't exist and the only thing in the PS4 is GDDR5?  How did that not sink in?

 

 I'll say it one more time, I put DDR5 in the title to highlight that I was aware it is being used as system wide memory because that's how it has been reported EVERYWHERE.  Dealing with a single person who is obsessing over meaningless semantics is a lot simpler than having to answer 20 responses from consolites who believe I am ignoring the fact that it is being used system wide and not just on the GPU.  Although you are pushing it now.  Labels are used to express meaning, the reality is far more people interpret it the way I've used, whether it's correct or not.  As that's how it has been reported. 

 

edit - Let me just add this as I re-read this line and it is ridiculous;

 

"Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware."

 

It has literally ZERO bearing on my point or any of the points I've made in this thread.  None whatsoever.  Please do explain to me how that affects the meaning of my point? 

Right im glad you are aware that it is being used as system memory, but it is NOT DDR5 and is not called DDR5. Its called GDDR5 no matter where and how you use it.

 

Meaning is about consensus and relevance.  That's how language is developed and used.  Take the word gay its meaning is happiness essentially, yet that meaning is completely different now, in fact after 50 years or so it now has an additional meaning in a dictionary.  Why?  because it was used differently by the consensus.  People used it to express and label something, a homosexual.  This is how all language has evolved.

 

Years ago I did a physics degree, there are endless concepts and labels used which are applied and used completely differently in normal conversation so as to express an appropriate meaning given a discussion about this or that.  In particular with someone who lacks the technical side of awareness.  That's how language develops and words create different meanings.  It is how language is supposed to be used.  To express your meaning in the best way possible.

Your insistence on me using GDDR5 instead of DDR5 in the title reminds me of some academics who couldn't stand to see a scientific term misused from its written definition, even if the misuse actually produced a greater understanding of the underlying concept.  I'm not suggesting that they were wrong or that they didn't have a valid point, but for me it's entirely dependent on circumstance.  Only when the misuse of a concept/word actually affects the meaning behind the point/concept you are getting across is it worth arguing about or getting upset about.  That's not the case here and hence I don't really understand your motivation.  You understood what my point was presumably?  You also understood that using DDR5 as opposed to GDDR5 mislead nobody in any meaningful way?  You understand that GDDR5 has not been used as system memory before?  So it is not unreasonable to use a term to describe GDDR5 being used as system memory, seeing as DDR5 has been widely reported and will be understood by most people.  With popular associations in mind, why would DDR5 not be a reasonable way to express the meaning of GDDR5 used as system wide memory?  It seems to me like a good way to express it.  DDR5 doesn't exist as a different entity, so there can be no confusion.  

 

I understood why the first guy picked on it, because he found the post upsetting and had no other viable response but to pick on petty semantics.  Your motivation seems to be different though, it's the actual semantics that you are caught up on.  Perhaps because I responded to your initial post about something else highlighting its irrelevance.  Maybe because you had no response to that, you decided to pick on the semantics, I'm not sure.  People such as yourself will always puzzle me, but each to their own ey.  

Avatar image for Silenthps
Silenthps

7302

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#73 Silenthps
Member since 2006 • 7302 Posts

 

Your insistence on me using GDDR5 instead of DDR5 in the title reminds me of some academics who couldn't stand to see a scientific term misused from its written definition, even if the misuse actually produced a greater understanding of the underlying concept.  I'm not suggesting that they were wrong or that they didn't have a valid point, but for me it's entirely dependant on circumstance.  Only when the misuse of a concept/word actually affects the meaning behind the point/concept you are getting across is it worth arguing about or getting upset about.  That's not the case here and hence I don't really understand your motivation.  You understood what my point was presumably?  You also understood that using DDR5 as opposed to GDDR5 mislead nobody in any meaningful way?  You understand that GDDR5 has not been used as system memory before?  So it is not unreasonable to use a term to describe GDDR5 being used as system memory, seeing as DDR5 has been widely reported and will be understood by most people.  With popular associations in mind, why would DDR5 not be a reasonable way to express the meaning of GDDR5 used as system wide memory?  It seems to me like a good way to express it.  DDR5 doesn't exist as a different entity, so there can be no confusion.  

iamrob7

It's not just semantics, GDDR RAM's architecture is vastly different than DDR SDRAM

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#74 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"] 

Your insistence on me using GDDR5 instead of DDR5 in the title reminds me of some academics who couldn't stand to see a scientific term misused from its written definition, even if the misuse actually produced a greater understanding of the underlying concept.  I'm not suggesting that they were wrong or that they didn't have a valid point, but for me it's entirely dependant on circumstance.  Only when the misuse of a concept/word actually affects the meaning behind the point/concept you are getting across is it worth arguing about or getting upset about.  That's not the case here and hence I don't really understand your motivation.  You understood what my point was presumably?  You also understood that using DDR5 as opposed to GDDR5 mislead nobody in any meaningful way?  You understand that GDDR5 has not been used as system memory before?  So it is not unreasonable to use a term to describe GDDR5 being used as system memory, seeing as DDR5 has been widely reported and will be understood by most people.  With popular associations in mind, why would DDR5 not be a reasonable way to express the meaning of GDDR5 used as system wide memory?  It seems to me like a good way to express it.  DDR5 doesn't exist as a different entity, so there can be no confusion.  

Silenthps

It's not just semantics, GDDR RAM's architecture is vastly different than DDR SDRAM

 

That has absolutely no bearing on anything I've said.  I'm not disputing the differences between DDR and GDDR RAM in any way shape or form.  In fact I've made them clear repeatedly in this thread.

 

It's like you don't understand anything I said in that post at all and I don't have the inclination to repeat myself.  

 

Avatar image for stayhigh1
stayhigh1

724

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#75 stayhigh1
Member since 2008 • 724 Posts

Everyone here's a developer :D

nextgenjoke

The real developer already made a statement about the 8GB GDDR5 and is extremely happy that Sony decided to equipped it  for the PS4.. A lot of them mention that PS4 is a high end PC, and very powerful, and way more easier to develop games on it. Its going to give them the creativity to push their games to the limit. ..I dont know how many threads we need about the PC being superior(if you spend thousand of dollars on it) when we have acknowledge that, we are just saying the average PC's that is out there for gamers can't out perform the PS4 in term of specs.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78 tormentos
Member since 2003 • 33798 Posts
Wrong consistently.  I don't "want to know what the PS4 will do with the extra ram", I know exactly how RAM is used.  Here is my point 1080p@30 games at their most extreme at 1080p use 3GB RAM total, that's for super bastard settings in the most advanced games at higher framerates thant 30FPS.  Higher quality textures = Higher resolution.  PS4 games are going to be 1080p on release are they not?  So this 8GB DDR5 is not going to be used in a game for years to come, not until they come out with graphical possibilities far beyond what we have now.  Planetside 2 has battles featuring 2000 players, 3GB usage. Crysis 3 uses every graphical trick available right now, 3GB top end usage.  Mid range PC GPU's these days have 2-3GB GDDR5 on top of their system memory.  That's 6 months before the PS4 even releases.  

 

The PS4 has a mid range PC gaming GPU.  It can't do anything beyond Crysis 3 at 1080p, there is nothing it can handle that can use anything like 8GB DDR5 right now.  Not even close.  

 

So like I said, as is patentily obvious, by the time 8GB DDR5 is relevant and even 60% utilised, the PC equivalent will be years ahead.  It's a meaningless statistic that "cows" are clinging to desperately, because their GPU is average and their CPU is highly questionable.

iamrob7
That mentality is so damn wrong is not even funny so what was the most graphical game on 2005.? You mean to tell me that the xbox 360 and PS3 have not kick the living crap out of the most graphics game 20045 2006.? Nothing on the 7800GTX runs like Halo 4 or Uncharted 3 nothing...Not even on 1024x768... You will see how the PS4 graphics actually surpass those of Crysis 3,power is a waste on PC,rather than getting more visuals in most games the power is use to get more frames. Example the 7850 will run any game out now on max,the only difference between it an a 7970GHZ edition is frames per second,they do output the same quality at the same settings,you will learn quick enough when the PS4 start to been push and you see PS4 games distant for crysis 3,comparing Crysis 3 to Killzone on PS4 is a joke is an unfinish launch game,that is like comparing Resistance 1 vs Halo 4...
Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#80 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

Right now PC gamers are playing in 1080p and above with 60FPS.  That's what I play in and no game uses anything like 8GB RAM.  The most RAM heavy game I play is DayZ, the Arma 2 mod, coming in at around 2GB RAM.  Most likely because it is an unoptimised mod for another game.  

 

Crysis 3 in ultra 1080p, uses around 1.5GB RAM.  

 

My GPU, a GTX 580 SC, has 1.5GB DDR5 RAM.  More than enough to cover any game currently at 1080p.  I am on the verge of getting a new GPU and even the medium spec GPU nowadays have 2-3GB DDR5 RAM.  I'm looking at a GTX 680 with 4GB DDR5.  Now that is a pretty decent PC, but that 4GB DDR5 will only be relevant if I have a multi monitor setup and some obscene resolution right now.

 

I see a lot of talk of this 8GB DDR5 RAM on the PS4, firstly the GPU and CPU are far more important features for gaming.  The 8GB is really only for longetivity, it is not going to be used for years, the games coming out on the PS4 will use 1-2GB of it at most for 1080p games (which as has already been stated, will be the stndard for the PS4).

 

So this 8GB DDR5 RAM is actually completely irrelevant and will be for a number of years, until games start actually using all that RAM.  In the meantime it will just sit there doing nothing in the console.  Waiting for the day it becomes relevant.  By that time the PC will be running DDR8 or DDR9, or perhaps something even beyond that whole concept.   

 

TL;DR - In short, by the time the 8GB DDR5 becomes actually useful, the standard gaming PC will be another generation ahead in terms of technology.  It will likely feature more RAM of a more advanced variety.  

 

The PS4's GPU seems to be the equivalent of a medium range gaming PC and the CPU is an open question.  I know playing Planetside 2 right now, my computer is CPU limited in terms of framerate, as opposed to GPU limited.  In large battles my framerate dips to 35-40 FPS, which is borderline unacceptable for me.  Now that is down to the large number of people in a battle, sometimes 500-600 people in a single area battling.  I have a 3770K (4 cores) @ 4.6Ghz.  Will 8 cores @ 1.6Ghz really work?  I don't know.

 

What I'm pleased about with the new consoles is that they are very similar to PC's now, the same architecture, means that dodgy PC ports should be a thing of the past.  I do however think that the latest console specs, whilst providing a big boost initially at least on the previous generation, will leave the consoles further behind the PC than this current generation ultimately.  The last generation of consoles were far closer to a top end PC than this generation will be on release.

 

xboxiphoneps3

you clearly dont know what you are talking about... the PS4 CPU cores will be clocked at 2.0 ghz each, AND developers can use all of that RAM right away, it wont take "years" for them to finally take advantage of all it... they can take advantage of all the ram right now

 

You don't understand a single thing I wrote did you.  I didn't say they COULDN'T use that 8GB RAM immediately, sure they could.  My point is that they WON'T be using it for years.  That's because 1080p games with all the graphical addons available right now use at most 3GB RAM.  By the time they use 8GB the equivalent PC alternative will be a generation ahead. 

 

 

Also LOL as to the 2.0GHz CPU, still over twice as slow clockspeed as a PC I built over a year before the PS4 will be released.  Will the 8 cores being used properly make up for it?  Remains to be seen, big question mark over it.  

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 MonsieurX
Member since 2008 • 39858 Posts
[QUOTE="tormentos"][QUOTE="iamrob7"]Wrong consistently.  I don't "want to know what the PS4 will do with the extra ram", I know exactly how RAM is used.  Here is my point 1080p@30 games at their most extreme at 1080p use 3GB RAM total, that's for super bastard settings in the most advanced games at higher framerates thant 30FPS.  Higher quality textures = Higher resolution.  PS4 games are going to be 1080p on release are they not?  So this 8GB DDR5 is not going to be used in a game for years to come, not until they come out with graphical possibilities far beyond what we have now.  Planetside 2 has battles featuring 2000 players, 3GB usage. Crysis 3 uses every graphical trick available right now, 3GB top end usage.  Mid range PC GPU's these days have 2-3GB GDDR5 on top of their system memory.  That's 6 months before the PS4 even releases.  

 

The PS4 has a mid range PC gaming GPU.  It can't do anything beyond Crysis 3 at 1080p, there is nothing it can handle that can use anything like 8GB DDR5 right now.  Not even close.  

 

So like I said, as is patentily obvious, by the time 8GB DDR5 is relevant and even 60% utilised, the PC equivalent will be years ahead.  It's a meaningless statistic that "cows" are clinging to desperately, because their GPU is average and their CPU is highly questionable.

xboxiphoneps3
That mentality is so damn wrong is not even funny so what was the most graphical game on 2005.? You mean to tell me that the xbox 360 and PS3 have not kick the living crap out of the most graphics game 20045 2006.? Nothing on the 7800GTX runs like Halo 4 or Uncharted 3 nothing...Not even on 1024x768... You will see how the PS4 graphics actually surpass those of Crysis 3,power is a waste on PC,rather than getting more visuals in most games the power is use to get more frames. Example the 7850 will run any game out now on max,the only difference between it an a 7970GHZ edition is frames per second,they do output the same quality at the same settings,you will learn quick enough when the PS4 start to been push and you see PS4 games distant for crysis 3,comparing Crysis 3 to Killzone on PS4 is a joke is an unfinish launch game,that is like comparing Resistance 1 vs Halo 4...

word, im not even impressed with Crysis 3, they should be able to push much more graphics out with the graphics power we have today

Kinda hard when you're held back by consoles :(
Avatar image for Silenthps
Silenthps

7302

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#82 Silenthps
Member since 2006 • 7302 Posts

[QUOTE="Silenthps"]

[QUOTE="iamrob7"] 

Your insistence on me using GDDR5 instead of DDR5 in the title reminds me of some academics who couldn't stand to see a scientific term misused from its written definition, even if the misuse actually produced a greater understanding of the underlying concept.  I'm not suggesting that they were wrong or that they didn't have a valid point, but for me it's entirely dependant on circumstance.  Only when the misuse of a concept/word actually affects the meaning behind the point/concept you are getting across is it worth arguing about or getting upset about.  That's not the case here and hence I don't really understand your motivation.  You understood what my point was presumably?  You also understood that using DDR5 as opposed to GDDR5 mislead nobody in any meaningful way?  You understand that GDDR5 has not been used as system memory before?  So it is not unreasonable to use a term to describe GDDR5 being used as system memory, seeing as DDR5 has been widely reported and will be understood by most people.  With popular associations in mind, why would DDR5 not be a reasonable way to express the meaning of GDDR5 used as system wide memory?  It seems to me like a good way to express it.  DDR5 doesn't exist as a different entity, so there can be no confusion.  

iamrob7

It's not just semantics, GDDR RAM's architecture is vastly different than DDR SDRAM

 

That has absolutely no bearing on anything I've said.  I'm not disputing the differences between DDR and GDDR RAM in any way shape or form.  In fact I've made them clear repeatedly in this thread.

 

It's like you don't understand anything I said in that post at all and I don't have the inclination to repeat myself.  

 

I didn't say you were disputing it and I understood what you wrote. By calling it DDR5 ram all you're doing is creating more confusion.
Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#84 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]Wrong consistently.  I don't "want to know what the PS4 will do with the extra ram", I know exactly how RAM is used.  Here is my point 1080p@30 games at their most extreme at 1080p use 3GB RAM total, that's for super bastard settings in the most advanced games at higher framerates thant 30FPS.  Higher quality textures = Higher resolution.  PS4 games are going to be 1080p on release are they not?  So this 8GB DDR5 is not going to be used in a game for years to come, not until they come out with graphical possibilities far beyond what we have now.  Planetside 2 has battles featuring 2000 players, 3GB usage. Crysis 3 uses every graphical trick available right now, 3GB top end usage.  Mid range PC GPU's these days have 2-3GB GDDR5 on top of their system memory.  That's 6 months before the PS4 even releases.  

 

The PS4 has a mid range PC gaming GPU.  It can't do anything beyond Crysis 3 at 1080p, there is nothing it can handle that can use anything like 8GB DDR5 right now.  Not even close.  

 

So like I said, as is patentily obvious, by the time 8GB DDR5 is relevant and even 60% utilised, the PC equivalent will be years ahead.  It's a meaningless statistic that "cows" are clinging to desperately, because their GPU is average and their CPU is highly questionable.

tormentos

That mentality is so damn wrong is not even funny so what was the most graphical game on 2005.? You mean to tell me that the xbox 360 and PS3 have not kick the living crap out of the most graphics game 20045 2006.? Nothing on the 7800GTX runs like Halo 4 or Uncharted 3 nothing...Not even on 1024x768... You will see how the PS4 graphics actually surpass those of Crysis 3,power is a waste on PC,rather than getting more visuals in most games the power is use to get more frames. Example the 7850 will run any game out now on max,the only difference between it an a 7970GHZ edition is frames per second,they do output the same quality at the same settings,you will learn quick enough when the PS4 start to been push and you see PS4 games distant for crysis 3,comparing Crysis 3 to Killzone on PS4 is a joke is an unfinish launch game,that is like comparing Resistance 1 vs Halo 4...

 

http://www.youtube.com/watch?v=jHWPGmf_A_0

 

Crysis 2 on a x1950 pro, graphics card from 2006.  Uncharted 3 and Halo 4 use graphical features that didn't exist in 2006.  Can a 2006 PC run them?  I don't see why not if it can run Crysis 2 just aswell if not better than consoles.  

 

Secondly a 7850 most certainly will not "run any gamg out now on max" unless a horribly unplayable framerate second is acceptable to you.  In which case a x1950 from 2006 can run any modern game on MAX settings if framerate is irrelevant for you.  

 

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 tormentos
Member since 2003 • 33798 Posts
[QUOTE="xboxiphoneps3"] you clearly dont know what you are talking about... the PS4 CPU cores will be clocked at 2.0 ghz each, AND developers can use all of that RAM right away, it wont take "years" for them to finally take advantage of all it... they can take advantage of all the ram right now

He actually believes that sony will not use the ram because PC developers are not using it...:lol: I would love to see sony using 5 or 6 GB of ram and trowing incredible textures,effects and things at once,i would love to see how those 660TI and most mid to high range GPU start to choke.. Is something they never think about they think that as long as the GPU is strong nothing can hurt them..:lol: http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/6 They don't want to understand what happen when a GPU get Vram limited.. http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/15 Look at the huge blow that GPU take when they are ram starved,in both cases the 650ti or the 7850 the result was the same,as long as the game doesn't demand allot of ram is ok,as soon as a game ask to much ram the impact is latent.. 33FPS difference between both models of 7850 just because one had 1GB more of ram,in the same resolution.
Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#86 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="Silenthps"]It's not just semantics, GDDR RAM's architecture is vastly different than DDR SDRAM

Silenthps

 

That has absolutely no bearing on anything I've said.  I'm not disputing the differences between DDR and GDDR RAM in any way shape or form.  In fact I've made them clear repeatedly in this thread.

 

It's like you don't understand anything I said in that post at all and I don't have the inclination to repeat myself.  

 

I didn't say you were disputing it and I understood what you wrote. By calling it DDR5 ram all you're doing is creating more confusion.

 

What confusion am I creating?  What problem is it going to cause?  Please explain to me the major issue using "DDR5" to describe GDDR5 being used system wide is going to cause when that is how most people interpret it anyway?  GDDR5 being used system wide is an advancement on the current PC setup, so would DDR5 be if it existed.  The net effect of either is the same for the purposes of any discussion or interpretation.  Especially on this board.  

Avatar image for glez13
glez13

10314

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87 glez13
Member since 2006 • 10314 Posts

LOL

DDR8...DDR9

What's up with all the nonsense threads.

Avatar image for AmazonTreeBoa
AmazonTreeBoa

16745

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#88 AmazonTreeBoa
Member since 2011 • 16745 Posts
it is already a gen behind.Tessellation
^This. My PC blows the PS4 out of the water.
Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#89 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="xboxiphoneps3"] you clearly dont know what you are talking about... the PS4 CPU cores will be clocked at 2.0 ghz each, AND developers can use all of that RAM right away, it wont take "years" for them to finally take advantage of all it... they can take advantage of all the ram right nowtormentos
He actually believes that sony will not use the ram because PC developers are not using it...:lol: I would love to see sony using 5 or 6 GB of ram and trowing incredible textures,effects and things at once,i would love to see how those 660TI and most mid to high range GPU start to choke.. Is something they never think about they think that as long as the GPU is strong nothing can hurt them..:lol: http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/6 They don't want to understand what happen when a GPU get Vram limited.. http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/15 Look at the huge blow that GPU take when they are ram starved,in both cases the 650ti or the 7850 the result was the same,as long as the game doesn't demand allot of ram is ok,as soon as a game ask to much ram the impact is latent.. 33FPS difference between both models of 7850 just because one had 1GB more of ram,in the same resolution.

 

You don't understand that the resolution of textures you are using is not going to be above the resolution the game is actually in i.e 1080p.  So in a 1080p game the best texture resolution you will get is 1080p.  1080p textures with all the current graphical bells and whistles available in game on top of them use at most 3GB RAM in total.  

 

No game will be released for a long time that uses anything like the full 8GB available on the PS4.  If ever.  By the time the 8GB becomes relevant, the PC equivalent will be a generation ahead or more.  

 

As for the 650ti, of course GPU memory is important, RAM is important.  The point is 8GB worth over 3GB worth for the foreseeable future is going to be meaningless for the PS4.  3GB over 1GB is a completely different thing, it's a big advantage in plenty of games.  

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#90 iamrob7
Member since 2007 • 2138 Posts

LOL

DDR8...DDR9

What's up with all the nonsense threads.

glez13

 

The idea was to express how much further PC memory would be ahead of the PS4's 8GB by the time it became relevant.  It wasn't a prediction of DDR8 or 9 actually existing.  I kinda thought that would be pretty obvious :S

 

Avatar image for ducati101
ducati101

1741

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#92 ducati101
Member since 2004 • 1741 Posts
[QUOTE="faizan_faizan"][QUOTE="DrTrafalgarLaw"] Yet, smelly hermits still boast about their peasantly DDR3 RAM. Call me when DDR4 has become a standard for system memory. Better yet, call me when DDR5 is actually introduced to the general consumers. Call me again when people are actually able to afford 8 GB of DDR5 RAM instead of sticking to 4.

Yet I have 12GB of GDDR5 RAM and 32GB of DDR3 System RAM, better luck on the PS5 ;)
Avatar image for Silenthps
Silenthps

7302

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#93 Silenthps
Member since 2006 • 7302 Posts

[QUOTE="Silenthps"][QUOTE="iamrob7"]

 

That has absolutely no bearing on anything I've said.  I'm not disputing the differences between DDR and GDDR RAM in any way shape or form.  In fact I've made them clear repeatedly in this thread.

 

It's like you don't understand anything I said in that post at all and I don't have the inclination to repeat myself.  

 

iamrob7

I didn't say you were disputing it and I understood what you wrote. By calling it DDR5 ram all you're doing is creating more confusion.

 

What confusion am I creating?  What problem is it going to cause?  Please explain to me the major issue using "DDR5" to describe GDDR5 being used system wide is going to cause when that is how most people interpret it anyway?  GDDR5 being used system wide is an advancement on the current PC setup, so would DDR5 be if it existed.  The net effect of either is the same for the purposes of any discussion or interpretation.  Especially on this board.  

well for one it already caused problems, there's still lots of people who still don't know the difference between the two, when DDR5 does come out, people will be more confused. Sony has no problem calling it gddr5 in their tech papers nor do any other websites, i don't see what's so hard about calling it what it actually is. In fact it would be much more clear if you just called it "GDDR5 used as system memory" from the start, if you don't feel comfortable with just saying its GDDR5. What if I started calling "GDDR5 being used system wide" as "hamster" instead? Are you saying that wouldn't cause confusion? Words have meaning and the G is important.
Avatar image for way2funny
way2funny

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#94 way2funny
Member since 2003 • 4570 Posts

[QUOTE="way2funny"]

[QUOTE="iamrob7"]

 

Are you not reading what I just wrote in the last two comments?  That's EXACTLY what I am saying.  2 comments ago I specifically said GDDR5 is DDR5 because DDR5 doesn't exist.  What is it exactly I am saying that is not getting through to you?  Please explain to me what part of my previous comments you didn't manage to take in?  Was it the part where I said DDR5 doesn't exist and the only thing in the PS4 is GDDR5?  How did that not sink in?

 

 I'll say it one more time, I put DDR5 in the title to highlight that I was aware it is being used as system wide memory because that's how it has been reported EVERYWHERE.  Dealing with a single person who is obsessing over meaningless semantics is a lot simpler than having to answer 20 responses from consolites who believe I am ignoring the fact that it is being used system wide and not just on the GPU.  Although you are pushing it now.  Labels are used to express meaning, the reality is far more people interpret it the way I've used, whether it's correct or not.  As that's how it has been reported. 

 

edit - Let me just add this as I re-read this line and it is ridiculous;

 

"Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware."

 

It has literally ZERO bearing on my point or any of the points I've made in this thread.  None whatsoever.  Please do explain to me how that affects the meaning of my point? 

iamrob7

Right im glad you are aware that it is being used as system memory, but it is NOT DDR5 and is not called DDR5. Its called GDDR5 no matter where and how you use it.

 

Meaning is about consensus and relevance.  That's how language is developed and used.  Take the word gay its meaning is happiness essentially, yet that meaning is completely different now, in fact after 50 years or so it now has an additional meaning in a dictionary.  Why?  because it was used differently by the consensus.  People used it to express and label something, a homosexual.  This is how all language has evolved.

 

Years ago I did a physics degree, there are endless concepts and labels used which are applied and used completely differently in normal conversation so as to express an appropriate meaning given a discussion about this or that.  In particular with someone who lacks the technical side of awareness.  That's how language develops and words create different meanings.  It is how language is supposed to be used.  To express your meaning in the best way possible.

Your insistence on me using GDDR5 instead of DDR5 in the title reminds me of some academics who couldn't stand to see a scientific term misused from its written definition, even if the misuse actually produced a greater understanding of the underlying concept.  I'm not suggesting that they were wrong or that they didn't have a valid point, but for me it's entirely dependent on circumstance.  Only when the misuse of a concept/word actually affects the meaning behind the point/concept you are getting across is it worth arguing about or getting upset about.  That's not the case here and hence I don't really understand your motivation.  You understood what my point was presumably?  You also understood that using DDR5 as opposed to GDDR5 mislead nobody in any meaningful way?  You understand that GDDR5 has not been used as system memory before?  So it is not unreasonable to use a term to describe GDDR5 being used as system memory, seeing as DDR5 has been widely reported and will be understood by most people.  With popular associations in mind, why would DDR5 not be a reasonable way to express the meaning of GDDR5 used as system wide memory?  It seems to me like a good way to express it.  DDR5 doesn't exist as a different entity, so there can be no confusion.  

 

I understood why the first guy picked on it, because he found the post upsetting and had no other viable response but to pick on petty semantics.  Your motivation seems to be different though, it's the actual semantics that you are caught up on.  Perhaps because I responded to your initial post about something else highlighting its irrelevance.  Maybe because you had no response to that, you decided to pick on the semantics, I'm not sure.  People such as yourself will always puzzle me, but each to their own ey.  

I'm not caught up on the symantics, im caught up in the syntax. GDDR5 is not DDR5 and is not called DDR5 and cannot be called DDR5. It already has the name GDDR5 and that relates to something very specific. This isn't liberal arts, trying to find meaning and give clever names to things. Fact is its GDDR5 no matter how its used, and calling it DDR5 is misleading and wrong. It is a WRONG statement. Nothing about that is up for discussion. Its just how it is wether you like it or not

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#96 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="tormentos"] He actually believes that sony will not use the ram because PC developers are not using it...:lol: I would love to see sony using 5 or 6 GB of ram and trowing incredible textures,effects and things at once,i would love to see how those 660TI and most mid to high range GPU start to choke.. Is something they never think about they think that as long as the GPU is strong nothing can hurt them..:lol: http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/6 They don't want to understand what happen when a GPU get Vram limited.. http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/15 Look at the huge blow that GPU take when they are ram starved,in both cases the 650ti or the 7850 the result was the same,as long as the game doesn't demand allot of ram is ok,as soon as a game ask to much ram the impact is latent.. 33FPS difference between both models of 7850 just because one had 1GB more of ram,in the same resolution.xboxiphoneps3

 

You don't understand that the resolution of textures you are using is not going to be above the resolution the game is actually in i.e 1080p.  So in a 1080p game the best texture resolution you will get is 1080p.  1080p textures with all the current graphical bells and whistles available in game on top of them use at most 3GB RAM in total.  

 

No game will be released for a long time that uses anything like the full 8GB available on the PS4.  If ever.  By the time the 8GB becomes relevant, the PC equivalent will be a generation ahead or more.  

 

As for the 650ti, of course GPU memory is important, RAM is important.  The point is 8GB worth over 3GB worth for the foreseeable future is going to be meaningless for the PS4.  3GB over 1GB is a completely different thing, it's a big advantage in plenty of games.  

 

lol bro RAM is not just used for textures.... its used for tons of other things.. faster loading times, more particles in the air, more randomness in the game, bigger landscapes, more NPC's in the game, etc... a bunch of things benefit from extra RAM

 

That's why I said textures + all the available bells and whistles then.  That's why I talked about Planetside 2 and DayZ, the two games with the largest amount of players in the largest area available right now.  Those are the benefits of extra RAM and the limit with every graphical feature available in those huge environments right now is around 3GB.  8GB provides no advantages until new features are developed that utilise it.  By then the PC equivalent will be a generation ahead.

 

Faster load times are all about your HD, i.e. an SSD will make a big improvement.  Rendering is related to the above.  

Avatar image for ducati101
ducati101

1741

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98 ducati101
Member since 2004 • 1741 Posts

[QUOTE="ducati101"][QUOTE="faizan_faizan"][QUOTE="DrTrafalgarLaw"] Yet, smelly hermits still boast about their peasantly DDR3 RAM. Call me when DDR4 has become a standard for system memory. Better yet, call me when DDR5 is actually introduced to the general consumers. Call me again when people are actually able to afford 8 GB of DDR5 RAM instead of sticking to 4.xboxiphoneps3

Yet I have 12GB of GDDR5 RAM and 32GB of DDR3 System RAM, better luck on the PS5 ;)

$2500 PC that the PS4 can match in graphics? ill take that

 

let me know when majority of PC owners all around the globe have even 8 gigs of ram total in their systems.. dont worry i will be waiting 

8GB's of RAM majority do, think your talking about GDDR which is a completely different thing. No need for me to educate you in the pro's and cons of DDR v GDDR.
Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#99 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="way2funny"]

Right im glad you are aware that it is being used as system memory, but it is NOT DDR5 and is not called DDR5. Its called GDDR5 no matter where and how you use it.

way2funny

 

Meaning is about consensus and relevance.  That's how language is developed and used.  Take the word gay its meaning is happiness essentially, yet that meaning is completely different now, in fact after 50 years or so it now has an additional meaning in a dictionary.  Why?  because it was used differently by the consensus.  People used it to express and label something, a homosexual.  This is how all language has evolved.

 

Years ago I did a physics degree, there are endless concepts and labels used which are applied and used completely differently in normal conversation so as to express an appropriate meaning given a discussion about this or that.  In particular with someone who lacks the technical side of awareness.  That's how language develops and words create different meanings.  It is how language is supposed to be used.  To express your meaning in the best way possible.

Your insistence on me using GDDR5 instead of DDR5 in the title reminds me of some academics who couldn't stand to see a scientific term misused from its written definition, even if the misuse actually produced a greater understanding of the underlying concept.  I'm not suggesting that they were wrong or that they didn't have a valid point, but for me it's entirely dependent on circumstance.  Only when the misuse of a concept/word actually affects the meaning behind the point/concept you are getting across is it worth arguing about or getting upset about.  That's not the case here and hence I don't really understand your motivation.  You understood what my point was presumably?  You also understood that using DDR5 as opposed to GDDR5 mislead nobody in any meaningful way?  You understand that GDDR5 has not been used as system memory before?  So it is not unreasonable to use a term to describe GDDR5 being used as system memory, seeing as DDR5 has been widely reported and will be understood by most people.  With popular associations in mind, why would DDR5 not be a reasonable way to express the meaning of GDDR5 used as system wide memory?  It seems to me like a good way to express it.  DDR5 doesn't exist as a different entity, so there can be no confusion.  

 

I understood why the first guy picked on it, because he found the post upsetting and had no other viable response but to pick on petty semantics.  Your motivation seems to be different though, it's the actual semantics that you are caught up on.  Perhaps because I responded to your initial post about something else highlighting its irrelevance.  Maybe because you had no response to that, you decided to pick on the semantics, I'm not sure.  People such as yourself will always puzzle me, but each to their own ey.  

I'm not caught up on the symantics, im caught up in the syntax. GDDR5 is not DDR5 and is not called DDR5 and cannot be called DDR5. It already has the name GDDR5 and that relates to something very specific. This isn't liberal arts, trying to find meaning and give clever names to things. Fact is its GDDR5 no matter how its used, and calling it DDR5 is misleading and wrong. It is a WRONG statement. Nothing about that is up for discussion. Its just how it is wether you like it or not

 

Haha, that's precisely the kind of blinkered animosity I was describing.  Syntax is the form of a sentence??  Semantics is the meaning, Syntax is form and the principles of structure.  You are dealing with Semantics, not syntax, when you criticise the use of DDR5 in place of GDDR5.  It's the meaning you have a problem with is it not?

 

I'm not sitting an exam with yes or no answers, I'm expressing ideas and thoughts in the shape of a discussion.  If I expressed those ideas in a better way to more people using an innaccurate definition of a word, which would have hindered me and wasted my time if I'd used it correctly, then that would be a smart move.   

 

If the great minds that have caused jumps in our evolution throughout the ages had thought like you, we would all be banging sticks together in caves grunting at one another still.

 

Also it is Semantics, not Symantics.  Symantics alludes to symmetry of definition perhaps, a new word, ah the wonders of language ey.

Avatar image for way2funny
way2funny

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#100 way2funny
Member since 2003 • 4570 Posts

[QUOTE="way2funny"]

[QUOTE="iamrob7"]

 

Meaning is about consensus and relevance.  That's how language is developed and used.  Take the word gay its meaning is happiness essentially, yet that meaning is completely different now, in fact after 50 years or so it now has an additional meaning in a dictionary.  Why?  because it was used differently by the consensus.  People used it to express and label something, a homosexual.  This is how all language has evolved.

 

Years ago I did a physics degree, there are endless concepts and labels used which are applied and used completely differently in normal conversation so as to express an appropriate meaning given a discussion about this or that.  In particular with someone who lacks the technical side of awareness.  That's how language develops and words create different meanings.  It is how language is supposed to be used.  To express your meaning in the best way possible.

Your insistence on me using GDDR5 instead of DDR5 in the title reminds me of some academics who couldn't stand to see a scientific term misused from its written definition, even if the misuse actually produced a greater understanding of the underlying concept.  I'm not suggesting that they were wrong or that they didn't have a valid point, but for me it's entirely dependent on circumstance.  Only when the misuse of a concept/word actually affects the meaning behind the point/concept you are getting across is it worth arguing about or getting upset about.  That's not the case here and hence I don't really understand your motivation.  You understood what my point was presumably?  You also understood that using DDR5 as opposed to GDDR5 mislead nobody in any meaningful way?  You understand that GDDR5 has not been used as system memory before?  So it is not unreasonable to use a term to describe GDDR5 being used as system memory, seeing as DDR5 has been widely reported and will be understood by most people.  With popular associations in mind, why would DDR5 not be a reasonable way to express the meaning of GDDR5 used as system wide memory?  It seems to me like a good way to express it.  DDR5 doesn't exist as a different entity, so there can be no confusion.  

 

I understood why the first guy picked on it, because he found the post upsetting and had no other viable response but to pick on petty semantics.  Your motivation seems to be different though, it's the actual semantics that you are caught up on.  Perhaps because I responded to your initial post about something else highlighting its irrelevance.  Maybe because you had no response to that, you decided to pick on the semantics, I'm not sure.  People such as yourself will always puzzle me, but each to their own ey.  

iamrob7

I'm not caught up on the symantics, im caught up in the syntax. GDDR5 is not DDR5 and is not called DDR5 and cannot be called DDR5. It already has the name GDDR5 and that relates to something very specific. This isn't liberal arts, trying to find meaning and give clever names to things. Fact is its GDDR5 no matter how its used, and calling it DDR5 is misleading and wrong. It is a WRONG statement. Nothing about that is up for discussion. Its just how it is wether you like it or not

 

Haha, that's precisely the kind of blinkered animosity I was describing.  Syntax is the form of a sentence??  Semantics is the meaning, Syntax is form and the principles of structure.  You are dealing with Semantics, not syntax, when you criticise the use of DDR5 in place of GDDR5.  It's the meaning you have a problem with is it not?

 

I'm not sitting an exam with yes or no answers, I'm expressing ideas and thoughts in the shape of a discussion.  If I expressed those ideas in a better way to more people using an innaccurate definition of a word, which would have hindered me and wasted my time if I'd used it correctly, then that would be a smart move.   

 

If the great minds that have caused jumps in our evolution throughout the ages had thought like you, we would all be banging sticks together in caves grunting at one another still.

 

Also it is Semantics, not Symantics.  Symantics alludes to symmetry of definition perhaps, a new word, ah the wonders of language ey.

Fact is its GDDR5 no matter how its used, and calling it DDR5 is misleading and wrong. It is a WRONG statement. Nothing about that is up for discussion. Its just how it is wether you like it or not