PS4 GPU already down to $144.99

  • 190 results
  • 1
  • 2
  • 3
  • 4

This topic is locked from further discussion.

Avatar image for -Unreal-
-Unreal-

24650

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 1

#151 -Unreal-
Member since 2004 • 24650 Posts
[QUOTE="o0squishy0o"][QUOTE="True_Gamer_"][QUOTE="AzatiS"] Remind me how bad Crysis run on X360 compared to a PC of 2007 let alone a one of 2011 ( when crysis released on consoles ).. The differences were huge in favor of PC that is. Now please...

What PC will it take to run BF4 on PS4 settings? See my point?

Probably a very mid-range card when the next PC cards come out. I don't know why people think consoles are capable of miracles.. you can get more; but its like tunning a standard car. You can only squeeze so much out of a corsa unless you replace parts.

Thank Xenu someone's on the same page as me.
Avatar image for YearoftheSnake5
YearoftheSnake5

9731

Forum Posts

0

Wiki Points

0

Followers

Reviews: 55

User Lists: 0

#152 YearoftheSnake5
Member since 2005 • 9731 Posts

Since Sony is buying the GPUs in bulk, they probably get a special, lower price per unit. Plus, it's not like they're paying for a full PCI-e graphics card. They're likely paying around $100 per GPU unit or somewhere around that.

Avatar image for Kjranu
Kjranu

1802

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#153 Kjranu
Member since 2012 • 1802 Posts

Honestly you forum posters can be totally retarded. The graphics chip in PS3 is outdated yet it still produces games with amazing graphics with TLOU being the latest example of this. The difference between console games and PC games is there is no PC game that fully takes advantage of a graphics chip by extreme optimizing. They have to make a game work well on a variety of chips available for to PC market. They only need to work with one with consoles (or two if we're counting the One). The consoles don't need a 780 to compete because they only need the power of a midrange chip to make that huge leap that we will see in TLOU2 that can easily look just as good as PC games if not better.

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#154 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

Honestly you forum posters can be totally retarded. The graphics chip in PS3 is outdated yet it still produces games with amazing graphics with TLOU being the latest example of this. The difference between console games and PC games is there is no PC game that fully takes advantage of a graphics chip by extreme optimizing. They have to make a game work well on a variety of chips available for to PC market. They only need to work with one with consoles (or two if we're counting the One). The consoles don't need a 780 to compete because they only need the power of a midrange chip to make that huge leap that we will see in TLOU2 that can easily look just as good as PC games if not better.

Kjranu

 

http://uk.gamespot.com/forums/topic/29355105/the-last-of-us-massive-graphics-downgrade-pics-and-video-comparison

 

Looks like ass to me.

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#155 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

 

Generic animations are now NEXT GEN...LOL

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#156 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

Multiplat handing TLOU it's ass....

 

Avatar image for Kjranu
Kjranu

1802

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#157 Kjranu
Member since 2012 • 1802 Posts

[QUOTE="Kjranu"]

Honestly you forum posters can be totally retarded. The graphics chip in PS3 is outdated yet it still produces games with amazing graphics with TLOU being the latest example of this. The difference between console games and PC games is there is no PC game that fully takes advantage of a graphics chip by extreme optimizing. They have to make a game work well on a variety of chips available for to PC market. They only need to work with one with consoles (or two if we're counting the One). The consoles don't need a 780 to compete because they only need the power of a midrange chip to make that huge leap that we will see in TLOU2 that can easily look just as good as PC games if not better.

AMD655

 

http://uk.gamespot.com/forums/topic/29355105/the-last-of-us-massive-graphics-downgrade-pics-and-video-comparison

 

Looks like ass to me.

The only one who looks like an arse here is you (your arse must be smoking from all the butthurt also). TLOU graphics looks great for a video chip that is 7 year old. The graphics are comparable with most PC games out there. Plus by being anti-console for all the aforementioned dumb reasons you are also being anti-AMD because AMD is going to make huge bucks off the next gen consoles. If you don't know why then you don't deserve to have an account here.

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#158 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts
[QUOTE="AMD655"]

[QUOTE="Kjranu"]

Honestly you forum posters can be totally retarded. The graphics chip in PS3 is outdated yet it still produces games with amazing graphics with TLOU being the latest example of this. The difference between console games and PC games is there is no PC game that fully takes advantage of a graphics chip by extreme optimizing. They have to make a game work well on a variety of chips available for to PC market. They only need to work with one with consoles (or two if we're counting the One). The consoles don't need a 780 to compete because they only need the power of a midrange chip to make that huge leap that we will see in TLOU2 that can easily look just as good as PC games if not better.

Kjranu

 

http://uk.gamespot.com/forums/topic/29355105/the-last-of-us-massive-graphics-downgrade-pics-and-video-comparison

 

Looks like ass to me.

The only one who looks like an arse here is you (your arse must be smoking from all the butthurt also). TLOU graphics looks great for a video chip that is 7 year old. The graphics are comparable with most PC games out there. Plus by being anti-console for all the aforementioned dumb reasons you are also being anti-AMD because AMD is going to make huge bucks off the next gen consoles. If you don't know why then you don't deserve to have an account here.

Anti-AMD? Anti-console? Big news flash for you.... PS3 owner, AMD rig.
Avatar image for Kjranu
Kjranu

1802

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#159 Kjranu
Member since 2012 • 1802 Posts
[QUOTE="Kjranu"][QUOTE="AMD655"]

 

http://uk.gamespot.com/forums/topic/29355105/the-last-of-us-massive-graphics-downgrade-pics-and-video-comparison

 

Looks like ass to me.

AMD655
The only one who looks like an arse here is you (your arse must be smoking from all the butthurt also). TLOU graphics looks great for a video chip that is 7 year old. The graphics are comparable with most PC games out there. Plus by being anti-console for all the aforementioned dumb reasons you are also being anti-AMD because AMD is going to make huge bucks off the next gen consoles. If you don't know why then you don't deserve to have an account here.

Anti-AMD? Anti-console? Big news flash for you.... PS3 owner, AMD rig.

Too bad your intelligence doesn't really reflect your ownership.
Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#160 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts
[QUOTE="Kjranu"][QUOTE="AMD655"][QUOTE="Kjranu"] The only one who looks like an arse here is you (your arse must be smoking from all the butthurt also). TLOU graphics looks great for a video chip that is 7 year old. The graphics are comparable with most PC games out there. Plus by being anti-console for all the aforementioned dumb reasons you are also being anti-AMD because AMD is going to make huge bucks off the next gen consoles. If you don't know why then you don't deserve to have an account here.

Anti-AMD? Anti-console? Big news flash for you.... PS3 owner, AMD rig.

Too bad your intelligence doesn't really reflect your ownership.

Oh, you mean the biased crap about TLOU and PS3? I see lol.
Avatar image for Kjranu
Kjranu

1802

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#161 Kjranu
Member since 2012 • 1802 Posts
[QUOTE="AMD655"][QUOTE="Kjranu"][QUOTE="AMD655"] Anti-AMD? Anti-console? Big news flash for you.... PS3 owner, AMD rig.

Too bad your intelligence doesn't really reflect your ownership.

Oh, you mean the biased crap about TLOU and PS3? I see lol.

No not biased. I also have an AMD rig and a PS3. I have played a lot of top of the line PC games and not many "blow" TLOU out of the water. Of course I'm not saying that TLOU is teh best graphics ever to have graxe teh galaxy but it looks damn, damn good for a 7 year old hardware. That was my point. If they can do that magic with a 7 year old hardware, just imagine what they can do with a 7850-like chip on the PS4. A quantum leap.
Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#162 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts
[QUOTE="Kjranu"][QUOTE="AMD655"][QUOTE="Kjranu"] Too bad your intelligence doesn't really reflect your ownership.

Oh, you mean the biased crap about TLOU and PS3? I see lol.

No not biased. I also have an AMD rig and a PS3. I have played a lot of top of the line PC games and not many "blow" TLOU out of the water. Of course I'm not saying that TLOU is teh best graphics ever to have graxe teh galaxy but it looks damn, damn good for a 7 year old hardware. That was my point. If they can do that magic with a 7 year old hardware, just imagine what they can do with a 7850-like chip on the PS4. A quantum leap.

You come across way too fanboy like in the first place, you deserved what you got..... And yes, you practically did state TLOU is the be all end all, when i found it to be pretty meh and generic overall.
Avatar image for Kjranu
Kjranu

1802

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#163 Kjranu
Member since 2012 • 1802 Posts
[QUOTE="AMD655"][QUOTE="Kjranu"][QUOTE="AMD655"] Oh, you mean the biased crap about TLOU and PS3? I see lol.

No not biased. I also have an AMD rig and a PS3. I have played a lot of top of the line PC games and not many "blow" TLOU out of the water. Of course I'm not saying that TLOU is teh best graphics ever to have graxe teh galaxy but it looks damn, damn good for a 7 year old hardware. That was my point. If they can do that magic with a 7 year old hardware, just imagine what they can do with a 7850-like chip on the PS4. A quantum leap.

You come across way too fanboy like in the first place, you deserved what you got..... And yes, you practically did state TLOU is the be all end all, when i found it to be pretty meh and generic overall.

Giving off a fanboy impression doesn't make my point one bit less valid. It's definitely not generic. It's one of the greatest looking games out there and they did that with a 7 year old hardware. That's it.
Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#164 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

[QUOTE="AMD655"][QUOTE="Kjranu"][QUOTE="AMD655"]

Oh, you mean the biased crap about TLOU and PS3?

I see lol.Kjranu

No not biased. I also have an AMD rig and a PS3. I have played a lot of top of the line PC games and not many "blow" TLOU out of the water. Of course I'm not saying that TLOU is teh best graphics ever to have graxe teh galaxy but it looks damn, damn good for a 7 year old hardware. That was my point. If they can do that magic with a 7 year old hardware, just imagine what they can do with a 7850-like chip on the PS4. A quantum leap.

You come across way too fanboy like in the first place, you deserved what you got.....

And yes, you practically did state TLOU is the be all end all, when i found it to be pretty meh and generic overall.

Giving off a fanboy impression doesn't make my point one bit less valid. It's definitely not generic. It's one of the greatest looking games out there and they did that with a 7 year old hardware. That's it.

You cannot have played many games in your time then, TLOU looks meh...

 

12fa809a_Screenshot140920.jpeg

 

1421d464_TombRaiderOriginal2013-08-1803-

Avatar image for Aidenfury19
Aidenfury19

2488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#165 Aidenfury19
Member since 2007 • 2488 Posts

Can we stop it with the stupid comparisons? Even ignoring the fact that the specs don't line up properly, the GPU in the PS4 is customized heavily enough it's more of a GCN1.5 or GCN2.0 part than it is a standard GCN chip.

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#166 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

Can we stop it with the stupid comparisons? Even ignoring the fact that the specs don't line up properly, the GPU in the PS4 is customized heavily enough it's more of a GCN1.5 or GCN2.0 part than it is a standard GCN chip.

Aidenfury19
No such thing as GCN 1.5/2.0
Avatar image for Heil68
Heil68

60836

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#167 Heil68
Member since 2004 • 60836 Posts
With Sony behind it, by golly we know we'll get the most bang for our buck.
Avatar image for Aidenfury19
Aidenfury19

2488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#168 Aidenfury19
Member since 2007 • 2488 Posts

[QUOTE="Aidenfury19"]

Can we stop it with the stupid comparisons? Even ignoring the fact that the specs don't line up properly, the GPU in the PS4 is customized heavily enough it's more of a GCN1.5 or GCN2.0 part than it is a standard GCN chip.

AMD655

No such thing as GCN 1.5/2.0

GCN is an architecture AMD plans to stick with for awhile, there will be several revisions of it and given how heavily customized it is, it's not unreasonable to expect that AMD will make use of some of those changes in subsequent revisions. We know for instance that they're trying to push hUMA and prior GCN parts didn't support it.

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#169 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

[QUOTE="AMD655"][QUOTE="Aidenfury19"]

Can we stop it with the stupid comparisons? Even ignoring the fact that the specs don't line up properly, the GPU in the PS4 is customized heavily enough it's more of a GCN1.5 or GCN2.0 part than it is a standard GCN chip.

Aidenfury19

No such thing as GCN 1.5/2.0

GCN is an architecture AMD plans to stick with for awhile, there will be several revisions of it and given how heavily customized it is, it's not unreasonable to expect that AMD will make use of some of those changes in subsequent revisions. We know for instance that they're trying to push hUMA and prior GCN parts didn't support it.

GCN parts support HSA....
Avatar image for menes777
menes777

2643

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#170 menes777
Member since 2003 • 2643 Posts

[QUOTE="AMD655"][QUOTE="Kjranu"] No not biased. I also have an AMD rig and a PS3. I have played a lot of top of the line PC games and not many "blow" TLOU out of the water. Of course I'm not saying that TLOU is teh best graphics ever to have graxe teh galaxy but it looks damn, damn good for a 7 year old hardware. That was my point. If they can do that magic with a 7 year old hardware, just imagine what they can do with a 7850-like chip on the PS4. A quantum leap. Kjranu
You come across way too fanboy like in the first place, you deserved what you got..... And yes, you practically did state TLOU is the be all end all, when i found it to be pretty meh and generic overall.

Giving off a fanboy impression doesn't make my point one bit less valid. It's definitely not generic. It's one of the greatest looking games out there and they did that with a 7 year old hardware. That's it.

Yet only scored an 8.0 with Gamespot?

Avatar image for Aidenfury19
Aidenfury19

2488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#171 Aidenfury19
Member since 2007 • 2488 Posts

[QUOTE="Aidenfury19"]

[QUOTE="AMD655"] No such thing as GCN 1.5/2.0AMD655

GCN is an architecture AMD plans to stick with for awhile, there will be several revisions of it and given how heavily customized it is, it's not unreasonable to expect that AMD will make use of some of those changes in subsequent revisions. We know for instance that they're trying to push hUMA and prior GCN parts didn't support it.

GCN parts support HSA....

The architecture might have, but the hardware doesn't. Not even Kaveri will fully support HSA, you'll have to wait another year for that IIRC. Nor will the PS4 fully support HSA, but it's a heck of a lot closer than anything released to date is.

Avatar image for BlbecekBobecek
BlbecekBobecek

2949

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#172 BlbecekBobecek
Member since 2006 • 2949 Posts

This thread is so dumb

seanmcloughlin

Avatar image for True_Gamer_
True_Gamer_

6750

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#173 True_Gamer_
Member since 2006 • 6750 Posts
In order to repeat the last gen trend PS4 had to have an equivalent GPU of this magnitude: http://www.newegg.com/Product/Product.aspx?Item=N82E16814150669 Sadly the console makers turned into cheapskates.
Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#174 AzatiS
Member since 2004 • 14969 Posts

[QUOTE="AzatiS"][QUOTE="True_Gamer_"] What PC will it take to run BF4 on PS4 settings? See my point?AMD655
A PC with a decent GPU with 2-3 GDDR5 , 32GB Ram which is really cheap and a decent CPU like 3570k which can OC up to really nice speeds... and voila

16GB of ram is pointless, let alone 32GB for system memory as a gamer. 2GB is all you need for Vram.

Sorry but if you run 64bit OS which will be must for next-gen gaming , 8GB is the least you need. Why not have 16GB or even 32GB if you like multitasking since its really cheap to have nowdays?

As for VRAM , 2GB is todays requirements .. in 3-4 years from now i bet that will be the minimum.

We talking about power overall here so i just proved PCs can , as we speak , be as powerful as a future console ( is not even out yet ) without too much of effort or super expensive hardware. Period

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#175 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

[QUOTE="AMD655"][QUOTE="AzatiS"] A PC with a decent GPU with 2-3 GDDR5 , 32GB Ram which is really cheap and a decent CPU like 3570k which can OC up to really nice speeds... and voilaAzatiS

16GB of ram is pointless, let alone 32GB for system memory as a gamer. 2GB is all you need for Vram.

Sorry but if you run 64bit OS which will be must for next-gen gaming , 8GB is the least you need. Why not have 16GB or even 32GB if you like multitasking since its really cheap to have nowdays?

As for VRAM , 2GB is todays requirements .. in 3-4 years from now i bet that will be the minimum.

We talking about power overall here so i just proved PCs can , as we speak , be as powerful as a future console ( is not even out yet ) without too much of effort or super expensive hardware. Period

You simply do not need 16 or 32GB, it is stupid.
Avatar image for g0ddyX
g0ddyX

3914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#176 g0ddyX
Member since 2005 • 3914 Posts

Whats the price of the Wii:U or Xbox One GPU?

Avatar image for Aidenfury19
Aidenfury19

2488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#177 Aidenfury19
Member since 2007 • 2488 Posts

[QUOTE="AzatiS"]

[QUOTE="AMD655"] 16GB of ram is pointless, let alone 32GB for system memory as a gamer. 2GB is all you need for Vram.AMD655

Sorry but if you run 64bit OS which will be must for next-gen gaming , 8GB is the least you need. Why not have 16GB or even 32GB if you like multitasking since its really cheap to have nowdays?

As for VRAM , 2GB is todays requirements .. in 3-4 years from now i bet that will be the minimum.

We talking about power overall here so i just proved PCs can , as we speak , be as powerful as a future console ( is not even out yet ) without too much of effort or super expensive hardware. Period

You simply do not need 16 or 32GB, it is stupid.

Because you can't make good use of the memory bandwidth available with 16 or 32GB, 8GB is anything but stupid for the PS4. On the other hand using a dedicated 2GB of VRAM WOULD be stupid at this point.

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#178 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

[QUOTE="AMD655"][QUOTE="AzatiS"] Sorry but if you run 64bit OS which will be must for next-gen gaming , 8GB is the least you need. Why not have 16GB or even 32GB if you like multitasking since its really cheap to have nowdays?

As for VRAM , 2GB is todays requirements .. in 3-4 years from now i bet that will be the minimum.

We talking about power overall here so i just proved PCs can , as we speak , be as powerful as a future console ( is not even out yet ) without too much of effort or super expensive hardware. Period

Aidenfury19

You simply do not need 16 or 32GB, it is stupid.

Because you can't make good use of the memory bandwidth available with 16 or 32GB, 8GB is anything but stupid for the PS4. On the other hand using a dedicated 2GB of VRAM WOULD be stupid at this point.

Well go out and purchase a new GPU then. Vram is not changeable like system memory is.
Avatar image for Aidenfury19
Aidenfury19

2488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#179 Aidenfury19
Member since 2007 • 2488 Posts

[QUOTE="Aidenfury19"]

[QUOTE="AMD655"] You simply do not need 16 or 32GB, it is stupid.AMD655

Because you can't make good use of the memory bandwidth available with 16 or 32GB, 8GB is anything but stupid for the PS4. On the other hand using a dedicated 2GB of VRAM WOULD be stupid at this point.

Well go out and purchase a new GPU then. Vram is not changeable like system memory is.

The whole point of the PS4 and XBONE's memory setup is that there is almost no distinction made between VRAM and system RAM, available memory is shared between the two and both the CPU and GPU are on the same die.

It lets you get rid of the PCIE bottleneck entirely so you actually can make better use of the 4GB+ of memory you're allocating to the GPU.

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#180 04dcarraher
Member since 2004 • 23859 Posts

[QUOTE="AMD655"][QUOTE="Aidenfury19"]

Because you can't make good use of the memory bandwidth available with 16 or 32GB, 8GB is anything but stupid for the PS4. On the other hand using a dedicated 2GB of VRAM WOULD be stupid at this point.

Aidenfury19

Well go out and purchase a new GPU then. Vram is not changeable like system memory is.

The whole point of the PS4 and XBONE's memory setup is that there is almost no distinction made between VRAM and system RAM, available memory is shared between the two and both the CPU and GPU are on the same die.

It lets you get rid of the PCIE bottleneck entirely so you actually can make better use of the 4GB+ of memory you're allocating to the GPU.

huh? there is no Pci-e bottleneck because the bus can move data faster then whats needed, Also next set of gpu's from AMD and Nvidia will have native unified memory abilities to allow the gpu to directly use system memory along with the cpu directly accessing the video card's memory pool. Not only that your putting too much trust in the unified memory usage for the consoles... 360 had this and still didnt help the end results. These new consoles have a 3 to 3.5 Gb memory allocation just for OS and features which means that these consoles again will have to split whats left between the game cache and video memory. You will not see 4gb video usage, you will see typical 2-3gb of usage for video memory and the rest for game/system cache.

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#181 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

Just so you all know, PS4/X1 will get decimated very soon...

 

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#182 Wasdie  Moderator
Member since 2003 • 53622 Posts

[QUOTE="AzatiS"]

[QUOTE="AMD655"] 16GB of ram is pointless, let alone 32GB for system memory as a gamer. 2GB is all you need for Vram.AMD655

Sorry but if you run 64bit OS which will be must for next-gen gaming , 8GB is the least you need. Why not have 16GB or even 32GB if you like multitasking since its really cheap to have nowdays?

As for VRAM , 2GB is todays requirements .. in 3-4 years from now i bet that will be the minimum.

We talking about power overall here so i just proved PCs can , as we speak , be as powerful as a future console ( is not even out yet ) without too much of effort or super expensive hardware. Period

You simply do not need 16 or 32GB, it is stupid.

Actually a modern gamer could easily hit 8-12 without trying that hard. I do all of the time. I'm happy I have 16 gigs. It's cheap anyways. 

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#183 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

[QUOTE="AMD655"][QUOTE="AzatiS"] Sorry but if you run 64bit OS which will be must for next-gen gaming , 8GB is the least you need. Why not have 16GB or even 32GB if you like multitasking since its really cheap to have nowdays?

As for VRAM , 2GB is todays requirements .. in 3-4 years from now i bet that will be the minimum.

We talking about power overall here so i just proved PCs can , as we speak , be as powerful as a future console ( is not even out yet ) without too much of effort or super expensive hardware. Period

Wasdie

You simply do not need 16 or 32GB, it is stupid.

Actually a modern gamer could easily hit 8-12 without trying that hard. I do all of the time. I'm happy I have 16 gigs. It's cheap anyways. 

Well i guess it depends on the usage, i encode/video edit a lot, run multiple browser tabs, and game, never use all 8GB of memory, not even close.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#184 Wasdie  Moderator
Member since 2003 • 53622 Posts

Well i guess it depends on the usage, i encode/video edit a lot, run multiple browser tabs, and game, never use all 8GB of memory, not even close.

AMD655

You wouldn't if you don't turn off page file memory. You should do that. Then see how much RAM you actually soak up. Keeping page file memory on will have Windows cache stuff onto your harddrive so there is always a RAM buffer. 

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#185 04dcarraher
Member since 2004 • 23859 Posts

[QUOTE="AMD655"] Well i guess it depends on the usage, i encode/video edit a lot, run multiple browser tabs, and game, never use all 8GB of memory, not even close.

Wasdie

You wouldn't if you don't turn off page file memory. You should do that. Then see how much RAM you actually soak up. Keeping page file memory on will have Windows cache stuff onto your harddrive so there is always a RAM buffer. 

I tried that a few weeks ago disabling the paging file and games were eating another 1-2gb more on average.
Avatar image for Hexagon_777
Hexagon_777

20348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#186 Hexagon_777
Member since 2007 • 20348 Posts

[QUOTE="SaltyMeatballs"]

[QUOTE="True_Gamer_"] It will produce same graphics on same settings. So you where saying?Tessellation

It's the closest example, but not the same.

7850: 1024 SPs, 64 TMUs, 32 ROPs, 153GB/s bandwidth.
PS4 :
1152 SPs, 72 TMUs, 32 ROPs, 176GB/s bandwidth.

(higher is better)

Also, for PC you need higher than the console equivalent.

that's a myth debunked long time ago,please continue trying :cool:

Can you link me to the debunking of this myth? I am curious.

Avatar image for R4gn4r0k
R4gn4r0k

49184

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#187 R4gn4r0k
Member since 2004 • 49184 Posts

[QUOTE="R4gn4r0k"]

I doubt Sony buys them on newegg. They buy them in large numbers so they pay even less per card.

danjammer69

Good lord, people really believe this stuff? While the PS4's GPU is close to an off the shelf type of GPU, it is NOT the same thing. It is a custom made APU (CPU/GPU on 1 die) that can not be purchased right off of the shelf. If Sony is 'getting' these from anywhere, it is from AMD themselves, or at the least one of AMD's manufacturers. So to clear things up, the PS4 does not have just a GPU that you can go purchase anywhere at all. These are custom designed parts...same for the Xbone.

Learn to read. Where did I say they buy this exact card ?

Dumbass.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#188 GarGx1
Member since 2011 • 10934 Posts

[QUOTE="Wasdie"]

[QUOTE="AMD655"] You simply do not need 16 or 32GB, it is stupid.AMD655

Actually a modern gamer could easily hit 8-12 without trying that hard. I do all of the time. I'm happy I have 16 gigs. It's cheap anyways.

Well i guess it depends on the usage, i encode/video edit a lot, run multiple browser tabs, and game, never use all 8GB of memory, not even close.

In an interview, Chris Roberts stated he expects Star Citizen to be looking for 8GB of system RAM, that's only 1 to 1.5 years away. There has been no mention of VRAM as yet other than some very loose prospective system requirements.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#189 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

[QUOTE="AMD655"] Well i guess it depends on the usage, i encode/video edit a lot, run multiple browser tabs, and game, never use all 8GB of memory, not even close.

Wasdie

You wouldn't if you don't turn off page file memory. You should do that. Then see how much RAM you actually soak up. Keeping page file memory on will have Windows cache stuff onto your harddrive so there is always a RAM buffer. 

does turning off page file memory boost game performance/fps? so having more stuff on RAM and less stuff on the HDD?

 

i heard that this is only really applicable to XP, and not Vista or W7

 

(err in terms of visible performance gains in Vista and W7 as compared to XP)

Avatar image for True_Gamer_
True_Gamer_

6750

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#190 True_Gamer_
Member since 2006 • 6750 Posts

[QUOTE="AMD655"]

[QUOTE="Wasdie"]

Actually a modern gamer could easily hit 8-12 without trying that hard. I do all of the time. I'm happy I have 16 gigs. It's cheap anyways.

GarGx1

Well i guess it depends on the usage, i encode/video edit a lot, run multiple browser tabs, and game, never use all 8GB of memory, not even close.

In an interview, Chris Roberts stated he expects Star Citizen to be looking for 8GB of system RAM, that's only 1 to 1.5 years away. There has been no mention of VRAM as yet other than some very loose prospective system requirements.

Star Citizen is doomed if it asks for so much ram. And badly coded BTW.