John Carmack:You need a 3.68 Teraflop GPU to match PS4's GPU

This topic is locked from further discussion.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#351 tormentos
Member since 2003 • 33793 Posts

 

yep, overall GTX 480 is faster

Memory Bandwidth

GTX 480 : 177408 MB/sec

7850 : 153600 MB/sec

Texel Rate ( texture filtering)

GTX 480 :   42000 Mtexels/sec

7850 :  55040 Mtexels/sec

Pixel Rate ( resolution & AA)

GTX 480 : 33600 Mpixels/sec

7850 : 27520 Mpixels/sec

04dcarraher

 

Oh look who joined the one who claimed the PS4 would not use more than 4GB of ram for games because no PC game did it,yet Killzone SF a launch game already use more than 4GB..:lol:

 

So how is that crow.?

Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#352 superclocked
Member since 2009 • 5864 Posts

[QUOTE="04dcarraher"]

 

yep, overall GTX 480 is faster

Memory Bandwidth

GTX 480 : 177408 MB/sec

7850 : 153600 MB/sec

Texel Rate ( texture filtering)

GTX 480 :   42000 Mtexels/sec

7850 :  55040 Mtexels/sec

Pixel Rate ( resolution & AA)

GTX 480 : 33600 Mpixels/sec

7850 : 27520 Mpixels/sec

tormentos

 

Oh look who joined the one who claimed the PS4 would not use more than 4GB of ram for games because no PC game did it,yet Killzone SF a launch game already use more than 4GB..:lol:

 

So how is that crow.?

Killzone is using 1.5GB as system RAM and 3GB as video memory...
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#354 04dcarraher
Member since 2004 • 23858 Posts

[QUOTE="04dcarraher"]

 

yep, overall GTX 480 is faster

Memory Bandwidth

GTX 480 : 177408 MB/sec

7850 : 153600 MB/sec

Texel Rate ( texture filtering)

GTX 480 :   42000 Mtexels/sec

7850 :  55040 Mtexels/sec

Pixel Rate ( resolution & AA)

GTX 480 : 33600 Mpixels/sec

7850 : 27520 Mpixels/sec

tormentos

 

Oh look who joined the one who claimed the PS4 would not use more than 4GB of ram for games because no PC game did it,yet Killzone SF a launch game already use more than 4GB..:lol:

 

So how is that crow.?

OMG lol.gif
I said the PS4 wont use more then 4gb for video use not the whole system usage. also KZ SF is only using 3 gb for video and 1.5 gb for system use
Its pointless to throw more then 4gb on a mid teir gpu thats only rendering at 1080 with AA only running at 30 fps.

killzone-video-RAM-use

Avatar image for PinkiePirate
PinkiePirate

1973

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#355 PinkiePirate
Member since 2012 • 1973 Posts

I understand his point. Not sure about the accuracy behind his claim.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#356 04dcarraher
Member since 2004 • 23858 Posts

To tone down the 40k models hype,  KZ SF will not have an always true 40,000 polygon characters. The way they are going to allow 40k npc models is when there is only one npc within the distance of 0-2, the more npc's on screen within a certain distance the LOD quality count lowers. So for example You have three npc's at a distance of 1 their polygon count will be about a 3rd each.  The main reason behind this is that the gpu does not have enough processing power to render multiple high polygon count models.

ezf5.jpg

You can see object LOD suffering at further distances

http://cdn.slashgear.com/wp-content/uploads/2013/05/Valient_Killzone_Shadow_Fall_Demo_Postmortem-94.jpeg

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#357 RyviusARC
Member since 2011 • 5708 Posts

It will take 5 years for consoles to relatively and visually match a 3.68 TFLOP PC GPU.  And "optimization" to me means anything from dedicated code to metal programming to heavy LOD management.  The LOD management in console games is typically really awful and noticeable, which doesn't translate in comparison screenshots like it does in actual high res game footage.  This coming generation will benefit from flexible tessellation which should mitigate noticeable LOD changes on many game objects, especially high polygon characters and detailed models.  

The Assassin's Creed games come to mind when I think of horrid LOD pop-in.  Bleh.

PC_Otter

 

The PS4/Xbox One will never match a 3.68 TFLOP PC GPU.

Games will look better at time goes by but that is becuase devs find more efficient ways to render things.

Just like how my 6 year old 8800gt still trashes any current console many times over the same will apply to the 3.68 TFLOP PC GPU.

Avatar image for XxR3m1xInHDn3D
XxR3m1xInHDn3D

2365

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#358 XxR3m1xInHDn3D
Member since 2013 • 2365 Posts
[QUOTE="clyde46"][QUOTE="DrTrafalgarLaw"][QUOTE="04dcarraher"] lol even so if we were stuck with a 8600GTS and if those games came to pc it would look better on the 8600GTS then it would have on the current consoles.

Hermits losing the ability to coherently type sentences fast.

Not everyone on this board uses English as their first language.

Aren't you from England?
Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#359 GioVela2010
Member since 2008 • 5566 Posts

To tone down the 40k models hype,  KZ SF will not have an always true 40,000 polygon characters. The way they are going to allow 40k npc models is when there is only one npc within the distance of 0-2, the more npc's on screen within a certain distance the LOD quality count lowers. So for example You have three npc's at a distance of 1 their polygon count will be about a 3rd each.  The main reason behind this is that the gpu does not have enough processing power to render multiple high polygon count models.

ezf5.jpg

You can see object LOD suffering at further distances

http://cdn.slashgear.com/wp-content/uploads/2013/05/Valient_Killzone_Shadow_Fall_Demo_Postmortem-94.jpeg

04dcarraher
Looks amazing already, better than any game with my GTX 670 thats for sure
Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#360 GioVela2010
Member since 2008 • 5566 Posts

[QUOTE="PC_Otter"]

It will take 5 years for consoles to relatively and visually match a 3.68 TFLOP PC GPU.  And "optimization" to me means anything from dedicated code to metal programming to heavy LOD management.  The LOD management in console games is typically really awful and noticeable, which doesn't translate in comparison screenshots like it does in actual high res game footage.  This coming generation will benefit from flexible tessellation which should mitigate noticeable LOD changes on many game objects, especially high polygon characters and detailed models.  

The Assassin's Creed games come to mind when I think of horrid LOD pop-in.  Bleh.

RyviusARC

 

The PS4/Xbox One will never match a 3.68 TFLOP PC GPU.

Games will look better at time goes by but that is becuase devs find more efficient ways to render things.

Just like how my 6 year old 8800gt still trashes any current console many times over the same will apply to the 3.68 TFLOP PC GPU.

Your 8800GT can equal TLOU graphics at best
Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#361 RyviusARC
Member since 2011 • 5708 Posts

[QUOTE="RyviusARC"]

[QUOTE="PC_Otter"]

It will take 5 years for consoles to relatively and visually match a 3.68 TFLOP PC GPU.  And "optimization" to me means anything from dedicated code to metal programming to heavy LOD management.  The LOD management in console games is typically really awful and noticeable, which doesn't translate in comparison screenshots like it does in actual high res game footage.  This coming generation will benefit from flexible tessellation which should mitigate noticeable LOD changes on many game objects, especially high polygon characters and detailed models.  

The Assassin's Creed games come to mind when I think of horrid LOD pop-in.  Bleh.

GioVela2010

 

The PS4/Xbox One will never match a 3.68 TFLOP PC GPU.

Games will look better at time goes by but that is becuase devs find more efficient ways to render things.

Just like how my 6 year old 8800gt still trashes any current console many times over the same will apply to the 3.68 TFLOP PC GPU.

Your 8800GT can equal TLOU graphics at best

 

You're hilarious!

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#362 04dcarraher
Member since 2004 • 23858 Posts
[QUOTE="04dcarraher"]

To tone down the 40k models hype,  KZ SF will not have an always true 40,000 polygon characters. The way they are going to allow 40k npc models is when there is only one npc within the distance of 0-2, the more npc's on screen within a certain distance the LOD quality count lowers. So for example You have three npc's at a distance of 1 their polygon count will be about a 3rd each.  The main reason behind this is that the gpu does not have enough processing power to render multiple high polygon count models.

ezf5.jpg

You can see object LOD suffering at further distances

http://cdn.slashgear.com/wp-content/uploads/2013/05/Valient_Killzone_Shadow_Fall_Demo_Postmortem-94.jpeg

GioVela2010
Looks amazing already, better than any game with my GTX 670 thats for sure

Then you live a sheltered life
Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#363 AzatiS
Member since 2004 • 14969 Posts
[QUOTE="blue_hazy_basic"]Lets wait and see huh? :) Remember before the last launch when the cell would end the PC, cure cancer and bring world peace?

+1 Nuff said
Avatar image for Senor_Kami
Senor_Kami

8529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#364 Senor_Kami
Member since 2008 • 8529 Posts

i own a titan, and my pc is better

Sweenix
It should be considering the graphics card alone is more than double the price of any next-gen system.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#365 tormentos
Member since 2003 • 33793 Posts

 

OMG lol.gif
I said the PS4 wont use more then 4gb for video use not the whole system usage. also KZ SF is only using 3 gb for video and 1.5 gb for system use
Its pointless to throw more then 4gb on a mid teir gpu thats only rendering at 1080 with AA only running at 30 fps.

 

04dcarraher

 

Is using 4.5 GB idiot and it was on the unfinish kits..:lol:

3GB already and was done on 4GB kits remember that the demo is not running on final hardware,so they were working with 4GB the 4.5 reside on the fact that dev kits have more ram than retail consoles.

 

Respawn Entertainment may have been the ones lobbying for the RAM bump. This would actually make sense, as Respawn has been very clear and vocal about how the 5GB of RAM allotted for gaming on the Xbox One has already caused issues in development of TitanFall.


http://www.examiner.com/article/xbox-one-rumors-point-to-a-higher-clock-speed-but-doubtful-on-increase-ram


The problem with you is that you think that ram is directly linked to how good something look or how powerful something most be..


So how is that crow...:lol:

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#366 04dcarraher
Member since 2004 • 23858 Posts

Eltormentos guessing, that KZ SF will use even more ram....

Come on now

Current dev kits are the same ones from the Feb demo and they only have 6 cores of the CPU open to games (7 is the goal, may or may not happen), no dedicated audio hardware so the CPU has to do that, the compute customizations aren't on the GPU yet so the CPU is also doing those the the game is only using a total of 4.5Gb of RAM (3gb is for video).

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#367 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

Just common sense...

 

A 7850 can only address so much Vram, the bus width is a large indication, 2GB is already the 7850's maximum, it may address more with a tighter coded game and a locked down framerate.

GTX 680 cannot address the full 4GB Vram until put in 3-way SLi, where there is enough GPU muscle to output the frame rate and data.

256-Bit is the real killer here.

 

Tormentos, you read things like it is gospel, from what i read, 1.5GB is being used on the GPU, the rest is all stored in system ram in order for the GPU to have time to render.

 

This is why 30FPS, and still lagging.

Avatar image for ManatuBeard
ManatuBeard

1121

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#368 ManatuBeard
Member since 2012 • 1121 Posts

[QUOTE="GioVela2010"]

To tone down the 40k models hype,  KZ SF will not have an always true 40,000 polygon characters. The way they are going to allow 40k npc models is when there is only one npc within the distance of 0-2, the more npc's on screen within a certain distance the LOD quality count lowers. So for example You have three npc's at a distance of 1 their polygon count will be about a 3rd each.  The main reason behind this is that the gpu does not have enough processing power to render multiple high polygon count models.

ezf5.jpg

You can see object LOD suffering at further distances

http://cdn.slashgear.com/wp-content/uploads/2013/05/Valient_Killzone_Shadow_Fall_Demo_Postmortem-94.jpeg

04dcarraher

Looks amazing already, better than any game with my GTX 670 thats for sure

 

Why cant we have games that look like this without the shooting part?

I miss Myst and Riven...

Avatar image for Inconsistancy
Inconsistancy

8094

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#369 Inconsistancy
Member since 2004 • 8094 Posts

Just common sense...

 

A 7850 can only address so much Vram, the bus width is a large indication, 2GB is already the 7850's maximum, it may address more with a tighter coded game and a locked down framerate.

GTX 680 cannot address the full 4GB Vram until put in 3-way SLi, where there is enough GPU muscle to output the frame rate and data.

256-Bit is the real killer here.

 

Tormentos, you read things like it is gospel, from what i read, 1.5GB is being used on the GPU, the rest is all stored in system ram in order for the GPU to have time to render.

 

This is why 30FPS, and still lagging.

AMD655

Max 32-bit address 4 294 967 296, 64-bit is 18 446 744 073 709 551 616. (also IPv4 is a 32-bit #, and IPv6 is a 64-#)

The max theoretical bandwidth, of the PS4, at 30fps, is 5.87GB/frame. (does not mean it can effectively use that much memory 'per frame')

Having more GPU's in series doesn't increase their ability to "address" VRAM. When you run multi-GPU setups, they have to duplicate all the memory necessary for the full scene (3 SLI GPU's, each with 4GB of RAM, only effectively have 4GB of RAM). I'm not sure how much extra VRAM is necessary to run SLI/Crossfire, but the power of the GPU has nothing to do with the VRAM use; though depending on the bandwidth and processing power, you may not be able to use all of the memory in one frame.

----

I doubt the PS4 is having bandwidth issues tbh, I'm betting the frame-rate drops are related to a job on the CPU or GPU just not getting done quickly enough (not enough processing power, or inefficient use of).

Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#370 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="AMD655"]

Just common sense...

 

A 7850 can only address so much Vram, the bus width is a large indication, 2GB is already the 7850's maximum, it may address more with a tighter coded game and a locked down framerate.

GTX 680 cannot address the full 4GB Vram until put in 3-way SLi, where there is enough GPU muscle to output the frame rate and data.

256-Bit is the real killer here.

 

Tormentos, you read things like it is gospel, from what i read, 1.5GB is being used on the GPU, the rest is all stored in system ram in order for the GPU to have time to render.

 

This is why 30FPS, and still lagging.

Inconsistancy

Max 32-bit address 4 294 967 296, 64-bit is 18 446 744 073 709 551 616. (also IPv4 is a 32-bit #, and IPv6 is a 64-#)

The max theoretical bandwidth, of the PS4, at 30fps, is 5.87GB/frame. (does not mean it can effectively use that much memory 'per frame')

Having more GPU's in series doesn't increase their ability to "address" VRAM. When you run multi-GPU setups, they have to duplicate all the memory necessary for the full scene (3 SLI GPU's, each with 4GB of RAM, only effectively have 4GB of RAM). I'm not sure how much extra VRAM is necessary to run SLI/Crossfire, but the power of the GPU has nothing to do with the VRAM use; though depending on the bandwidth and processing power, you may not be able to use all of the memory in one frame.

----

I doubt the PS4 is having bandwidth issues tbh, I'm betting the frame-rate drops are related to a job on the CPU or GPU just not getting done quickly enough (not enough processing power, or inefficient use of).

I'm not sure whether volatile memory's bandwidth or non-volatile bandwidth are the cause of pop-ins, but KZSF definitely had pop-ins just a few metres away from the player.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#371 tormentos
Member since 2003 • 33793 Posts

Eltormentos guessing, that KZ SF will use even more ram....

Come on now

Current dev kits are the same ones from the Feb demo and they only have 6 cores of the CPU open to games (7 is the goal, may or may not happen), no dedicated audio hardware so the CPU has to do that, the compute customizations aren't on the GPU yet so the CPU is also doing those the the game is only using a total of 4.5Gb of RAM (3gb is for video).

04dcarraher

 

WTF..:lol:

 

No Guerilla them self stated that the demo was done using old kits that did not even have the sound block,and sound was been done on CPU when it will be done on the sound block..

So no it was different hardware they did not had 8GB of ram to use for the game,and the hardware was unfinish forcing them to use the CPU for sound.

 

CPU Load

  • 60 AI characters
  • 940 entities, 300 active
  • 8200 physics objects (1500 key-framed, 6700 static)
  • 500 particle systems
  • 120 sound voices
  • 110 ray casts
  • 1000 jobs per frame

http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall

 

The PS4 has a sound chip,and for the demo the sound chip wasn't present and the CPU was been use to emulate it..

 

If funny that you some how think it will not use more ram,because the video of Killzone SF release on february clearly show,pop in detail which is atribute in almost all cases to ram starvation,smoke suddenly became more black which show the game need more rams.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#372 tormentos
Member since 2003 • 33793 Posts

[QUOTE="AMD655"]

Just common sense...

 

A 7850 can only address so much Vram, the bus width is a large indication, 2GB is already the 7850's maximum, it may address more with a tighter coded game and a locked down framerate.

GTX 680 cannot address the full 4GB Vram until put in 3-way SLi, where there is enough GPU muscle to output the frame rate and data.

256-Bit is the real killer here.

 

Tormentos, you read things like it is gospel, from what i read, 1.5GB is being used on the GPU, the rest is all stored in system ram in order for the GPU to have time to render.

 

This is why 30FPS, and still lagging.

Inconsistancy

Max 32-bit address 4 294 967 296, 64-bit is 18 446 744 073 709 551 616. (also IPv4 is a 32-bit #, and IPv6 is a 64-#)

The max theoretical bandwidth, of the PS4, at 30fps, is 5.87GB/frame. (does not mean it can effectively use that much memory 'per frame')

Having more GPU's in series doesn't increase their ability to "address" VRAM. When you run multi-GPU setups, they have to duplicate all the memory necessary for the full scene (3 SLI GPU's, each with 4GB of RAM, only effectively have 4GB of RAM). I'm not sure how much extra VRAM is necessary to run SLI/Crossfire, but the power of the GPU has nothing to do with the VRAM use; though depending on the bandwidth and processing power, you may not be able to use all of the memory in one frame.

----

I doubt the PS4 is having bandwidth issues tbh, I'm betting the frame-rate drops are related to a job on the CPU or GPU just not getting done quickly enough (not enough processing power, or inefficient use of).

 

Probablyn the last this is a launch game,which tend to be rush to get them to the finish line.

 

Resistance look nothing like Killzone 2.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#373 tormentos
Member since 2003 • 33793 Posts

Just common sense...

 

A 7850 can only address so much Vram, the bus width is a large indication, 2GB is already the 7850's maximum, it may address more with a tighter coded game and a locked down framerate.

GTX 680 cannot address the full 4GB Vram until put in 3-way SLi, where there is enough GPU muscle to output the frame rate and data.

256-Bit is the real killer here.

 

Tormentos, you read things like it is gospel, from what i read, 1.5GB is being used on the GPU, the rest is all stored in system ram in order for the GPU to have time to render.

 

This is why 30FPS, and still lagging.

AMD655

 

Such a joke..

 

No is 3GB on the GPU learn to read,and was for unsinish hardware.

 

And the game is 30 FPS because that is the target,on consoles 60 FPS get rarely push,it could have a Titan GPU and the power will be use more for detail that  for frames.

Avatar image for deactivated-58e448fd89d82
deactivated-58e448fd89d82

4494

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#374 deactivated-58e448fd89d82
Member since 2010 • 4494 Posts

[QUOTE="AMD655"]

Just common sense...

 

A 7850 can only address so much Vram, the bus width is a large indication, 2GB is already the 7850's maximum, it may address more with a tighter coded game and a locked down framerate.

GTX 680 cannot address the full 4GB Vram until put in 3-way SLi, where there is enough GPU muscle to output the frame rate and data.

256-Bit is the real killer here.

 

Tormentos, you read things like it is gospel, from what i read, 1.5GB is being used on the GPU, the rest is all stored in system ram in order for the GPU to have time to render.

 

This is why 30FPS, and still lagging.

tormentos

 

Such a joke..

 

No is 3GB on the GPU learn to read,and was for unsinish hardware.

 

And the game is 30 FPS because that is the target,on consoles 60 FPS get rarely push,it could have a Titan GPU and the power will be use more for detail that  for frames.

 

I think you should just shut up and go to bed pal.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#375 ronvalencia
Member since 2008 • 29612 Posts

Just common sense...

A 7850 can only address so much Vram, the bus width is a large indication, 2GB is already the 7850's maximum, it may address more with a tighter coded game and a locked down framerate.

GTX 680 cannot address the full 4GB Vram until put in 3-way SLi, where there is enough GPU muscle to output the frame rate and data.

256-Bit is the real killer here.

Tormentos, you read things like it is gospel, from what i read, 1.5GB is being used on the GPU, the rest is all stored in system ram in order for the GPU to have time to render.

This is why 30FPS, and still lagging.

AMD655

AMD GCN can address more than 2 GB of RAM e.g. it supports 64bit pointers.

AMD_GCN_VirtMemory_689.jpg

AMD GCN is IOMMU is based on X86-64 IP.

Professional real time 3D applications can use the entire 4GB of VRAM on AMD FirePro W7000 (based on AMD Pitcairn ASIC).

------------

This video answers many technical questions about the DX11.2's tiled resources, massive textures (e.g. 10 GB) and smaller VRAM pool.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#376 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="AMD655"]

Just common sense...

 

A 7850 can only address so much Vram, the bus width is a large indication, 2GB is already the 7850's maximum, it may address more with a tighter coded game and a locked down framerate.

GTX 680 cannot address the full 4GB Vram until put in 3-way SLi, where there is enough GPU muscle to output the frame rate and data.

256-Bit is the real killer here.

 

Tormentos, you read things like it is gospel, from what i read, 1.5GB is being used on the GPU, the rest is all stored in system ram in order for the GPU to have time to render.

 

This is why 30FPS, and still lagging.

Inconsistancy

Max 32-bit address 4 294 967 296, 64-bit is 18 446 744 073 709 551 616. (also IPv4 is a 32-bit #, and IPv6 is a 64-#)

The max theoretical bandwidth, of the PS4, at 30fps, is 5.87GB/frame. (does not mean it can effectively use that much memory 'per frame')

Having more GPU's in series doesn't increase their ability to "address" VRAM. When you run multi-GPU setups, they have to duplicate all the memory necessary for the full scene (3 SLI GPU's, each with 4GB of RAM, only effectively have 4GB of RAM). I'm not sure how much extra VRAM is necessary to run SLI/Crossfire, but the power of the GPU has nothing to do with the VRAM use; though depending on the bandwidth and processing power, you may not be able to use all of the memory in one frame.

----

I doubt the PS4 is having bandwidth issues tbh, I'm betting the frame-rate drops are related to a job on the CPU or GPU just not getting done quickly enough (not enough processing power, or inefficient use of).

PS4's 32 color ROPS would be underutilized e.g. 7970's 32 color ROPS vs 7870's 32 color ROPS results.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#377 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="AMD655"]

 

Shows how brainless you really are.

 

With a GPU such as the GTX 480 and 7850, the GPU will run out of performance before memory becomes a bottleneck, the 480 has the same bandwidth as a GTX 680 when you push the memory to 4ghz.

 

Also take note that i can play BF3 on Ultra with 2xMSAA at 2560x1440 with no memory bottleneck.

 

 

You do not murder anyone other than yourself trying to sound smart, when all you are is a foolish inexperienced person looking up stuff on the internet, then throwing it around like it is gospel.

tormentos

 

On your PC yeah on the PS4 not at all,Killzone SF already use 4GB incredible draw distances,port all that code to run on your 480 GTX,and either resolution,frames,image quality or AA had to be lower or drop,because you can't magicall fit 4GB of data into 1.5Gb.

 

So does the 7850,that doesn't mean more memory will not benefit the game.

 

You are an idiot that was proved wrong and started a hude damage control your 480 is less powerful than the PS4 and a hell of allot less efficient to.

Smaller VRAM issues was addressed by DX11.2's tiled resource i.e. converts the smaller/faster VRAM into a cache. The key point is the just-in-time texture loads on smaller/faster VRAM based on user's view port. On AMD GCN, this feature is hardware accelerated i.e. AMD PRT. On other GPUs, this feature would be done via software e.g. NVIDIA CUDA or CPU.
Avatar image for Plagueless
Plagueless

2569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#378 Plagueless
Member since 2010 • 2569 Posts
Sure, maybe in 5 years when there are 3 new generations of cards. Right now I'm quite confident that my PC will outperform the PS4.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#379 ronvalencia
Member since 2008 • 29612 Posts


This topic may be old but I want to reiterate on wasdie's beautiful post that PC's have a lot of bullshit overhead to deal with so they gimp their games to the lowest specs. From the legend Carmack himself. Hermits lining up for a Titan as we speak.

Now this "My PC is better" bullshit can rest, unless you own a Titan.

DrTrafalgarLaw

From http://semiaccurate.com/2013/07/17/keynotes-for-amd-developer-summit-2013-announced/

Bingo-card-APU13-wp.png

---

PC has "AMD HSA" supported 3D engines e.g. EA DICE's Frostbite 3 and Unity3D.

http://www.slideshare.net/zlatan4177/gpgpu-algorithms-in-games

GCNfullysupportsHSAIL_zpscc129eae.png

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#380 tormentos
Member since 2003 • 33793 Posts

Smaller VRAM issues was addressed by DX11.2's tiled resource i.e. converts the smaller/faster VRAM into a cache. The key point is the just-in-time texture loads on smaller/faster VRAM based on user's view port. On AMD GCN, this feature is hardware accelerated i.e. AMD PRT. On other GPUs, this feature would be done via software e.g. NVIDIA CUDA or CPU.ronvalencia

 

Just in time textures is way older than GCN.

 

And that advantage that state there also apply to the PS4 which has even more ram.

 

So the advantage ram wise should be even bigger,you talk as if PRT was not supported on PS4 stop trying to act like that.

 

Any advantage in ram usage those GPU get the PS4 wouold also get them,hell even more.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#381 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"] Smaller VRAM issues was addressed by DX11.2's tiled resource i.e. converts the smaller/faster VRAM into a cache. The key point is the just-in-time texture loads on smaller/faster VRAM based on user's view port. On AMD GCN, this feature is hardware accelerated i.e. AMD PRT. On other GPUs, this feature would be done via software e.g. NVIDIA CUDA or CPU.tormentos

Just in time textures is way older than GCN.

And that advantage that state there also apply to the PS4 which has even more ram.

So the advantage ram wise should be even bigger,you talk as if PRT was not supported on PS4 stop trying to act like that.

Any advantage in ram usage those GPU get the PS4 wouold also get them,hell even more.

AMD GCN done it on per polygon basis and its hardware accelerated (i.e. nearly zero performance cost).

DX11.2's tiled resource has the similar spec as AMD PRT i.e. both has 64KB texture tile and 'etc'.

AMD PRT doesn't change the GPU's potential i.e. it fixes the smaller VRAM pool issue.

AMD PRT can be applied from persistent storage devices e.g. hard disk or Blu-ray, but remember the tile stream size would be smaller from hard disk and Blu-ray sources. ZLib decompression could increase effective tile stream size.


PS; DX11.2 enables the CPU to directly access GPU's memory areas.

Avatar image for cdragon_88
cdragon_88

1848

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#382 cdragon_88
Member since 2003 • 1848 Posts

[QUOTE="killzowned24"][QUOTE="cdragon_88"]

So are we really putting words in John Carmack's mouth? If so here's mine: vum_zps7c6d6d44.jpg

GioVela2010

https://twitter.com/ID_AA_Carmack/status/50277106856370176

Lol what an idiot that guy is (the guy you quoted. Not Carmack)

 

Topic head by TC: "John Carmack:You need a 3.68 Teraflop GPU to match PS4's GPU" :? I'm sure Carmack never said that. Who's the idiot now?

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#383 GioVela2010
Member since 2008 • 5566 Posts

[QUOTE="GioVela2010"][QUOTE="killzowned24"] https://twitter.com/ID_AA_Carmack/status/50277106856370176cdragon_88

Lol what an idiot that guy is (the guy you quoted. Not Carmack)

 

Topic head by TC: "John Carmack:You need a 3.68 Teraflop GPU to match PS4's GPU" :? I'm sure Carmack never said that. Who's the idiot now?

You are, thanks for playing
Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#384 wis3boi
Member since 2005 • 32507 Posts

3gMJtev.jpg.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#385 ronvalencia
Member since 2008 • 29612 Posts

3gMJtev.jpg.

wis3boi

http://semiaccurate.com/2013/07/17/keynotes-for-amd-developer-summit-2013-announced/

Bingo-card-APU13-wp-617x805.png

For the PC, EA DICE's Frostbite 3 and Unity3D has AMD HSA support (HSAIL software).

Avatar image for tionmedon
tionmedon

468

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#386 tionmedon
Member since 2006 • 468 Posts

i think my I7 3960x and 690gtx will be more than enough to quiet the PS4........period.

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#387 wis3boi
Member since 2005 • 32507 Posts

[QUOTE="wis3boi"]

3gMJtev.jpg.

ronvalencia

http://semiaccurate.com/2013/07/17/keynotes-for-amd-developer-summit-2013-announced/

 

For the PC, EA DICE's Frostbite 3 and Unity3D has AMD HSA support (HSAIL software).

 

 

 

you really cant read a goddamn thing, can you? Stop posting random charts to posts that have zero to do with your crap :|

Avatar image for the_bi99man
the_bi99man

11465

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#388 the_bi99man
Member since 2004 • 11465 Posts

3gMJtev.jpg.

wis3boi

Winner.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#389 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="wis3boi"]

3gMJtev.jpg.

wis3boi

http://semiaccurate.com/2013/07/17/keynotes-for-amd-developer-summit-2013-announced/

 

For the PC, EA DICE's Frostbite 3 and Unity3D has AMD HSA support (HSAIL software).

 

 

 

you really cant read a goddamn thing, can you? Stop posting random charts to posts that have zero to do with your crap :|

My point is, PC will get its own optimizations.
Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#390 wis3boi
Member since 2005 • 32507 Posts

[QUOTE="wis3boi"]

[QUOTE="ronvalencia"] http://semiaccurate.com/2013/07/17/keynotes-for-amd-developer-summit-2013-announced/

 

For the PC, EA DICE's Frostbite 3 and Unity3D has AMD HSA support (HSAIL software).

 

 

ronvalencia

 

you really cant read a goddamn thing, can you? Stop posting random charts to posts that have zero to do with your crap :|

My point is, PC will get its own optimizations.

nothing to do with the picture i posted, keep being an irrelevant spammer

Avatar image for whitey_rolls
whitey_rolls

2547

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#391 whitey_rolls
Member since 2006 • 2547 Posts

I don't really get what the issue is, even if Carmack is correct - who cares?

Just wait 6 months after the consoles are released and then buy a GPU that will blow them out of the water for the next 7 years.

Avatar image for humpmasterflex
humpmasterflex

363

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#392 humpmasterflex
Member since 2003 • 363 Posts


This topic may be old but I want to reiterate on wasdie's beautiful post that PC's have a lot of bullshit overhead to deal with so they gimp their games to the lowest specs. From the legend Carmack himself. Hermits lining up for a Titan as we speak.

Now this "My PC is better" bullshit can rest, unless you own a Titan.

DrTrafalgarLaw

 

titan runs games at maxed out setting at resolutions of 2560x1600 and 60fps, ps4 cant run killzone at 1920x1080p/60fps. Seems legit. LMAO, ouch

 

console peasants can't accept the facts

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#393 tormentos
Member since 2003 • 33793 Posts

[QUOTE="tormentos"]

[QUOTE="ronvalencia"] Smaller VRAM issues was addressed by DX11.2's tiled resource i.e. converts the smaller/faster VRAM into a cache. The key point is the just-in-time texture loads on smaller/faster VRAM based on user's view port. On AMD GCN, this feature is hardware accelerated i.e. AMD PRT. On other GPUs, this feature would be done via software e.g. NVIDIA CUDA or CPU.ronvalencia

 

Just in time textures is way older than GCN.

 

And that advantage that state there also apply to the PS4 which has even more ram.

 

So the advantage ram wise should be even bigger,you talk as if PRT was not supported on PS4 stop trying to act like that.

 

Any advantage in ram usage those GPU get the PS4 wouold also get them,hell even more.

AMD GCN done it on per polygon basis and its hardware accelerated (i.e. nearly zero performance cost).

DX11.2's tiled resource has the similar spec as AMD PRT i.e. both has 64KB texture tile and 'etc'.

AMD PRT doesn't change the GPU's potential i.e. it fixes the smaller VRAM pool issue.

 

AMD PRT can be applied from persistent storage devices e.g. hard disk or Blu-ray, but remember the tile stream size would be smaller from hard disk and Blu-ray sources. ZLib decompression could increase effective tile stream size.

 

PS; DX11.2 enables the CPU to directly access GPU's memory areas.

 

Nothing of what you say there prove me wrong.

 

PRT and JIt compression can be done on PS4 period.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#394 tormentos
Member since 2003 • 33793 Posts

[QUOTE="wis3boi"]

3gMJtev.jpg.

ronvalencia

http://semiaccurate.com/2013/07/17/keynotes-for-amd-developer-summit-2013-announced/

Bingo-card-APU13-wp-617x805.png

For the PC, EA DICE's Frostbite 3 and Unity3D has AMD HSA support (HSAIL software).

 

 

 

 

You really don't read what people post..:lol:

You just hit reply quoting what ever you feel like it,look at what you quote and tell me it has something to do with what you posted there..:lol:

Avatar image for DrTrafalgarLaw
DrTrafalgarLaw

4487

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#395 DrTrafalgarLaw
Member since 2011 • 4487 Posts

3gMJtev.jpg.

wis3boi
LOL, didn't know this thread got so many people upset. :cool:
Avatar image for GuNsbl4ziN
GuNsbl4ziN

285

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#396 GuNsbl4ziN
Member since 2010 • 285 Posts
[QUOTE="wis3boi"]

3gMJtev.jpg.

DrTrafalgarLaw
LOL, didn't know this thread got so many people upset. :cool:

That's what happens when you post such a moronic thread.
Avatar image for blaznwiipspman1
blaznwiipspman1

16916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#397 blaznwiipspman1
Member since 2007 • 16916 Posts

I guess carmack would be more credible IF he weren't paid a fat stash of benjamins to spout BS to console fanboys. Not to mention, all his games are usually console exclusive. He knows who butters his bread and who doesn't. This would be like asking Gabe Newell, founder of Steam, what his thoughts are on the specs of console hardware. Face it, your weak hardware will NEVER even come close to even a 7870, let alone a Titan lol.

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#398 GioVela2010
Member since 2008 • 5566 Posts

I guess carmack would be more credible IF he weren't paid a fat stash of benjamins to spout BS to console fanboys. Not to mention, all his games are usually console exclusive. He knows who butters his bread and who doesn't. This would be like asking Gabe Newell, founder of Steam, what his thoughts are on the specs of console hardware. Face it, your weak hardware will NEVER even come close to even a 7870, let alone a Titan lol.

blaznwiipspman1
Lmao butthurt hermit tears of epic proportions ITT
Avatar image for pcgamingowns
pcgamingowns

1223

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#399 pcgamingowns
Member since 2013 • 1223 Posts

carmack is a casual now he's been building too many rockets lately and forgot how to make games.

Avatar image for Far_RockNYC
Far_RockNYC

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#400 Far_RockNYC
Member since 2012 • 1244 Posts

John Carmack:lol: