To Upgrade or to Not Upgrade? (GTX680 --> GTX970)

  • 105 results
  • 1
  • 2
  • 3
Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#51  Edited By Coseniath
Member since 2004 • 3183 Posts
@FelipeInside said:

You guys have great memory. I barely remember which card models I had, lol....

Haha, I can even type which brand each card was. pe: MSI 6600GT... or Powercolor Voodoo 2...

ps: @04dcarraher: I got an MSI 7600GS too from warranty for the fried MSI 6600GT. The cooler stopped working after 2years and 8months and I was raiding in wow while my GPU had 104C.... xD

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#52 04dcarraher
Member since 2004 • 23858 Posts

@Coseniath said:
@FelipeInside said:

You guys have great memory. I barely remember which card models I had, lol....

ps: @04dcarraher: I got an MSI 7600GS too from warranty for the fried MSI 6600GT. The cooler stopped working after 2years and 8months and I was raiding in wow while my GPU had 104C.... xD

weird, my chaintech 6600GT fan died within the first year, called them up asked for a new fan they said ok, I jerry rigged a 80mm fan with zip ties week later new heatsink/fan arrived.

lol my 7600gs was fried from my old CRT blowing up and since it was xfx it had the double lifetime warranty, turned it in got a new one a week later.

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#53 wis3boi
Member since 2005 • 32507 Posts
@FelipeInside said:
@klunt_bumskrint said:

@FelipeInside: I would have bought the 980 over the 970 too. FYI the Titan X overclocks like a beast and only has 6 and 8 pin power.

I was going to go with the 970 due to budget reasons, but since the 980 finished costing me $300 by selling my iPad and my 680, I couldn't say no.

I wish I could buy a Titan, but I would need to sell my house, lol..... and for 1080p it's total overkill.

Now EVERYONE, let's play a game. List the reasons (gaming) why you upgraded a video card for?

These are the moments I remember most:

1- Bought a VooDoo GPU to play Unreal

2- Upgraded my GPU to play Half Life 2

3- Bought a GTX 6800 Ultra to play World at War

4- Upgraded GPU and installed Windows7 to play Just Cause 2

5- Bought a 980 to play Witcher 3

- think it was an MX 440 or something, nvidia. Got it to play GTA III on pc. The PC was beige colored and had a floppy drive. Good times.

- ATi X800 256mb (holy crap 256!) to play BF2. Card ended up sucking ass and glitched like a mofo, especially in Oblivion

- 8800 GTX, for Crysis....because foliage

- 4870 1gb, because memory. Also blew the 8800 out of the water for about $200

- GTX 570, just a general upgrade

- AMD 280x, 570 was getting old. Went through two 280s, both had defective memory. Said **** it, back to green team, got a 970, best card ive ever owned

Avatar image for Dark_sageX
Dark_sageX

3561

Forum Posts

0

Wiki Points

0

Followers

Reviews: 236

User Lists: 0

#54 Dark_sageX
Member since 2003 • 3561 Posts

wait for pascal and see what those gen of cards have to offer, the GTX680 is still a mighty fine card, the GTX970 doesn't offer enough of a performance boost to justify the upgrade, if you had a GTX660ti or even a GTX670 you might have had a case, but right now just keep the GTX680.

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 FelipeInside
Member since 2003 • 28548 Posts

So pwetty......

Avatar image for BassMan
BassMan

18741

Forum Posts

0

Wiki Points

0

Followers

Reviews: 233

User Lists: 0

#56 BassMan
Member since 2002 • 18741 Posts

Congrats! It's about time you upgrade that 680. Welcome back to Ultra settings. :)

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#57  Edited By Coseniath
Member since 2004 • 3183 Posts
@FelipeInside said:

So pwetty......

I know that feel... :D

Gratz for your new GPU!

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58  Edited By RyviusARC
Member since 2011 • 5708 Posts

@Coseniath said:
@FelipeInside said:

So pwetty......

I know that feel... :D

Gratz for your new GPU!

I got mine back at release and I received a partial refund (due to 3.5GB) but still kept both of my cards.

I will probably upgrade with second gen Pascal unless I really need more power to drive the Oculus Rift once the public release is available in early 2016.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#59 04dcarraher
Member since 2004 • 23858 Posts

@FelipeInside said:

So pwetty......

Noice, here is mine when I got it. But on a side note have you tried moving that top exhaust fan over to see if it helps your cpu temps?

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#60 insane_metalist
Member since 2006 • 7797 Posts

Gigabyte love from red team:

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 FelipeInside
Member since 2003 • 28548 Posts

@04dcarraher said:
@FelipeInside said:

So pwetty......

Noice, here is mine when I got it. But on a side note have you tried moving that top exhaust fan over to see if it helps your cpu temps?

You mean more to the right?

Well, it's the NZXT Case. It has another space for another fan to the right (you can see it on the photo), but I thought putting the fan there would be too far from the CPU.

The water cooling system does cover a quarter of the current fan, but it still does it's job (I never have temp problems, it's a huge case so the airflow is good).

Saying that, next build I'm going back to normal fans. Water cooling is nice and all but just as noisy as a normal fan with the risk of leaks.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#62 04dcarraher
Member since 2004 • 23858 Posts

@FelipeInside said:

You mean more to the right?

Well, it's the NZXT Case. It has another space for another fan to the right (you can see it on the photo), but I thought putting the fan there would be too far from the CPU.

The water cooling system does cover a quarter of the current fan, but it still does it's job (I never have temp problems, it's a huge case so the airflow is good).

Saying that, next build I'm going back to normal fans. Water cooling is nice and all but just as noisy as a normal fan with the risk of leaks.

yeah I meant moving it over a spot, that top fan can be causing turbulence with that one fan on the liquid cooler, and that top fan isnt doing anything for that water cooling. Moving it over may reduce some of your fan noise.

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 FelipeInside
Member since 2003 • 28548 Posts

@04dcarraher said:
@FelipeInside said:

You mean more to the right?

Well, it's the NZXT Case. It has another space for another fan to the right (you can see it on the photo), but I thought putting the fan there would be too far from the CPU.

The water cooling system does cover a quarter of the current fan, but it still does it's job (I never have temp problems, it's a huge case so the airflow is good).

Saying that, next build I'm going back to normal fans. Water cooling is nice and all but just as noisy as a normal fan with the risk of leaks.

yeah I meant moving it over a spot, that top fan can be causing turbulence with that one fan on the liquid cooler, and that top fan isnt doing anything for that water cooling. Moving it over may reduce some of your fan noise.

The noisy thing is still the fan on the water cooler. I stopped the top fan just then and same noise.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#64 04dcarraher
Member since 2004 • 23858 Posts

@FelipeInside said:

The noisy thing is still the fan on the water cooler. I stopped the top fan just then and same noise.

You using the corsair fans that came with it?

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 FelipeInside
Member since 2003 • 28548 Posts

@04dcarraher said:
@FelipeInside said:

The noisy thing is still the fan on the water cooler. I stopped the top fan just then and same noise.

You using the corsair fans that came with it?

Yeah, it comes as an all-in-one package.

Avatar image for rogelio22
rogelio22

2477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#66  Edited By rogelio22
Member since 2006 • 2477 Posts

2011 palit gtx 570... died a yr later

2012 evga gtx 570 classified... died a yr later and found out it was the cheap ass power supply doing this

2013 evga 770 ssc... still alive because off corsair tx850 psu and new mobo

2014 dual evga 970s.... play gta5 at mostly max, high settings with no aa at 4k and a rock solid 60 fps

yeah I know I was mostly a console gamer before 2011 and finally ive seen the light lol

oh yeah I just recently bought a alienware 15 2015 with i7, 16 gb, 265 ssd, 1tb hdd, 1080p screen with a gtx 980m and its awesome and I know most people hate alienware but this was cheaper than all similar asus, msi models which were going for $2500 but I got mine for $2000

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#67  Edited By Xtasy26
Member since 2008 • 5593 Posts

Here my Graphics Card history/cards I used to play "3D Accelerated Games", can I still use the term? :P

1. Riva 128 (STB Velocity 128) - 1998 - First Graphics that I used to play "3D Accelerated Games" on the PC. Used it to play Forsaken, Turok, Quake 2, Incoming, Moto Racer 2, Sin, Grim Fandango, Need For Speed Hot Pursuit. Would love to have a Voodoo 2 but that was way too expensive at the time and the Riva 128 had the perfect price/performance ratio back in 1998. Not to mention you needed to purchase a second card for 2D graphics as the Voodoo 2 only did 3d. Riva 128 was nVidia's first successful GPU after the disastrous nv1.

2. GeForce MX 420 64 MB DDR (yes DDR technology !!) (MSI brand?? - not sure) - 2002 . Offered excellent performance and used to it play Jedi Knight 2/Jedi Academy, Quake III, GTA III, UT Tournament 2003, Need For Speed Hot Pursuit 2, Halo 1.

3. GeForce 7600 GT (XFX) - 2006 - Graphics card came in a wicked box in a X shape (the best looking box ever!!! lol). Used it to play Far Cry, could finally play it maxed out with Pixed Shader 3.0, the water looked so much better! And the HDR was just wow!

4. ATI HD 4870 (Sapphire) - 2008 - Switched to ATI after being loyal nVidia user for 10 years. Chose to go with ATI because it offered the same or better performance that the GTX 260 at lesser price, even matched the GTX 280 in some games which cost almost twice as much because of the power of GDDR5! Primarily brought it to play Crysis maxed out. Waited a year after it came out to finally play the game even though I pre-ordered the game a year earlier. Later used it to play Crysis Warhead.

5. Radeon HD 6950 Bios Flashed it to HD 6970 (XFX) - 2011 - Upgraded to Windows 7, wanted to play Win 7 DX 11 supported games. Hoped that I would be able to BIOS flash it to HD 6970 and save $100 and I did! Also upgraded monitor to 1080P so needed something like an HD 6970. Used it to play DX 11 games as DX 10 was going out of the way. Initially used to play Dirt 2 one of the first games to use DX 11 and use them tesselations. Pretty much use this GPU to play all my DX 11 games like Crysis 2, BF3, BF4.

Notice that I purchase GPU's every 2 - 4 years or every two generations so was looking to buy an R9 series as it was two generations after the HD 69XX series. Thought about getting a R9 290 and BIOS flashing it R9 290X but missed out. Then the mining craze took off and the price of R9 290s went through the roof. By the time it came down nVidia released their GTX 970/980 series couple of months later. Thought about pulling the trigger and buying a GTX 970 in early 2015 as it had been 4 years since I got my HD 6970. Was browsing newegg for GTX 970 2 - 3 days before news hit of 3.5 GB issue and was no way going to buy the GTX 970. I want something that is future proof given that I update GPUs 2 - 4 years, don't want to have issues in the future especially since even current games experience stuttering when past the 3.5 GB limit like in games like Total War Attila.

Since I waited this long might as well wait till the 390 series comes out that will use HBM. If not may wait till 2016 and will likely do a full system build with Zen CPU (my Phenom II is getting old!!) and HBM 2 GPU.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68  Edited By RyviusARC
Member since 2011 • 5708 Posts

@Xtasy26 said:

Here my Graphics Card history/cards I used to play "3D Accelerated Games", can I still use the term? :P

1. Riva 128 (STB Velocity 128) - 1998 - First Graphics that I used to play "3D Accelerated Games" on the PC. Used it to play Forsaken, Turok, Quake 2, Incoming, Moto Racer 2, Sin, Grim Fandango, Need For Speed Hot Pursuit. Would love to have a Voodoo 2 but that was way too expensive at the time and the Riva 128 had the perfect price/performance ratio back in 1998. Not to mention you needed to purchase a second card for 2D graphics as the Voodoo 2 only did 3d. Riva 128 was nVidia's first successful GPU after the disastrous nv1.

2. GeForce MX 420 64 MB DDR (yes DDR technology !!) (MSI brand?? - not sure) - 2002 . Offered excellent performance and used to it play Jedi Knight 2/Jedi Academy, Quake III, GTA III, UT Tournament 2003, Need For Speed Hot Pursuit 2, Halo 1.

3. GeForce 7600 GT (XFX) - 2006 - Graphics card came in a wicked box in a X shape (the best looking box ever!!! lol). Used it to play Far Cry, could finally play it maxed out with Pixed Shader 3.0, the water looked so much better! And the HDR was just wow!

4. ATI HD 4870 (Sapphire) - 2008 - Switched to ATI after being loyal nVidia user for 10 years. Chose to go with ATI because it offered the same or better performance that the GTX 260 at lesser price, even matched the GTX 280 in some games which cost almost twice as much because of the power of GDDR5! Primarily brought it to play Crysis maxed out. Waited a year after it came out to finally play the game even though I pre-ordered the game a year earlier. Later used it to play Crysis Warhead.

5. Radeon HD 6950 Bios Flashed it to HD 6970 (XFX) - 2011 - Upgraded to Windows 7, wanted to play Win 7 DX 11 supported games. Hoped that I would be able to BIOS flash it to HD 6970 and save $100 and I did! Also upgraded monitor to 1080P so needed something like an HD 6970. Used it to play DX 11 games as DX 10 was going out of the way. Initially used to play Dirt 2 one of the first games to use DX 11 and use them tesselations. Pretty much use this GPU to play all my DX 11 games like Crysis 2, BF3, BF4.

Notice that I purchase GPU's every 2 - 4 years or every two generations so was looking to buy an R9 series as it was two generations after the HD 69XX series. Thought about getting a R9 290 and BIOS flashing it R9 290X but missed out. Then the mining craze took off and the price of R9 290s went through the roof. By the time it came down nVidia released their GTX 970/980 series couple of months later. Thought about pulling the trigger and buying a GTX 970 in early 2015 as it had been 4 years since I got my HD 6970. Was browsing newegg for GTX 970 2 - 3 days before news hit of 3.5 GB issue and was no way going to buy the GTX 970. I want something that is future proof given that I update GPUs 2 - 4 years, don't want to have issues in the future especially since even current games experience stuttering when past the 3.5 GB limit like in games like Total War Attila.

Since I waited this long might as well wait till the 390 series comes out that will use HBM. If not may wait till 2016 and will likely do a full system build with Zen CPU (my Phenom II is getting old!!) and HBM 2 GPU.

3.5GB is plenty for now but in the future DX12 will be the norm for games that demand more and that will allow for vRAM stacking with SLI so if you SLI 2 970s it will be 7GB of fast vRAM.

There have already been talk about a DX12 patch for The Witcher 3 and Arkham Knight.

It's very easy to port DX11 games to DX12 and can be done with a couple of people over the course of a week or two.

If vRAM is an issue then wait until 2016. I think the memory yields made it so the 390x will only have 4GB.

The true generational jump in GPUs is 2016 with 16nm chips, the 390 and 390x are still 28nm.

Avatar image for _SKatEDiRt_
_SKatEDiRt_

3117

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 _SKatEDiRt_
Member since 2007 • 3117 Posts

lll just buy you one in US and ship it to you Felipe

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 FelipeInside
Member since 2003 • 28548 Posts

@_SKatEDiRt_ said:

lll just buy you one in US and ship it to you Felipe

LOL, I'll hit you up next time :P

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#71 Xtasy26
Member since 2008 • 5593 Posts

@RyviusARC said:
@Xtasy26 said:

Here my Graphics Card history/cards I used to play "3D Accelerated Games", can I still use the term? :P

1. Riva 128 (STB Velocity 128) - 1998 - First Graphics that I used to play "3D Accelerated Games" on the PC. Used it to play Forsaken, Turok, Quake 2, Incoming, Moto Racer 2, Sin, Grim Fandango, Need For Speed Hot Pursuit. Would love to have a Voodoo 2 but that was way too expensive at the time and the Riva 128 had the perfect price/performance ratio back in 1998. Not to mention you needed to purchase a second card for 2D graphics as the Voodoo 2 only did 3d. Riva 128 was nVidia's first successful GPU after the disastrous nv1.

2. GeForce MX 420 64 MB DDR (yes DDR technology !!) (MSI brand?? - not sure) - 2002 . Offered excellent performance and used to it play Jedi Knight 2/Jedi Academy, Quake III, GTA III, UT Tournament 2003, Need For Speed Hot Pursuit 2, Halo 1.

3. GeForce 7600 GT (XFX) - 2006 - Graphics card came in a wicked box in a X shape (the best looking box ever!!! lol). Used it to play Far Cry, could finally play it maxed out with Pixed Shader 3.0, the water looked so much better! And the HDR was just wow!

4. ATI HD 4870 (Sapphire) - 2008 - Switched to ATI after being loyal nVidia user for 10 years. Chose to go with ATI because it offered the same or better performance that the GTX 260 at lesser price, even matched the GTX 280 in some games which cost almost twice as much because of the power of GDDR5! Primarily brought it to play Crysis maxed out. Waited a year after it came out to finally play the game even though I pre-ordered the game a year earlier. Later used it to play Crysis Warhead.

5. Radeon HD 6950 Bios Flashed it to HD 6970 (XFX) - 2011 - Upgraded to Windows 7, wanted to play Win 7 DX 11 supported games. Hoped that I would be able to BIOS flash it to HD 6970 and save $100 and I did! Also upgraded monitor to 1080P so needed something like an HD 6970. Used it to play DX 11 games as DX 10 was going out of the way. Initially used to play Dirt 2 one of the first games to use DX 11 and use them tesselations. Pretty much use this GPU to play all my DX 11 games like Crysis 2, BF3, BF4.

Notice that I purchase GPU's every 2 - 4 years or every two generations so was looking to buy an R9 series as it was two generations after the HD 69XX series. Thought about getting a R9 290 and BIOS flashing it R9 290X but missed out. Then the mining craze took off and the price of R9 290s went through the roof. By the time it came down nVidia released their GTX 970/980 series couple of months later. Thought about pulling the trigger and buying a GTX 970 in early 2015 as it had been 4 years since I got my HD 6970. Was browsing newegg for GTX 970 2 - 3 days before news hit of 3.5 GB issue and was no way going to buy the GTX 970. I want something that is future proof given that I update GPUs 2 - 4 years, don't want to have issues in the future especially since even current games experience stuttering when past the 3.5 GB limit like in games like Total War Attila.

Since I waited this long might as well wait till the 390 series comes out that will use HBM. If not may wait till 2016 and will likely do a full system build with Zen CPU (my Phenom II is getting old!!) and HBM 2 GPU.

3.5GB is plenty for now but in the future DX12 will be the norm for games that demand more and that will allow for vRAM stacking with SLI so if you SLI 2 970s it will be 7GB of fast vRAM.

There have already been talk about a DX12 patch for The Witcher 3 and Arkham Knight.

It's very easy to port DX11 games to DX12 and can be done with a couple of people over the course of a week or two.

If vRAM is an issue then wait until 2016. I think the memory yields made it so the 390x will only have 4GB.

The true generational jump in GPUs is 2016 with 16nm chips, the 390 and 390x are still 28nm.

For now maybe, but you still have some issues with stuttering in some games like Total War Antilla once past 3.5 GB. While a full blown 4GB R9 290X/980 doesn't have that issue. Why would I buy something that is knowingly crippled especially since I plan to use it for up to 3 - 4 years?

And I agree with you being a much bigger jump in 2016 with 16mn/14nm I mean we have been in 28 nm for 3 + years now (going on to 4 years in 2016) for crying out loud, which is unheard in the GPU industry in the 15+ years I have been following it.

Look how big a jump it was going from Voodoo 2 in 1998 - Radeon 9700 Pro in 2002 in 4 years. Such a massive increase.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72  Edited By RyviusARC
Member since 2011 • 5708 Posts

@Xtasy26 said:
@RyviusARC said:
@Xtasy26 said:

Here my Graphics Card history/cards I used to play "3D Accelerated Games", can I still use the term? :P

1. Riva 128 (STB Velocity 128) - 1998 - First Graphics that I used to play "3D Accelerated Games" on the PC. Used it to play Forsaken, Turok, Quake 2, Incoming, Moto Racer 2, Sin, Grim Fandango, Need For Speed Hot Pursuit. Would love to have a Voodoo 2 but that was way too expensive at the time and the Riva 128 had the perfect price/performance ratio back in 1998. Not to mention you needed to purchase a second card for 2D graphics as the Voodoo 2 only did 3d. Riva 128 was nVidia's first successful GPU after the disastrous nv1.

2. GeForce MX 420 64 MB DDR (yes DDR technology !!) (MSI brand?? - not sure) - 2002 . Offered excellent performance and used to it play Jedi Knight 2/Jedi Academy, Quake III, GTA III, UT Tournament 2003, Need For Speed Hot Pursuit 2, Halo 1.

3. GeForce 7600 GT (XFX) - 2006 - Graphics card came in a wicked box in a X shape (the best looking box ever!!! lol). Used it to play Far Cry, could finally play it maxed out with Pixed Shader 3.0, the water looked so much better! And the HDR was just wow!

4. ATI HD 4870 (Sapphire) - 2008 - Switched to ATI after being loyal nVidia user for 10 years. Chose to go with ATI because it offered the same or better performance that the GTX 260 at lesser price, even matched the GTX 280 in some games which cost almost twice as much because of the power of GDDR5! Primarily brought it to play Crysis maxed out. Waited a year after it came out to finally play the game even though I pre-ordered the game a year earlier. Later used it to play Crysis Warhead.

5. Radeon HD 6950 Bios Flashed it to HD 6970 (XFX) - 2011 - Upgraded to Windows 7, wanted to play Win 7 DX 11 supported games. Hoped that I would be able to BIOS flash it to HD 6970 and save $100 and I did! Also upgraded monitor to 1080P so needed something like an HD 6970. Used it to play DX 11 games as DX 10 was going out of the way. Initially used to play Dirt 2 one of the first games to use DX 11 and use them tesselations. Pretty much use this GPU to play all my DX 11 games like Crysis 2, BF3, BF4.

Notice that I purchase GPU's every 2 - 4 years or every two generations so was looking to buy an R9 series as it was two generations after the HD 69XX series. Thought about getting a R9 290 and BIOS flashing it R9 290X but missed out. Then the mining craze took off and the price of R9 290s went through the roof. By the time it came down nVidia released their GTX 970/980 series couple of months later. Thought about pulling the trigger and buying a GTX 970 in early 2015 as it had been 4 years since I got my HD 6970. Was browsing newegg for GTX 970 2 - 3 days before news hit of 3.5 GB issue and was no way going to buy the GTX 970. I want something that is future proof given that I update GPUs 2 - 4 years, don't want to have issues in the future especially since even current games experience stuttering when past the 3.5 GB limit like in games like Total War Attila.

Since I waited this long might as well wait till the 390 series comes out that will use HBM. If not may wait till 2016 and will likely do a full system build with Zen CPU (my Phenom II is getting old!!) and HBM 2 GPU.

3.5GB is plenty for now but in the future DX12 will be the norm for games that demand more and that will allow for vRAM stacking with SLI so if you SLI 2 970s it will be 7GB of fast vRAM.

There have already been talk about a DX12 patch for The Witcher 3 and Arkham Knight.

It's very easy to port DX11 games to DX12 and can be done with a couple of people over the course of a week or two.

If vRAM is an issue then wait until 2016. I think the memory yields made it so the 390x will only have 4GB.

The true generational jump in GPUs is 2016 with 16nm chips, the 390 and 390x are still 28nm.

For now maybe, but you still have some issues with stuttering in some games like Total War Antilla once past 3.5 GB. While a full blown 4GB R9 290X/980 doesn't have that issue. Why would I buy something that is knowingly crippled especially since I plan to use it for up to 3 - 4 years?

And I agree with you being a much bigger jump in 2016 with 16mn/14nm I mean we have been in 28 nm for 3 + years now (going on to 4 years in 2016) for crying out loud, which is unheard in the GPU industry in the 15+ years I have been following it.

Look how big a jump it was going from Voodoo 2 in 1998 - Radeon 9700 Pro in 2002 in 4 years. Such a massive increase.

Nope Total War Attila is a 32bit application and cannot access more than 3GB of vRAM.

So any stutter happening is not from vRAM.

Besides if there is going to be a big jump in vRAM usage then 4GB will not be enough either.

Currently 3.5GB is enough for me and I play at 1440p.

Even maxed GTA V at 2xMSAA and experienced no stuttering at 1440p.

DX12 will make it 7GB+ so it will be enough for a while.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#73  Edited By Xtasy26
Member since 2008 • 5593 Posts

@RyviusARC said:
@Xtasy26 said:
@RyviusARC said:
@Xtasy26 said:

Here my Graphics Card history/cards I used to play "3D Accelerated Games", can I still use the term? :P

1. Riva 128 (STB Velocity 128) - 1998 - First Graphics that I used to play "3D Accelerated Games" on the PC. Used it to play Forsaken, Turok, Quake 2, Incoming, Moto Racer 2, Sin, Grim Fandango, Need For Speed Hot Pursuit. Would love to have a Voodoo 2 but that was way too expensive at the time and the Riva 128 had the perfect price/performance ratio back in 1998. Not to mention you needed to purchase a second card for 2D graphics as the Voodoo 2 only did 3d. Riva 128 was nVidia's first successful GPU after the disastrous nv1.

2. GeForce MX 420 64 MB DDR (yes DDR technology !!) (MSI brand?? - not sure) - 2002 . Offered excellent performance and used to it play Jedi Knight 2/Jedi Academy, Quake III, GTA III, UT Tournament 2003, Need For Speed Hot Pursuit 2, Halo 1.

3. GeForce 7600 GT (XFX) - 2006 - Graphics card came in a wicked box in a X shape (the best looking box ever!!! lol). Used it to play Far Cry, could finally play it maxed out with Pixed Shader 3.0, the water looked so much better! And the HDR was just wow!

4. ATI HD 4870 (Sapphire) - 2008 - Switched to ATI after being loyal nVidia user for 10 years. Chose to go with ATI because it offered the same or better performance that the GTX 260 at lesser price, even matched the GTX 280 in some games which cost almost twice as much because of the power of GDDR5! Primarily brought it to play Crysis maxed out. Waited a year after it came out to finally play the game even though I pre-ordered the game a year earlier. Later used it to play Crysis Warhead.

5. Radeon HD 6950 Bios Flashed it to HD 6970 (XFX) - 2011 - Upgraded to Windows 7, wanted to play Win 7 DX 11 supported games. Hoped that I would be able to BIOS flash it to HD 6970 and save $100 and I did! Also upgraded monitor to 1080P so needed something like an HD 6970. Used it to play DX 11 games as DX 10 was going out of the way. Initially used to play Dirt 2 one of the first games to use DX 11 and use them tesselations. Pretty much use this GPU to play all my DX 11 games like Crysis 2, BF3, BF4.

Notice that I purchase GPU's every 2 - 4 years or every two generations so was looking to buy an R9 series as it was two generations after the HD 69XX series. Thought about getting a R9 290 and BIOS flashing it R9 290X but missed out. Then the mining craze took off and the price of R9 290s went through the roof. By the time it came down nVidia released their GTX 970/980 series couple of months later. Thought about pulling the trigger and buying a GTX 970 in early 2015 as it had been 4 years since I got my HD 6970. Was browsing newegg for GTX 970 2 - 3 days before news hit of 3.5 GB issue and was no way going to buy the GTX 970. I want something that is future proof given that I update GPUs 2 - 4 years, don't want to have issues in the future especially since even current games experience stuttering when past the 3.5 GB limit like in games like Total War Attila.

Since I waited this long might as well wait till the 390 series comes out that will use HBM. If not may wait till 2016 and will likely do a full system build with Zen CPU (my Phenom II is getting old!!) and HBM 2 GPU.

3.5GB is plenty for now but in the future DX12 will be the norm for games that demand more and that will allow for vRAM stacking with SLI so if you SLI 2 970s it will be 7GB of fast vRAM.

There have already been talk about a DX12 patch for The Witcher 3 and Arkham Knight.

It's very easy to port DX11 games to DX12 and can be done with a couple of people over the course of a week or two.

If vRAM is an issue then wait until 2016. I think the memory yields made it so the 390x will only have 4GB.

The true generational jump in GPUs is 2016 with 16nm chips, the 390 and 390x are still 28nm.

For now maybe, but you still have some issues with stuttering in some games like Total War Antilla once past 3.5 GB. While a full blown 4GB R9 290X/980 doesn't have that issue. Why would I buy something that is knowingly crippled especially since I plan to use it for up to 3 - 4 years?

And I agree with you being a much bigger jump in 2016 with 16mn/14nm I mean we have been in 28 nm for 3 + years now (going on to 4 years in 2016) for crying out loud, which is unheard in the GPU industry in the 15+ years I have been following it.

Look how big a jump it was going from Voodoo 2 in 1998 - Radeon 9700 Pro in 2002 in 4 years. Such a massive increase.

Nope Total War Attila is a 32bit application and cannot access more than 3GB of vRAM.

So any stutter happening is not from vRAM.

Besides if there is going to be a big jump in vRAM usage then 4GB will not be enough either.

Currently 3.5GB is enough for me and I play at 1440p.

Even maxed GTA V at 2xMSAA and experienced no stuttering at 1440p.

DX12 will make it 7GB+ so it will be enough for a while.

Not true. Depends on the game. Attila handles texture resolution based on available VRAM. Hence it allocates 4GB Ram for 4GB GPUs however since GTX 970 last 0.5GB incurs 1/7th of the speed of the 3.5GB pool you will experience stutter in Attila. As you can see below the GTX 970 experiences stuttering however a full blown 4GB R9 290X and the GTX 980 doesn't.

http://www.pcgameshardware.de/Total-War-Attila-PC-259548/Specials/Test-Benchmarks-1151602/

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#74  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@Xtasy26 said:
@RyviusARC said:

Nope Total War Attila is a 32bit application and cannot access more than 3GB of vRAM.

So any stutter happening is not from vRAM.

Besides if there is going to be a big jump in vRAM usage then 4GB will not be enough either.

Currently 3.5GB is enough for me and I play at 1440p.

Even maxed GTA V at 2xMSAA and experienced no stuttering at 1440p.

DX12 will make it 7GB+ so it will be enough for a while.

Not true. Depends on the game. Attila handles texture resolution based on available VRAM. Hence it allocates 4GB Ram for 4GB GPUs however since GTX 970 last 0.5GB incurs 1/7th of the speed of the 3.5GB pool you will experience stutter in Attila. As you can see below the GTX 970 experiences stuttering however a full blown 4GB R9 290X and the GTX 980 doesn't.

http://www.pcgameshardware.de/Total-War-Attila-PC-259548/Specials/Test-Benchmarks-1151602/

Wrong, people were getting stuttering on 980's and SLI 980's , SLI 670's , GTX 660ti, GTX 780m, and on as well on different AMD gpus , its not the gpu.

You really need to stop the 970 3.5gb meme, at 1600p no gpu is able to max this game out using even under the 3.5gb mark.... even at 4k using nearly 4gb 290x is in the same boat as all single gpu's. All real examples of games allocating more then 3.5gb on the 970 show no issues. Its only when you use over excessive settings that is the real issue. By the time games require more then 4gb at 1080p gaming (actual real tangible differences in quality) all 4gb cards will be in the same boat.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#75  Edited By RyviusARC
Member since 2011 • 5708 Posts

.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#76  Edited By Xtasy26
Member since 2008 • 5593 Posts

@04dcarraher said:
@Xtasy26 said:
@RyviusARC said:

Nope Total War Attila is a 32bit application and cannot access more than 3GB of vRAM.

So any stutter happening is not from vRAM.

Besides if there is going to be a big jump in vRAM usage then 4GB will not be enough either.

Currently 3.5GB is enough for me and I play at 1440p.

Even maxed GTA V at 2xMSAA and experienced no stuttering at 1440p.

DX12 will make it 7GB+ so it will be enough for a while.

Not true. Depends on the game. Attila handles texture resolution based on available VRAM. Hence it allocates 4GB Ram for 4GB GPUs however since GTX 970 last 0.5GB incurs 1/7th of the speed of the 3.5GB pool you will experience stutter in Attila. As you can see below the GTX 970 experiences stuttering however a full blown 4GB R9 290X and the GTX 980 doesn't.

http://www.pcgameshardware.de/Total-War-Attila-PC-259548/Specials/Test-Benchmarks-1151602/

Wrong, people were getting stuttering on 980's and SLI 980's , SLI 670's , GTX 660ti, GTX 780m, and on as well on different AMD gpus , its not the gpu.

You really need to stop the 970 3.5gb meme, at 1600p no gpu is able to max this game out using even under the 3.5gb mark.... even at 4k using nearly 4gb 290x is in the same boat as all single gpu's. All real examples of games allocating more then 3.5gb on the 970 show no issues. Its only when you use over excessive settings that is the real issue. By the time games require more then 4gb at 1080p gaming (actual real tangible differences in quality) all 4gb cards will be in the same boat.

Did you look at the link? It's not talking about the average frames. This means NOTHING to me. The site I linked has average frames too. But what they were showing was frametimes with respect to the GTX 970. No ones is talking about maxing this game out. That is not the issue here. While come the GTX 980 and R9 290X aren't experiencing this? Average frames is not going to show the stuttering effects.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#77  Edited By Coseniath
Member since 2004 • 3183 Posts
@Xtasy26 said:

Did you look at the link? It's not talking about the average frames. This means NOTHING to me. The site I linked has average frames too. But what they were showing was frametimes with respect to the GTX 970. No ones is talking about maxing this game out. That is not the issue here. While come the GTX 980 and R9 290X aren't experiencing this? Average frames is not going to show the stuttering effects.

You also posted a game that wasn't too friendly with Nvidia hardware. In this game GTX970 is only 10% faster than R9 280X (clearly the true difference between them... /sarcasm off). Isn't the R9 280X with the 3GB VRAM affected by stuttering? I bet it doesn't...

I can also post games that are not friendly to AMD hardware that makes even R9 290 to stutter.

Does this means that R9 290 run out of memory in these situations? No. Just some games are better playable with AMD others with Nvidia.

Actually I can find even wierder scenarios when a GTX690 with its only 2GB VRAM doesn't stutter while 295X with the 4GB VRAM do stutter. Hell R9 295X should be like 60% faster with double VRAM too...

GTX970 stuttering has been observed but not in real world scenarios.

Stuttering has been observed at 4K(after 1.3 scaling) in Battlefield 4 when GTX980 has 19.1FPS and GTX970 has 16.8FPS.

Then again who plays a game with 16.8 FPS?

And today if a GPU runs out of VRAM without giving very good visual, you should obviously point the finger to the devs.

From W1zzard at TechpowerUp:

Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised.

TLDR: Games are too complicated to just say that these VRAM legends (Copyright W1zzard :P) are the only thing responsible...

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#78  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@Coseniath said:

You also posted a game that wasn't too friendly with Nvidia hardware. In this game GTX970 is only 10% faster than R9 280X (clearly the true difference between them... /sarcasm off). Isn't the R9 280X with the 3GB VRAM affected by stuttering? I bet it doesn't...

I can also post games that are not friendly to AMD hardware that makes even R9 290 to stutter.

Does this means that R9 290 run out of memory in these situations? No. Just some games are better playable with AMD others with Nvidia.

Actually I can find even wierder scenarios when a GTX690 with its only 2GB VRAM doesn't stutter while 295X with the 4GB VRAM do stutter. Hell R9 295X should be like 60% faster with double VRAM too...

GTX970 stuttering has been observed but not in real world scenarios.

Stuttering has been observed at 4K(after 1.3 scaling) in Battlefield 4 when GTX980 has 19.1FPS and GTX970 has 16.8FPS.

Then again who plays a game with 16.8 FPS?

And today if a GPU runs out of VRAM without giving very good visual, you should obviously point the finger to the devs.

From W1zzard at TechpowerUp:

Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised.

TLDR: Games are too complicated to just say that these VRAM legends (Copyright W1zzard :P) are the only thing responsible...

Also how a game handles cpu resources and harddrive usage also plays a role how well a game runs on a gpu. Major harddrive thrashing with games like Watchdogs or GTA 5 can lead into performance issues. using a SSD or multiple drives helps alot.

Below is BF4, a game that is well mult-threaded, shows any AMD cpu causing frame timing spikes.

Here is some more stuttering frametiming issues on both the 970 and 290 on BF hardline open beta using Intel i7 3930k at 3.8 ghz and and even at 1440p ultra used only around 2.7gb.

Here is Bioshock Inf

And here is another random game Skyrim 7950 vs GTX 660ti, showing card with gimped memory bus and less ram getting more stable frame rates then a gpu that is better.

So thing to take from this Xtasy26 , is this stop being a blind fanboy of sorts, using examples like youtube vids running multiple instances of games at the same time to show stuttering on the 970, or using poorly coded games or games that favor one side over another. Fact is that 970 is a 4gb card...... So unless your running excessive gpu settings the segmented memory pool isnt going to be a problem.

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#79 deactivated-579f651eab962
Member since 2003 • 5404 Posts

As this is ongoing, I'll move it to the sparkly new hardware board.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#80  Edited By Coseniath
Member since 2004 • 3183 Posts
@klunt_bumskrint said:

As this is ongoing, I'll move it to the sparkly new hardware board.

Yeah, probably from the new change in forums, this belongs to hardware from the begining of discussion.

@04dcarraher said:

Also how a game handles cpu resources and harddrive usage also plays a role how well a game runs on a gpu. Major harddrive thrashing with games like Watchdogs or GTA 5 can lead into performance issues. using a SSD or multiple drives helps alot.

This is the answer W1zzard gave when people asked him why he is not posting minimum FPS.

So guyz thats the reason that TechpowerUp doesn't have minimum FPS in their tests.

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#81 insane_metalist
Member since 2006 • 7797 Posts

@Coseniath said:
@04dcarraher said:

Also how a game handles cpu resources and harddrive usage also plays a role how well a game runs on a gpu. Major harddrive thrashing with games like Watchdogs or GTA 5 can lead into performance issues. using a SSD or multiple drives helps alot.

This is the answer W1zzard gave when people asked him why he is not posting minimum FPS.

So guyz thats the reason that TechpowerUp doesn't have minimum FPS in their tests.

I had GTA V first installed on my 3TB Seagate 7200 RPM. I reinstalled it on my SSD and I have not noticed any difference.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#82  Edited By Coseniath
Member since 2004 • 3183 Posts
@insane_metalist said:

I had GTA V first installed on my 3TB Seagate 7200 RPM. I reinstalled it on my SSD and I have not noticed any difference.

This depends on the game. I remember seeing in a friend a ridiculus stutter in STALKER: Call of Pripyat, where while he was moving forwards and backwards in a certain point, the game was loading always a different area so it had to get the data from the HDD and there was a stutterfest just by moving a bit....

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#83 insane_metalist
Member since 2006 • 7797 Posts

@Coseniath said:
@insane_metalist said:

I had GTA V first installed on my 3TB Seagate 7200 RPM. I reinstalled it on my SSD and I have not noticed any difference.

This depends on the game. I remember seeing in a friend a ridiculus stutter in STALKER: Call of Pripyat, where while he was moving forwards and backwards in a certain point, the game was loading always a different area so it had to get the data from the HDD and there was a stutterfest just by moving a bit....

Hmm.. never had that happen. I run most of my games from 1TB & 3TB Seagate Barracuda's.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#84  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@insane_metalist said:
@Coseniath said:
@04dcarraher said:

Also how a game handles cpu resources and harddrive usage also plays a role how well a game runs on a gpu. Major harddrive thrashing with games like Watchdogs or GTA 5 can lead into performance issues. using a SSD or multiple drives helps alot.

This is the answer W1zzard gave when people asked him why he is not posting minimum FPS.

So guyz thats the reason that TechpowerUp doesn't have minimum FPS in their tests.

I had GTA V first installed on my 3TB Seagate 7200 RPM. I reinstalled it on my SSD and I have not noticed any difference.

You wont, becuase chances are you have the paging file on the SSD. GTA 5 wrote alot of gb's of data having the paging file on the SSD. I have GTA 5 installed on a separate hdd and moved paging file to the drive it was on and onto another mechanical drive to see. And having the paging file on same drive was noticeable difference, putting it another drive reduce hdd thrashing and its affects. but having PF on the SSD yielded the best result in having no ill effect from hdd usage. Even Watchdogs using separate drives reduced the issues quite abit(before SSD).

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#85  Edited By Coseniath
Member since 2004 • 3183 Posts
@insane_metalist said:
@Coseniath said:
@insane_metalist said:

I had GTA V first installed on my 3TB Seagate 7200 RPM. I reinstalled it on my SSD and I have not noticed any difference.

This depends on the game. I remember seeing in a friend a ridiculus stutter in STALKER: Call of Pripyat, where while he was moving forwards and backwards in a certain point, the game was loading always a different area so it had to get the data from the HDD and there was a stutterfest just by moving a bit....

Hmm.. never had that happen. I run most of my games from 1TB & 3TB Seagate Barracuda's.

Loading Video...

Take a look with an 6970 2GB...

Bad engine, instead of adding the whole area in the RAM, it was using HDD too much :/.

Stutterfest as a word is too weak to describe it...

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#86 insane_metalist
Member since 2006 • 7797 Posts

@Coseniath: Well that really sucks. How does that happen? How come the system itself uses the HDD instead of more RAM?

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#87 Coseniath
Member since 2004 • 3183 Posts
@insane_metalist said:

@Coseniath: Well that really sucks. How does that happen? How come the system itself uses the HDD instead of more RAM?

Bad engine.

The game was more than great but the engine was crap...

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#88 insane_metalist
Member since 2006 • 7797 Posts

@Coseniath said:
@insane_metalist said:

@Coseniath: Well that really sucks. How does that happen? How come the system itself uses the HDD instead of more RAM?

Bad engine.

The game was more than great but the engine was crap...

So basically we can easily get screwed. Wonderful.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#89 04dcarraher
Member since 2004 • 23858 Posts

@insane_metalist said:
@Coseniath said:
@insane_metalist said:

@Coseniath: Well that really sucks. How does that happen? How come the system itself uses the HDD instead of more RAM?

Bad engine.

The game was more than great but the engine was crap...

So basically we can easily get screwed. Wonderful.

A SSD can tempered the hitching.

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#90 insane_metalist
Member since 2006 • 7797 Posts

@04dcarraher said:
@insane_metalist said:
@Coseniath said:
@insane_metalist said:

@Coseniath: Well that really sucks. How does that happen? How come the system itself uses the HDD instead of more RAM?

Bad engine.

The game was more than great but the engine was crap...

So basically we can easily get screwed. Wonderful.

A SSD can tempered the hitching.

Well at least a SSD saves the day. It's still poopy that great games are released on crappy engines.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#91  Edited By Xtasy26
Member since 2008 • 5593 Posts

@Coseniath said:
@Xtasy26 said:

Did you look at the link? It's not talking about the average frames. This means NOTHING to me. The site I linked has average frames too. But what they were showing was frametimes with respect to the GTX 970. No ones is talking about maxing this game out. That is not the issue here. While come the GTX 980 and R9 290X aren't experiencing this? Average frames is not going to show the stuttering effects.

You also posted a game that wasn't too friendly with Nvidia hardware. In this game GTX970 is only 10% faster than R9 280X (clearly the true difference between them... /sarcasm off). Isn't the R9 280X with the 3GB VRAM affected by stuttering? I bet it doesn't...

I can also post games that are not friendly to AMD hardware that makes even R9 290 to stutter.

Does this means that R9 290 run out of memory in these situations? No. Just some games are better playable with AMD others with Nvidia.

Actually I can find even wierder scenarios when a GTX690 with its only 2GB VRAM doesn't stutter while 295X with the 4GB VRAM do stutter. Hell R9 295X should be like 60% faster with double VRAM too...

GTX970 stuttering has been observed but not in real world scenarios.

Stuttering has been observed at 4K(after 1.3 scaling) in Battlefield 4 when GTX980 has 19.1FPS and GTX970 has 16.8FPS.

Then again who plays a game with 16.8 FPS?

And today if a GPU runs out of VRAM without giving very good visual, you should obviously point the finger to the devs.

From W1zzard at TechpowerUp:

Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised.

TLDR: Games are too complicated to just say that these VRAM legends (Copyright W1zzard :P) are the only thing responsible...

That's funny the GTX 980 has no problem with Stuttering with Attila. The GTX 980 actually does quite well (has the best AVG FPS) so your argument that it's friendly to nVidia hardware is nonsense.

@04dcarraher said:
@Coseniath said:

You also posted a game that wasn't too friendly with Nvidia hardware. In this game GTX970 is only 10% faster than R9 280X (clearly the true difference between them... /sarcasm off). Isn't the R9 280X with the 3GB VRAM affected by stuttering? I bet it doesn't...

I can also post games that are not friendly to AMD hardware that makes even R9 290 to stutter.

Does this means that R9 290 run out of memory in these situations? No. Just some games are better playable with AMD others with Nvidia.

Actually I can find even wierder scenarios when a GTX690 with its only 2GB VRAM doesn't stutter while 295X with the 4GB VRAM do stutter. Hell R9 295X should be like 60% faster with double VRAM too...

GTX970 stuttering has been observed but not in real world scenarios.

Stuttering has been observed at 4K(after 1.3 scaling) in Battlefield 4 when GTX980 has 19.1FPS and GTX970 has 16.8FPS.

Then again who plays a game with 16.8 FPS?

And today if a GPU runs out of VRAM without giving very good visual, you should obviously point the finger to the devs.

From W1zzard at TechpowerUp:

Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised.

TLDR: Games are too complicated to just say that these VRAM legends (Copyright W1zzard :P) are the only thing responsible...

Also how a game handles cpu resources and harddrive usage also plays a role how well a game runs on a gpu. Major harddrive thrashing with games like Watchdogs or GTA 5 can lead into performance issues. using a SSD or multiple drives helps alot.

Below is BF4, a game that is well mult-threaded, shows any AMD cpu causing frame timing spikes.

Here is some more stuttering frametiming issues on both the 970 and 290 on BF hardline open beta using Intel i7 3930k at 3.8 ghz and and even at 1440p ultra used only around 2.7gb.

Here is Bioshock Inf

And here is another random game Skyrim 7950 vs GTX 660ti, showing card with gimped memory bus and less ram getting more stable frame rates then a gpu that is better.

So thing to take from this Xtasy26 , is this stop being a blind fanboy of sorts, using examples like youtube vids running multiple instances of games at the same time to show stuttering on the 970, or using poorly coded games or games that favor one side over another. Fact is that 970 is a 4gb card...... So unless your running excessive gpu settings the segmented memory pool isnt going to be a problem.

The example I showed wasn't a youtube video. It was hard concrete results using actual performance with respect to frametimes, the GTX 980 doesn't have any issues with frametimes with Attila, so how is pointing out facts a fanboyism? Explain how GTX 980 doesn't have any issues with frametimes with Attila? GTX 980 actually does quite well in both frametimes and actual average frames.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#92  Edited By Coseniath
Member since 2004 • 3183 Posts

@Xtasy26: I think you didn't even read what I posted.

So if Attila isn't AMD friendly, then why the 280X with less VRAM and far less proccessing power is just 10% slower? (not to mention that R9 285 beats GTX770, LOL)

And this review smells fishy from miles away...

GTX970 has less minimum FPS from 280X and GTX960. And GTX970 stutters while others don't. With the same settings. Yeah right... and I am the Queen of UK...

Either his GTX970 is defective, or, he is lets say... a "little unfriendly" towards Nvidia.

Now after these facts if you still believe this is a "trusted" review, do it. I don't...

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#93  Edited By Xtasy26
Member since 2008 • 5593 Posts

@Coseniath said:

@Xtasy26: I think you didn't even read what I posted.

So if Attila isn't AMD friendly, then why the 280X with less VRAM and far less proccessing power is just 10% slower? (not to mention that R9 285 beats GTX770, LOL)

And this review smells fishy from miles away...

GTX970 has less minimum FPS from 280X and GTX960. And GTX970 stutters while others don't. With the same settings. Yeah right... and I am the Queen of UK...

Either his GTX970 is defective, or, he is lets say... a "little unfriendly" towards Nvidia.

Now after these facts if you still believe this is a "trusted" review, do it. I don't...

Ahh...did you read my first statement carefully. This particular game in questions allocates memory accordingly based on the available memory. The GTX 280X and 960 are not 4GB cards. So of course these GPUs are not going to stutter as the game doesn't allocate 4GB memory to BEGIN with silly.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#94 Coseniath
Member since 2004 • 3183 Posts
@Xtasy26 said:

Ahh...did you read my first statement carefully. This particular game in questions allocates memory accordingly based on the available memory. The GTX 280X and 960 are not 4GB cards. So of course these GPUs are not going to stutter as the game doesn't allocate 4GB memory to BEGIN with silly.

As far as we know, available memory is something that is being decided by OS. If a game overrides that order, they better know what they are doing. It shouldn't matter how much memory you have. This is clearly poor memory allocation. Same as latest COD.

Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised.

This was from TechpowerUp, if we need the 12GB VRAM of TitanX. With your saying Total War Attila sucks like COD AW. "It keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything"...

What is beyond me, is why do you want to blame a GPU for a mediocre software that programmers are slacking to do their job properly...

Avatar image for ribstaylor1
Ribstaylor1

2186

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#95 Ribstaylor1
Member since 2014 • 2186 Posts

I'd personally wait for the next gen cards out early late this year early next spring. a 680 at 1080p still plays games on ultra/high settings, so what's the point right now besides getting a max of 60fps? Generational leap is just around the corner not an incremental one like the 970. So I'd just wait for either the new AMD or Nvidia series.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#96 Xtasy26
Member since 2008 • 5593 Posts

@Coseniath said:
@Xtasy26 said:

Ahh...did you read my first statement carefully. This particular game in questions allocates memory accordingly based on the available memory. The GTX 280X and 960 are not 4GB cards. So of course these GPUs are not going to stutter as the game doesn't allocate 4GB memory to BEGIN with silly.

As far as we know, available memory is something that is being decided by OS. If a game overrides that order, they better know what they are doing. It shouldn't matter how much memory you have. This is clearly poor memory allocation. Same as latest COD.

Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised.

This was from TechpowerUp, if we need the 12GB VRAM of TitanX. With your saying Total War Attila sucks like COD AW. "It keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything"...

What is beyond me, is why do you want to blame a GPU for a mediocre software that programmers are slacking to do their job properly...

Of course it's going to be different depending on the game and how it's programmed. That's what I have been saying from the beginning on how Total War Attila does it's memory allocation. Whether you agree with the programming or not is irrelevant it is what it is. Secondly, even if it does load it into the memory as the article states doesn't mean it will necessarily use it as the article states "even if the texture might never or only rarely be used". So, you might not even see any stuttering effects. Again it all depends on the game and how it's programmed. It all comes down to the programming and the memory management of different games. Clearly Total War Attila is having an effect on the GTX 970 due to it's memory partition because that's how Total War was programmed. This doesn't change the facts that I have provided with respect to Total War Attila. Rather you have rather re-enforced my point that depending on how the game is programmed you will see the effects of the memory partition of the GTX 970 as was the case with Total War Attila.

Why do you keep on ignoring that this is not an issue with the GTX 980 or the R9 290X?

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#97  Edited By Coseniath
Member since 2004 • 3183 Posts
@Xtasy26 said:

Of course it's going to be different depending on the game and how it's programmed. That's what I have been saying from the beginning on how Total War Attila does it's memory allocation. Whether you agree with the programming or not is irrelevant it is what it is. Secondly, even if it does load it into the memory as the article states doesn't mean it will necessarily use it as the article states "even if the texture might never or only rarely be used". So, you might not even see any stuttering effects. Again it all depends on the game and how it's programmed. It all comes down to the programming and the memory management of different games. Clearly Total War Attila is having an effect on the GTX 970 due to it's memory partition because that's how Total War was programmed. This doesn't change the facts that I have provided with respect to Total War Attila. Rather you have rather re-enforced my point that depending on how the game is programmed you will see the effects of the memory partition of the GTX 970 as was the case with Total War Attila.

Why do you keep on ignoring that this is not an issue with the GTX 980 or the R9 290X?

I am not ignoring anything. I can see how well other cards perform.

However...

Also I checked this site's other performance reviews and either he is fooling people or his GTX970 is defected.

Really? Really? In a game that doesn't use more than 2GB in 1440p the GTX970 has stutter?

In the end not even Total War Attila might have a problem with GTX970...

I think its more than obvious that this site shouldn't be taken seriously (at least about GTX970 performance).

edit: He also has R9 290X in par with GTX970 when gameworks are enabled without even using AMD's witcher performance drivers. LOL!

edit2: You can check yourself.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#98 Xtasy26
Member since 2008 • 5593 Posts

@Coseniath said:
@Xtasy26 said:

Of course it's going to be different depending on the game and how it's programmed. That's what I have been saying from the beginning on how Total War Attila does it's memory allocation. Whether you agree with the programming or not is irrelevant it is what it is. Secondly, even if it does load it into the memory as the article states doesn't mean it will necessarily use it as the article states "even if the texture might never or only rarely be used". So, you might not even see any stuttering effects. Again it all depends on the game and how it's programmed. It all comes down to the programming and the memory management of different games. Clearly Total War Attila is having an effect on the GTX 970 due to it's memory partition because that's how Total War was programmed. This doesn't change the facts that I have provided with respect to Total War Attila. Rather you have rather re-enforced my point that depending on how the game is programmed you will see the effects of the memory partition of the GTX 970 as was the case with Total War Attila.

Why do you keep on ignoring that this is not an issue with the GTX 980 or the R9 290X?

I am not ignoring anything. I can see how well other cards perform.

However...

Also I checked this site's other performance reviews and either he is fooling people or his GTX970 is defected.

Really? Really? In a game that doesn't use more than 2GB in 1440p the GTX970 has stutter?

In the end not even Total War Attila might have a problem with GTX970...

I think its more than obvious that this site shouldn't be taken seriously (at least about GTX970 performance).

edit: He also has R9 290X in par with GTX970 when gameworks are enabled without even using AMD's witcher performance drivers. LOL!

edit2: You can check yourself.

I guess we can agree then. :) Again the stuttering effect with the GTX 970 depends on the game and how it's programmed as we have agreed on. The Witcher 3 GTX 970 honestly doesn't look bad the frametimes only seem to happen on the tail end of 120 sec test no where near has bad as the beginning to end as with Total War Attila. So, you will hardly have any issues with Witcher 3 on a GTX 970.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#99  Edited By Coseniath
Member since 2004 • 3183 Posts
@Xtasy26 said:

I guess we can agree then. :) Again the stuttering effect with the GTX 970 depends on the game and how it's programmed as we have agreed on. The Witcher 3 GTX 970 honestly doesn't look bad the frametimes only seem to happen on the tail end of 120 sec test no where near has bad as the beginning to end as with Total War Attila. So, you will hardly have any issues with Witcher 3 on a GTX 970.

You can programmed even a Titan X to have VRAM issues... COD has shown the way...

"The Witcher 3 GTX 970 honestly doesn't look bad the frametimes only seem to happen on the tail end of 120 sec"?

I see stuttering for like 30-40% of the entire frametime. Reaching 50ms (and more) so easily isn't smooth gameplay...

His GTX970 is 100% defective...

After all these world exclusive abysmal results the reviewer has for GTX970, I can't consider these reviews as serious.

You may choose to trust this site and the performance reviews.

I don't...

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#100 Xtasy26
Member since 2008 • 5593 Posts

@Coseniath said:
@Xtasy26 said:

I guess we can agree then. :) Again the stuttering effect with the GTX 970 depends on the game and how it's programmed as we have agreed on. The Witcher 3 GTX 970 honestly doesn't look bad the frametimes only seem to happen on the tail end of 120 sec test no where near has bad as the beginning to end as with Total War Attila. So, you will hardly have any issues with Witcher 3 on a GTX 970.

You can programmed even a Titan X to have VRAM issues... COD has shown the way...

"The Witcher 3 GTX 970 honestly doesn't look bad the frametimes only seem to happen on the tail end of 120 sec"?

I see stuttering for like 30-40% of the entire frametime. Reaching 50ms (and more) so easily isn't smooth gameplay...

His GTX970 is 100% defective...

After all these world exclusive abysmal results the reviewer has for GTX970, I can't consider these reviews as serious.

You may choose to trust this site and the performance reviews.

I don't...

Other people have noticed stuttering sometime on GTX 970 in Witcher 3 too. That doesn't mean that it's a consistent or their graphics card is "100% defective" lol. Rather this test seems to back up this claim that may notice slight stutter that is not always there. Second with respect to COD as I have stated just because it allocates more memory doesn't mean it's going to use it. Plenty of review sites have pointed this out. As of a matter the GTX 970 does very well even at 4K.

As the site states:

"On both the GPUs we see that frametimes are within a playable range without any crazy variance. I didn’t feel any stutter or jerkiness on either of the graphics cards."

GTX 970 actually pulls in around 3.5GB even at 4K and will hardly see any stuttering.

As I and many other sites have pointed out COD Advanced Warfare allocates memory based on the GPU memory that the graphics card has and since it uses around 3.5GB on GTX 970 you will hardly see any stuttering. It all depends on the graphics card. As the site states:

"Even on the highest AA setting and 4K resolution we see the maximum VRAM usage at 3575MB . For those using different GPUs and lower or higher VRAM the numbers might vary ..."

Source.

So argument about COD Advanced Warfare is mute.

Secondly, your argument about the site being not reliable is ludicrous. Germany is the biggest PC Gaming country in Europe. Germans brought 20.4 Million PC Games in 2014. And PCgameshardware.de is one of the biggest if not one the biggest PC Gaming Hardware site in Germany. I have seen English speaking users refer to it many times as they actually had pretty good benchmarks.