NVIDIA's Kepler GTX 680 blows away the Radeon 7970 (Benchmarks inside)

This topic is locked from further discussion.

Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#1 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

http://wccftech.com/nvidia-kepler-geforce-gtx-680-benchmarked-blows-hd-7970/

Not overclocked: http://www.tomshardware.com/news/Nvidia-Kepler-GeForce-GTX680-gpu,15012.html

Benchmark specs:

  • Intel Core i7 3960X (Turbo Boost)
  • ASUS Rampage IV Extreme motherboard
  • 8 GB (4x 2 GB) GeIL EVO 2 DDR3-2200 MHz quad-channel memory
  • Corsair AX1200W PSU
  • Windows 7 x64
  • Note: For those who are wondering why the graphics card was not listed, each graphics card was tested on the system specified

Looks like the GTX 680 is the clear winner here by a wide margin. Looks like NVIDIA's won this one, but we'll see what happens with future card updates and further benchmarking tests at other resolutions.

Avatar image for Assassin_87
Assassin_87

2349

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#2 Assassin_87
Member since 2004 • 2349 Posts

Makes me glad that I waited to upgrade. I'm still holding out to see what mid-range offerings look like.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 ronvalencia
Member since 2008 • 29612 Posts

LOL at blown away part...

link ...

Corrected graph

For GTX 680, 772Mhz --> 1006Mhz = 30% core overclock.

Avatar image for 2scoopsofempty
2scoopsofempty

923

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 2scoopsofempty
Member since 2005 • 923 Posts
yep. time to say goodbye to my 6950 1GB and switch back to the green team. Looks like it beats the 7970 in power consumption too.
Avatar image for ShadowriverUB
ShadowriverUB

5515

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 ShadowriverUB
Member since 2009 • 5515 Posts
I hope they didn't used some extreame overclocking for testing that they need cooler like that :p real testing should be on normal clocks.
Avatar image for nameless12345
nameless12345

15125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 nameless12345
Member since 2010 • 15125 Posts

But where are the games to show off it's power? Crysis 3 isn't on the near horizon neither there's no guaratee it would burn the latest graphics cards.

Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#7 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

I hope they didn't used some extreame overclocking for testing that they need cooler like that :p real testing should be on normal clocks.ShadowriverUB

"The clocks of GeForce GTX 680 are maintained at a 1006Mhz Core and 6008Mhz Memory clock"

Looks like they didn't overclock it, I believe the last one is the only one where they overclock. Not 100% sure though.

Avatar image for jer_1
jer_1

7451

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#8 jer_1
Member since 2003 • 7451 Posts
Sweet, this is a pretty sound spanking right here. I'll definitely keep my eye on the 680 in the future, forget about anything ATI...
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 ronvalencia
Member since 2008 • 29612 Posts

Sweet, this is a pretty sound spanking right here. I'll definitely keep my eye on the 680 in the future, forget about anything ATI...jer_1

I hope they didn't used some extreame overclocking for testing that they need cooler like that :p real testing should be on normal clocks.ShadowriverUB

For HKEPC's GTX 680 benchmarks was overclocked i.e. 772Mhz --> 1006Mhz = 30% core overclock.

Can jer_1 and ShadowriverUB read GPU-Z screenshot?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ShadowriverUB"]I hope they didn't used some extreame overclocking for testing that they need cooler like that :p real testing should be on normal clocks.XVision84

"The clocks of GeForce GTX 680 are maintained at a 1006Mhz Core and 6008Mhz Memory clock"

Looks like they didn't overclock it, I believe the last one is the only one where they overclock. Not 100% sure though.

Can't you read the GPU-Z screenshot?

"Default Clock" was 772Mhz i.e. refer to GPU-Z screenshot.

Avatar image for ZumaJones07
ZumaJones07

16457

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11 ZumaJones07
Member since 2005 • 16457 Posts
looks like a bunch of nerdiness
Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#12 ReadingRainbow4
Member since 2012 • 18733 Posts

what kind of temps were those? my idle on my gtx580 is 30-33 and my load never goes over 65.

And That's OC'd.

Avatar image for dontshackzmii
dontshackzmii

6026

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#13 dontshackzmii
Member since 2009 • 6026 Posts

that looks like one expensive pc. why do we need video card wars on sw?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#14 ronvalencia
Member since 2008 • 29612 Posts

that looks like one expensive pc. why do we need video card wars on sw?

dontshackzmii

It should be in PC hardware...

For end user performance, it comes down to AIB's value added efforts i.e. the level of AIB factory overclocks.

Avatar image for PC4lifeman2233
PC4lifeman2233

479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 PC4lifeman2233
Member since 2012 • 479 Posts
Does it really matter which one has the fastest card? Most people can't even afford those high-end cards. I know I sure as hell can't. What's the point of upgrading anymore then the performance of a GTX 580 at this point anyway? Sure if you have a monitor that's 2560x1600. Graphics aren't going to drastically improve anytime soon. It's always going to be Price vs. performance. Both AMD and Nvidia have been on par with that the last couple years.
Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#16 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

[QUOTE="XVision84"]

[QUOTE="ShadowriverUB"]I hope they didn't used some extreame overclocking for testing that they need cooler like that :p real testing should be on normal clocks.ronvalencia

"The clocks of GeForce GTX 680 are maintained at a 1006Mhz Core and 6008Mhz Memory clock"

Looks like they didn't overclock it, I believe the last one is the only one where they overclock. Not 100% sure though.

Can't you read the GPU-Z screenshot?

"Default Clock" was 772Mhz i.e. refer to GPU-Z screenshot.

Right, I said that the last one was the only one where they showed they overclocked.

That's the last pic in the link :P

Avatar image for nameless12345
nameless12345

15125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 nameless12345
Member since 2010 • 15125 Posts

that looks like one expensive pc. why do we need video card wars on sw?

dontshackzmii

If there would be no consoles you'd be seeing threads like this on here all the time :P

Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#18 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

that looks like one expensive pc. why do we need video card wars on sw?

dontshackzmii

System Wars is all about Systems. People commonly speak of individual consoles in threads and that's allowed, so I don't see why speaking of PC individually is allowed.

It also shows the advance in PC power, which is comparable to System Wars.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="XVision84"]

"The clocks of GeForce GTX 680 are maintained at a 1006Mhz Core and 6008Mhz Memory clock"

Looks like they didn't overclock it, I believe the last one is the only one where they overclock. Not 100% sure though.

XVision84

Can't you read the GPU-Z screenshot?

"Default Clock" was 706 Mhz i.e. refer to GPU-Z screenshot.

Right, I said that the last one was the only one where they showed they overclocked.

That's the last pic in the link :P

They all show 1006 Mhz core i.e. 30 percent overclock over 706 Mhz core.

Avatar image for Riverwolf007
Riverwolf007

26023

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 Riverwolf007
Member since 2005 • 26023 Posts

lol, quick call richie rich and tell him to get his pre-order in!

Avatar image for nameless12345
nameless12345

15125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 nameless12345
Member since 2010 • 15125 Posts

[QUOTE="dontshackzmii"]

that looks like one expensive pc. why do we need video card wars on sw?

XVision84

System Wars is all about Systems. People commonly speak of individual consoles in threads and that's allowed, so I don't see why speaking of PC individually is allowed.

It also shows the advance in PC power, which is comparable to System Wars.

There's the PC Hardware forum for discussing PC parts. Fortunately, all PC games work on (newer) AMD and Nvidia cards (or at least should :P ).

Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#22 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

Does it really matter which one has the fastest card? Most people can't even afford those high-end cards. I know I sure as hell can't. What's the point of upgrading anymore then the performance of a GTX 580 at this point anyway? Sure if you have a monitor that's 2560x1600. Graphics aren't going to drastically improve anytime soon. It's always going to be Price vs. performance. Both AMD and Nvidia have been on par with that the last couple years. PC4lifeman2233

Many people do upgrade to high-end cards because they want the best possible experience. You'd be fine with a GTX 580, but it's still nice seeing the next generation of cards, even if it's just for future reference. When new consoles release, chances are that some people would not be able to afford them and would have no need to upgrade right away, but it's still nice to know the comparison.

And what makes you think graphics aren't going to drastically improve anytime soon? Next-gen is near, and that should result in a nice jump ahead, and what's stopping PC developers from taking advantage of the new generation of video cards?

Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#23 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

[QUOTE="XVision84"]

[QUOTE="ronvalencia"]

Can't you read the GPU-Z screenshot?

"Default Clock" was 706 Mhz i.e. refer to GPU-Z screenshot.

ronvalencia

Right, I said that the last one was the only one where they showed they overclocked.

That's the last pic in the link :P

They all show 1006 Mhz core i.e. 30 percent overclock over 706 Mhz core.

I know, that's what it said in the article and that's what you said.

You don't need to repeat yourself :)

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="PC4lifeman2233"]Does it really matter which one has the fastest card? Most people can't even afford those high-end cards. I know I sure as hell can't. What's the point of upgrading anymore then the performance of a GTX 580 at this point anyway? Sure if you have a monitor that's 2560x1600. Graphics aren't going to drastically improve anytime soon. It's always going to be Price vs. performance. Both AMD and Nvidia have been on par with that the last couple years. XVision84

Many people do upgrade to high-end cards because they want the best possible experience. You'd be fine with a GTX 580, but it's still nice seeing the next generation of cards, even if it's just for future reference. When new consoles release, chances are that some people would not be able to afford them and would have no need to upgrade right away, but it's still nice to know the comparison.

And what makes you think graphics aren't going to drastically improve anytime soon? Next-gen is near, and that should result in a nice jump ahead, and what's stopping PC developers from taking advantage of the new generation of video cards?

Intel HD 3000 IGP and consoles.

Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#25 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

lol, quick call richie rich and tell him to get his pre-order in!

Riverwolf007

Oh god, I hate those movies so much :P

To be fair, they're not that expensive, they should drop in price in a matter of months.

Avatar image for MlauTheDaft
MlauTheDaft

5189

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 MlauTheDaft
Member since 2011 • 5189 Posts

that looks like one expensive pc. why do we need video card wars on sw?

dontshackzmii

Considering the antipathy for Hermits, this is rather relevant. Consolites need to understand where we're coming from ;)

Avatar image for Riverwolf007
Riverwolf007

26023

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 Riverwolf007
Member since 2005 • 26023 Posts

[QUOTE="Riverwolf007"]

lol, quick call richie rich and tell him to get his pre-order in!

XVision84

Oh god, I hate those movies so much :P

To be fair, they're not that expensive, they should drop in price in a matter of months.

dude, i just looked up the 580 at newegg and the price range was from $400 to almost $700.

i'm starting the pc components watching game because i am finally going to break down and build a new rig this summer or fall.

Avatar image for Stevo_the_gamer
Stevo_the_gamer

50069

Forum Posts

0

Wiki Points

0

Followers

Reviews: 49

User Lists: 0

#28 Stevo_the_gamer  Moderator
Member since 2004 • 50069 Posts
I'll wait for final benchmarks--besides, it'll cost an arm and a leg. :P
Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#29 ReadingRainbow4
Member since 2012 • 18733 Posts

I got my 580 off ebay for around $350.

best part is I'll be able to sell it for near that amount as well.

Avatar image for Stevo_the_gamer
Stevo_the_gamer

50069

Forum Posts

0

Wiki Points

0

Followers

Reviews: 49

User Lists: 0

#30 Stevo_the_gamer  Moderator
Member since 2004 • 50069 Posts

i'm starting the pc components watching game because i am finally going to break down and build a new rig this summer or fall.

Riverwolf007

It's about time.

Avatar image for Slow_Show
Slow_Show

2018

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 Slow_Show
Member since 2011 • 2018 Posts

To everyone getting their panties in a knot over the core clocks: Anandtech hinted that Kepler does something weird with its core clocks in their Acer TimelineU M3/GT640M review (if you read between the lines it sounds like it could be a TurboBoost-type feature), so it could be GPU-Z reporting incorrect numbers rather than some jackass comparing a 30% OC'd 680 to a stock 7970.

Avatar image for topgunmv
topgunmv

10880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#32 topgunmv
Member since 2003 • 10880 Posts

They overclocked the card by over 40%.

Lol, try again.

Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#33 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

[QUOTE="XVision84"]

[QUOTE="Riverwolf007"]

lol, quick call richie rich and tell him to get his pre-order in!

Riverwolf007

Oh god, I hate those movies so much :P

To be fair, they're not that expensive, they should drop in price in a matter of months.

dude, i just looked up the 580 at newegg and the price range was from $400 to almost $700.

i'm starting the pc components watching game because i am finally going to break down and build a new rig this summer or fall.

A GTX 580 is still able to handle all games that are released today, so it's not going to drop in price to $200. Getting a GTX 680 makes your PC future-proof for some time to come.Prices should be gradually going down now that the 680 is released.

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#34 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts
So... how does it fare when not overclocked?
Avatar image for Riverwolf007
Riverwolf007

26023

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 Riverwolf007
Member since 2005 • 26023 Posts

[QUOTE="Riverwolf007"]i'm starting the pc components watching game because i am finally going to break down and build a new rig this summer or fall.

Stevo_the_gamer

It's about time.

lol, 7 years man, my pc is held together with spit, bailing wire, emergency ram injections and my friends old components that i got as hand me downs after they replaced their rigs.

it's actually doing pretty well for what it is but i keep running into goofy stuff like playing a newer game and the monitor and pc not agreeing on the format and deciding to just shut down and crazy crap like that.

it is hilarious to open it up and look at all the quick fixes and goony power adaptor cabling things together that should never have interacted together in the first place.

Avatar image for moistsandwich
moistsandwich

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 moistsandwich
Member since 2009 • 25 Posts

My GTX 460 does the job I need it to..... I'll wait til the GTX 780 is released.... then I'll buy a GTX 680

Avatar image for Iantheone
Iantheone

8242

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Iantheone
Member since 2007 • 8242 Posts

lol, 7 years man, my pc is held together with spit, bailing wire, emergency ram injections and my friends old components that i got as hand me downs after they replaced their rigs.

it's actually doing pretty well for what it is but i keep running into goofy stuff like playing a newer game and the monitor and pc not agreeing on the format and deciding to just shut down and crazy crap like that.

it is hilarious to open it up and look at all the quick fixes and goony power adaptor cabling things together that should never have interacted together in the first place.

Riverwolf007

Thats how a true PC gamer games :P I remember my old PC, and even my current one is getting a bit ragged after almost 3 years.

Avatar image for inggrish
inggrish

10503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#38 inggrish
Member since 2005 • 10503 Posts

Looks mint, and I will probably buy... but sying it blows away the 7970 is a bit of an exaggeration. yes it's better, but I reckon it will also be relatively more expensive too.

Avatar image for Darkslayer16
Darkslayer16

3619

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 Darkslayer16
Member since 2006 • 3619 Posts

What exactly is supposed to be impressive about 9673 in 3DMark 11?

Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#40 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

So... how does it fare when not overclocked?ferret-gamer

That we do not know yet.

Avatar image for GiantAssPanda
GiantAssPanda

1885

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 GiantAssPanda
Member since 2011 • 1885 Posts
So with standard clocks they're about the same..? Buuuut.. I'm very, very happy with my new 7950. Cool & quiet with great power consumption and performance. And overclocks nicely as well. And thanks to Radeon Pro you can basically tweak your in-game settings as well as with the Nvidia Inspector tool.
Avatar image for Riverwolf007
Riverwolf007

26023

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 Riverwolf007
Member since 2005 • 26023 Posts

[QUOTE="Riverwolf007"]lol, 7 years man, my pc is held together with spit, bailing wire, emergency ram injections and my friends old components that i got as hand me downs after they replaced their rigs.

it's actually doing pretty well for what it is but i keep running into goofy stuff like playing a newer game and the monitor and pc not agreeing on the format and deciding to just shut down and crazy crap like that.

it is hilarious to open it up and look at all the quick fixes and goony power adaptor cabling things together that should never have interacted together in the first place.

Iantheone

Thats how a true PC gamer games :P I remember my old PC, and even my current one is getting a bit ragged after almost 3 years.

i also didn't hook up a new fan right about 3 years ago and now every time i turn the thing on i get a warning about not having a fan even though the fan works fine.

you have to hit f1 before it will boot and i have put up with that for years instead of taking a few mins to figure out the problem. :lol:

Avatar image for ZombieKiller7
ZombieKiller7

6463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43 ZombieKiller7
Member since 2011 • 6463 Posts

Not sure about now but in my day Nvidia was known as the better card at premium price, while ATI was for the "common working mortal" at a lower msrp.

Sort of like Intel vs AMD cpu's

Intel is the premium product that costs more.

Nowadays AMD bought ATI so I guess AMD GPU/CPU is the low cost option for most people with Intel/Nvidia being a little more expensive and a little better.

Avatar image for deactivated-5d78760d7d740
deactivated-5d78760d7d740

16386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#44 deactivated-5d78760d7d740
Member since 2009 • 16386 Posts

Not sure about now but in my day Nvidia was known as the better card at premium price, while ATI was for the "common working mortal" at a lower msrp.

Sort of like Intel vs AMD cpu's

Intel is the premium product that costs more.

Nowadays AMD bought ATI so I guess AMD GPU/CPU is the low cost option for most people with Intel/Nvidia being a little more expensive and a little better.

ZombieKiller7

AMD actually had the upper hand some time ago before Intel came out with their Centrino CPU's, after that Intel has been dominating.

Nowadays NVIDIA and AMD are actually quite close in competition, you're good with either one, both make premium products.

Avatar image for The_Capitalist
The_Capitalist

10838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#45 The_Capitalist
Member since 2004 • 10838 Posts

Well, not too many people actually buy the top of the line cards because of diminishing returns.

I think my GTX 560 Ti is good for the next two or three years.

Avatar image for deactivated-59b71619573a1
deactivated-59b71619573a1

38222

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 deactivated-59b71619573a1
Member since 2007 • 38222 Posts

Well, not too many people actually buy the top of the line cards because of diminishing returns.

I think my GTX 560 Ti is good for the next two or three years.

The_Capitalist

Me too. but 3 years is pushing it a lot I think.

I will probably get one of these cards way down the line. It's pretty dumb to buy one at launch. Not worth the price IMO.

Also it will drive AMDs GPU prices down hopefully

Avatar image for SaltyMeatballs
SaltyMeatballs

25165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#47 SaltyMeatballs
Member since 2009 • 25165 Posts
Meh. May be interesting when there are games which utilise that power to it's full potential.
Avatar image for deactivated-59b71619573a1
deactivated-59b71619573a1

38222

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 deactivated-59b71619573a1
Member since 2007 • 38222 Posts

Meh. May be interesting when there are games which utilise that power to it's full potential.SaltyMeatballs

Im guessing next gen might use it a bit better. Kinda pointless for games so far. We haven't had anything that's really "unplayable" in a while. Everyone thought BF3 would cripple systems but it ran very well

Avatar image for metal_zombie
metal_zombie

2288

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 metal_zombie
Member since 2004 • 2288 Posts
how much better is this card than a 6870 CF?
Avatar image for ScreamDream
ScreamDream

3953

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 ScreamDream
Member since 2006 • 3953 Posts

Grats to Nvidia. Your turn ATI.