GTX680 Specs leaked!

  • 67 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for DevilMightCry
DevilMightCry

3554

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#1 DevilMightCry
Member since 2007 • 3554 Posts

Some pretty hard hitting numbers, especially against the newly released AMD 7970.

Link

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#2 Wasdie  Moderator
Member since 2003 • 53622 Posts

All for $999.99 and a new house when it overheats and burns your place to the ground.

[spoiler] :P [/spoiler]

Avatar image for GTSaiyanjin2
GTSaiyanjin2

6018

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 GTSaiyanjin2
Member since 2005 • 6018 Posts

I guess we are not going to get much of a price war.... This is really disappointing. I'll wait forGK110 myself, maybe it will come by the end of the year.

Avatar image for wewantdoom4now
wewantdoom4now

1792

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 wewantdoom4now
Member since 2012 • 1792 Posts

I guess we are not going to get much of a price war.... This is really disappointing. I'll wait forGK110 myself, maybe it will come by the end of the year.

GTSaiyanjin2

they're gonna have lower end models.

Avatar image for JigglyWiggly_
JigglyWiggly_

24625

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#5 JigglyWiggly_
Member since 2009 • 24625 Posts

All for $999.99 and a new house when it overheats and burns your place to the ground.

:P

Wasdie

The gimped mobile version(which I want) will probably actually cost 999$ >.>

Avatar image for GTSaiyanjin2
GTSaiyanjin2

6018

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6 GTSaiyanjin2
Member since 2005 • 6018 Posts

[QUOTE="GTSaiyanjin2"]

I guess we are not going to get much of a price war.... This is really disappointing. I'll wait forGK110 myself, maybe it will come by the end of the year.

wewantdoom4now

they're gonna have lower end models.

Yeah but the way AMD and Nvidia are pricing their cards I cant see them having anything good of value to offer over their existing line up. The AMD 6850, 6870, and the nvidia 560ti are still some of the best cards for the money right now.

Avatar image for PfizersaurusRex
PfizersaurusRex

1540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#7 PfizersaurusRex
Member since 2012 • 1540 Posts

Are you sure you're allowed to post that? I mean it clearly says "NVIDIA Confidential".:cool:

Avatar image for ShadowDeathX
ShadowDeathX

11699

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#8 ShadowDeathX
Member since 2006 • 11699 Posts
AMD uses 384-bit instead of 256-bit normal in new cards. Nvidia uses 256-bit instead of the normal 384-bit in their new cards. Why?
Avatar image for Lox_Cropek
Lox_Cropek

3555

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 0

#9 Lox_Cropek
Member since 2008 • 3555 Posts

1500 cores :o

Avatar image for ShadowDeathX
ShadowDeathX

11699

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#10 ShadowDeathX
Member since 2006 • 11699 Posts
If these turn out to be super great, I'll wait for the 4GB version of the GTX 680. Wish Nvidia would put 3GB at least in their cards, but no =(
Avatar image for hartsickdiscipl
hartsickdiscipl

14787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#11 hartsickdiscipl
Member since 2003 • 14787 Posts

If this ends up being true, that is absurd. What impresses me the most is the fact that it runs on a pair of 6-pin power connectors. 2GB of VRAM is plenty, but the memory bandwidth would seem to be a little low for a flagship card at this point in the game, if prior reports are true. 192gb/s of bandwidth is a LOT, but AMD has some cards on the market with more if I'm not mistaken. If these specs are right, I'm very interested to see how it stacks up with so many cores and a relatively weak memory subsystem. VERY nice to see a TDP below 200 watts from a top notch Nvidia GPU, if accurate.

Avatar image for GTSaiyanjin2
GTSaiyanjin2

6018

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 GTSaiyanjin2
Member since 2005 • 6018 Posts

This card reminds me a lot of the 8800 GT, just not the price :P

Avatar image for Truth_Hurts_U
Truth_Hurts_U

9703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#13 Truth_Hurts_U
Member since 2006 • 9703 Posts
Rumor says it's the mid range GPU made to look high end with a massive price tag to it. All because AMD failed at making a competive GPU. Wont buy it unless it's 50% faster then a GTX 580. Been waiting a longgg time for an upgrade.
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#14 04dcarraher
Member since 2004 • 23858 Posts

Rumor says it's the mid range GPU made to look high end with a massive price tag to it. All because AMD failed at making a competive GPU. Wont buy it unless it's 50% faster then a GTX 580. Been waiting a longgg time for an upgrade.Truth_Hurts_U
huh why do think think it might be only 50% faster? Look at the specs again GTX 680 1536 Cuda cores vs GTX 580's 512 Cuda cores, thats 3x the shader processing power alone. Not counting the new architecture. Also note that its no real difference 256bit vs 384bit memory interface for single monitor resolutions. Its when you get into the scalable multi monitor resolution where the 2+gb 384 bit bus helps.

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#15 mitu123
Member since 2006 • 155290 Posts

Wow, that is awesome!!!

Avatar image for superclocked
superclocked

5864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 superclocked
Member since 2009 • 5864 Posts
I wonder how accurate these benchmarks of the GTX 680 are...
Avatar image for JigglyWiggly_
JigglyWiggly_

24625

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#17 JigglyWiggly_
Member since 2009 • 24625 Posts
[QUOTE="superclocked"]I wonder how accurate these benchmarks of the GTX 680 are...

Those statistics are so skewed looking lol. .2 difference is like 9999
Avatar image for dovberg
dovberg

3348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#18 dovberg
Member since 2009 • 3348 Posts

I would be shocked if that is true but would be thrilled if it is.

Avatar image for achilles614
achilles614

5310

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 achilles614
Member since 2005 • 5310 Posts

[QUOTE="Truth_Hurts_U"]Rumor says it's the mid range GPU made to look high end with a massive price tag to it. All because AMD failed at making a competive GPU. Wont buy it unless it's 50% faster then a GTX 580. Been waiting a longgg time for an upgrade.04dcarraher

huh why do think think it might be only 50% faster? Look at the specs again GTX 680 1536 Cuda cores vs GTX 580's 512 Cuda cores, thats 3x the shader processing power alone. Not counting the new architecture. Also note that its no real difference 256bit vs 384bit memory interface for single monitor resolutions. Its when you get into the scalable multi monitor resolution where the 2+gb 384 bit bus helps.

Why would you want a GPU that powerful if you weren't going to get into crazy resolutions...? (it's late here and I saw someone mention 1000 bucks)
Avatar image for Truth_Hurts_U
Truth_Hurts_U

9703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#20 Truth_Hurts_U
Member since 2006 • 9703 Posts
[QUOTE="achilles614"] Why would you want a GPU that powerful if you weren't going to get into crazy resolutions...? (it's late here and I saw someone mention 1000 bucks)

Why not? I want my GPU to max games for years and not only for a year, then I would have to start turning things down. The longer the GPU lasts, the higher chance it breaks, the more likely get a new GPU with a life time warranty for free.
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#21 04dcarraher
Member since 2004 • 23858 Posts
[QUOTE="04dcarraher"]

[QUOTE="Truth_Hurts_U"]Rumor says it's the mid range GPU made to look high end with a massive price tag to it. All because AMD failed at making a competive GPU. Wont buy it unless it's 50% faster then a GTX 580. Been waiting a longgg time for an upgrade.achilles614

huh why do think think it might be only 50% faster? Look at the specs again GTX 680 1536 Cuda cores vs GTX 580's 512 Cuda cores, thats 3x the shader processing power alone. Not counting the new architecture. Also note that its no real difference 256bit vs 384bit memory interface for single monitor resolutions. Its when you get into the scalable multi monitor resolution where the 2+gb 384 bit bus helps.

Why would you want a GPU that powerful if you weren't going to get into crazy resolutions...? (it's late here and I saw someone mention 1000 bucks)

We will have to wait and see if the 256bit actually limits performance for those multi monitor resolutions like 5760 x 1080 or 7680 x 3200 , 384 bit ensures that there is more then enough headroom for three monitors along with more then 2gb of memory, If these specs are true about it having 2gb then most likely 5760 x 1080 will be an acceptable limit.
Avatar image for msfan1289
msfan1289

1044

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 msfan1289
Member since 2011 • 1044 Posts

[QUOTE="superclocked"]I wonder how accurate these benchmarks of the GTX 680 are...  JigglyWiggly_
Those statistics are so skewed looking lol. .2 difference is like 9999

all i can say is LMAO! (the chart looks so fake)

Avatar image for blaznwiipspman1
blaznwiipspman1

16914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 blaznwiipspman1
Member since 2007 • 16914 Posts

i think i know what nvidias strategy is now against amd. basically make one of their cards specs really incredible, then make underground deals with devs to make games that are heavy on what nvidias card does and in this way make amd look bad. its pretty pathetic nvidia cant compete normally but use these underhanded tactics. the shader count on the 680 is ridiculous. nvidia did the same thing last gen by gimping tessellation in their favor for the 5xx series

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#24 04dcarraher
Member since 2004 • 23858 Posts

i think i know what nvidias strategy is now against amd. basically make one of their cards specs really incredible, then make underground deals with devs to make games that are heavy on what nvidias card does and in this way make amd look bad. its pretty pathetic nvidia cant compete normally but use these underhanded tactics. the shader count on the 680 is ridiculous. nvidia did the same thing last gen by gimping tessellation in their favor for the 5xx series

blaznwiipspman1

wow :?: Like AMD does not do the same crap with AMD sponsored games? You need to let go of the fanboyism, because your complaining about petty stuff and about a gpu that actually will outclass the 7970 while using nearly the same amount of power. And how did Nvidia gimp tessellation? another baseless claim.....

It's a win-win for everyone and the price war will start when these come out.

Avatar image for marcthpro
marcthpro

7927

Forum Posts

0

Wiki Points

0

Followers

Reviews: 24

User Lists: 0

#25 marcthpro
Member since 2003 • 7927 Posts
I do hope the price war go well :D it will perhap result into me buying my GTX 570 SLI faster then expected unless i can sell my GTX 570 for 65% of it price and then get a GTX 680 :P if it that powerfulll !
Avatar image for V4LENT1NE
V4LENT1NE

12901

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#26 V4LENT1NE
Member since 2006 • 12901 Posts

[QUOTE="blaznwiipspman1"]

i think i know what nvidias strategy is now against amd. basically make one of their cards specs really incredible, then make underground deals with devs to make games that are heavy on what nvidias card does and in this way make amd look bad. its pretty pathetic nvidia cant compete normally but use these underhanded tactics. the shader count on the 680 is ridiculous. nvidia did the same thing last gen by gimping tessellation in their favor for the 5xx series

04dcarraher

wow :?: Like AMD does not do the same crap with AMD sponsored games? You need to let go of the fanboyism, because your complaining about petty stuff and about a gpu that actually will outclass the 7970 while using nearly the same amount of power. And how did Nvidia gimp tessellation? another baseless claim.....

It's a win-win for everyone and the price war will start when these come out.

Dont worry, blazn like to go around every nvidia thread trying to trash them, he has been doing it for months.
Avatar image for Elann2008
Elann2008

33028

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#27 Elann2008
Member since 2007 • 33028 Posts

I do hope the price war go well :D it will perhap result into me buying my GTX 570 SLI faster then expected unless i can sell my GTX 570 for 65% of it price and then get a GTX 680 :P if it that powerfulll ! marcthpro
I want another GTX 580..:x

Avatar image for arto1223
arto1223

4412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#28 arto1223
Member since 2005 • 4412 Posts

All for $999.99 and a new house when it overheats and burns your place to the ground.

Wasdie

Only has 2 6-pin power connectors. Might have fairly low power consumption and therefore and somewhat low amount of heat generated. Fingers crossed on this, but it might be high.

Anyways, looks good.

Avatar image for blaznwiipspman1
blaznwiipspman1

16914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 blaznwiipspman1
Member since 2007 • 16914 Posts

[QUOTE="blaznwiipspman1"]

i think i know what nvidias strategy is now against amd. basically make one of their cards specs really incredible, then make underground deals with devs to make games that are heavy on what nvidias card does and in this way make amd look bad. its pretty pathetic nvidia cant compete normally but use these underhanded tactics. the shader count on the 680 is ridiculous. nvidia did the same thing last gen by gimping tessellation in their favor for the 5xx series

04dcarraher

wow :?: Like AMD does not do the same crap with AMD sponsored games? You need to let go of the fanboyism, because your complaining about petty stuff and about a gpu that actually will outclass the 7970 while using nearly the same amount of power. And how did Nvidia gimp tessellation? another baseless claim.....

It's a win-win for everyone and the price war will start when these come out.

AMD doesn't do that kind of stuff, they're in favour of open standards, and have been for a long time. Like I said, its an underhanded tactic, at least recognize it for what it is. Also, I didn't know the 680 was out already, where are you getting your insider information from? Nvidia gimped the tessellation in their favor, they slapped on so much of tessellation hardware to the 5xx series that devs even didn't know what to do with it. The games that did use tessellation to above normal levels were from devs that were in bed with nvidia. I'll call it now, the coming gen will see games slopped all over with an extra large portion of shaders.

Avatar image for TheShadowLord07
TheShadowLord07

23083

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 TheShadowLord07
Member since 2006 • 23083 Posts

All for $999.99 and a new house when it overheats and burns your place to the ground.

Wasdie

im expecting here could cost that much of money(or even more). the 7970 cost almost that much atm

Avatar image for Lox_Cropek
Lox_Cropek

3555

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 0

#31 Lox_Cropek
Member since 2008 • 3555 Posts

Ok, I wanna see the price on that thing. The HD 7970 is already U$1000 here. That thing will cost at least U$1200...

Avatar image for ravenguard90
ravenguard90

3064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 ravenguard90
Member since 2005 • 3064 Posts

AMD doesn't do that kind of stuff, they're in favour of open standards, and have been for a long time. Like I said, its an underhanded tactic, at least recognize it for what it is. Also, I didn't know the 680 was out already, where are you getting your insider information from? Nvidia gimped the tessellation in their favor, they slapped on so much of tessellation hardware to the 5xx series that devs even didn't know what to do with it. The games that did use tessellation to above normal levels were from devs that were in bed with nvidia. I'll call it now, the coming gen will see games slopped all over with an extra large portion of shaders.

blaznwiipspman1

For someone who's really focused on tactics and their ethics, you should consider this: What AMD is doing can also be considered unethical, as they are making things open for others to work on them, and not their own driver team. Why else do you think everyone else is complaining about their drivers? Even when AMD claimed to be stepping up their efforts for their driver programming team, there were still a few caveats found when the 7970 was released (refer to Techpowerup's Crossfire test on BF3 at 2560x1600).

When a company makes something open-source, yes, it gives opportunities for others to tweak it and maximize its potential. However, it is obvious that is not the case, as developers are not embracing this opportunity, but rather leaving it aside in favour of Nvidia's sponsorships.

Push comes to shove, all these companies are in it for the money. Stop complaining when one company uses up more money to divert developers to favour their cards, and start looking at the possibility that maybe AMD is just plain stingy and lazy.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#33 04dcarraher
Member since 2004 • 23858 Posts
Lol blaze AMD does not sponsor games? In which AMD based cards tend to run better ya right stop aimlessly bashing....
Avatar image for abuabed
abuabed

6606

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 abuabed
Member since 2005 • 6606 Posts
I doubt these numbers, looks overpowered obviously.
Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#36 wis3boi
Member since 2005 • 32507 Posts

All for $999.99 and a new house when it overheats and burns your place to the ground.

Wasdie

challenge accepted

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7971

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Grey_Eyed_Elf
Member since 2011 • 7971 Posts
What happened to having Mid range cards that were almost as fast as last generations high end and for half the price?... Now we are paying the same price for the same performance as last gen and no new features... whats the point?
Avatar image for QQabitmoar
QQabitmoar

1892

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 QQabitmoar
Member since 2011 • 1892 Posts

Around 10% faster than the 7970 in real world gaming scenarions confirmed, after those Nvidia benhcmarks showed 40% difference. With aggressive pricing, it could be a good hit, but I was expecting something better. Hopefully the rumored 190w TDP is true, and hopefully the true Kepler juggernaught, the GK110 comes soon as well, and is all we excpected it to be.

Avatar image for Idontremember
Idontremember

965

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 Idontremember
Member since 2003 • 965 Posts

Those specs are so high.... I mean, REALLY HIGH!!!!!

LIKE SUPER HIGH!!!!

The jump from the 580 to 680 seems way to big to be true, that would leave ATI in the dark, I need to gather a few thousands $ for this monster!!!!

Avatar image for hartsickdiscipl
hartsickdiscipl

14787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#40 hartsickdiscipl
Member since 2003 • 14787 Posts

[QUOTE="04dcarraher"]

[QUOTE="blaznwiipspman1"]

i think i know what nvidias strategy is now against amd. basically make one of their cards specs really incredible, then make underground deals with devs to make games that are heavy on what nvidias card does and in this way make amd look bad. its pretty pathetic nvidia cant compete normally but use these underhanded tactics. the shader count on the 680 is ridiculous. nvidia did the same thing last gen by gimping tessellation in their favor for the 5xx series

blaznwiipspman1

wow :?: Like AMD does not do the same crap with AMD sponsored games? You need to let go of the fanboyism, because your complaining about petty stuff and about a gpu that actually will outclass the 7970 while using nearly the same amount of power. And how did Nvidia gimp tessellation? another baseless claim.....

It's a win-win for everyone and the price war will start when these come out.

AMD doesn't do that kind of stuff, they're in favour of open standards, and have been for a long time. Like I said, its an underhanded tactic, at least recognize it for what it is. Also, I didn't know the 680 was out already, where are you getting your insider information from? Nvidia gimped the tessellation in their favor, they slapped on so much of tessellation hardware to the 5xx series that devs even didn't know what to do with it. The games that did use tessellation to above normal levels were from devs that were in bed with nvidia. I'll call it now, the coming gen will see games slopped all over with an extra large portion of shaders.

First you say that Nvidia "gimped" their hardware by making it better at tessellation than AMD bothered to, and then forget the fact that AMD are the ones who are heavy on cores and shaders, and have been for years now.

Saying that something was "gimped" indicates that it was somehow made weaker, not stronger.. at least anywhere I've ever heard it used. Don't whine because the GTX 400 and 500 series performed better in tessellation than their AMD counterparts.. especially the 5800's. That's the way the game works.

Secondly, even if Nvidia's new GTX 680 does have over 1500 Cuda cores, that's fewer than the number of stream processors that the AMD 5870 from 2 and a half years ago had. I'm not really sure where you're going with your arguments. They seem full of holes and obvious oversights.

Avatar image for ShadowDeathX
ShadowDeathX

11699

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#41 ShadowDeathX
Member since 2006 • 11699 Posts

[QUOTE="blaznwiipspman1"]

[QUOTE="04dcarraher"] wow :?: Like AMD does not do the same crap with AMD sponsored games? You need to let go of the fanboyism, because your complaining about petty stuff and about a gpu that actually will outclass the 7970 while using nearly the same amount of power. And how did Nvidia gimp tessellation? another baseless claim.....

It's a win-win for everyone and the price war will start when these come out.

hartsickdiscipl

AMD doesn't do that kind of stuff, they're in favour of open standards, and have been for a long time. Like I said, its an underhanded tactic, at least recognize it for what it is. Also, I didn't know the 680 was out already, where are you getting your insider information from? Nvidia gimped the tessellation in their favor, they slapped on so much of tessellation hardware to the 5xx series that devs even didn't know what to do with it. The games that did use tessellation to above normal levels were from devs that were in bed with nvidia. I'll call it now, the coming gen will see games slopped all over with an extra large portion of shaders.

First you say that Nvidia "gimped" their hardware by making it better at tessellation than AMD bothered to, and then forget the fact that AMD are the ones who are heavy on cores and shaders, and have been for years now.

Saying that something was "gimped" indicates that it was somehow made weaker, not stronger.. at least anywhere I've ever heard it used. Don't whine because the GTX 400 and 500 series performed better in tessellation than their AMD counterparts.. especially the 5800's. That's the way the game works.

Secondly, even if Nvidia's new GTX 680 does have over 1500 Cuda cores, that's fewer than the number of stream processors that the AMD 5870 from 2 and a half years ago had. I'm not really sure where you're going with your arguments. They seem full of holes and obvious oversights.

AMD: More cores but each of them are slower. Nvidia: Less cores but each of them are faster.
Avatar image for SinfulPotato
SinfulPotato

1381

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 SinfulPotato
Member since 2005 • 1381 Posts
[QUOTE="superclocked"]I wonder how accurate these benchmarks of the GTX 680 are...

Look at the scale of that graph. Unless it is priced the same, there is zero point.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="hartsickdiscipl"]

[QUOTE="blaznwiipspman1"]

AMD doesn't do that kind of stuff, they're in favour of open standards, and have been for a long time. Like I said, its an underhanded tactic, at least recognize it for what it is. Also, I didn't know the 680 was out already, where are you getting your insider information from? Nvidia gimped the tessellation in their favor, they slapped on so much of tessellation hardware to the 5xx series that devs even didn't know what to do with it. The games that did use tessellation to above normal levels were from devs that were in bed with nvidia. I'll call it now, the coming gen will see games slopped all over with an extra large portion of shaders.

ShadowDeathX

First you say that Nvidia "gimped" their hardware by making it better at tessellation than AMD bothered to, and then forget the fact that AMD are the ones who are heavy on cores and shaders, and have been for years now.

Saying that something was "gimped" indicates that it was somehow made weaker, not stronger.. at least anywhere I've ever heard it used. Don't whine because the GTX 400 and 500 series performed better in tessellation than their AMD counterparts.. especially the 5800's. That's the way the game works.

Secondly, even if Nvidia's new GTX 680 does have over 1500 Cuda cores, that's fewer than the number of stream processors that the AMD 5870 from 2 and a half years ago had. I'm not really sure where you're going with your arguments. They seem full of holes and obvious oversights.

AMD: More cores but each of them are slower. Nvidia: Less cores but each of them are faster.

Radeon HD 7870 follows less cores (i.e. 1280 SPs) with faster reference clock speed (i.e. 1Ghz).

AMD could have designed thier own GTX 680 type config from HD7870 i.e. 1.3Ghz core, 5Ghz (1.5Ghz ref)GDDR5 memory.

Avatar image for ShadowDeathX
ShadowDeathX

11699

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#44 ShadowDeathX
Member since 2006 • 11699 Posts
[QUOTE="SinfulPotato"][QUOTE="superclocked"]I wonder how accurate these benchmarks of the GTX 680 are...

Look at the scale of that graph. Unless it is priced the same, there is zero point.

10% to 20% average. I see these numbers leveling out or even AMD passing Nvidia at very high rezs.
Avatar image for hartsickdiscipl
hartsickdiscipl

14787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#45 hartsickdiscipl
Member since 2003 • 14787 Posts

[QUOTE="hartsickdiscipl"]

[QUOTE="blaznwiipspman1"]

AMD doesn't do that kind of stuff, they're in favour of open standards, and have been for a long time. Like I said, its an underhanded tactic, at least recognize it for what it is. Also, I didn't know the 680 was out already, where are you getting your insider information from? Nvidia gimped the tessellation in their favor, they slapped on so much of tessellation hardware to the 5xx series that devs even didn't know what to do with it. The games that did use tessellation to above normal levels were from devs that were in bed with nvidia. I'll call it now, the coming gen will see games slopped all over with an extra large portion of shaders.

ShadowDeathX

First you say that Nvidia "gimped" their hardware by making it better at tessellation than AMD bothered to, and then forget the fact that AMD are the ones who are heavy on cores and shaders, and have been for years now.

Saying that something was "gimped" indicates that it was somehow made weaker, not stronger.. at least anywhere I've ever heard it used. Don't whine because the GTX 400 and 500 series performed better in tessellation than their AMD counterparts.. especially the 5800's. That's the way the game works.

Secondly, even if Nvidia's new GTX 680 does have over 1500 Cuda cores, that's fewer than the number of stream processors that the AMD 5870 from 2 and a half years ago had. I'm not really sure where you're going with your arguments. They seem full of holes and obvious oversights.

AMD: More cores but each of them are slower. Nvidia: Less cores but each of them are faster.

Yes, I know this. That doesn't change the fact that he was going off about shaders and the fact is, AMD has a lot more by pure numbers.

Avatar image for GummiRaccoon
GummiRaccoon

13799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 GummiRaccoon
Member since 2003 • 13799 Posts

[QUOTE="blaznwiipspman1"]

AMD doesn't do that kind of stuff, they're in favour of open standards, and have been for a long time. Like I said, its an underhanded tactic, at least recognize it for what it is. Also, I didn't know the 680 was out already, where are you getting your insider information from? Nvidia gimped the tessellation in their favor, they slapped on so much of tessellation hardware to the 5xx series that devs even didn't know what to do with it. The games that did use tessellation to above normal levels were from devs that were in bed with nvidia. I'll call it now, the coming gen will see games slopped all over with an extra large portion of shaders.

ravenguard90

For someone who's really focused on tactics and their ethics, you should consider this: What AMD is doing can also be considered unethical, as they are making things open for others to work on them, and not their own driver team. Why else do you think everyone else is complaining about their drivers? Even when AMD claimed to be stepping up their efforts for their driver programming team, there were still a few caveats found when the 7970 was released (refer to Techpowerup's Crossfire test on BF3 at 2560x1600).

When a company makes something open-source, yes, it gives opportunities for others to tweak it and maximize its potential. However, it is obvious that is not the case, as developers are not embracing this opportunity, but rather leaving it aside in favour of Nvidia's sponsorships.

Push comes to shove, all these companies are in it for the money. Stop complaining when one company uses up more money to divert developers to favour their cards, and start looking at the possibility that maybe AMD is just plain stingy and lazy.

DO YOU UNDERSTAND WHAT ETHICS MEANS?

No, you do not.

Avatar image for marcthpro
marcthpro

7927

Forum Posts

0

Wiki Points

0

Followers

Reviews: 24

User Lists: 0

#47 marcthpro
Member since 2003 • 7927 Posts

Ethics is not the Word you should say There Policy Is what you should have said or ToS as even open source thing can come with Severe ToS you should also have said is Dishonest or Disreputable : because what they do is Ethical

I do hope AMD driver keep the lead they have expect 12.2 fiasco with 2D game so far it seem it doing alright for most of game versus time of ATI Team Driver like ATI HD47xx/48xx or worst ATI HD58xx so much issue with them according friend

altough some of the ATI team are still there i think they shouuld invest twice the amount on updating driver bi-weekly till there Driver are super stable : mean while they keep upgrade there Crossfire application as needed to fix most of issue there still few to fix

Avatar image for hrt_rulz01
hrt_rulz01

22682

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 hrt_rulz01
Member since 2006 • 22682 Posts
I most likely will upgrade to a GTX 680 eventually, but only for the right price.
Avatar image for ravenguard90
ravenguard90

3064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 ravenguard90
Member since 2005 • 3064 Posts

DO YOU UNDERSTAND WHAT ETHICS MEANS?

No, you do not.

GummiRaccoon

Apparently, you don't either if you haven't considered the subjective nature of it.

There's always two sides to an argument. What's ethical or "right" can be different depending on where one's perspective originates from.

If you take it from a utilitarian perspective, then Nvidia is in the wrong, as it would not benefit the majority of users out there by optimizing software to run better on their cards only. However, if you take it from a relativist perspective, then it is based entirely on the moral perspective that you take to judge the action. On one hand, Nvidia could be blamed for "bribing" the developers to make their software work better on their cards; on the other hand, you could say putting in the extra funding is essential in order to ensure all potential is used in their hardware. If it is the latter, then it's AMD that is in the wrong, as they refuse to ensure that resources are made available to ensure the same optimizations can be had on their cards.

What blazn was doing was proclaiming ethical violations in Nvidia's actions. I'm just saying it's not as absolute as he's claiming.

Avatar image for GummiRaccoon
GummiRaccoon

13799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 GummiRaccoon
Member since 2003 • 13799 Posts

[QUOTE="GummiRaccoon"]

DO YOU UNDERSTAND WHAT ETHICS MEANS?

No, you do not.

ravenguard90

Apparently, you don't either if you haven't considered the subjective nature of it.

There's always two sides to an argument. What's ethical or "right" can be different depending on where one's perspective originates from.

If you take it from a utilitarian perspective, then Nvidia is in the wrong, as it would not benefit the majority of users out there by optimizing software to run better on their cards only. However, if you take it from a relativist perspective, then it is based entirely on the moral perspective that you take to judge the action. On one hand, Nvidia could be blamed for "bribing" the developers to make their software work better on their cards; on the other hand, you could say putting in the extra funding is essential in order to ensure all potential is used in their hardware. If it is the latter, then it's AMD that is in the wrong, as they refuse to ensure that resources are made available to ensure the same optimizations can be had on their cards.

What blazn was doing was proclaiming ethical violations in Nvidia's actions. I'm just saying it's not as absolute as he's claiming.

There is nothing morally right or wrong about being inept, therefore it cannot be ethical nor unethical.