GameSpot may receive revenue from affiliate and advertising partnerships for sharing this content and from purchases through links.

Revealed: Nvidia's Next Flagship Graphics Card, the GTX 980Ti

New GeForce GPU carries 6GB of memory, costs $650.

909 Comments

Nvidia has divulged new details of its GeForce 980Ti, the latest addition to its line of premium-priced graphics cards.

Starting at $650 (and bundled with a copy of Batman: Arkham Knight), the GTX 980Ti carries six gigabytes of GDDR5 memory. That's about half the capacity of Nvidia's $1000 Titan X graphics card, but Nvidia believes the 980Ti can perform to a similar standard at about two-thirds of the price.

A gallery below offers a closer look at the new GPU.

Click on the thumbnails below to view in full-screen
Click on the thumbnails below to view in full-screen

Much like with the Titan X, the 980Ti has a 336GB/s maximum bandwidth. Its display outputs are 1x HDMI, 1x DVI, and 3x DP, while it has two power connectors arranged as 1x 6pin and 1x 8pin.

Meanwhile, the GTX 980, due to its drop down the pecking order, will now be sold at $500.

Nvidia's new GeForce prices follow:

  • Titan X: $1000
  • 980 Ti: $650
  • 980: $500
  • 970: $330
  • 960: $200

Mark Aevermann, a spokesperson for Nvidia, told GameSpot that the GTX 980Ti was built with DirectX 12 features in mind.

"This is, by all rights, our flagship GeForce GTX GPU, built on the GM 200 chip," he said, "and we designed it for a lot of the top-end PC features, such as 4K, VR, and G-Sync. These are the things that this card excels at."

Further reading: DirectX 12: A Game Changer for Xbox One and PC?

Got a news tip or want to contact us directly? Email news@gamespot.com

Join the conversation
There are 909 comments about this story
909 Comments  RefreshSorted By 
GameSpot has a zero tolerance policy when it comes to toxic conduct in comments. Any abusive, racist, sexist, threatening, bullying, vulgar, and otherwise objectionable behavior will result in moderation and/or account termination. Please keep your discussion civil.

Avatar image for viciouskiller
viciouskiller

239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

Pointless considering the boost DX12 is going to give to cheaper cards.

Upvote • 
Avatar image for NetGod
NetGod

60

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

@viciouskiller:


How old are you? Five?

Do your parents know you're on the internet like this?

Upvote • 
Avatar image for pseudopsiakite
PseudopsiaKite

72

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@viciouskiller: DX12 can't give you performance beyond what the hardware is capable of. DX12 mainly remove CPU bottlenecks if one exists and it's implemented appropriately.

Upvote • 
Avatar image for viciouskiller
viciouskiller

239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

@pseudopsiakite@viciouskiller: It increases cpu performance up to 50% and gpu 20%.
So spending money on a performance boost your going to get for free anyway is pointless.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@viciouskiller@pseudopsiakite: Dx12 will not increase performance by that much the main thing Dx12 is doing is makeing it easyer to code from games so they have less time trying to work out bugs I have done alot of reading on it. the main thing DX12 will do is make it so games spend a bit less time in alaph and bata stage.


it wont change what hardware can do.

but hardware is going to see a massvie change next year regardless when they change Node size and go from GDDR5 to HBM2 stack


2 • 
Avatar image for ishsgames
ishsgames

519

Forum Posts

0

Wiki Points

0

Followers

Reviews: 18

User Lists: 0

@fvckgabenewell@viciouskiller@pseudopsiakite: dx12 may really improve PC optimization greatly since the API overhead is also greatly less than dx11. My point, is the future SDKs have to do far less to achieve far more {including current hardware}.

Upvote • 
Avatar image for Kooken58
Kooken58

368

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

@fvckgabenewell@viciouskiller@pseudopsiakite: Isn't DX12 also going to make it so your machine can utilize your integrated graphics that most Intel CPUs come with? Meaning some of the load is taken off the GPU.

Upvote • 
Avatar image for pseudopsiakite
PseudopsiaKite

72

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@Kooken58@fvckgabenewell@viciouskiller@pseudopsiakite: Yes, it can use the integrated GPU for post processing to gain a few FPS. However, it's in sequence so it adds more latency. They demoed it with Unreal Engine 4.


Google "Explicit Multiadapter control in Unreal Engine 4"

Upvote • 
Avatar image for pseudopsiakite
PseudopsiaKite

72

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@viciouskiller@pseudopsiakite: The key words are"up to". When marketing uses these words it can be anywhere from 0%-50% depending on the situation.
A lot of these DX12 previews create a situation where there is a CPU bottleneck in DX11.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@viciouskiller: DX12 is not going to be that big of game changer the big deal will be when every card has 8GB of HBM and going from 28nm nodes to 14nm that and standard 1.3 display ports on everything

Upvote • 
Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

Haven't fully decided yet but I think I'll stick with my 780ti and maybe get another to go SLI with. (DX12 may make SLI/Cross Fire obsolete though)

Upvote • 
Avatar image for frozenuxx
Frozenuxx

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

The amount of salt in this thread is amazing lol.

3 • 
Avatar image for inmate_of_death
InMaTe_of_Death

34

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

I think it will be wiser to just wait until the next generation of Nvidia cards.

Upvote • 
Avatar image for suppaphly42
SuppaPHly42

1223

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 5

@inmate_of_death: that all depends on what you are upgrading from. the argument can always be made, oh i'll just wait till the next one, sometimes you have to just pick one and go for it buyers remorse be damned

Upvote • 
Avatar image for inmate_of_death
InMaTe_of_Death

34

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@suppaphly42@inmate_of_death: Well I have a 770 so I'd like to wait at least one more gen, the main issue with my card is its the 2gig version and I have a 1440p screen which isn't enough for newer games. The other main reason for me waiting is I want to have a card that will run VR super smooth. I know they recommend a 970 but from my experience its always better to be high above the recommended.

Upvote • 
Avatar image for suppaphly42
SuppaPHly42

1223

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 5

@inmate_of_death@suppaphly42: that is true and i can't fault your reasoning. as it applies to you perfectly.

all i was getting at was, its only the wise choice if you have a reason past waiting for the next one, because it will be better. that kind of thinking has no end to it as there will always be something better around the corner


Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@suppaphly42@inmate_of_death: I'm waiting tell I have the monitor I want when they have a 34"3440x1440 IPS that is 144hrz with displayport 1.3 and HDMI 2.0 in it I will upgrade to that and get the corect GPU to power it

Upvote • 
Avatar image for suppaphly42
SuppaPHly42

1223

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 5

@fvckgabenewell@suppaphly42@inmate_of_death: that is a wise choice, because you have a well thought out reason for it, past just putting it off till the next one. i also am waiting for a screen that will make the most out of the gpu the difference for me is i want a tv that can support 4k gaming with at least at 60 fps i want better but i don't want to spend a bunch on it. also star citizen is the game i want all of it for so no rush on my part anyway

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@inmate_of_death: yeah its a long wait is all

Upvote • 
Avatar image for Cruxis27
Cruxis27

2057

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

unless you're playing at 4k (which chances are you're not) you don't need this. If you have extra money to blow, then that's a different story.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

waitng on AMD to release there new card befor I make a choice for upgrade looking for a single card that will max out 3440x1440 ultra wide at over 60fps and If the amd card cant do it I might just wait tell next year we will see mid to bottom end cards that are doing 4k then


we should see a node change from 28nm to 14nm and 8gb HBM stacks of memory on most of the mid to high end cards and maybe a 16gb titan card with memorey bus speeds of 2048GB/s and something equivlent from AMD

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

aslo looking for displayport 1.3 as you can't push more then 60hz out of 1.2a at 4k so it makes Gsync and freesync kinda usless on anything past 1440p if you can't run it any faster then 60 frams per second

I see it as a huge mistake with the 980 ti and titan x not to have dispaly port 1.3 the 980 ti in sli can't push a Gsync monitor past 60FPS

Upvote • 
Avatar image for pseudopsiakite
PseudopsiaKite

72

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@fvckgabenewell: G-Sync can work down to 30hz, FreeSync varies depending on the monitor.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@pseudopsiakite@fvckgabenewell: you dont undertand what I'm saying display port 1.2a bottlenecks it can't push more then 60hz at 4k its fine at 1440p it can push out 144hz

Upvote • 
Avatar image for pseudopsiakite
PseudopsiaKite

72

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@fvckgabenewell@pseudopsiakite: I know about the bandwidth limitation, however it's does not make make G-Sync or FreeSync useless. If I had a 4k monitor I would prefer it has G-Sync/FreeSync because it's hard to achieve a perfect 60fps.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@pseudopsiakite@fvckgabenewell: yeah at least FreeSync works with HDMI 2.0 so you still have the option to go crossfire down the road if you start out with one card nvidia Gsync will run with display port 1.2a right now and they made a mistake of not furture proofing it by releaseing high end cards with a dispay port 1.3 on it

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@fvckgabenewell@pseudopsiakite: so its a big time fail on Nvidia to not have 1.3 on there high end cards for guys that want to run sli and get over 60 FPS in 4k

Upvote • 
Avatar image for suppaphly42
SuppaPHly42

1223

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 5

sweet can't wait to get one... maybe two XD the witcher 3 will soon be maxed

Upvote • 
Avatar image for zeca04
zeca04

383

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 1

@suppaphly42: Hmm, what card do you have now?

Upvote • 
Avatar image for suppaphly42
SuppaPHly42

1223

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 5

@zeca04@suppaphly42: 6970 but my other stuff is boss

Upvote • 
Avatar image for KelpsterD
KelpsterD

138

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I'd love to grab a 970 for $330..but I'm in Canada. 970's are $400+ here. It's only a $70 difference but still, that's the price of a game.


I'll have to wait until August when my budget allows for such an expenditure..but then it's Witcher 3 and Arkham Knight for days!

Upvote • 
Avatar image for skipper847
skipper847

7334

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

My 970 is crap. all I can say.

Upvote • 
Avatar image for elessarGObonzo
elessarGObonzo

2678

Forum Posts

0

Wiki Points

0

Followers

Reviews: 140

User Lists: 0

@skipper847: not crap really. for the price it's nothing special, that's for sure. if you're doing 1080p it should suit you fine, but $350+ for that is a bit ridiculous.

2 • 
Avatar image for zeca04
zeca04

383

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 1

@skipper847: No, it isnt.

2 • 
Avatar image for skipper847
skipper847

7334

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

@zeca04@skipper847: Been having bad shimmering and jaggy edges on objects from day one. Looks like Anti aliasing not working. :(

Upvote • 
Avatar image for NetGod
NetGod

60

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

@skipper847@zeca04:


Just sounds like you got a bad card.

It's not all that unusual (for any companies' product), but for the most part, there is nothing wrong with Nvidia's products.

Upvote • 
Avatar image for oflow
oflow

5185

Forum Posts

0

Wiki Points

0

Followers

Reviews: 40

User Lists: 0

whats the major difference between this and the regular 980?

Upvote • 
Avatar image for Alurit
Alurit

1002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@oflow: it has the same chip as the titan x, but with 9% less cuda cores and SMs, and half the ram so it's a slightly cheaper version of the titan x with almost the same performance, but they decided to name it 980 ti for some unknown reason, where they could have named it something like titan s

Upvote • 
Avatar image for elessarGObonzo
elessarGObonzo

2678

Forum Posts

0

Wiki Points

0

Followers

Reviews: 140

User Lists: 0

@oflow: 6GBs of memory vs 4GB. 384bit vs 256. ~$100.

Upvote • 
Avatar image for spikepigeo
spikepigeo

597

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

@oflow:

2816 CUDA Cores compared to 980's 2048

A little bit more texture units

Slightly less core and memory clock than the 980, just like the Titan X

Greater memory bus width

2GB more VRAM

3 billion more transistors


All in all, it's gonna be a little faster but not a ton. The 2gb extra VRAM is nice though.

Upvote • 
Avatar image for mikey1611
mikey1611

2015

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

From what I've read on other forums this is Nvidia squirming for what's AMD is going to release next week (next two weeks? I forget). This makes me really excited for what AMD's got. Feels like competition.

Upvote • 
Avatar image for pseudopsiakite
PseudopsiaKite

72

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

@mikey1611: I'm surprised Nvidia released the 980 TI before Fiji. I was expecting Nvidia to tweak it as response to Fiji. I really hope Fiji good, I don't want Nvidia to slow down increasing performance of their GPUs like Intel's CPUs.

Upvote • 
Avatar image for spikepigeo
spikepigeo

597

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

@mikey1611: Can only be good for everyone.

Upvote •