GameSpot may receive revenue from affiliate and advertising partnerships for sharing this content and from purchases through links.

Revealed: Nvidia's Next Flagship Graphics Card, the GTX 980Ti

New GeForce GPU carries 6GB of memory, costs $650.

909 Comments

Nvidia has divulged new details of its GeForce 980Ti, the latest addition to its line of premium-priced graphics cards.

Starting at $650 (and bundled with a copy of Batman: Arkham Knight), the GTX 980Ti carries six gigabytes of GDDR5 memory. That's about half the capacity of Nvidia's $1000 Titan X graphics card, but Nvidia believes the 980Ti can perform to a similar standard at about two-thirds of the price.

A gallery below offers a closer look at the new GPU.

Click on the thumbnails below to view in full-screen
Click on the thumbnails below to view in full-screen

Much like with the Titan X, the 980Ti has a 336GB/s maximum bandwidth. Its display outputs are 1x HDMI, 1x DVI, and 3x DP, while it has two power connectors arranged as 1x 6pin and 1x 8pin.

Meanwhile, the GTX 980, due to its drop down the pecking order, will now be sold at $500.

Nvidia's new GeForce prices follow:

  • Titan X: $1000
  • 980 Ti: $650
  • 980: $500
  • 970: $330
  • 960: $200

Mark Aevermann, a spokesperson for Nvidia, told GameSpot that the GTX 980Ti was built with DirectX 12 features in mind.

"This is, by all rights, our flagship GeForce GTX GPU, built on the GM 200 chip," he said, "and we designed it for a lot of the top-end PC features, such as 4K, VR, and G-Sync. These are the things that this card excels at."

Further reading: DirectX 12: A Game Changer for Xbox One and PC?

Got a news tip or want to contact us directly? Email news@gamespot.com

Join the conversation
There are 909 comments about this story
909 Comments  RefreshSorted By 
GameSpot has a zero tolerance policy when it comes to toxic conduct in comments. Any abusive, racist, sexist, threatening, bullying, vulgar, and otherwise objectionable behavior will result in moderation and/or account termination. Please keep your discussion civil.

Avatar image for lorider25
lorider25

420

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

Looks tasty, but my SLI 780ti still have life in them!


Hope to see some benchmarks

Upvote • 
Avatar image for SphinxDemon
SphinxDemon

96

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

well two years ago I built my pc with one of the best gpus, the 680 gtx. 2 years later I cannot run the witcher on 3 on ultra without it being slow... I guess its time to upgrade and this card look like eye fk candy. thank gosh for the 6 gb vram! ultra 4k here I come

Upvote • 
Avatar image for drocdoc
drocdoc

327

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 5

I'll wait till AMD R9 300 series come out before jumping ship on this. Hell, these 900 series Nvidia cards aren't even 100% DX 12 compatible. But Nvidia knew exactly when to release it's gpu's inflating prices before Amd release theirs.

Upvote • 
Avatar image for deactivated-58a78a043e9d4
deactivated-58a78a043e9d4

2269

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

@drocdoc: It's only the Fermi cards that aren't 100% compatible, Kepler and Maxwell are.
<< LINK REMOVED >>

Upvote • 
Avatar image for S4vagecabbage
S4vagecabbage

81

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Haha damn! I just got the GTX 980 and there's a newer one coming already...BLAST!!!

Upvote • 
Avatar image for balrogbane
Balrogbane

1051

Forum Posts

0

Wiki Points

0

Followers

Reviews: 63

User Lists: 5

I don't mind dropping a little extra for great frame rate and graphics, but.. heh..dang I ain't payin' that much for any graphics card. Maybe in a couple years it'll be ready for the common man.

Upvote • 
Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

@balrogbane: Actually, just next year.

New architecture + die shrinking + HBM = Huge performance leap.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@Coseniath@balrogbane: yeah next year going from 28mn to 20nm and HBM mid grade cards will be way stronger then whats out right now

Upvote • 
Avatar image for alaskancrab
alaskancrab

508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

@fvckgabenewell@Coseniath@balrogbane: yeah we've been waiting the last 6 years for AMD to improve it's memory controllers on their CPUs... what's changed?


Great for power consumption, but I doubt it'll have any other significant benefits.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@alaskancrab@fvckgabenewell@Coseniath: so 1st you say the p4 is better then what nvidia is realeaseing with the 890 ti cus ps4 has more ram yet it has a AMD 7870 in it lol then you go on to complain about AMD after


alaskancrab do you have anything remotley usfull to say or are you just T-r-lololing

Upvote • 
Avatar image for alaskancrab
alaskancrab

508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

@fvckgabenewell@alaskancrab@fvckgabenewell: look dude... you don't know what you are talking about... first of all Sony is already releasing 4gig+ VRAM games... so ease up on the attitude.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@alaskancrab@fvckgabenewell@alaskancrab: what are you doing hateing on AMD and defending sony all at the same time you are contradicting yourself fool

Upvote • 
Avatar image for alaskancrab
alaskancrab

508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

@Coseniath@balrogbane: AMD's the last company I'd wait to come out with faster memory controllers.

Upvote • 
Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

@alaskancrab@Coseniath@balrogbane: Next year AMD will not be the only company with HBM.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@alaskancrab@Coseniath@balrogbane: lol why they are relaseing it 1st so weather you want that to happen or not its going to happen

Upvote • 
Avatar image for DanielL5583
DanielL5583

1221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 166

User Lists: 0

@balrogbane: Yeah, this is enthusiast grade stuff.
You could perhaps instead get a 970 instead, which are half the price and still offer very good performance.

Upvote • 
Avatar image for Daveof89
Daveof89

1583

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

@DanielL5583@balrogbane: I just put a 970ME in my latest build a couple months ago, and I love it! 980 was a bit too steep for me, and so is the TI.

Upvote • 
Avatar image for GhostHawk196
GhostHawk196

1337

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

Sorry but if you're going to PC game you may as well get the absolute best. Theres no point getting a PC which only just outperforms a console for an additional $400. Titan X is the better choice. If I was to upgrade from my 780Ti cards it would be Titan X.

Upvote • 
Avatar image for godfather830
godfather830

419

Forum Posts

0

Wiki Points

0

Followers

Reviews: 65

User Lists: 0

@GhostHawk196: The 780 Ti is only a tiny bit less powerful than a Titan X. The difference is negligible. It would be stupid for anyone to buy the Titan X right now.


Evidence: << LINK REMOVED >>

Upvote • 
Avatar image for Daveof89
Daveof89

1583

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

@GhostHawk196: Wrongity wrong wrong. You could buy a console AND the 980ti for the price of the Titan... The differences in visuals would probably be hardly noticeable for most games.

Upvote • 
Avatar image for alaskancrab
alaskancrab

508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

@GhostHawk196: that's what I'm saying... another 2 years and we're set. TitanX for $300!

Upvote • 
Avatar image for day3002
day3002

27

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

There are more graphic cards availble on PC then there is actual AAA games.

11 • 
Avatar image for Bryjoered07
Bryjoered07

206

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@day3002: Wow, what a dumb comment. I can name a SINGLE exclusive that is good on console that I wish was on PC, it's Bloodborne. Almost every AAA game comes out for PC and performs and looks better. What are you smoking?

Upvote • 
Avatar image for Alurit
Alurit

1002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@day3002: well same goes for every platform, there weren't that many good AAA games released recently, not even console exclusives, the ps4 that we have is just gathering dust

Upvote • 
Avatar image for Fursnake
Fursnake

449

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

I have a 34" 1440p monitor, I need one of these. I'm getting one of these. But I'll wait until I see some non-reference cooler cards.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@Fursnake: yeah man wait tell after AMD get there cards out you maybe will see a price drop I hope AMD get something out that is faster then what Nvidia has

Upvote • 
Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

@Fursnake: << LINK REMOVED >>

Upvote • 
Avatar image for DanielL5583
DanielL5583

1221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 166

User Lists: 0

Welp. I'm sold on this card, after seeing a couple of reviews on it. Been looking to change from the red team for a long time, and this here is my ticket.

Upvote • 
Avatar image for drflyntcoal
DrFlyntCoal

496

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

¯\_(ツ)_/¯

Upvote • 
Avatar image for alaskancrab
alaskancrab

508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

So lets see the PS4 had 8 gigs of memory, and even if 2 gigs isn't available(doubtful), we are left with 6 gigs? So how am I suppose to run anti aliasing with this card?

Upvote • 
Avatar image for jeremyc99999
jeremyc99999

80

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

@alaskancrab: The PS4 GPU only has 2 gigs of memory, the actual system has 8 gigs of RAM memory. So if say I were to get this card, I would have 6 gigs of GPU memory and 16 gigs of system RAM memory. System memory and GPU memory are two different things.

Upvote • 
Avatar image for godfather830
godfather830

419

Forum Posts

0

Wiki Points

0

Followers

Reviews: 65

User Lists: 0

@jeremyc99999@alaskancrab: You're right except that the PS4 does not have dedicated RAM. It has 8GB of GDDR5 VRAM, and it uses the VRAM as system memory. Weird choice, but that is how it is.


Funnily enough, the Xbox One is the opposite.

Upvote • 
Avatar image for Daveof89
Daveof89

1583

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

@alaskancrab: Uhhhm.. that's not 8 gigs of memory for graphjics, it's 8 gigs of memory total. The game engine/os takes a lot of memory.

Upvote • 
Avatar image for Alurit
Alurit

1002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@alaskancrab: but the ps4 has only has about 2gb of memory left for the gpu since it is the system memory as well the a dedicated gpu's memory is purely frame buffer, a pc has dedicated system memory, for people getting a card like that is usually 16GB, but at least 8GB.

Upvote • 
Avatar image for DanielL5583
DanielL5583

1221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 166

User Lists: 0

@Alurit@alaskancrab: ............Huh?

Upvote • 
Avatar image for alaskancrab
alaskancrab

508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

@Alurit@alaskancrab: huh? you really lost me there.

Upvote • 
Avatar image for Alurit
Alurit

1002

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@alaskancrab@Alurit: you need a lot of memory to run the OS and the game, and there are so much memory left for the textures, it's not for nothing that at system requirements devs put 4 and 8gb of ram required, that is not video ram that's separate, but you still that as well, 8gb of ram is not that much the game engines soak up most of it and there isn't more than 2gb left for graphics for the ps4

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@alaskancrab: rofl GPU on that ps4 is 2 years old alredy and has less then half the prossessing power of this card

Upvote • 
Avatar image for alaskancrab
alaskancrab

508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

@fvckgabenewell@alaskancrab: that's fine... you still needs gobs of memory for Anti Aliasiing and Filtering. The whole point is to get better image quality than on PS4. why else would I pay $650... 60fps ain't the only important thing.

Upvote • 
Avatar image for fvckgabenewell
FvckGabeNewell

69

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@alaskancrab@fvckgabenewell: well with everthing cranked right up on ultra in the witcher 3 a gtx 980 only uses around 3gb of video memory at 1080p thats with the aa maxed so its really how well things are optimized for ram 6bg of video ram is more then what is used in most games.

Upvote • 
Avatar image for alaskancrab
alaskancrab

508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

@fvckgabenewell@alaskancrab: the GTAV videos were using like 6 gigs.

Upvote • 
Avatar image for DanielL5583
DanielL5583

1221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 166

User Lists: 0

@alaskancrab: Because memory isn't everything, and the PS4 doesn't just use its memory for graphics alone.

Upvote • 
Avatar image for alaskancrab
alaskancrab

508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

8 gigs.. Nvidia... come on, why you ripping us off, what did we ever do?

Upvote • 
Avatar image for westsiderz28
westsiderz28

670

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

heh the pc master race eh bunch of muggins if you ask me, a fool and his money are soon parted,

i will say no more

Upvote • 
Avatar image for zeca04
zeca04

383

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 1

@westsiderz28: Good.

Upvote •