GameSpot may receive revenue from affiliate and advertising partnerships for sharing this content and from purchases through links.

AMD Unveils Its New GPU Architecture, Polaris

AMD finally set to debut 14nm cards.

80 Comments

As expected, AMD announced its new GPU architecture for 2016 at the Consumer Electronics Show today, and it potentially marks a significant move for the company.

Polaris, as it's called, sees AMD moving from the 28nm technology both it and Nvidia have been using for years to 14nm FinFET GPUs. This will allow for the company's hardware to offer a "remarkable generational jump in power efficiency."

It says the new line of GPUs is built for "fluid frame rates in graphics, gaming, VR, and multimedia applications running on compelling small form-factor thin and light computer designs." It also promises "industry-leading performance-per-watt" and support for HDR monitors., HDMI 2.0a, DisplayPort 1.3, and 4K h.265 encoding and decoding.

The video above, which features various AMD employees talking up the advantages of Polaris, also briefly compares the power consumption of two comparable cards. Running Star Wars Battlefront at 1080p and Medium video settings, the Polaris card uses about 86 watts, whereas we see a stock GTX 950 using 153 watts. (The subsequent fine print pins the latter rig's power draw at 140 watts.)

Further details weren't shared, but AMD says the first Polaris-based cards should begin shipping in mid-2016.

Got a news tip or want to contact us directly? Email news@gamespot.com

Join the conversation
There are 80 comments about this story
80 Comments  RefreshSorted By 
  • 80 results
  • 1
  • 2
GameSpot has a zero tolerance policy when it comes to toxic conduct in comments. Any abusive, racist, sexist, threatening, bullying, vulgar, and otherwise objectionable behavior will result in moderation and/or account termination. Please keep your discussion civil.

Avatar image for Rushaoz
Rushaoz

954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

People start going down the electricity costs route when talking about GPUs but in reality a lower wattage GPU is really benefiting how future proof other parts in your build are, not your power bill. Lemme explain..

You got one of these all Team Red guys out there that built a mid range budget rig 3 years ago. They used an FX 6300 and a 7870 like my buddy did. Now a few years later the CPU is a bottleneck issue and the 7870 isn't cutting it anymore. He decides to overclock the CPU and upgrade the GPU to an r9 390. Welp guess what he had to spend $30 on a GPU cooler and now has to upgrade his 500 watt PSU because the 390 uses almost 300 watts on load alone and the CPU overclock added on about 40-50 watts on load effectively bringing the power draw to over 90% of the PSUs max power output. (we used a kill a watt) If you know anything about system building you know that you always want some extra reserve power in your PSU to keep it healthy. No one wants a PSU meltdown.. trust me. There are plenty of horror stories out there.

Not to mention the little overclock he was able to get didn't help much. The 390 isn't hitting it's performance ceiling. So now a new AM3+ motherboard with better OC'ing capabilities and an upgrade to an FX 8350 sometime in the next few months. He's now spending more money than I did on my rig just to keep up with the times while I'm still humming along.

Point is AMD is budget friendly NOW, but not further down the road. A decent midrange Intel/Nvidia rig will outlast an AMD rig by a good margin WITHOUT having to upgrade parts for future upgrades if need be. I plan on sticking a Pascal GPU in my rig when they release and I don't have to upgrade a damn thing. Had I an AMD rig I'd need to upgrade the CPU because they're a bottleneck which means I'd have to upgrade my motherboard and PSU too just like my buddy did.

No thanks.

I'm still rocking my i5 2500k at 4.4 GHz that 4 years later is still keeping up with current gen CPUs and a GTX 680 that still plays most games admirably with the exception being the Witcher 3 that I have to run at high settings.

That being said I hope AMD has something truly worthy with Polaris. Nvidia has been crippling it's GPUs for a few gens now because AMD just isn't up to snuff. AMD is now like Intel was in the early 2000's. Just adding on more and more MHz every year using the same architecture over and over again. As an ex AMD guy (I had a Barton 3200+ and a 9800 Pro :D back 12 years ago or so ) I hope AMD gets back on their game. Competition is better for all of us in the end.

2 • 
Avatar image for slypher9
Slypher9

947

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

@Rushaoz: and that's the segment AMD always aim to please, the budget guys... most Pc users don't upgrade as fast as one might think and most gamers don't run games on max settings... The budget market is the best market to maximize profit and that's there aim.

Upvote • 
Avatar image for asmoddeuss
asmoddeuss

623

Forum Posts

0

Wiki Points

0

Followers

Reviews: 34

User Lists: 0

Edited By asmoddeuss

@Rushaoz: a i5 2500k.... is a nice cpu but is also a bottleneck for newer cards/performance hungry games, trust me, my i5 3570k @ 4.2 Ghz (OC) bottlenecks my GTX 980ti, my GPU cannot get high GPU utilization % because my cores are all at 100% (for example in Crysis 3 and Watch Dogs both at 1080p, no way I can do 4k with stable FPS) thus dropping FPS. I would like to get a better cpu but that means changing my whole setup, but is true, I can still game with my i5 3570k in 1080p with the exception of a few cpu hungry games. :)

Upvote • 
Avatar image for Rushaoz
Rushaoz

954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Rushaoz

@asmoddeuss: Really? Sure there isn't anything funky going on with your set up?

https://www.youtube.com/watch?v=WZ_5p9wd2dk

Digital Foundry tested the 6600k, 4690k, 3570k and 2500k with the Titan X and the differences were very minimal. You might wanna recheck everything for stability you shouldn't have any bottleneck at all.

2 • 
Avatar image for asmoddeuss
asmoddeuss

623

Forum Posts

0

Wiki Points

0

Followers

Reviews: 34

User Lists: 0

Edited By asmoddeuss

@Rushaoz: Oh Thanks for that link man, in all the games they showed in that benchmark I have no problems at all, is only in Crysis 3 and Watch Dogs (too bad they don't show watch dogs) that I got horrible fps. Mmm, well my PSU is starting to be very old, but I will check again those FPS.

Upvote • 
Avatar image for asmoddeuss
asmoddeuss

623

Forum Posts

0

Wiki Points

0

Followers

Reviews: 34

User Lists: 0

Edited By asmoddeuss

@asmoddeuss: Alright, I am right here, that video you posted is just a cutscene part benchmark, is not the heavy parts of the game, check this https://www.youtube.com/watch?v=ia3x6kkGVV0 I have the same performance as in this vid. So yeah i5 3570k is a bottleneck in some cpu demanding games like Crysis 3. In the future don't trust every benchmark you see ^^

Also check this vid https://www.youtube.com/watch?v=uZoAuPFC1gQ Look at the core load of the CPU, 99% all the time.... that is a bottleneck, and look how the FPS drops when he is running. The FPS drops to around the 40 fps lol, a GTX 980ti ! Is not the GPU is the CPU. So bottleneck right there. :(

Upvote • 
Avatar image for Karmazyn
Karmazyn

994

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

It's probably the same sort of bullshit as mantle was, after bf4 success the whole idea was dropped and forgotten and of icily abandoned, will Polaris meet mantle,s fate?

Upvote • 
Avatar image for asmoddeuss
asmoddeuss

623

Forum Posts

0

Wiki Points

0

Followers

Reviews: 34

User Lists: 0

@Karmazyn: I don't understand what Mantle has to do with a new GPU architecture?

Upvote • 
Avatar image for Karmazyn
Karmazyn

994

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

@asmoddeuss: its about amds architecture of lies.

Upvote • 
Avatar image for ShimmeringSword
ShimmeringSword

107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@Karmazyn: That you just compared a manufacturing process to a piece of software shows you don't know what you're talking about. This is a hardware leap, something that has, does, and will continue to happen in order to make videocards and chips in general faster. It's not some gimmick, it's literally how hardware gets better.

2 • 
Avatar image for Karmazyn
Karmazyn

994

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

Edited By Karmazyn

@ShimmeringSword: I do know what I am talking about, about AMD lies and failed promises. I am looking at my two GPU boxes and both have then on the back "discover future of gaming with revolutionary Mantle supported by AMD". where is this revolution. I demand new box !

Upvote • 
Avatar image for ShimmeringSword
ShimmeringSword

107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@Karmazyn: One, that's extremely obvious marketing speech that no one should fall for, and you read the box lol, read the benchmarks instead. Second, and once again, Mantle is software, die size is hardware. Physically better built hardware WILL be better.

Upvote • 
Avatar image for SicoWolf
SicoWolf

49

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By SicoWolf

@Karmazyn: ****, you are daft. Mantle accomplished exactly what it set out to do - provide a low-level API so that games could take greater advantage of GPU hardware. We would not have DirectX 12 today without Mantle. DX12 was a direct response to Mantle. Furthermore, Mantle was sold and rebranded as Vulkan, which is replacing OpenGL. AMD should be and probably are extremely proud of Mantle. You should be thanking them for pushing the industry forward.

Also, there is nothing bullshit about efficiency increases. Nvidia had a huge advantage here with their Maxwell architecture. AMD is now leapfrogging Nvidia, and maybe Nvidia will leapfrog AMD again when their next architecture. That's the life of healthy competition. The industry across the board, including CPUs, is putting more emphasis on efficiency, as mobile/lightweight devices become more and more important.

5 • 
Avatar image for Karmazyn
Karmazyn

994

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

@SicoWolf: dude please. the proof is in temperatures. Compare AMD and Nvidia temperatures and power consumption vs performance. yup almighty mantle which was only used in BF 4 ceased to be.

Upvote • 
Avatar image for Blue-Sky
Blue-Sky

10381

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

@SicoWolf: "We would not have DirectX 12 today without Mantle"

You had me until then

Upvote • 
Avatar image for r31ya
r31ya

165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@Karmazyn: Polaris is mostly simply moving from 28nm architecture to 14nm architecture.

Something that Intel have done and NVIDIA about to do. It's about to be the industry standard within 2 year or so.

Upvote • 
Avatar image for ALLIAMOS
ALLIAMOS

751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

time to ADD AMD cards to my HEAD,,,,

2 • 
Avatar image for Tyson8earzz
Tyson8earzz

885

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

I am all for AMD making a push for greatness. We need more competition, which in turn will drive prices to competitive levels for us, the consumer. It's only a good thing.

3 • 
Avatar image for Trickymaster
Trickymaster

354

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

It's good to see AMD is finally catching up and playing at Nvidia's level.

Upvote • 
Avatar image for Rushaoz
Rushaoz

954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@Trickymaster: Let's not get ahead of ourselves here. Nvidia is still superior when it comes to building a GPU. They compared Polaris to a GTX 950 instead of their own cards.. why do you think that is? Because had they compared it to they're own cards they'd be shooting themselves in the foot. Both companies used a 28nm process but Nvidia still managed to be superior when it came to power draw. A GTX 970 uses about 150-160 watts on load while the comparable r9 390 uses a ridiculous 270-290 watts depending on the card.

That being said I hope they actually have something good with Polaris. Nvidia has been crippling their GPUs for awhile now because AMD has been so far behind.

Upvote • 
Avatar image for GamerYnoX
GamerYnoX

656

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@Rushaoz: AMD cards are bad when if comes to power efficiency but they generally outperform Nvidias cards for the $. The GTX 970 is comparable to the r9 390 except the r9 outperforms at 1440p. It's clearly the best performing card at its price point.

Upvote • 
Avatar image for Rushaoz
Rushaoz

954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@GamerYnoX: A 5 FPS difference in nearly all games between both cards barely qualifies as outperforming. Not to mention the 970 does better at some games while the 390 does better at others but really they're on par. I'll take the much lower power usage and higher overclocking ceiling of a 970 over a 390. To top it off the 970 is now cheaper than the 390 so this is a no brainer unless you're an AMD fanboy.

http://www.guru3d.com/articles_pages/powercolor_radeon_r9_390_pcs_8gb_review,13.html

2 • 
Avatar image for zinten
zinten

491

Forum Posts

0

Wiki Points

0

Followers

Reviews: 50

User Lists: 0

Edited By zinten

@Trickymaster: I think thats pretty much their last chance at catching up. They recently cut their research budget by a huge chunk for short term profit so if this tech fails to beat nvidia the next one most likely wont stand a chance at all.

Upvote • 
Avatar image for Guimengo1
Guimengo1

114

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

Here's hoping the AMD technologies are widely adopted, instead of the usual nVidia monopoly attempts.

3 • 
Avatar image for jenovaschilld
jenovaschilld

8029

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I have gotten fine results from my own amd desktop gaming pcs and have gotten great performance per dollar from AMD GPUs and CPUs. Yes their cards run HOT and HUNGRY but you do get alot of performance for the price comparatively. This next generation of GPU architecture will be good news for everyone wanting good performance at low prices, and also means mobile tablets/laptops will get this boost in power also.

I still love Nvidia but their costs can be higher then paying for the extra electricity.

6 • 
Avatar image for babaelc
babaelc

166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

That is genuinely amazing.

Upvote • 
Avatar image for rasterror
rasterror

3696

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

I would gladly switch to AMD cards if their drivers didn't suck. They're technically better than Nvidia but the software kills it.

5 • 
Avatar image for jenovaschilld
jenovaschilld

8029

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@rasterror: Their drivers are fine and their APIs are just as good as Nvidia - big problem is ... they are always too late to come. Getting up to date drivers a year later helps little. And many of the companies that sale AMD cards are not much helpful when it comes to getting timely drivers for their cards also. Nvidia still roles out newer drivers for their cards that are over 2yrs old per brand, which you do not see with AMD.

That doesn't mean they are all bad, some brands and software drivers for certain games are very timely.

2 • 
Avatar image for ShimmeringSword
ShimmeringSword

107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@jenovaschilld: What you just said is pretty much "bad drivers". Late is bad , doesn't need a different name :)

2 • 
Avatar image for superklyph
SuperKlyph

1901

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

@rasterror: That's the exact reason I switched to Nvidia last week. Years of bad drivers.

Upvote • 
Avatar image for jinzo9988
jinzo9988

2457

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I might be interested if I decide to go with a low-draw home theater setup that still has a bit of muscle in it for casual couch gaming. We'll see how things stack up when the time comes.

Upvote • 
Avatar image for DrunkenPunk800
DrunkenPunk800

1824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Wake me when the new GeForce cards come out.

I like AMD CPU's, but their GPU's have compatibility issues.

4 • 
Avatar image for Ice-Cube
Ice-Cube

2454

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

@DrunkenPunk800: Yep i'm running their CPU but i'll stick with Nvidia when it comes to graphic cards. Ease of use and is always updated

3 • 
Avatar image for lonewolf1044
lonewolf1044

4987

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

Competition is always good and I do not want to see neither AMD or Nvidia get a monopoly.

14 • 
Avatar image for placksheep
placksheep

365

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

"...fine print pins the latter rig's power draw at 140 watts"

Nothing like using fine print to allow you to lie about the difference between your product and the competition X-O

3 • 
Avatar image for Buck_Swaggler
Buck_Swaggler

445

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

wanted to link to a clip of Tim Taylor saying "needs more power" but couldn't find one

2 • 
Avatar image for Shadowdanc3r
Shadowdanc3r

282

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Shadowdanc3r

Launch the Polaris

The end doesn't scare us

-Megadeth

2 • 
Avatar image for nikon133
nikon133

1453

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

While this is great news - I'm wondering if AMD is planning to do something about Nvidia's exclusive solutions - GameWorks. They really need to step up their cooperative with developers and software side of the whole equation, speed up drivers' updates (promised, but let not hold breath until we see it happening), make some use of their TressFX and other tech they (might) have but is mostly sitting dormant.

I have been using AMD GPUs for my desktop in last two refreshes - 6870 and 280x - but for next upgrade, power consumption be damned, I'm looking at Nvidia right now. I don't care much about how much power my desktop rig burns. I'd probably be looking at 14nm parts if I was on market for for light laptop with good battery life that can be used for light gaming on the go as well, but for desktop, it is really down to real-life performance and features.

Upvote • 
Avatar image for ssj2los
ssj2los

196

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@nikon133: Yes AMD is doing something. It's called GPUOpen and its open source so everyone can optimize for max visual benefits.... Seems like a good move by AMD and I see this becoming a standard over Nvidias GameWorks.

4 • 
Avatar image for nikon133
nikon133

1453

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

@ssj2los: I've heard about that initiative for some time... but they seem to be a bit casual about it. But I hope you are right and it is happening for real.

Upvote • 
Avatar image for barcaazul
BarcaAzul

3084

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

So it's an eco friendly GPU then. That's about as much sense it means to me.

Other than the size means they can fit more into a tight space.

Upvote • 
Avatar image for Megawizard
Megawizard

586

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

My question is: will the die size be capable of going past medium settings without stuttering (depending on the card it's installed in), or is that gonna take a further generation or two of development? A company can promise anything it wants; making it happen is another matter.

2 • 
Avatar image for ShimmeringSword
ShimmeringSword

107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@Megawizard: Maybe you're new to chip manufacturing, but reduction in die size is one of the main ways chips are improved, it happens regularly, typically every other generation of chip. It's not a gimmick nor does it make the chip unstable or somehow deficient (otherwise it would still be in RnD), it just straight up makes the entire chip fit in less space.

Really this news isn't anything special, simply a new gen of cards coming.

2 • 
Avatar image for malachi
malachi

337

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

We need CPU's from AMD not more GPU's!!! FFS!

Upvote • 
Avatar image for wxruss8
wxruss8

67

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 5

@malachi: THIS! I'm not fond of their APU's. I'd rather have a separate video card and CPU.

Upvote • 
Avatar image for fedor
Fedor

11831

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@malachi: Zen has already been announced and is releasing this year.

3 • 
  • 80 results
  • 1
  • 2