AMD’s Polaris Shoots for the Stars
The North Star
In the final portion of our RTG Summit coverage, AMD has saved the best for last. Part one of the summit covered displays and visual technologies, part two was about software and AMD’s push to become more open (as in open source), and now it’s time to look to the North Star and find out what AMD is planning in the realm of GPU hardware.
We’ve all known for a while that 14/16nm FinFET process technology is coming to GPUs, and in December RTG was happy to show us their first working silicon. This is a big change from the “old AMD” where we would often get very few details about a new product prior to launch. This time, AMD is providing some high level details of their next generation GCN architecture (is that redundant—next generation Graphics Card Next?), well in advance of the expected launch date.
And we may as well cut straight to the chase and let you know that Polaris isn’t slated to launch until around the middle of the year, so in about six months. Which isn’t too surprising, considering the cadence of GPU launches, but if you were hoping to upgrade right now, you’ll have to postpone things a bit or stick with existing products.
There’s something else to discuss as well, and that’s the positioning of the Polaris part we were shown. Basically, it’s AMD’s entry-level GPU, rather than a high-end competitor; so again, you might need to wait longer if you’re hoping to get something faster than a Fury X. Of course, just because AMD was demonstrating their entry-level Polaris part, that doesn’t mean they can’t do midrange and high-end launches in the same time frame, but we would look more toward the fall for the high-end product launch.
But what does 14nm FinFET mean, what other Polaris chips might we see, and what new technologies are being baked into the fourth generation GCN architecture (which we’ll call GCN4, though admittedly it looks more like GCN 1.3)? Let’s dig into the meat of the announcement and talk about some of the cool and interesting things that are coming.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Jarred's love of computers dates back to the dark ages when his dad brought home a DOS 2.3 PC and he left his C-64 behind. He eventually built his first custom PC in 1990 with a 286 12MHz, only to discover it was already woefully outdated when Wing Commander was released a few months later. He holds a BS in Computer Science from Brigham Young University and has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
Nvidia's upgrading GeForce Now's $10 tier with 1440p and Ultrawide resolutions, but the only extra Ultimate users get is a new 100-hour play limit
Intel CEO sees 'less need for discrete graphics' and now we're really worried about its upcoming Battlemage gaming GPU and the rest of Intel's graphics roadmap