The RTX 3080 is unreservedly still the graphics card I'd buy today
AMD RDNA 3 is incoming, the RTX 4090 is awesome, but I don't care.
We're less than two weeks from the retail availability of AMD's new RDNA 3 graphics. Nvidia's own RTX 40-series has been out for nearly two months. But if I'm spending my own, personal cash on a new GPU any time soon, there's absolutely no doubt what I board would buy. It's the GeForce RTX 3080. Yup, the plain old 10GB card.
Only very recently has the RTX 3080 finally become available for anything near its original MSRP. As I tap these words out, the cheapest I can find is the Peladn Gaming RTX 3080 for $720, a whisker over the original $699 recommended price for the RTX 3080 way back in late 2020.
For sure, the 12GB version is tempting. But the cheapest non-refurb is currently running at $999 and that's just a bit too rich for the marginal gains. It also undercuts what I think, as bonkers as it sounds, is the 10GB card's excellent value proposition. That's the world we live in. $700 GPUs are now the "value" option.
Anyway, from the very beginning the RTX 3080's appeal centered on the fact that it's based on the top GA102 Ampere chip, the same silicon as the $1,500 MSRP RTX 3090. A slightly cut down version of a top GPU has always been the best long term bet when GPU buying and the RTX 3080 proves that in spades. Indeed, as I explained recently, the new RTX 4080 is such a huge disappointment very much because it's not based on Nvidia's top new AD102 Lovelace chip as found in the mighty RTX 4090, but the second-tier AD103 GPU.
So, where the RTX 3080 gives over 80% of the hardware and experience of an RTX 3090 in terms of measures like shader counts and frame rates, the RTX 4080 isn't even 60% of a 4090. More to the point, the cheapest RTX 4080 is over $500 more expensive, right now. That's a premium of roughly 75% at today's prices. And the RTX 4080 isn't nearly 75% faster than a RTX 3080.
Moreover, if I'm paying $1,250 for a graphics card, I want something seriously special, not the cut-down compromise of a card that is the RTX 4080. I'd sooner wait a while for the RTX 4090 is fall back closer to its own $1,599 MSRP, at which point it looks like excellent value compared to a $1,250 RTX 4080.
Peladn Gaming RTX 3080| 10GB GDDR6X | 8,704 shaders | 1,740MHz Boost | $720.99 at Newegg
Yes, ladies and gentlemen, here is an RTX 3080 at very nearly MSRP. It's been a hella long time since we saw one at this price, and an overclocked one, too (however moderately). Even better, it's available to order and in stock. Well, it is as we type these words.
As for AMD's upcoming Radeon RX 7900 XT and XTX boards, they're simply not doing it for me. The internet has been alight with predictions that AMD's new boards will tear Nvidia apart thanks to their keen price-performance proposition. But, hello? Even at MSRP, they are $899 and $999 cards respectively. And there's a decent chance they'll sell for well beyond those figures for several months after launch. Assuming you can buy them, and they don't sell out in a matter of minutes.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Even by AMD's own claims, the new RDNA 3 architecture has substandard ray tracing performance. AMD puts the improvement at 1.5x to 1.6x versus RDNA 2's acutely weak ray tracing performance. The whole ray tracing thing is admittedly a bit of a nightmare. How much does ray tracing really matter? Is it obvious when it's enabled versus disabled in a given game? I'll admit, I'm not confident I can always tell the difference.
And yet I still find it very hard to compute the notion of paying a thousand dollars for a GPU with an obvious performance weakness. My bet is that an RX 7900 XT or XTX will barely be any better, if it is indeed any better, for playing games with ray tracing than an RTX 3080. Meanwhile, the RTX 3080 is still a beast for plain old raster games. So, I'd me much happier spending $720 on the RTX 3080 than $1,000-plus on the new AMD boards.
At launch in 2020, the RTX 3080 looked like a killer proposition. It was just spoiled by the massive spike in GPU prices. But it's a testament to just how well specified it was that over two years later and with the RTX 3080 only now hitting those MSRPs that it's still so appealing.
Best CPU for gaming: The top chips from Intel and AMD
Best gaming motherboard: The right boards
Best graphics card: Your perfect pixel-pusher awaits
Best SSD for gaming: Get into the game ahead of the rest
Really, the only thing that even slightly gives me pause for thought is the fact that Nvidia has locked down its frame generation tech from DLSS 3 to RTX 40-series boards. But I can live with that. Frame insertion doesn't help with latency, so for my money is less of a literal game changer than DLSS resolution scaling has proven to be.
What doesn't worry me much is the prospect of the upcoming RTX 4070 or RTX 4070 Ti. Given Nvidia's outrageous pricing of the RTX 4080 and that GPU's disappointing specs, it's hard to see how the RTX 4070 and RTX 4070 Ti are going to dramatically beat the RTX 3080 in terms of value. The RTX 4070 Ti will probably be a rebadged 4080 GB, the card that Nvidia infamously 'unlaunched', while the RTX 4070 will be even lower specified.
My hunch is that the plain non-Ti RTX 4070 will have about the same performance as an RTX 3080 in raster games, maybe it'll be a bit faster for ray tracing, and Frame Generation will be touted in all the comparative Nvidia benchmarks. And at least at launch early next year and for several months afterwards, it'll be more expensive than a $720 RTX 3080. Worst case scenario it will be a slightly better buy. But not enough to have me really regretting an RTX 3080 trigger pull today.
So, that's where I am at. Two years on from the RTX 3080's launch with its replacement widely available and AMD's new GPUs just days from availability, the RTX 3080 would totally, happily, and unreservedly get my money. I'm I missing something? Have I got this totally wrong? Tell me why below.
Jeremy has been writing about technology and PCs since the 90nm Netburst era (Google it!) and enjoys nothing more than a serious dissertation on the finer points of monitor input lag and overshoot followed by a forensic examination of advanced lithography. Or maybe he just likes machines that go “ping!” He also has a thing for tennis and cars.