Red Dead Redemption appeared on consoles 14 years ago—here's the kind of gaming PC it could have been ported to back then
So much has changed since then but one thing still remains true: the PC is the platform to enjoy gaming at its very best.
As I'm over half a century old, going back 14 years doesn't feel like a big deal. What's significant about that very specific length of time is that's how long it's taken Rockstar to release a port of Red Dead Redemption. In the world of PC technology, it's very much a big deal, a yawning canyon of time, and an awful lot has changed with CPUs, graphics cards, RAM, storage, and displays. So let's see just what kind of gaming PC you could have been using back then.
First up, let's consider the best gaming CPUs money could buy in 2010. From AMD, you could get a Phenom II X6 1090T Black Edition. Ludicrous name aside, it boasted some pretty serious figures: six cores and six threads, a boost clock of 3.6 GHz, 6 MB of L3 cache, and a TDP of 125 W—all for a little under $300. How times have changed.
I say six cores but a lot of the internal hardware was shared and the K10 architecture wasn't best suited to gaming. (Edit: A sharp-eyed reader has pointed out in my dotage that I was thinking about Bulldozer, the successor to K10.) Over at the Intel camp, the same kind of money could have got you a Core i7 930. That was a four-core, eight-thread, 3.06 GHz processor, with 8 MB of L3 cache and a TDP of 130 W. It even had a triple-channel memory controller.
Of course, these were high-end processors, and you could easily spend far less and still have a great gaming chip. Motherboards back then didn't look half as nice as they are these days and they were certainly less user-friendly, especially regarding BIOS settings. But they were seriously cheap: $50 would get you a decent board to lob your expensive Phenom into, though boards for the i7 930 were more expensive.
A great gaming PC needs a great graphics card, of course, and there were some excellent options from both AMD and Nvidia. One of my favourites from Team Red at that time was the Radeon HD 6870. Even though it only cost $240 or so, you got 1,120 shaders, 1 GB of VRAM, and 134 GB/s of memory bandwidth. And it only used up to 150 W of power.
One of Nvidia's best cards in 2010 was the GeForce GTX 480. It boasted 1.5 GB of VRAM and 178 GB/s of memory bandwidth, but only had 480 shader units and a whopping 250 W TDP. Oh, and it cost just under $500. Despite its unusual specs, compared to the Radeon card, the GTX 480 was faster in most games than the 6870, though mostly at high resolutions or when AA was enabled in games. Drop the pixel count down and there was little between them in terms of performance, on average.
The GTX 480 was, though, an over-priced, power-hungry slab of a graphics card compared to the HD 6870. Both cards would go on to be surpassed by more powerful models at the end of 2010, but it would be a few more years before we saw anything substantially better, especially when it came to handling the performance impact of anti-aliasing.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
As for system RAM, well you were looking at between four and six GB in a decent gaming PC, roughly a third of what you might find in a rig today. DDR3 was all the rage, though data transfer speeds were a fraction of what they are now: 1,333 MT/s was considered hot stuff, whereas now you'd think something was broken in your PC if the RAM was that slow.
If you had cash to flash at your gaming PC back then, you might have bought a new-fangled SSD to speed up Windows and your games. Sizes ranged from 64 to 120 GB, though the latter would have set you back a good $300 or so. Most PC gamers still used traditional hard drives at that time and the same amount of money would have got you a whopping 2 TB of storage. The fact that you can get the same capacity of ultra-fast Gen4 SSD for less than half the money shows just how much storage technology has advanced.
Lastly, the monitor. While 16:9 aspect ratio screens are standard fare these days, they were quite rare in 2010, with 16:10 being cutting-edge for gaming and 4:3 for basic PCs. Resolutions were usually 1024 x 768 to 1680 x 1050, and at most 1920 x 1200, but top-end displays could be had with a dizzying 2560 x 1600.
There was nothing like upscaling to boost performance back then, so if your GPU wasn't up to that pixel count, your options were to either lower the resolution or buy another GPU and use them in tandem (aka CrossFire/SLI). Or you could buy a dual-GPU card, like the Radeon HD 5790, which is still surprisingly good even today.
The best gaming monitors then used VA panels but they weren't super fast like they are today. Refresh rates were usually not much more than 60 Hz; in fact, it was so rare to see anything higher that refresh rates were hardly ever mentioned in reviews. Thank goodness I don't have to stare at those kinds of screens any more!
None of this is to poke fun at old hardware. If you had a high-end gaming PC in 2010, you were spoilt for choice with games to enjoy on it: StarCraft 2, Mass Effect 2, BioShock 2, Aliens vs Predator, Assassin's Creed 2, Mafia 2, Civilization 5, Call of Duty: Black Ops, and Just Cause 2 all racked up countless hours on my PC.
One thing doesn't seem to have changed all that much, though, and it's that game publishers just love a sequel or two. Or three, or four…
Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?