New DirectX 12 preview could pave the way for PCs to become more like an Xbox or PS5
Now the CPU and GPU can both access VRAM directly, does a gaming PC need traditional system memory anymore?
At the end of March Microsoft announced a new preview version of its Agility SDK (via Guru3D) which incorporates a new feature that could deliver gaming PCs that no longer use traditional system memory. Just like an Xbox Series X.
For now, though, GPU Upload Heaps is here to allow both the CPU and the graphics card simultaneous access to the video memory strapped to the GPU. This means that your PC will no longer need to copy large chunks of data from the CPU to the GPU, or therefore retain copies of data both in system memory and in VRAM.
The upshot of this is that, in certain situations, you will see increased game performance because CPU and RAM utilisation should be decreased.
GPU Upload Heaps seems to be another step on the journey started by the introduction of the resizable base address register (BAR) feature a few years back. That made it possible for Windows to manage the graphics card's memory pool, but now the new feature means your PC's processor can have direct access to that VRAM.
On the side of the user that means, so long as you have resizable BAR enabled on your system, there's nothing else necessary for you to be able to access the new GPU Upload Heaps feature. Though, realistically, it's not going likely to be a feature visible to gamers in terms of turning it off and on, as it's a developer-level feature aimed at giving them the tools to squeeze as much performance out of a system as possible.
And, because it is new, it's not likely to find its way into games for a while yet. Though it's fundamentally going to be similar to the way Microsoft's own Xbox Series X/S consoles interact with the shared VRAM that makes up the entirety of their systems, so you'd think that it should be quite straightforward for developers used to the Xbox Velocity architecture to implement.
This is what makes the GPU Upload Heaps so interesting to me. Microsoft's console made the switch to using graphics memory entirely in this generation, and now it's creating a situation on the PC where Windows could, theoretically, operate using only a pool of traditional video memory on a graphics card.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
The old Xbox One used traditional RAM, ostensibly because it was conceived as a multi-tasking machine and not just a games console. In that, memory latency was more important than the raw bandwidth the PlayStation 4 preferred in its own GDDR5-based setup, with only a small amount of DDR3 for background tasks.
Best CPU for gaming: The top chips from Intel and AMD
Best gaming motherboard: The right boards
Best graphics card: Your perfect pixel-pusher awaits
Best SSD for gaming: Get into the game ahead of the rest
Latency's no longer really an issue for modern VRAM, however, which is why the Series X/S have joined PlayStation in using GDDR6. Which means there is now the potential for a dedicated PC gaming machine to be created using only graphics memory shared between both CPU and GPU.
I'm thinking of either a laptop or handheld device—the sort where a closed system makes sense—using an AMD APU along the same lines as a PlayStation 5 or Xbox Series X and delivering a load of gaming performance from a compact device.
I don't think I'd want a desktop machine like that, I enjoy being able to upgrade my machine, and there are no swappable GDDR6 modules that I know of. If it was a wholesale switch PCs would become almost entirely locked down, which doesn't work for my nerdy PC sensibilities.
But for a dedicated gaming device, it's an interesting prospect. After all, we know AMD can make APUs with massive GPUs built into them, and a gaming laptop built around one would be pretty cool. It's all still a pipedream right now, but no longer so far off now that Windows can now be fully onboard with access to video memory directly via the CPU.
Dave has been gaming since the days of Zaxxon and Lady Bug on the Colecovision, and code books for the Commodore Vic 20 (Death Race 2000!). He built his first gaming PC at the tender age of 16, and finally finished bug-fixing the Cyrix-based system around a year later. When he dropped it out of the window. He first started writing for Official PlayStation Magazine and Xbox World many decades ago, then moved onto PC Format full-time, then PC Gamer, TechRadar, and T3 among others. Now he's back, writing about the nightmarish graphics card market, CPUs with more cores than sense, gaming laptops hotter than the sun, and SSDs more capacious than a Cybertruck.