JEDEC rubber stamps the GDDR7 spec, mega bandwidth VRAM is on its way for next-gen GPUs
Don't expect to see it on mainstream graphics cards for a good while, though.
When it comes to graphics cards, a significant portion of their performance comes from the onboard VRAM. That memory not only has to store masses of data for rendering or compute, it also needs to be able to transfer data very rapidly. JEDEC, the semiconductor standards body, has finalised the specifications of GDDR7, the next generation of ultra-fast video RAM, with double the bandwidth of GDDR6.
News of the spec approval (via TechPowerUp) has been well received and for good reason, as it's not just the raw speed that's better. At the moment, if you want the fastest VRAM, you either buy 24Gbps GDDR6 from Samsung or 24Gbps GDDR6X from Micron (the only company to make that type of memory).
GDDR7 pushes them all aside, as it will be able to reach 32Gbps and, in time, will probably go even higher. To give you an idea of what that actually means, a Radeon RX 7900 XTX uses 20Gbps GDDR6 for a total of 960 GB/s of bandwidth. Swap that for GDDR7 and you're looking at up to 1,536 GB/s.
The speed boost is achieved by using a type of signalling called PAM-3 (Pulse Amplitude Modulation, Level 3), which transmits 3 bits of data for every two clock cycles. GDDR6 uses a system called NRZ (none-return-to-zero) that sends one bit every cycle, so the newer RAM is shifting 50% more data per clock.
It's a similar mechanism to that used by Micron in its GDDR6X (exclusively used by Nvidia), which runs PAM-4 signalling, for two bits per cycle. As to why GDDR7 isn't using PAM-4, it's all about simplicity and cost. The better system requires tighter electrical tolerances, so it's more expensive to manufacture and thus makes graphics cards that use it pricier too.
GDDR7 will be able to switch between PAM-3 and NRZ, depending on the load being placed upon it. When outright performance is required, the PAM-3 is activated, but for situations where reduced energy consumption is important (e.g. non-gaming, desktop use), then NRZ kicks in to keep things nice and cool.
The new VRAM spec also increases the bit density of the memory modules. GDDR6 is limited to 16Gb (2GB) per chip, but JEDEC has ratified support for up to 32Gb densities and clamshell mode is available again. In plain English, a graphics card with a 128-bit memory bus could, in theory, be equipped with 32GB of VRAM.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.
I don't expect we'll see full-blown 32Gbps, 32Gb GDDR7 modules being used on graphics cards any time soon, though, as it will take a while for Samsung, Micron, and SK Hynix to hone and ramp up production. What we'll probably get to begin with are modules that are around 28 to 30Gbps but still 16Gb in density. In other words, the next generation of GPUs will be sporting VRAM that's around 25% faster but no larger in terms of storage capacity.
As to which GPU vendor will be first to jump onto the GDDR7 bandwagon, I suspect that it will be Nvidia, although there is a slim possibility that AMD might get there first if it launches RDNA 4 before the consumer version of Blackwell hits the shelves. However, GDDR7 is likely to be pretty expensive to start with, and with AMD leaning more towards the mainstream sector for its next round of Radeons, it will probably stick with GDDR6.
Whatever ultimately happens, the vendor first to market with a GDDR7-powered graphics card is going to make a huge fuss about it.
Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?