Nvidia might have forgotten the venerable GTX 1060 but AMD's new Frame Generation tech proves there's life in the old GPU yet

FSR 3 Frame Generation isn't officially recommended for use on Nvidia GeForce GTX 10-series graphics cards but it turns out not does it work, it runs exactly as intended.

When AMD first announced FSR 3 back in November 2022, during the RDNA 3 architecture launch, there was plenty of gossip as to what it would all entail. Most people assumed it would be an entirely shader-based frame generation and, of course, that's exactly what it turned out to be.

FSR 3 Frame Generation is currently only available in two games (Immortals of Aveum and Forespoken) but it works well in both of them. There are caveats to using it, though, such as it works best when the game is already running at 60 fps or faster.

The technology will increase the latency between frames updated by the engine, so using it with very low frame rates can make things seem even more sluggish. Because it's not utilising dedicated optical flow accelerators the Nvidia tech does, AMD's take on frame gen can also add ghosting around moving characters. And the lower the frame rate, the more noticeable those artifacts become.

Another potential sticking point is that, while supporting Radeon RX 6000 and RX 7000-series GPUs, AMD only recommends the tech for GeForce owners of RTX 20 series or newer. It doesn't say that it won't work with older cards, just that the performance gains might not be all that great, or perhaps even make it worse.

Well, one tech YouTuber by the name of Daniel Owen has decided to give it a go anyway, testing out a GeForce GTX 1060 and a GTX 1070 in Immortals of Aveum. This game uses Unreal Engine 5 to produce some pretty spectacular graphics but, as the video shows, the performance at 1080p, even on Low quality makes it barely playable.

Enter stage left FSR 3, using Performance upscaling and Frame Generation, to save the day. Well, not completely save things, as the game still seems a little laggy at times, but considering no extra hardware or machine learning features were needed for this, the end result is still impressive for a free performance boost on an ageing GPU.

So if it works well enough on these cards, why did AMD not suggest that it could be used on Pascal powered graphics cards? AMD's Frame Generation wizardry heavily relies on something called asynchronous compute—this is where all of the workload for the GPU is split up into separate, parallel sequences of instructions, instead of one long list.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

While AMD has pushed its use since its GCN architecture first came to light in 2011, Nvidia took longer to get onboard, and it wasn't until Pascal that proper support for it appeared. Even then, AMD's GPUs generally showed more gains from the use of asynchronous compute than Nvidia's. As Daniel shows, though, it works well enough for FSR 3 to make an actual, worthwhile difference.

Nvidia's own Frame Generation tech is exclusively tied to its RTX 40-series and there doesn't seem to be any way of making it work on older chips as yet. So to have something that does the same thing but works across multiple vendors and generations of GPUs, and without the need for specialised hardware, is the ideal direction that PC gaming needs to be heading toward.

It'll be interesting to see if AMD's pre-RDNA GPUs handle FSR 3 Frame Gen just as well as Nvidia's GTX 10-series, but the important take from all of this is that if you're still running with an older card, and if it gets expanded out to supporting a host of different games, Frame Generation might just be the thing to keep it all going until that inevitable upgrade time does come knocking. 

Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days? 

TOPICS