Arise Arc owners and rejoice as Intel's new drivers offer up to 36% more performance in Dragon's Dogma 2. Maybe
Yes, I've actually tested them but the GPU gods were rather moody today.
All graphics cards get new drivers on a pretty regular basis these days, and that's especially true of Intel and its Arc series. The chip giant has just released a new set, with performance improvements for a variety of games, one of which being Dragon's Dogma 2. So I tested them out and you'll probably able to guess what happened.
I'd already examined how well an Arc A770 graphics card handles Dragon's Dogma 2 for our performance analysis of the game, but those results were all collated using maximum quality settings, Interlaced rendering mode, and the previous 101.5333 drivers. The new 5379 set promises up to 36% more performance at 1080p and 31% more at 1440p, using High settings and the Progressive rendering mode.
As I'm still in the midst of collecting more DD2 performance data, it made sense to dive back in and check out Intel's claims. To be honest, I wasn't expecting much but any extra fps is a good thing in my book, as long as it doesn't mess up the game or graphics. So I stuffed the A770 into a Ryzen 5 5600X gaming PC, with 16GB of DDR-3200 RAM. It's not a super high-end machine but it's still perfectly decent at handling today's games.
The one area of the game I didn't bother checking the new drivers with is the main city area, as this is entirely CPU-limited with an A770. So I just concentrated on testing the open world, a little outside of the main city, where there's lots of rolling hills, trees, and grass. The quality settings were clicked over to the High preset and the Progressive rendering mode enabled.
And without further adieu, here are the results, taken at 1080p, 1440p, and just for giggles, 4K.
Oh. Well that's disappointing. The newer drivers did make the game feel a tad smoother at 1080p, with no hint of any stuttering that the previous set exhibited at times. So how can Intel get away with claiming such a large improvement? Well the devil is in the details and the release notes point out that the testing was done on a Core i9 14900K system.
Unfortunately, I don't have one and I wasn't about to dig into my main 14700KF setup, as I use that for work. Instead, I popped the A770 into a Ryzen 7 8700G platform, with 32GB of DDR5-6400. It's obviously nothing like a 14900K but it is quite a bit faster than a Ryzen 5 5600X, thanks to having two more cores, higher clocks, and a newer architecture.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.
So maybe that system would show some differences between the drivers, yes? Well after an endless wait for the shader compiling to finish, I had hoped I could dive in and find out for sure. Except I couldn't as the game just hard-locked and crashed at the main menu every time on that platform, even with the latest drivers.
Ah well. I guess if you do have a Core i9 14900K gaming PC with an Intel Arc graphics card (and who am I to judge if you do), then you should certainly upgrade to the 5379 drivers as plenty of other games have apparently been treated to a performance boost, such as Assassin’s Creed Origins, Detroit: Become Human, Fortnite, God of War, Sons of the Forest, and the forthcoming Horizon: Forbidden West.
Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?