Why are game install sizes getting so big?

(Image credit: Supaste at English Wikipedia [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)])

Remember just last month when I said that the HDD is dead to me? Some of the latest games are trying very hard to make me reconsider. Where I used to feel 1TB of storage was more than sufficient and 2TB was spacious, multiple games have come out or will soon arrive where the install size is more than 100GB. I certainly don't need every game I own installed on my PC, but I do like to keep a collection around for testing and such. And with Red Dead Redemption 2 apparently ready to suck up 150GB, and Call of Duty Modern Warfare thinking it may eventually need 175GB, it got me wondering.

Just WTF is going on with game sizes that we're suddenly jumping from 50GB being a large install to nearly three times that much!? I remember when entire worlds of gaming could be contained on just a single floppy disk—or maybe a collection of floppy disks. Whatever. The point is, if we were still doing physical distribution media, we're looking at games that would require 20 DVDs, 250 CDs, or a mere 121,500 3.5-inch floppy disks. That's the 1.44MB "high density" variety, naturally, not the earlier 720KB version.

That's a lot of CDs (Image credit: https://www.pexels.com/photo/abstract-art-background-blank-270456/)

I decided to do some digging, because while the simple answer is that there are more textures, videos, audio, maps, and other files being included, that doesn't really explain everything. I ended up focusing on the first item in that list, because it was the most approachable and I've seen quite a few HD texture packs come out in the past couple of years. What I really wanted to know is why higher resolution textures are so gigantic.

Let me start off with a simple example of files sizes for a relatively modest 2048x2048 texture—or a 2K texture size if you prefer. If you want to store a texture without any compression, that would typically mean a 32-bit value for every pixel: RGBA (Red, Green, Blue, and Alpha—alpha being the technical name for transparency). That's about 4.2 million pixels and 4 bytes per pixel, and we end up with a nice round 16MiB file size. Now imagine all of the models and textures in some of the larger open-world games out there. If every surface used a 2K uncompressed texture, even with 150GB of textures there would still only be room for 9,600 unique textures.

Okay, that actually doesn't seem too shabby, but remember that's for 2K textures. What if the game wants to use 4K textures (4096x4096 resolution)? That would quadruple the file sizes, or cut the number of unique textured down to one-fourth as many. 2,400 textures in 150GB, or only 800 textures in a 50GB game. That doesn't seem like nearly enough for a modern game with lots of unique environments, and it's not.

Lossy image compression is a bit like trash compaction (not really) (Image credit: Photo by Alex Fu from Pexels)

But let me go back to that uncompressed bit. Why should textures be stored uncompressed? Sure, it might result in slightly higher image quality, but the file sizes can be massive (relatively speaking). Anyone familiar with digital images is probably thinking JPEG or PNG files would be better, and as far as file sizes go, they'd be correct. It's like putting trash into a compactor: a high quality (but lossy, meaning there are compression artifacts) JPEG could cut the file sizes down to around 2MB each for example, and higher compression rates (and lower quality JPEGs) could reach file sizes of 1MB or even 0.5MB. Alternatively, even lossless PNG compression could typically cut the file sizes in half.

The problem is both of those imaging algorithms are variable and somewhat complex. Your PC might only require a fraction of a second to open and process a JPEG image, but what if it had to process thousands of such images? Game loading times would skyrocket, and gamers wouldn't be happy.

Enter DirectX Texture Compression, aka DXTC (and formerly S3TC, named after the now-defunct company S3 Graphics that created the technique back in the late 90s). S3TC became officially supported as of DirectX 6.0, which was released way back in 1998, and renamed DXTC with five modes of operation. I won't dig into the low-level details, but in short DXTC is a fixed rate image compression algorithm. Unlike JPEG and PNG, that means it's easy to determine precisely how large the resulting files will be, and it's also easier to create hardware to speed up the processing of images. Basically, all modern graphics cards since around 2000 have had hardware support for S3TC / DXTC.

For games that use DXTC, the most common modes of operation yield either a 6:1 or a 4:1 compression ratio. That means a game can store four to six times as many images within a given space. Unreal Engine and Unity Engine support DXTC, along with other file formats, and while I'm sure there are exceptions, my understanding is most games currently use DXTC for their textures. And of course, not all textures used in a game need to be the same resolution, so smaller objects that may only occupy a few hundred pixels of screen space don't need to be stored as a massive 2K or 4K texture.

Demonstration of why games use mipmapping (Image credit: BillyBob CornCob [CC0])

There's one more thing: mipmaps. MIP is apparently from the Latin phrase "multum in parvo," meaning "many things in a small place." (Yeah, I didn't know that either until I was writing this article—thanks, Wikipedia!) Mipmaps have been around since the early 80s and have been used extensively in graphics applications. The short summary: not every texture needs to be stored at full resolution, all the time, and by precalculating lower resolution versions of a texture it's possible to optimize memory and bandwidth requirements.

Each DXTC texture ends up storing mipmaps of the same texture at every half the resolution intervals, all the way down to basically nothing—a 1x1 pixel. If you start with a 2K texture, you need to precalculate 11 additional mipmaps (also just called mips): 1024x1024, 512x512, 256x256, 128x128, 64x64, 32x32, 16x16, 8x8, 4x4, 2x2, and 1x1. Unreal Engine even provides this helpful documentation listing the final file sizes.

We can now at least get a reasonable idea of how much space a game will require based on the number of textures it uses, along with the texture size. It all starts with the number of unique textures, and for simplicity I'm going to ignore the fact that not all textures are the same size. For any reasonably complex game with a lot of different environments, I think it's safe to say that it will use thousands of textures. Let's just toss out 10,000 as a nice round number.

Going with Unreal Engine file sizes, for 1K textures (basically a "high quality" texture), the game would need 13.3GB of textures. Bump up the resolution to 2K textures and that jumps to 53.3GB. And if you want the latest and greatest 4K textures, you're looking at 223.7GB. Oops. And that's just for textures—most games will also have lots of audio files, especially if they're fully voiced (like Call of Duty and Red Dead Redemption 2). Add in map geometry files and some videos and any modern game that uses 4K textures is going to end up being pretty massive.

How many different textures are in this scene? (Image credit: Rockstar)

There's obviously no single answer as to how a game stores its textures, never mind audio and video files. Still, in general, much of the game install size bloat from the latest games appears to be coming from the texturing department. The benefit is that we no longer get clearly repeating textures in games, and everything looks far more realistic. Look back at games like the original Half-Life, or Deus Ex, and the improvement in image fidelity is massive. I'm looking forward to staring off into the sunset in Red Dead Redemption 2, admiring the graphics—150GB be damned!

But higher resolution textures are clearly a case of diminishing returns, particularly at lower resolutions. Switch from ultra to high textures (4K to 2K, or from 2K to 1K) and it's often difficult to see the changes, especially on a 1080p monitor. That's because the textures are usually only being used on a small portion of the screen. Up close, a wall with a 2K texture should look better than the same wall with a 1K texture, but back up and the game engine will probably swap out the 2K texture for a lower resolution mipmap and you'll never even notice.

That's good news, because I'm certainly not looking forward to any games trying to include 8K resolution textures that quadruple the install size yet again. Maybe when we're all using 8K displays at some point in the future it will become a factor, but hopefully by then we'll also have affordable SSDs with capacities in the tens of terabytes. That or someone clever will figure out a better texture compression algorithm. I can only hope.

Jarred Walton

Jarred's love of computers dates back to the dark ages when his dad brought home a DOS 2.3 PC and he left his C-64 behind. He eventually built his first custom PC in 1990 with a 286 12MHz, only to discover it was already woefully outdated when Wing Commander was released a few months later. He holds a BS in Computer Science from Brigham Young University and has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance. 

TOPICS