In this article, we’ll get you up to speed on how ASTC differs from other texture codecs and why you should use it. Then, we’ll show you how you can integrate ASTC into your game or engine’s texture pipeline. Finally, we’ll demonstrate some quick benchmarks in both compression time and quality against common encoders used today.
Today, AAA games easily possess an on-disk footprint of 20GB or more in texture data. If you’re developing a game, chances are, textures consume a massive proportion of your memory and disk budget. This memory consumption also eats into your frame time. More texture data means greater hardware memory bus and GPU memory cache consumption.
This trend seems to have no end in sight. Looming ahead are 4k and 8k resolutions, as well as increased adoption of displays with HDR support.
So, what’s a developer to do?
Chances are, you employ some form of compression, and not just any compression. Textures need to upload to the GPU quickly and decode on-the-fly to have any hope of keeping frame times tight. Thus, you likely use a compressed texture format that supports hardware decompression. Common examples include BC1-7 (block compression), ETC1-2 (Ericsson texture compression), PVRTC (PowerVR texture compression), or ASTC (Adaptive Scalable Texture Compression).
Recently, Arm collaborated with key game development stakeholders to release a new ASTC encoder sporting dramatically improved compression times for an already high-quality codec. In this article, we’ll get you up to speed on how ASTC differs from other texture codecs and why you should use it. Then, we’ll show you how you can integrate ASTC into your game or engine’s texture pipeline. Finally, we’ll demonstrate some quick benchmarks in both compression time and quality against common encoders used today.
What is ASTC?
ASTC is a form of texture compression that uses variable block sizes, rather than a single fixed size. It's a flexible texture compression scheme that uses both the OpenGL ES and OpenGL 3D graphics APIs. ASTC is an open standard, royalty-free format specification managed under the auspices of The Khronos Group.
Before getting into the details of ASTC and what makes it unique, let’s first review how a simpler codec like BC1 works.
Most hardware texture encoders are block-based, meaning the source image is subdivided into adjacent blocks, which are then independently compressed. For BC1 in particular, the image is split into 4x4 tiles. Each block is compressed to 8 bytes supporting three color channels and 1 bit for alpha. Textures using formats like ASTC and BC1 can stream directly to the GPU without any full-image decompression.
BC1 works if you have three color channels and 1 alpha bit, but what if you only have two channels? What if you need more than 1 bit of alpha information? What if you need more precision in one color channel than another or want to compress a 3D or volume texture?
If you’re using the BC formats, this is where all the other BC variants come in: BC3 through BC7 support various permutations of all the types of textures you might want to ship in a game.
ASTC takes a different approach.
True to its name, ASTC is adaptive in that a single format can support 2D or 3D, LDR or HDR (low or high dynamic range rendering), and one through four channels of color data. ASTC texels are always decoded into an RGBA value regardless of the number of components requested. Also, ASTC blocks don’t need to be power-of-two sized. In fact, they don’t even have to be square.
Compared to the BC formats, ASTC is most similar to BC7. While the simpler codecs use fixed-mode encodings where each block is compressed by choosing two color interpolants and assigning weights to each constituent texel, codecs like ASTC and BC7 optionally partition each block, so blocks not accurately represented by a single gradient can be encoded in terms of several gradients.
Furthermore, there is an additional degree of freedom in choosing what information is stored in the interpolants. ASTC stores the partition assignments using a procedurally generated lookup table, as shown below. Explaining the full mechanics of the ASTC encoding scheme is outside the scope of this article, but you can read more about it in the ASTC Format Overview.
Why Use ASTC?
If you’re shipping mobile games, ASTC has plenty to offer. ASTC is supported (at the time of writing) by Apple devices using the A8 chip (first introduced in 2014) or later as well as a large number of Android devices (this includes most GPUs introduced since 2014). For a recent list of devices supporting ASTC, refer to the documentation on recommended texture formats by the Unity game engine. ASTC is also supported by other game engines such as Unreal Engine 4 and has hardware support on other consumer platforms such as the Nintendo Switch (NVIDIA Tegra K1 GPU).
Aside from hardware support, there are a few other notable advantages of ASTC over other common formats. As illustrated in the chart below, the encoding bitrate is highly flexible, allowing more aggressive compression than other formats.
Configurable modes can encode masks and normal textures at higher quality. There is an HDR encoding mode, including a format variation that supports both LDR and HDR alpha. In addition, there is volumetric 3D image texture support — notable because volume textures tend to have large footprints, but these volumetric 3D extensions are optional and not as widely supported (Mali only).
ASTC in Practice
When working with other texture formats, the learning curve consists of getting a sense for which formats work well given a particular texture usage. For example, tangent-space normal maps are typically encoded as BC5 textures since it affords more precision to two vector components (the third can be reconstructed in a shader).
In contrast, with ASTC, the learning curve consists of understanding the configuration of a single adaptive format.
The basic settings available when compressing a texture are color mode, block size, quality, and texture usage (color, mask, or normal).
For color mode, you have the option of using linear LDR (8-bit quantized color channels), sRGB LDR (8-bit quantized color channels with sRGB gamma encoding), or HDR (floating point color channels). The HDR support is a nice feature for textures you would sample during, say, your lighting pass (for example, a skybox) or for textures you wish to sample for emitting directly to an HDR display. For HDR, you have the additional option of encoding the alpha channel as LDR or as HDR.
Block sizes in ASTC are also a significant departure from other texture formats. Not only is the block size not fixed, the block size is not even necessarily square. The larger the block footprint, the higher the compression ratio, but keep in mind that the compression is lossy so you should expect signal degradation. The blocks available range from 4x4 (8 bits/texel, like BC7) to 12x12 (0.89 bits/texel). For mobile game development, 6x6 and 8x8 block sizes are the most popular, giving a good tradeoff between image quality and bandwidth.
The quality setting allows you to trade compression speed for image quality. The options here range from fastest (sub-second compression for a 2K texture) to exhaustive (dozens of seconds or more for a 2K texture).
Textures with special uses like a tangent-space normal map or a material mask can be encoded with a few more optional flags. If the texture is compressed as a mask, this informs the encoder that the color channels in the source image are storing uncorrelated data and should not be treated as a color for the purposes of choosing a good encoding. This will produce higher perceptual quality results.
If the encoder is told to compress a 3-channel input as a normal map, the compressed data will contain two higher-precision channels (X and Y), since Z can be reconstructed in a shader using the normalization condition.
Defining Your Texture Pipeline
The Arm ASTC Encoder is a command-line tool for compressing and decompressing images using the ASTC texture compression standard. It can also be used as a library for users wanting more control over their pipeline. Arm’s new encoder (astcenc 2.1) is up to six times faster than its predecessor (astcenc 1.7), and Arm provides a useful set of guidelines for using the encoder. While you should read Arm’s documentation in full, here are some quick tips to get you started.
First, create a classification scheme for all the textures in your game (for example, low, medium, or high-quality variants of character and landscape textures). You’ll want enough granularity to optimize the memory footprint of your title, but you don’t want so much granularity that your artists complain about the abundance of options!
For each classification, consider what type of data the texture encodes, the relative importance of the texture (for example, character equipment albedo versus foliage billboard versus roughness/metallic texture), worst-case draw distance, and worst-case lighting conditions. Textures you see up close and under good lighting will need higher image quality.
For each classification, determine the desired compression ratio. Consider your on-disk footprint, the number of textures needed, and your budget, then work backward. Afterward, analyze representative textures encoded at that bitrate for quality to determine the quality setting. Note that for color textures in particular, you’ll want to use the sRGB color space for better perceptual quality.
Finally, teach your offline texture pipeline how to consume your texture assets and compress using the correct settings (covered in the next section). Note that the command line encoder does not yet automate mipmap chain generation and final packing, but it can be compiled as a library allowing developers to provide their own frontend. To integrate the command line tool into your pipeline, you’ll need to downsample the image using an external tool and pack the compressed output into a texture file yourself (for example, KTX). Refer to the tracking issue on GitHub.
While this may seem like many steps, it’s actually similar to the sort of analysis and bucketing you would need to do with the BC or ETC texture format families. The main benefit to a pipeline like this is the flexible quality setting and bitrate to maximally pack your texture data, compared to other texture formats with a fixed bitrate. In other words, the upfront investment may need to be a bit higher to exact a greater payoff. Of course, if you want to just get up and running, you can define your initial classification scheme as coarsely as you want.
Benchmarking and Using the ASTC Encoder
For experimentation, we’ll use a 2K albedo texture taken from a Medieval Pavement Floor material you can find at textures.com. To follow along, and to integrate the encoder in your own texture pipeline, you’ll also need the encoder, which is distributed as binaries on GitHub. Also, you will need another tool such as ImageMagick to convert TIFF images to a format the ASTC encoder understands, such as PNG.
The release actually contains three binaries corresponding to the instruction set they were compiled against. For all the demos here, we’ll be using the binary compiled with AVX2 (advanced vector extensions) for maximal throughput. The albedo image is reproduced below:
To encode this as an ASTC texture, use the following command:
astcenc-avx2.exe -cs albedo.png albedo.astc 4x4 -medium
This command specifies:
- (-cs) The input is an LDR sRGB image
- (4x4) Encode at 8.00 bpp (4x4 block size)
- (-medium) Balance compression speed for image quality
The encoding time for this command on an AMD Ryzen 9 3900X (24 threads) is 3.94 seconds and the result is impressive. Here’s a comparison of the before and after shots, zoomed in at the upper right corner (source image on left).
Artifacts are minimal even at this magnification level and are comparable to BC7 quality. At a lower quality setting with a larger block size (8x8 on the right), the block artifacts become more noticeable, but the image as a whole still retains a decent amount of detail.
The following matrix shows compression times for a few representative quality and block-size settings (subject to some noise).
Compression Time | Block Size |
Quality Setting | 4x4 (8.00 bpp) | 6x6 (3.56 bpp) | 10x10 (1.28 bpp) |
Thorough | 7.519 s | 10.006 s | 11.012 s |
Medium | 1.994 s | 2.489 s | 2.065 s |
Fast | 0.381 s | 0.305 s | 0.241 s |
As you can see, the quality setting has a generally greater impact on compression time than the block size does. In my tests, the encoder performed four to eight times faster than AMD’s Compressonator operating under similar settings. For more statistics on the performance improvements in ASTC Encoder 2.0 over its predecessor (astcenc 1.7), Khronos has provided a helpful analysis. The measurements presented here were taken with a November 13, 2020 release (astcenc 2.1) which boasted even more performance improvements.
Wrapping Up
If you ship games or texture-heavy applications on mobile, ASTC should be an integral part of your pipeline. With broad support from mobile devices created in the last five years or more, variable compression ratios, and high-quality results for a swath of texture formats and uses, ASTC is a compelling texture format. The faster encoder has also come a long way in ensuring reasonable content cook times for what is considered the “final boss” of texture format complexity.
To learn more about the ASTC format, Arm provides an excellent visual tutorial in the encoder docs. A more detailed specification on the various ASTC block modes and compression format is from The Khronos Group. Finally, Arm provides comprehensive documentation including information on best practices, ASTC usage in popular game engines, alternative encoding tools, and ASTC usage at runtime.