Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles
(untagged)

NormalMapCompressor - A tool to automatically compress normal maps

0.00/5 (No votes)
4 Jul 2005 1  
Normal maps are used for realtime 3D rendering (mostly in games) to improve the visual quality, but compressing them makes the 3D content look ugly, this tool helps to fix that problem.

Note: If you have trouble starting NormalMapCompressor even with DirectX 9 installed and you are getting a "Microsoft.DirectX.dll not found" error, you most likely have no Managed DirectX installed (or for some reason your managed DirectX doesn't want to start with NormalMapCompressor). To download the latest version of DirectX visit Microsoft's DirectX site. You might have to install DirectX with "dxsetup.exe /InstallManagedDX" (sorry can't include that 32 MB file into the installer). Sorry for the inconvenience, on some computers it works instantly, on others it does not.

Contents

Introduction

NormalMapCompressor is a useful tool to automatically compress your normal maps. If you don't know what normal maps are or what's this about, you may want to skip this article, it's rather specific. Basically normal maps are used for real-time 3D rendering (mostly in games) to improve the visual quality by giving the impression of additional 3D detail on flat surfaces. However, normal maps are pretty big and with today's texture sizes that are above 1024x1024 your load times can get painfully long and the graphic memory will get filled very fast (and swapping big chunks of data around every frame is not really a FPS booster). One way or the other, you need to compress the color images, the normal maps and other data if you have lot of visual content in your game. This tool uses DirectX 9 (in .NET) to do the job and assumes that the normal maps are stored in the default xyz format (NVidia/Doom3 style, opposed to the ATI format where y is inverted). However, all normal map formats should work well with this tool (it doesn't care), but you might want to exchange the fx shader file for rendering.

This article is for an experienced audience with a basic understanding of DirectX, .NET and normal maps. If you are a beginner you might learn a thing or two, but I guess all this stuff will be too confusing if you don't know the basics. This is also not really an article about rendering normal maps with managed DirectX, but you can use my source code to see how it works (I've tried to comment it as much as possible). There are also some nifty tricks to achieve specular normal mapping with pixel shader 1.1 and to switch around the red and alpha channel, but the main focus of this article is compression and the various tricks used to achieve the best visual quality with the smallest file size possible (I read that Gf7800 has a special normal map compression format, but that doesn't help us with the current generation of graphic cards).

After installing NormalMapCompressor you should see the following screen (sorry for my crazy blue Windows theme btw ^^):

On the upper left you can select an input normal map and an optional height map (for parallax or offset mapping, or whatever you want to store in the extra alpha channel). Below that you will see some information about the currently loaded file. The important information are the red/green/blue variation percentages, as you can see they are usually equally distributed or the red/green component has more variations. The bad thing is that the DXT compression formulas for color images used in DirectX favors only the green channel (usual 30%, 59%, 11% formular), the blue/z channel isn't that important (especially if we renormalize later), but the image quality can be improved by preserving the red channel better. The good thing is that the DXT5 format provides a special alpha channel with better compression and we can use that if we just switch the red and the alpha channels. This isn't a new trick, Doom3 uses this technique and there are several articles by ATI and NVidia that were published about this stuff. Check out the following sites:

But I can't find any useful tool to use this technique and it's a pain in the ass for the graphic guys to convert the normal maps themselves and doing it on the fly isn't a good choice either (who wants to wait that long just to load up and convert all the textures for a simple test run). For this reason I wrote this little app, which lets you play around with your normal maps and automates the conversion process for you (see Batch Convert).

On the right side you can specify the output parameters, like generating mipmaps, normalizing all values before compression and the compression format. As you can see, DXT1 will give you the smallest file size, but it gives a pretty bad normal map quality, especially if you have a lot of curvy stuff on your normal map. DXT5 r-a switched format is a pretty good choice for most normal maps, it has a small file size (1/4 of rgba, 1/3 of rgb), a good compression and a decent normal map quality. The only thing you have to do now is to switch the r and a channel in the pixel shader.

Comparison

You don't believe me? Well, I've created a test normal map to check out the differences between the compression formats. As you can see the DXT1 compression is pretty bad (and I saved the trouble to test out other compression formats like 16 bit 565 RGB or even 8 bit formats, it's not worth it, check out the articles from ATI/NVidia about that), the DXT5 r-a switched mode is pretty nice and you can even get a better visual quality with the help of re-normalization in the pixel shader:

Features

Well, the NormalMapCompressor is just a nice little tool to test out compression formats and to convert single files, but usually your graphic artists make a lot of files and you might want to convert them at the time of their creation. For this reason the Batch Convert button is used which will open up the following dialog:

You can select an input directory, an input filter, an optional height map for the alpha channel filter and the compression, just like in the previous screen (btw: the normalization and generate mipmap settings are taken from the main screen). Now just select an output directory (or leave it the same as the input directory) and press start to convert all normal maps at once.

Because graphic artists are just as lazy as we coders (especially if you provide them with a lot of useful tools) and you may not know which files were just changed and which files didn't change, you can automate the checking process by using the Auto-update mode. Just let the program run, may be hide it in the tray with the "Minimize to tray" button, and have all your (uncompressed) normal maps compressed for you. We have separate directories for the uncompressed original stuff from our graphic artists (with all those PSD, BMP, TGA, etc. files flying around) and a directory for the game to quickly load textures. If you need more compression options or want more input filter options, just change the BatchConvert.cs file.

Here is a list of useful tools if you are working with normal maps:

  • 3DS Max and Maya for 3D content creation (don't ask me the details about it ^^).
  • ZBrush is a great tool to quickly create and paint bumps, you can even make 2D normal maps with it, but the normal map material uses the Ati-Inverted-Y version, use this image (link) for the normal map material (NormalRGBMat.zmt) to fix this behaviour and generate proper XYZ normal maps :)
  • Photoshop, as always, is the tool for image manipulation, effects, etc. Nvidia has done two nice plugins: DDS Exporter and a Normal Map Filter, download them here.
  • I still think FX Composer is the best program to test out shaders, look at the texture effects and test the new graphic features. GPU Gems I and II are also great books if you are interested in advanced shaders.

Files in the project

Overview of all the data files in the project:

NormalMapCompressor.exe This is the good stuff, start it and feel happy.
DirectionalNormalMapping.fx All the shader techniques are here, you can also use this file for 3DStudio Max or Fx Composer, just change the comments in the first couple of lines.
Microsoft.DirectX.* Files from the Feb 2005 Managed DirectX installation to make life easier (installing them is a bitch).
DiffuseMap.dds Required texture for rendering the sphere, this is the base color texture.
NormalizeCubeMap.dds This file is required for PS1.1 specular normal map shaders.
doorNormal.bmp This file is from the NVidia bump compression article.
rockNormal.bmp Default normal map when the program starts up.
stoneNormal.jpg Another stone texture, shows you that even with a good JPEG compression the normalization gets lost a bit (uncheck normalize normals to see the effect).
testNormal.bmp Smaller test normal map, you can see compression errors at the sharp edges of the text. Small normal maps are good for testing the compression.

Overview of all the source files in the project:

Program.cs Main entry point for the application, catches all the assembly load errors.
MainForm.cs This is the main form used to display all the input and output options and render the 3D sphere.
BatchConvert.cs Form for the BatchConvert dialog, supports minimizing to tray.
BitmapHelper.cs Bitmap helper methods like normalizing normals, switching r and a channels and combining rgb and alpha bitmaps.
TextureHelper.cs Helps to convert the texture format of a loaded texture.
MeshHelper.cs Helper methods to provide compatible vertices for the shaders.
TangentVertex.cs Tangent vertex is used for all shader techniques and contains the following: position, normal vector, texture coordinates, tangent vector.
StringHelper.cs String manipulation methods to make life easier.
App.ico Application icon, in 16x16 and 32x32, rendered in 3DStudio ...
AssemblyInfo.cs Assembly info (v1.0) about the application.

Tricks and Tips

First of all we have to start the application, which isn't really a pleasure for Managed DirectX (will be called MDX from now on) apps, because so many things can go wrong by just loading the MDX DLLs. If you take a look at the Program.cs you can see the following code safely executes the code in Main() without pre-loading any assemblies (which will happen only in StartApplication()). If anything goes wrong we can still catch the error and present it to the user instead of crashing without any error message:

    [STAThread]
    static void Main()
    {
        try
        {
            Application.EnableVisualStyles();
            StartApplication();
        } // try

        catch (System.IO.FileNotFoundException ex)
        {
            string filename = ex.FileName.Split(new char[] { ',' })[0];
            if (filename.EndsWith(".dll") == false &&
                filename.EndsWith(".exe") == false)
                filename += ".dll";
            MessageBox.Show("Important file not found (" + filename + "), 
                            unable to execute.\nError: " + ex.ToString(), 
                            "Fatal Error");
        } // catch (ex)

        catch (Exception ex)
        {
            MessageBox.Show("Fatal application error: " + 
                              ex.ToString(), "Fatal error");
        } // catch (ex)

    } // Main()

    
    /// <summary>

    /// Extra function to init context of 

    /// MainForm, this will throw

    /// an exception if something could not be loaded yet.

    /// </summary>

    static void StartApplication()
    {
        Application.Run(new MainForm());
    } // StartApplication()

The installer will take care of .NET and DirectX and make sure that at least .NET 1.1 and DirectX 9 are installed, and even if we check for Managed DirectX, there are so many incompatible versions around and so it's not a joy ride. I've tested this on some computers and most non-developer machines don't even have the MDX part installed and if it was installed it was most likely another version and not mine. After a hour of installing all sorts of DirectX versions and trying to find a common thing or an easy way to install MDX (guess what, there is no easy way, MS really doesn't want anyone to have MDX), I just copied the required DLL files to the installation directory and tested it on all my test machines and it worked fine everywhere (but I heard some people still have these problems) :) Dunno if that's a "good" solution (it increased the Installer by just 300 KB), but it works for now. MDX Feb 2005 was the last version with a managed DirectX installer, may be this will help you overcome your troubles: Feb2005_MDX.

You can read more about this stuff here:

OK, in MainForm.cs we will create the form and initialize DirectX using the following steps, see lines 760-970:

  • Setup the direct X device (using pictureBoxOutput as the target control).
  • Check the pixelshader version (ps1.1 or ps2.0).
  • Create view and projection matrices and setup all lighting and material colors.
  • Start the render timer (with a 10ms interval).
  • Create the 3D sphere for rendering.
  • Load all helper textures (diffuse map, normalization cube map for ps1.1).
  • And finally load the shader effect (fx) file for all the shader business.
  • After loading the default normal map, start rendering.

All the buttons, checkboxes and radio buttons use very simple code, the only other important thing in MainForm.cs is the render loop:

    // Clear background and start rendering

    device.Clear(ClearFlags.Target | ClearFlags.ZBuffer, 0, 1, 0);
    try
    {
        device.BeginScene();

        // If mouse is pressed, hold rotation!

        if (leftMouseButtonPressed == false)
            rotation += (Environment.TickCount-lastTick) / 2500.0f;
        lastTick = Environment.TickCount;

        // Init world matrix, just rotate around the y axis.

        worldMatrix = Matrix.RotationAxis(
            new Vector3(0, 1, 0), rotation)*
            Matrix.Scaling(zoomFactor, zoomFactor, zoomFactor);

        // Only use shader if right mouse button is not pressed

        if (rightMouseButtonPressed == false)
        {
            // Select technique depending on the selected shader technique

            // and if we use agbr dxt5 or not.

            effect.Technique =
                radioButtonDXT5Switched.Checked ?
                (comboBoxShaderTechnique.SelectedIndex == 0 ? "SpecularAgbr" :
                comboBoxShaderTechnique.SelectedIndex == 1 ? "SpecularAgbr20" :
                "SpecularAgbrNormalize20") :
                (comboBoxShaderTechnique.SelectedIndex == 0 ? "Specular" :
                comboBoxShaderTechnique.SelectedIndex == 1 ? "Specular20" :
                "SpecularNormalize20");

            //not required: device.Transform.World = worldMatrix;

            effect.SetValue("world", worldMatrix);
            effect.SetValue("worldViewProj",
                worldMatrix * viewMatrix * projMatrix);
            // Normal map is set in UpdateOutput, 

            // rest is set at Initialization.

            // Render the shader technique with the 

            // required number of passes.

            try
            {
                int passes = effect.Begin(0);
                for (int pass = 0; pass < passes; pass++)
                {
                    effect.BeginPass(pass);
                    sphere.DrawSubset(0);
                    effect.EndPass();
                } // for (pass)

            } // try

            finally
            {
                effect.End();
            } // finally

        } // if (rightMouseButtonPressed)

        else
        {
            // Code for no shader, just for testing

            device.Transform.World = worldMatrix;
            device.VertexFormat = CustomVertex.PositionNormalTextured.Format;
            sphere.DrawSubset(0);
        } // else

    } // try

    finally
    {
        // If BeginScene was called, EndScene must be called too!

        device.EndScene();
    } // finally


    // Finished

    device.Present();

So if the right mouse button is not pressed this will render the sphere with the shader effect file using the selected technique.

Shader code

The code for ps2.0 is not that complicated (at least if you have worked with shaders before ^^), all the shader code can be found in DirectionalNormalMapping.fx (same stuff for point lights which isn't much different):

// Pixel shader function

float4 PS_Specular20(VertexOutput_Specular20 In) : COLOR
{
    // Grab texture data

    float4 diffuseTexture = tex2D(diffuseTextureSampler, 
                                           In.diffTexCoord);
    float3 normalVector = (2.0 * tex2D(normalTextureSampler, 
                                     In.normTexCoord)) - 1.0;

    // Additionally normalize the vectors

    
    //not needed: normalize(In.lightVec);

    float3 lightVector = In.lightVec;  
    float3 viewVector = normalize(In.viewVec);

    // Compute the angle to the light

    float bump = saturate(dot(normalVector, lightVector));
    // Specular factor

    float3 reflect = 
        normalize(2 * bump * normalVector - lightVector);
    float spec = pow(saturate(dot(reflect, viewVector)), 
                                              shininess);

    float4 ambDiffColor = ambientColor + bump * diffuseColor;
    return diffuseTexture * ambDiffColor +
        bump * spec * specularColor * diffuseTexture.a;
} // PS_Specular20(.)

The same thing applies to DXT5 r-a switched texture mode, just change a single line: float3 normalVector = (2.0 * tex2D(normalTextureSampler, In.normTexCoord).agb) - 1.0;

Well, that works fine for ps2.0, achieving the same effect on ps1.1 hardware is a little harder. First of all: there is no normalization function in ps1.1 and without it our viewVector will look very crappy. For that reason we use a little trick called NormalizeCubeMap.dds (texture must be uncompressed), which contains normalized xyz values for every 3D cube map coordinate we pass into the viewVector. The next problem is the number of instructions we can use, which is limited to 8 and powerful functions like pow are not possible either. Using the .agb swizzle takes way to much instructions on ps1.1 anyway, and because of all these problems there is no way in hell we gonna write all this crap without pixel shader assembly code (which isn't pleasant either, but is more fun than playing around with 8 instructions in a high level shader language).

OK, let's take a look at the shader code. First of all we have to setup all samplers, constants and texture coordinates for this shader (like we would do in a fixed pipeline code):

    sampler[0] = (diffuseTextureSampler);

    sampler[1] = (normalTextureSampler);

    sampler[2] = (NormalizeCubeTextureSampler);

    PixelShaderConstant1[0] = ;

    PixelShaderConstant1[2] = <diffuseColor>;

    PixelShaderConstant1[3] = <specularColor>;

    PixelShader = asm
    {
        // Optimized for ps_1_1, uses all possible 8 instructions.
        ps_1_1
        // Helper to calculate fake specular power.
        def c1, 0, 0, 0, -0.25
        //def c2, 0, 0, 0, 4
        def c4, 1, 0, 0, 1
        // Sample diffuse and normal map
        tex t0
        tex t1
        // Normalize view vector (t2)
        tex t2
        // Light vector (t3)
        texcoord t3

Well, there are already four texture instructions to load all the texture coordinates. Constant c0 holds the ambient color, c1 is a constant used later to improve the specular power, c2 is the diffuse color, c3 is the specular color and c4 holds another constant to help us in switching rgba to agbr.

t0 holds the diffuse map, t1 is the normal map texture coordinate (same as t0), t2 is the view vector which will get normalized with the help of the normalization cube map and t3 holds the light vector. And finally v0 is a little helper variable passed from the vertex shader returning light vector/3 to help us in calculating float3 reflect = normalize(2 * bump * normalVector - lightVector);, which is obviously not possible in ps1.1 (btw: v1 is free and can be used for more advanced shaders like sub surface normal mapping or specular map normal mapping or just to support point lights).

OK, back to the pixel shader code, first of all we need to convert the DXT5 compressed rgba normal map value to agbr using lerp (copying t1 to r1 and putting alpha channel into red channel):

        // v0 is lightVecDiv3!
        // Convert agb to xyz (costs 1 instuction)
        lrp r1.xyz, c4, t1.w, t1

Btw: t1.r still holds the alpha channel, if we need that. Now execute the following formulas:

// Compute the angle to the light
bump[r0] = saturate(dot(normalVector[t1], lightVector[t3]));
// Specular factor
reflect[r1] = bump[r0] * normalVector[t1] - lightVectorDiv3[v0];
spec[r1] = saturate(dot(reflect[r1], viewVector[t2]));

with the following ps1.1 assembly:

        // Now work with r1 instead of t1
        dp3_sat r0.xyz, r1_bx2, t3_bx2
        mad r1.xyz, r1_bx2, r0, -v0_bx2
        dp3_sat r1, r1, t2_bx2

And finally take the pow(spec) function with the following formula: spec = saturate(2*spec*spec-0.25); using the alpha channel while simultaneously combining the ambient, diffuse and specular colors with the calculated factors in the rgb channel (combining instructions like crazy ^^): return diffuseTexture * (ambientColor + bump[r0] * diffuseColor) + spec[r1.w] * specularColor; using the following ASM code:

        // Increase pow(spec) effect
        mul_x2_sat r1.w, r1.w, r1.w
        //we have to skip 1 mul because we lost 1 instruction because of agb
        //mul_x2_sat r1.w, r1.w, r1.w
        // r0 = r0 (bump) * diffuseColor + ambientColor
        mad r0.rgb, r0, c2, c0
        // Combine 2 instructions because we need 1 more to set alpha!
        // Sub -0.4 from fake pow(spec) to make it look more realistic
        +add_sat r1.w, r1.w, c1.w
        mul r0.rgb, t0, r0
        +mul_x2_sat r1.w, r1.w, r1.w
        mad r0.rgb, r1.w, c3, r0
        // Set alpha from texture to result color!
        // Can be combined too :)
        +mov r0.w, t0.w
    };

The last instruction is used to copy any alpha information from the diffuse texture to the final color to support alpha blending and alpha tests. Pretty complicated stuff this ps1.1 business, right? Well, you only have to do it once, or if you working for EA you might not have to do it at all (see BattleField2, which won't even start on PS1.3 hardware). I think it's nice to support older hardware and when doom3 and hl2 can do it, why can't everyone else do? ^^

Conclusion

Let's hope this article isn't too long. Feel free to read and learn more about normal mapping, fx shader files, managed DirectX, etc. by just reading and debugging my code, there is just too much code to go into any more detail here. I hope my little tool is useful and if you have questions or improvement ideas, feel free to comment in the comment section.

NormalMapCompressor can be freely used in your projects. If you use the source code, it would be nice to mention the original author (hey, that's me).

History

  • 2005-07-04 (v1.0)- Initial version.
  • 2005-07-05 (v1.1)- Updated version.

    New features:

    • Flip x/y/z functionality to import any other saved normal map formats.
    • Mipmaps do finally work now and image data size will now display the size including the mipmap data.

    And I've fixed the following bugs:

    • Generate mipmaps no longer depends on AutoGenerateMipMap (which didn't work for saving), instead the mipmaps are generated with TextureLoader.FilterTexture.
    • Saving dds directly in the main app fixed for DXT5Switched, which saved the same version as DXT5 (which is wrong if you want to have the DXT5Switched output), Batch Convert did that correctly already.
    • Added a few more error messages in case anything goes wrong at the time of initialization.
    • Fixed the bug with Auto-Update which always updated all the images instead of only the changed ones. Auto-Updating works fine now and is a lot faster!

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here