Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles / Languages / C

Symbolic Code Isn't Always Meaningful

4.93/5 (9 votes)
18 Mar 2019CPOL2 min read 10.4K  
Overuse of #define can be ludicrous.

Introduction

We rightly provide meaningful names to numeric and string constants in our code for a variety of good reasons. For example, it's much easier to remember and use the name SPI1 than it is to use the pointer value 0x40013000, and it's much nicer, too, when porting the code to a device where SPI1 is at address 0x40014000 instead: we change the definition once and the rest of our code is magically brought up to date.

But...

Let's Not Go Overboard

C++
#ifndef ZERO
#define ZERO 0 
#endif

#ifndef ONE 
#define ONE !ZERO 
#endif

I stumbled over this in production code today. The sheer foolishness of these six lines of code baffles me. Let's look at...

Why This Is Wrong

This is a perfect example of cargo cult programming. Someone chose to follow "best practices" without understanding why they're best practices and when they should not apply.

This person heard you should always wrap your #defines in protection blocks, so they've #ifndef-protected their definition of ZERO. You know, in case someone defines ZERO to be any value other than 0... (Because #define ZERO 17 would never be confusing!)

Naturally, they've done the same for ONE. In case someone had already done #define ONE 2301331...

But that's not where the oddness ends.

Look again at how ONE is defined: !ZERO. A numerical value is being defined in terms of a logical operator. This isn't just confusing. This is dangerous. You cannot make any assumptions about logical operations in C. !ZERO expands to !0 which may not necessarily be 1 in every compiler. Even if the standard defines !0 to be 1, it's not a safe thing to assume. I've been programming in C since before there was a C standard. I've never encountered a compiler that was 100% standards-compliant...

But all that's beside the point because...

Here's How to Do It Right

How do you do this right? Use the constants 0 and 1. It's that simple. Zero will NEVER be defined as any value other than 0 and one will never be defined as any value other than 1. There is no programmer on Earth that will look at 0 and wonder what it is. The same applies to 1. (This does not apply to ZERO and ONE!) You save typing. You make things clearer. And you don't have to worry about standards compliance. (If your compiler is confusing 0 and 1 in calculations with other numbers, you have bigger problems than following "best practices" to worry about!).

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)