Old school C:
#define MYVAL 7
Better approach:
const uint32_t myVal = 7;
Here are some reasons you should use const instead of #define:
- #define has global scope, so you're creating (read-only) global values every time you use #define. Global scope is evil, so don't do that. (Read-only global scope for constant values is a bit less evil than global variables per se, especially if you can't use the namespace features of C++. But gratuitous global scope is always a bad idea.) A const alternative can obey scoping rules, including being purely local if defined inside a procedure, or more commonly file static with the "static" keyword.
- Const lets you do more aggressive type checking (depending upon your compiler and static analysis tools, especially if you use a typedef more specific than built-in C data types). While C is a bit weak as a language in this area compared to other languages, a classical example is a const lets you identify a number as being in feet or meters, while the #define approach is just as if you'd typed the number 7 in with no units. The #define approach can bite you if you use the wrong value in the wrong place. Type checking is an effective way to find bugs, and using #define gives up an opportunity to let static analysis tools help you with that.
- Const lets you use the value as if it were a variable when you need to (e.g., passing an address to the variable) without having to change how the variable is defined.
- #define in general is so bug-prone that you should minimize its use just to avoid having to spend time asking "is this one OK?" in a peer review. Most #define uses tend to be const variables in old-school code, so getting rid of them can dramatically reduce the peer review burden of sifting through hundreds of #define statements to look for problems.
Here are some common myths about this tradeoff. (Note that on some systems these statements might be true, especially if you have and old and lame compiler. But they don't necessarily have to be true and they often are false, especially on newer chips with newer compilers.)
- "Const wastes memory." False if you have a compiler that is smart enough to do the right thing. Sure, if you want to pass a pointer to the const it will actually have to live in memory somewhere, but you can't even pass a pointer to a #define at all. One of the points of "const" is to give the compiler a hint that lets it optimize memory footprint.
- "Const won't work for X." Generally false if you have a newer compiler, and especially if you are using a mostly-C subset of the capability of a C++ compiler, as is increasingly common. And honestly, most of the time #define is just being used as a plain old integer const to get rid of magic numbers. const will work fine. (If you have magic numbers instead of #define, then you have bigger problems than this even.) Use const for the no-brainer cases. Something is probably wrong if everything about your code is so special you need #define everywhere.
- "Const hassles me about type conversions." That's a feature to prevent you from being sloppy! So strictly speaking the compiler doing this is not a myth. The myth is that this is a bad thing.
There are plenty of discussions on this topic. You'll also see that some folks advocate using enums for some situations, which we'll get to another time. For now, if you change as many #defines as you can to consts then that is likely to improve your code quality, and perhaps flush out a few bugs you didn't realize you had.
Be careful when reading discussion group postings on this topic. There is a lot of dis-information out there about performance and other potential tradeoff factors, usually based on statements about 20 year old versions of the C language or experiences with compilers that have poor optimization capability. In general, you should always use const by default unless your particular compiler/system/usage presents a compelling case not to.
See also the Barr Group C coding standard rule 1.8.b which says to use const, and has a number of other very useful rules.