Monday, May 22, 2017

#define vs. const

Is your code full of "#define" statements?  If so, you should consider switching to the const keyword.

Old school C:
    #define MYVAL 7

Better approach:
   const uint32_t myVal = 7;

Here are some reasons you should use const instead of #define:
  • #define has global scope, so you're creating (read-only) global values every time you use #define. Global scope is evil, so don't do that.  (Read-only global scope for constant values is a bit less evil than global variables per se, especially if you can't use the namespace features of C++. But gratuitous global scope is always a bad idea.) A const alternative can obey scoping rules, including being purely local if defined inside a procedure, or more commonly file static with the "static" keyword.
  • Const lets you do more aggressive type checking (depending upon your compiler and static analysis tools, especially if you use a typedef more specific than built-in C data types). While C is a bit weak as a language in this area compared to other languages, a classical example is a const lets you identify a number as being in feet or meters, while the #define approach is just as if you'd typed the number 7 in with no units. The #define approach can bite you if you use the wrong value in the wrong place. Type checking is an effective way to find bugs, and using #define gives up an opportunity to let static analysis tools help you with that.
  • Const lets you use the value as if it were a variable when you need to (e.g., passing an address to the variable) without having to change how the variable is defined.
  • #define in general is so bug-prone that you should minimize its use just to avoid having to spend time asking "is this one OK?" in a peer review. Most #define uses tend to be const variables in old-school code, so getting rid of them can dramatically reduce the peer review burden of sifting through hundreds of #define statements to look for problems.
Here are some common myths about this tradeoff. (Note that on some systems these statements might be true, especially if you have and old and lame compiler.  But they don't necessarily have to be true and they often are false, especially on newer chips with newer compilers.)
  • "Const wastes memory."  False if you have a compiler that is smart enough to do the right thing. Sure, if you want to pass a pointer to the const it will actually have to live in memory somewhere, but you can't even pass a pointer to a #define at all. One of the points of "const" is to give the compiler a hint that lets it optimize memory footprint.
  • "Const won't work for X." Generally false if you have a newer compiler, and especially if you are using a mostly-C subset of the capability of a C++ compiler, as is increasingly common. And honestly, most of the time #define is just being used as a plain old integer const to get rid of magic numbers. const will work fine.  (If you have magic numbers instead of #define, then you have bigger problems than this even.) Use const for the no-brainer cases. Something is probably wrong if everything about your code is so special you need #define everywhere.
  • "Const hassles me about type conversions."  That's a feature to prevent you from being sloppy!  So strictly speaking the compiler doing this is not a myth. The myth is that this is a bad thing.
There are plenty of discussions on this topic.  You'll also see that some folks advocate using enums for some situations, which we'll get to another time. For now, if you change as many #defines as you can to consts then that is likely to improve your code quality, and perhaps flush out a few bugs you didn't realize you had.

Be careful when reading discussion group postings on this topic.  There is a lot of dis-information out there about performance and other potential tradeoff factors, usually based on statements about 20 year old versions of the C language or experiences with compilers that have poor optimization capability.  In general, you should always use const by default unless your particular compiler/system/usage presents a compelling case not to.

See also the Barr Group C coding standard rule 1.8.b which says to use const, and has a number of other very useful rules.


14 comments:

  1. Philip,

    I have a question regarding the first point of the comparison between macros and const.

    According to the ISO IEC 9899-2011 (section 6.10.3.5) "A macro definition lasts (independent of block structure) until a corresponding #undef directive is encountered or (if none is encountered) until the end of the preprocessing translation unit", so a macro defined in a source (.c) file is only visible inside that file. Isn't it possible for a macro to have global scope only if it is defined in a header file? Did you mean global "file" scope?

    Best regards,
    Fernando Mondello

    ReplyDelete
    Replies
    1. This is a reasonable observation. But I don't think I've ever seen embedded software that carefully uses an #undef for every #define. And even if you did do this, it's a lot of manual book-keeping that (a) someone/something has to check, and (b) is likely to rot away over time unless constantly maintained and checked. The nice part of "const" is that the compiler does the scope book-keeping for you. So while in theory you could overcome this problem by fastidious use of #undef, in typical practice that doesn't really solve the problem, and even in theory is a lot more painful than just letting the compiler do the work for you.

      Please note also that this discussion is limited to constant values, not more general macro practices. If you do have to use a macro for some reason then yes, you should #undef it to limit scope.

      Delete
  2. I agree with you; I haven't seen an #undef for a previous #defined macro, except for x-macros.

    What I tried to point out was that if you define a constant with a macro, unless it is in a header file, it doesn't have global scope, just file scope (or a reduced scope if you use #undef, but as you mentioned, it would be a maintenance nightmare).

    ReplyDelete
    Replies
    1. Only file scope if you compile each .c file separately. If you combine multiple .c files in a compilation unit, that #define has a much greater scope than expected.

      Delete
  3. Is this insight particularly applicable on a specific product family (32-bit MCU as you defined) or all available MCUs? I'd read a post at the same blog (BarrGroup) regarding 8-bit MCU where author, Nigel Jones raised his voice (though during 1998) for #define macro. Nigel Jones stated #define used "immediate addressing" rather than "indexed addressing" used by const modifier. Can you please elaborate this further?

    ReplyDelete
    Replies
    1. I can believe that almost 20 years ago compilers for 8-bit MCUs were less able to optimized code involving the "const" keyword. The addressing has to do with limitations in the compiler and whether the compiler is smart enough to optimize a const variable. The C programming language structure used should not matter for code quality (unless you have a 20-year-old bad compiler, or even a recent bad compiler).

      Delete
  4. A 'const' object may need to be declared 'static' to be treated efficiently, whether addressed or not: its default-extern linkage at compilation-unit scope implies it must be addressable in other units, although some compilers may emit such in sections from which a linker may drop them.

    Of course, if it is routinely to be addressed, it may need to be explicitly 'extern' and given a "home" compilation unit; but if not, the code-store "waste" due to its being 'static' in multiple units is often outweighed by its being addressable as a 'static const'.

    Mind you, a '#define' that does not incorporate an explicit type [e.g., '((foo_t)0UL)'] is something of a "sin", in my world, which covers most of the "bad" objections to 'const' objects.

    Non-'const'-correct vendor libraries are another nightmare entirely, as they force coding of wrappers with "undefined behavior" under the language standard...a pet peeve, in my particular line of work (I'm looking at you, ST).

    ReplyDelete
  5. One of the problems I have w/ using CONST for feature flags is that it tends to create code that cannot be reached. This causes static evaluation tools (like Resharper in C#) to complain about "unreachable code". This can also generate complier warnings in C# in certain cases. And we all are treating warnings as errors, right? ;)

    ReplyDelete
  6. Even better approach:
    uint32_t const myVal = 7;

    Refer to:
    www.dansaks.com/articles/1998-06%20Placing%20const%20in%20Declarations.pdf

    It took me a while before I was convinced, however if you get involved in "pointers to const" and "const pointers", the reversed notation "uint32_t const" makes the most sense to me.

    ReplyDelete
  7. You can't do:

    const int NUMBER_OF_THINGS = 5
    things_t thingCollection[NUMBER_OF_THINGS] //error

    This is one of the very few cases in which I sometimes use #defines to remove magic numbers.

    ReplyDelete
    Replies
    1. You can't do this either, at least not in c99:

      #define FOO_IN_A_BAR (7)
      #define BAZ_IN_A_FOO (3)
      #define BAZ_IN_A_BAR (FOO_IN_A_BAR * BAZ_IN_A_FOO)

      Delete
    2. Gauthier,

      Can you explain why it wouldn't compile. I tried the following code:

      #include

      #define FOO_IN_A_BAR (7)
      #define BAZ_IN_A_FOO (3)
      #define BAZ_IN_A_BAR (FOO_IN_A_BAR * BAZ_IN_A_FOO)

      int main() {
      printf("%d\n", BAZ_IN_A_BAR);
      return 0;
      }

      and compiled it with the following command without errors or warnings: gcc -pedantic -Wall -std=c99 -o main main.c

      Delete
    3. Gauthier Östervall:

      Your code example is legal C. I have been doing that since K&R made it into the wild.

      The important part is using in contexts where "*" is multiply, not derefing a pointer.

      Delete
  8. Thanks for all the comments about special cases. Something to keep in mind is that especially for small resource-constrained embedded systems, it's common to use an IDE which can do global optimization by building the whole system at once. If your compiler knows it's seeing all the code in your application at once it can do optimizations beyond what you can do with a traditional module-by-module independent compilation approach. (And even then there are tricks you can do -- it all depends on how clever your compiler and build environment are.) But the important thing is that this is all discussing special cases. My main point is that almost all the time there is no need for a #define. You should only be using one if you have to, not as your go-to constant approach.

    ReplyDelete

Please send me your comments. I read all of them, and I appreciate them. To control spam I manually approve comments before they show up. It might take a while to respond. I appreciate generic "I like this post" comments, but I don't publish non-substantive comments like that.

If you prefer, or want a personal response, you can send e-mail to comments@koopman.us.
If you want a personal response please make sure to include your e-mail reply address. Thanks!

Job and Career Advice

I sometimes get requests from LinkedIn contacts about help deciding between job offers. I can't provide personalize advice, but here are...