97

All across our C code base, I see every macro defined the following way:

#ifndef BEEPTRIM_PITCH_RATE_DEGPS
#define BEEPTRIM_PITCH_RATE_DEGPS                   0.2f
#endif

#ifndef BEEPTRIM_ROLL_RATE_DEGPS
#define BEEPTRIM_ROLL_RATE_DEGPS                    0.2f
#endif

#ifndef FORCETRIMRELEASE_HOLD_TIME_MS
#define FORCETRIMRELEASE_HOLD_TIME_MS               1000.0f
#endif

#ifndef TRIMSYSTEM_SHEARPIN_BREAKINGFORCE_LBS
#define TRIMSYSTEM_SHEARPIN_BREAKINGFORCE_LBS       50.0f
#endif

What is the rationale of doing these define checks instead of just defining the macros?

#define BEEPTRIM_PITCH_RATE_DEGPS                   0.2f
#define BEEPTRIM_ROLL_RATE_DEGPS                    0.2f
#define FORCETRIMRELEASE_HOLD_TIME_MS               1000.0f
#define TRIMSYSTEM_SHEARPIN_BREAKINGFORCE_LBS       50.0f

I can't find this practice explained anywhere on the web.

3
  • 6
    Changing constants somewhere else in the code is guaranteed to work this way. If somewhere else someone defines one of those macros, they won't get overwritten by the preprocessor when it parses this file. Sep 4, 2015 at 12:55
  • 8
    It's an example of the WET design principle.
    – stark
    Sep 4, 2015 at 12:59
  • Posted an answer with an example, try compiling it. Sep 4, 2015 at 13:04

7 Answers 7

146

This allows you to override the macros when you're compiling:

gcc -DMACRONAME=value

The definitions in the header file are used as defaults.

0
51

As I said in the comment, imagine this situation:

foo.h

#define FOO  4

defs.h

#ifndef FOO
#define FOO 6
#endif

#ifndef BAR
#define BAR 4
#endif

bar.c

#include "foo.h"
#include "defs.h"

#include <stdio.h>

int main(void)
{
    printf("%d%d", FOO, BAR);
    return 0;
}

Will print 44.

However, if the conditional ifndef was not there, the result would be compilation warnings of MACRO redefinition and it will print 64.

$ gcc -o bar bar.c
In file included from bar.c:2:0:
defs.h:1:0: warning: "FOO" redefined [enabled by default]
 #define FOO 6
 ^
In file included from bar.c:1:0:
foo.h:1:0: note: this is the location of the previous definition
 #define FOO 4
 ^
14
  • 1
    This is compiler-specific. Redefining an object-like macro is illegal unless the redefinition is "the same" (there's a more technical specification for that, but it's not important here). Illegal code requires a diagnostic and, having issued a diagnostic (here a warning), the compiler is free to do anything, including compile the code with implementation-specific results. Sep 4, 2015 at 23:50
  • 7
    If you have conflicting defs for the same macro, wouldn't you rather get the warning in most cases? Rather than silently using the first definition (because the 2nd one uses an ifdef to avoid redefining). Sep 5, 2015 at 5:33
  • @PeterCordes Most of the times, definitions under #infdefs are used as "fallback" or "default" values. Basically, "if the user configured it, fine. If not, let's use a default value." Sep 5, 2015 at 13:28
  • @Angew: Ok, so if you have some #defines in a library header that are part of the the library's ABI, you should not wrap them in #ifndef. (Or better, use an enum). I just wanted to make it clear that #ifndef is only appropriate when having a custom definition for something in one compilation unit but not another is ok. If a.c includes headers in a different order than b.c, they might get different definitions of max(a,b), and one of those definitions might break with max(i++, x), but the other might use temporaries in a GNU statement-expression. Still confusing at least! Sep 5, 2015 at 23:24
  • 1
    @PeterCordes What I like to do in that case is #ifdef FOO #error FOO already defined! #endif #define FOO x
    – Cole Tobin
    Sep 6, 2015 at 7:41
17

I do not know the context but this can be used to give the user the availability to override the values set by those macro definitions. If the user explicitly defines a different value for any of those macros it will be used instead of the values used here.

For instance in g++ you can use the -D flag during compilation to pass a value to a macro.

14

This is done so that the user of the header file can override the definitions from his/her code or from compiler's -D flag.

7

Any C project resides on multiple source files. When working on a single source file the checks seem to (and actually) have no point, but when working on a large C project, it's a good practice to check for existing defines before defining a constant. The idea is simple: you need the constant in that specific source file, but it may have been already defined in another.

2

You could think about a framework/library that gives to the user a default preset that allow the user to compile and work on it. Those defines are spreaded in different files and the final user is advised to include it's config.h file where he can config its values. If the user forgot some define the system can continue to work because of the preset.

1

Using

#ifndef BEEPTRIM_PITCH_RATE_DEGPS
#define BEEPTRIM_PITCH_RATE_DEGPS                   0.2f
#endif

allows the user to define the value of the macro using the command line argument (in gcc/clang/VS) -DBEEPTRIM_PITCH_RATE_DEGPS=0.3f.

There is another important reason. It is an error to re-define a preprocessor macro differently. See this answer to another SO question. Without the #ifndef check, the compiler should produce an error if -DBEEPTRIM_PITCH_RATE_DEGPS=0.3f is used as a command line argument in the compiler invocation.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Not the answer you're looking for? Browse other questions tagged or ask your own question.