Null revisited (briefly)
John Woods
john at frog.UUCP
Mon Feb 6 12:25:00 AEST 1989
In article <13068 at steinmetz.ge.com>, davidsen at steinmetz.ge.com (William E. Davidsen Jr) writes:
> Some peopke put a zero in a char using NULL, such as "ffoA[n]=NULL".
> I think this is portable for either type, but I have seen compilers
> which warn about type conversion is NULL is a pointer.
If NULL is #defined as 0, this will work, but if it is #defined as (void *)0
(which I think is the other ANSI-legal #definition; if you don't know, assume
that only 0 works, if you do know, quietly correct me by mail), this is not
guaranteed to work: on a sufficiently bizarre architecture, the nil-pointer
need not have an all-zero-bits representation, as long as the compiler can
(a) properly recognize that the constant 0 must be coerced to the nil-pointer
pattern, and (b) can coerce the (void *)nil-pointer to the (foo *)nil-pointer
for all types "foo". The compiler is not obligated to know how to turn
(void *)0 back into 0.
Myself, I usually use #define NUL '\000', named after the ASCII NUL character,
though I often worry about misunderstandings between NUL and NULL.
--
John Woods, Charles River Data Systems, Framingham MA, (508) 626-1101
...!decvax!frog!john, john at frog.UUCP, ...!mit-eddie!jfw, jfw at eddie.mit.edu
Presumably this means that it is vital to get the wrong answers quickly.
Kernighan and Plauger, The Elements of Programming Style
More information about the Comp.lang.c
mailing list