sizeof ptrs,lint,etc
Doug Gwyn <gwyn>
gwyn at brl-tgr.ARPA
Wed Feb 6 11:33:55 AEST 1985
> I hate lint. All it ever does is complain about code that I know works.
> I don't like casting funxions to (void). I don't like casting arguments
> to funxions. I don't like /*NOTREACHED*/. I do `if (exp) return exp;'
> to avoid the braces when I really mean `if (exp) { exp; return;}'
> I don't declare args as ptrs if I merely pass them on to another funxion.
> Even the UNIX REVIEW they gave away at Uniforum says that void just makes
> programs harder to read. What I do with lint is sweep it under the rug.
If "lint" (assuming a modern version such as the one I use) complains
about your code, then if it works it is most likely an ACCIDENT of the
particular C implementation you are working with and would not work on
some other system with considerably different machine architecture.
Casting function return values to (void) not only documents the fact
that the function returns a value which you are discarding, but if
done as a natural action and not as a blind response to "lint" warnings
it also shows that you have considered the return value and have made
the decision that it is not needed from that invocation of the function.
Far too much C code fails to test for failure of functions such as
write() or even malloc(). Not considering what should be done in such
cases is pure sloppiness.
Believe it or not, C strongly supports data typing; many of us in the
world of production software think that this is a Good Thing. If you
need to pass a (struct foo *) to a function, then giving it an (int) or
a (struct bar *) is simply wrong. If you write your code cleanly, you
should seldom need to cast function arguments.
/*NOTREACHED*/ should be unnecessary in many cases, were "lint" a bit
smarter. However, how is "lint" to know that abort() and exit() never
return? Presumably some other mechanism could be figured out to handle
these situations. However, /*NOTREACHED*/, /*ARGSUSED*/, /*VARARGS2*/,
and other such pragmas add some useful documentation to the source code.
There is no rational defense for "return exp" when that is not what you
mean for your code to be saying to the reader. (The same applies to
using 3 XORs to swap the contents of variables under normal circumstances.)
Undeclared data default to int, which for very sound reasons on many
architectures may not be handled the same as various pointer types.
By not declaring your pointers, you guarantee that your code will not
port to some machines.
The quote from UNIX REVIEW is in Bill Tuthill's "C Advisor" column
and is his opinion. I have heard others claim that (void) correctly
used makes code EASIER to read, or at least to maintain.
There should be no lint to sweep under the rug, if you have done your
coding properly. If you find yourself sweeping lint under the rug,
then you haven't understood what strong typing and "lint" is all about.
> LET'S GET BACK TO BASICS!!! What have they done to the poor C language?
> It used to be quite clean. It was originally developed on a pdp-11, ported
> to an Interdata 8/32, Honeywell 6000, & IBM 370. On each of these machines,
> either sizeof(int) = sizeof(int *) = sizeof(??? *)
> or sizeof(long)= sizeof(long*) = sizeof(??? *).
> Of these, the h6000 is not byte addressable & so the bizarre pointer
> format of K&R page 211 is used. Note that a pointer is always 36 bits
> even tho half may be unused. So far, so good. Now somewhere along the line
> someone broke the rule & decided that maybe pointers to different objects
> should be different lengths. The pros? Possible storage savings. The cons?
> Now I have to cast pointers. The universe is out of kilter! What about
> pointers to pointers? Is sizeof(int **) = sizeof(char **)? We all want
> C & UNIX to run everywhere, but let's not bend over backwards to
> accommodate weird architectures. If space is sometimes wasted on a
> weird machine, it is for conceptual simplicity. When & if a prog is
> ported to a bizarre machine it will probably have to be tinkered with
> anyway. ALL THINGS IN MODERATION, INCLUDING PORTABILITY.
You DON'T have to cast pointers if you use them right (with a few
exceptions, for "generic" aligned pointed returned from malloc()).
Who CARES whether sizeof(int **) == sizeof(char **)? It doesn't
matter in correctly-written code. (The only real exception is taken
care of by the varargs mechanism, which is clearly necessary to cope
with this on real machines.)
The distinction among different data types should ALWAYS matter to
a conscientious programmer, even when the machine is forgiving. If
your data types do not match, you have written incorrect code! Now
that there really are architectures where effective C implementations
have to enforce the distinctions, there is additional reason for taking
care with this matter.
I consider it a BUG, and a sign of insufficient care taken in crafting
the original code, if the vast majority of one's C code (other than
obviously system-specific functions) does not port to another quite
different architecture UNCHANGED. Of course, for this to work really
well the code has to be targeted at a carefully-specified environment
such as that described in the forthcoming ANSI C standard, the /usr/group
1984 standard, or the System V Interface Definition (all of which are
very much inter-compatible, thankfully).
> The nil pointer in C *IS* a bit pattern of some size all zeros. This
> is not lisp. If you want to generate a cell called `nil' & explicitly
> compare to `(??? *) &nil' be my guest. The syntax `if (p)' or `if (!p)'
> suits me just fine.
Sorry, the null pointer is not necessarily a 0 bit pattern. We
discussed this at length a few months ago. There can be no such thing
as a "generic" null pointer on all reasonably interesting architectures.
In summary, strong data typing is not a conspiracy. It is required
in order to be able to develop correct, reasonably portable,
applications in C. I and others I know of have no trouble using data
types in accordance with the current (and forthcoming) rules. Indeed,
we often find that by being careful in this regard, we can use "lint"
to catch errors that otherwise would go unnoticed (and that may even
work on the system upon which the code is first developed). As
professionals we cannot afford to indulge in sloppy workmanship at the
expense of long-term maintenance costs and loss of product reliability.
As a dedicated amateur hacker, you clearly do not care about such
issues, so why not just be quiet, do your own thing, and let the rest
of us write our code in peace (knowing that we won't have much trouble
from it in the future)?
More information about the Comp.lang.c
mailing list