integer sizes (getting warmer)
mwm at ucbtopaz.CC.Berkeley.ARPA
mwm at ucbtopaz.CC.Berkeley.ARPA
Sat Mar 9 15:04:21 AEST 1985
In article <8988 at brl-tgr.ARPA> gwyn at brl-tgr.ARPA (Doug Gwyn <gwyn>) writes:
>You failed to show the #define for "uint60" on e.g. a VAX.
>How are you going to port your code to a 32-bit machine?
With more difficulty than porting it to a 64-bit machine. The code will die
at compile time, leaving a pointer to the variables of types to large for
the machine. At that point, you can either restrict the problem range (if
possible) or add support for the long integers (perhaps via mp). This is
far better than producing spurious answers, as not using such a system
would do. Of course, you also get the benefit of using the smallest type on
machines with funny byte/short sizes, *without* having to worry about
portability.
>By implementing your very-long-integer data types in a way
>that does not exceed the guaranteed minimum limits of any
>ANSI C conforming implementation, you would significantly
>improve the portability of your code.
First, 60 bits is *not* a "very-long-integer" data type. Second,
portability is the problem of the porter, not the original author. I *will
not* make my job significantly harder for purposes of portability. I do
cast things (more than the compiler insists on, actually), cast functions
to void, etc. But not using the capabilities of the machine/OS/compiler to
their fullest is silly. For instance, do you likewise recommend that I make
all external variable names six characters, monocase, because that's the
"guaranteed minimum?"
>There are well-known
>techniques for implementing extended-precision arithmetic
>(e.g. Knuth Vol. 2); having such routines in the standard
>library could be useful but forcing the compiler to handle
>this issue directly is counter to the general flavor of C.
I'm well aware of those technics, having made use of Knuth in implementing
some of them. Unix (at least post-v7 unices) has just such a library,
/usr/lib/libmp.a. Or is this another one of the nice v7 tools that didn't
make it to System X.y.z.w?
I don't think the compiler should have to handle the problem. I don't think
the person writing the code should have to worry about whether things will
fit in a short/long/int on every machine the code might be run on, either.
The include file full of typedefs is a reasonable solution for C, and the
best I've been able to come up with. If you've got a better solution, I'd
like to hear it.
>If you really want non-portable code (e.g., for reasons of
>speed), you can still do that too, but there is no reason
>the C language should be made considerably more elaborate
>just so you can code in terms of "uint60".
Adding an include file with ~100 typedefs is making the C language
"considerably more elaborate"????? I'm not proposing any changes to C *at
all*. Just adding some typedefs to the set that programs can expect to
find, like the "jmp_buf" typedef in <setjmp.h>.
The point of this wasn't to make code with long integers more readable, but
to make it possible to write code which expects ints to have some minimum
number of bits that is both efficient and portable. If long ints were the
only problem, I'd be writing in a language that supports integer properly,
as opposed to C.
<mike
More information about the Comp.lang.c
mailing list