64 bit architectures and C/C++
Blair P. Houghton
bhoughto at pima.intel.com
Sun Apr 28 12:03:50 AEST 1991
In article <168 at shasta.Stanford.EDU> shap at shasta.Stanford.EDU (shap) writes:
>It seems to me that 64 bit architectures are going to
>introduce some nontrivial problems with C and C++ code.
Nope. They're trivial if you didn't assume 32-bit architecture,
which you shouldn't, since many computers still have 36, 16, 8,
etc.-bit architectures.
>I want to start a discussion going on this topic. Here are some seed
>questions:
Here's some fertilizer (but most of you consider it that way, at
any time :-) ):
>1. Do the C/C++ standards need to be extended to cover 64-bit
>environments, or are they adequate as-is?
The C standard allows all sorts of data widths, and specifies
a scad of constants (#defines, in <limits.h> to let you use
these machine-specific numbers in your code, anonymously.
>2. If a trade-off has to be made between compliance and ease of
>porting, what's the better way to go?
If you're compliant, you're portable.
>3. If conformance to the standard is important, then the obvious
>choices are
>
> short 16 bits
> int 32 bits
> long 64 bits
> void * 64 bits
The suggested choices are:
short <the shortest integer the user should handle; >= 8 bits>
int <the natural width of integer data on the cpu; >= a short>
long <the longest integer the user should handle; >= an int>
void * <long enough to specify any location legally addressable>
There's no reason for an int to be less than the full
register-width, and no reason for an address to be limited
to the register width.
An interesting side-effect of using the constants is that
you never need to know the sizes of these things on your
own machine; i.e., use CHAR_BIT (the number of bits in a char)
and `sizeof int' (the number of chars in an int) and you'll
never need to know how many bits an int contains.
>How bad is it for sizeof(int) != sizeof(long).
It's only bad if you assume it's not true. (I confess: I peeked.
I saw Chris' answer, and I'm not going to disagree.)
>4. Would it be better not to have a 32-bit data type and to make int
>be 64 bits? If so, how would 32- and 64- bit programs interact?
Poorly, if at all. Data transmission among architechures
with different bus sizes is a hairy issue of much aspirin.
The only portable method is to store and transmit the data
in some width-independent form, like morse-code or a text
format (yes, ascii is a 7 or 8 bits wide, but it's a
_common_ form of data-width hack, and if all else fails,
you can hire people to read and type it into your
machine).
>Looking forward to a lively exchagne...
--Blair
"Did anyone NOT bring potato salad?"
More information about the Comp.lang.c
mailing list