How to convert a char into an int from 0 through 255?
Karl Heuer
karl at haddock.ima.isc.com
Fri Jan 12 05:02:02 AEST 1990
In article <7544 at cs.utexas.edu> Dan Bernstein <stealth.acf.nyu.edu!brnstnd at longway.tic.com> writes:
>((int) ch) & 0xff works and answers my original question, but it won't handle
>machines with more than 256 characters. There's no compile-time way to find
>the right bit pattern---UCHAR_MAX + 1 may not be a power of two.
I believe it is required that all of the U{type}_MAX constants are one less
than a power of two.
I used to think that U{type}_MAX+1 == 1 << (sizeof(type)*CHAR_BIT) was
required, but it seems that Cray has good reasons for violating that identity.
I suppose this is another item to send to X3J11 for interpretation.
Karl W. Z. Heuer (karl at haddock.isc.com or ima!haddock!karl), The Walking Lint
More information about the Comp.std.c
mailing list