Int and Char
art at ACC.ARPA
art at ACC.ARPA
Sat Jul 26 04:34:02 AEST 1986
> Don't C programmers know the difference between a char and an int?
> I get these great public domain programs off the net and I spend
> the next 2 weeks deciding which variables are declared as char
> but really mean short. You see, it's like this. Short, int, and
> long are all SIGNED. That means they can take on negative values.
> Chars are NOT SIGNED, there is no char called -1. If you want a
> variable that takes on negative values, use a short. It certainly
> makes your code easier for others to read as well as making it
> more portable.
>
> For my next question, why doesn't my C compiler accept declarations
> like 'signed char a;' or something like that? It says 'signed undefined'
> and dies. Is there no way to make up for the errors of others
> and artificially make chars come out signed? I am running sys V rel2
> on an att 3b20.
>
> Just had to get this off my chest.
To quote K&R 2.7:
"There is one subtle point about the conversion of characters
to integers. The language does not specify whether variables
of type char are signed or unsigned quantities. When a char
is converted to an int, can it ever produce a negative integer?
Unfortunately, this varies from machine to machine, reflecting
differences in architecture.
...
The definition of C quarantees that any character in the machine's
standard character set will never be negative, so these characters
may be used freely in expressions as positive quantities. But
arbitrary bit patterns stored in character variables may appear
to be negative on some machines, yet positive on others."
MOST compilers I know of assume that chars are SIGNED and support declaration
of "unsigned char".
<Art at ACC.ARPA>
------
More information about the Comp.lang.c
mailing list