is it really necessary for character values to be positive?
Karl Heuer
karl at haddock.UUCP
Fri Jan 9 18:15:09 AEST 1987
In article <548 at brl-sem.ARPA> ron at brl-sem.ARPA (Ron Natalie <ron>) writes:
>In article <289 at haddock.UUCP>, karl at haddock.UUCP (Karl Heuer) writes:
>> In article <39 at houligan.UUCP> dave at murphy.UUCP writes:
>> >Summary: invent an 8-bit character set and let some of them be negative
>>
>> Suppose I am using such a system, and one of the characters -- call it '@'
>> -- has a negative value. The following program will not work:
>> main() { int c; ... c = getchar(); ... if (c == '@') ... }
>
>Getchar returns int. The int has a character in it. Before trying to
>use it as such, you ought to either place it back into a character
>variable explicitly or use a cast to char...
>
> main() { int c; ... c = getchar(); ... if ((char) c == '@') ... }
That's one way to "fix" the problem, but the construct I wrote is valid by
current standards and is a common idiom. I don't think programmers would like
having to cast* the result of getchar() back into char before using it!
Your suggestion does make sense logically, though, and I think it supports my
contention that making getchar() an int function was a mistake in the first
place.**
Karl W. Z. Heuer (ima!haddock!karl or karl at haddock.isc.com), The Walking Lint
*Of course, the cast is done *after* testing for EOF.
**I do have what I think is a better idea, but I'm not going to describe it in
this posting. Anyway, it's too late to change getchar() now.
More information about the Comp.lang.c
mailing list