side effects inside sizeof
Henry Spencer
henry at utzoo.UUCP
Sun May 13 09:26:04 AEST 1984
Here's an entertaining oddity in C. Try running the following under
your favorite C compiler:
----
main(){
int x = 1;
printf("%u\n", sizeof(x++));
printf("%d\n", x);
}
----
The result of the first printf is uninteresting, just whatever the
size of an integer is on your machine. The result of the second
printf is the good part. Looks like x should have been incremented
by the autoincrement, so it should be 2. Surprise -- on every system
handy locally, it's 1! (This includes V7, 4.1BSD, 4.2BSD, Amdahl UTS,
and several varieties of 68K.)
The fact is, most C compilers don't generate code for the expression
inside sizeof at all. Once they know what type it is, the result of
the sizeof is fully defined, so "of course" the details of the operand
no longer matter. But nowhere in K&R is there anything that would
permit this wanton disregard of side effects, unless you really work
the statement "...this expression is semantically an integer constant..."
hard.
I agree that the behavior as implemented is reasonable and should be
classed as "correct", but this needs to be more explicitly documented.
Anybody know whether the ANSI C committee has noticed this one?
--
Henry Spencer @ U of Toronto Zoology
{allegra,ihnp4,linus,decvax}!utzoo!henry
More information about the Comp.lang.c
mailing list