Must casting destroy lvalueness?
Jerry Leichter
leichter at yale.UUCP
Thu Nov 6 04:35:34 AEST 1986
All this has really gotten out of hand. I've found I've wanted to use casts
as lvalues in exactly one situation: Incrementing a pointer by an amount
that is NOT to be taken as a multiple of the size of the type pointed to.
A "common" - NONE of these are what I'd really want to call common, but I
HAVE run into them more than once - case is in a (machine dependent!) storage
allocator that wants to make sure of proper alignment. This doesn't arise
in something like malloc, which wants a char * (void *, in ANSI C) anyway,
but in specialized applications. For example, I need to allocate stuff out
of a large array. The "stuff" will be of a known type - a struct whose last
element is a varying-length array - and will thus be varying in size. The
begining of anything allocated must be on an even address. So I have a pointer
to some big structure that I'd like to increment by 1. Not 1*size of the
structure, but 1. YES, THIS IS MACHINE DEPENDENT - I got tied down by such
a dependency when I cast the pointer to int to look at the bottom bit!
I can think of no particular use for casting of arbitrary lvalues, but in
situations as above, the following definition for a cast argument to op=
would be handy:
(type)a op= b
shall mean:
a = (type)a op b
(except that a is only evaluated once).
Pre- and post-decrement and increment should work in the obvious way. Note
that the type of (type)a op= b (and of ++(type)a, etc.) is the type of a,
NOT the type being cast to.
I can think of no real uses for this construct where "op" is anything but
"+" or "-".
-- Jerry
More information about the Comp.lang.c
mailing list