preventing sign extension
David MacKenzie
mackenzi at thor.stolaf.edu
Tue Jan 10 15:40:30 AEST 1989
Now that my confusion about varargs has been taken care of, I have run
into another problem with "my" port of Holub's integer doprnt to Unix
(it's getting to be a group project . . .).
Here are some code fragments:
/*
* INTMASK is a portable way to mask off the bottom N bits
* of a long, where N is the width of an int.
*/
#define INTMASK (long) ((unsigned) (0))
:
:
long lnum; /* used to hold numeric arguments */
:
:
/* Fetch a long or int sized argument off the
* stack as appropriate. If the fetched number
* is a base 10 int then mask off the top
* bits to prevent sign extension.
*/
:
:
[skip to where it gets an int sized argument]
lnum = (long) va_arg (args, int);
:
:
[if (print in unsigned format)]
lnum &= INTMASK;
I'm not quite sure what this was supposed to do, but on my 68000 box
with Green Hills C, it does what it looks like it's doing: a bitwise
AND with zero. The result is that
printf ("%d\n", 4);
produces the correct output, but
printf ("%u\n", 4);
displays 0.
I'm a bit fuzzy on the conditions under which sign extension takes
place; I had hoped I'd never have to deal with it. Would any of
you C wizards like to explain what Allen was trying to do, and
why it presumably works under MS-DOS but doesn't under Unix? And
how to make it work under Unix?
David MacKenzie
edf at rocky2.rockefeller.edu
More information about the Comp.lang.c
mailing list