assigning an integer to the negation of ...
Peter Desnoyers
desnoyer at Apple.COM
Thu Dec 22 03:03:05 AEST 1988
In article <15090 at mimsy.UUCP> chris at mimsy.UUCP (Chris Torek) writes:
>In article <1911 at pembina.UUCP> lake at alberta.UUCP (Robert Lake) writes:
>> i = -(unsigned short)j;
>[where i is an int and j is 1---j's type is irrelevant]
>
It was pointed out that the context in which this problem arose was
TCP software - I would bet it was the window size calculations.
Quoting V. Jacobsen and R. Braden from RFC 1072 (TCP Extensions for
long-delay paths):
The TCP header uses a 16 bit field to report the receive window size
to the sender. Therefore, the largest window that can be used is 2**16
= 65K bytes. (In practice, some TCP implementations will "break" for
windows exceeding 2**15, because of their failure to do unsigned
arithmetic.)
I would also guess that the broken TCP implementations actually try to
do unsigned arithmetic, but don't get it right, as in the original,
subtly flawed example.
Peter Desnoyers
More information about the Comp.lang.c
mailing list