Using MSC 5.0 time functions
Derek Morgan
C03601DM%WUVMD.BITNET at cunyvm.cuny.EDU
Mon Feb 8 03:54:16 AEST 1988
I have a routine titled delay(), which uses the system clock to create a
timed pause which is theoretically independent of CPU speed. The problem
is that the delay is never constant, and rarely of the length that it should
be. Even accepting the ~120 ms error margin, the function is relatively random
with respect to how long the delay is. Any ideas on why this is so? I am
sending the code. Please forgive any lack of C eloquence, but I am still
learning the ins and outs of the language. Direct responses are preferred,
because I receive the C net in digest format, and it may take a week or
more to get your reply :-)
#include <sys\types.h>
#include <sys\timeb.h>
void delay(len)
long len;
{
struct timeb xtime;
long t0, t2;
ftime(&xtime);
t0 = (long)xtime.millitm+xtime.time*100L;
do
{
ftime(&xtime);
t2 = (long)xtime.millitm+xtime.time*100L;
}
while(t2-t0<len);
}
Thanks in advance,
Derek Morgan (C03601DM at WUVMD.BITNET)
Also thanx for any replies to my earlier posting---haven't gotten the digest
yet. The problem was an error in proofreading which didn't raise any syntax
errors or warnings. Unfortunately, I don't have access to a lint-type program,
although it wouldn't have found this one.
More information about the Comp.lang.c
mailing list