problems/risks due to programming language, stories requested
Peter H. Golde
phg at cs.brown.edu
Thu Mar 1 11:04:47 AEST 1990
In article <1990Feb28.213543.21748 at sun.soe.clarkson.edu> jk0 at image.soe.clarkson.edu (Jason Coughlin) writes:
>From article <6960 at internal.Apple.COM>, by chewy at apple.com (Paul Snively):
>> For what it's worth, my personal opinion is that C lends itself to
>> precisely the kinds of errors noted above--when does break work and when
>> doesn't it, and why in God's name do you need it in switch statements in
>> the first place, etc.
>
> Gee, if you read the language defn you'd know exactly when break
>applies and when break doesn't. It seems to me that it is the
>programmer's responsibility to know the language in which he is going to
>implement said project -- it's not necessarily the language's responsibility
>to know the programmer didn't read the defn.
However, every programmer, no matter how good, makes stupid mistakes -- ones
in which s/he knows better, but for some reason, s/he did anyway. These
might be simple syntax errors, or left out statements, etc. The higher the
percentage of these error which the compiler catches, the more reliable
the program will be and the less time it will take to be debugged. This
is why redundancy in a language can be a good thing.
The C language might be made "simpler" if all
undeclared variable were automatically declared
as auto int; thus saving the need for "useless" declarations. I would
not like to program in such a language, would you?
To take a more "real-life" example, I have, at times, mis-typed
a C program as follows:
c = foo(d); /* update count of flibbets *
bam_whiz(c, d); /* and propagate change to zip module */
return;
If I had used another language, this error would have been caught by
the compiler. Clearly this is a small point, but it illustrates
my point: some languages and compilers permit a larger percentage
of minor errors to pass than others.
--Peter Golde
More information about the Comp.lang.c
mailing list