"Noalias" warning and questions
Chris Calabrese[rs]
cjc at ulysses.homer.nj.att.com
Sat Feb 13 00:28:43 AEST 1988
In article <8012 at elsie.UUCP>, ado at elsie.UUCP writes:
> X3J11 has now mailed out copies of the January 11, 1988 versions of the
> Draft Proposed C Standard and the Rationale.
>
> It looks as if folks planning to make use of standard library functions
> won't have the luxury of simply ignoring "noalias". The Rationale (which
> contains what were to me enlightening words on "noalias") notes that
> "Declaring a pointer-type function parameter to be noalias enjoins the caller
> of that function to assure that the object referenced by that pointer overlaps
> with no other parameter object." Since many standard library functions
> have prototypes that declare arguments to be "noalias", you'll need to
> ensure that there's no such overlap when you call them.
This is all getting a little ridiculous. When I use C it's
I want to have an extremely close coupling with the operating
environment, and I put all kinds of #define and #if and #ifdef
sections into my code to make it portable. If people really
want to do have optimized compilers for a multiprocessor environment,
don't use C! That's not what it's designed for.
If you are doing numerical programming, there are plenty of purely
functional languages with all of C's flexibility to handle user
defined data types, and user defined data objects for you object
oriented types. These languages can be broken down in intricate
detail by the compiler, so that the compiler can do all sorts of
optimizations.
C is designed for systems work. It is also designed to run on UNIX.
If you are not doing both of these, you should consider other
languages.
The ANSI board should not mire a perfectly good systems language
which was designed to be the root of a particular operating system
with all sorts of standard so that it can be used in environments
and for aplications which it has no business being in.
C was a great language when it was developed (almost 20 years ago
for you people who think that it's the new language on the block
[that title probably belings to Pascal, which is only 14 years old]),
but there are many new directions which language design has taken
since then, and if you want to use a flexible language which
can be used in all environents and for all aplications, C just
isn't it.
I'm not saying that I don't use C for 95% of the programming
that I do, or that other people should drop it like a hot
potato. I'm saying that miring a perfectly good language
just so you can get it to do a few more things is wrong.
We're not trying to create another ada, are we?
Chris Calabrese
AT&T Bell Labs
ulysses!cjc
More information about the Comp.lang.c
mailing list