Unix/C program modularity

Laura Creighton laura at l5.uucp
Fri Oct 18 05:01:31 AEST 1985

In article <637 at dicomed.UUCP> papke at dicomed.UUCP (Kurt Papke) writes:
>As a manager of programmers and engineers for 5 years and a practicioner
>previous to that, I have noticed a disconcerting problem that tends to arise
>in software systems designed under the Unix family of operating systems.
>The problem I have observed is that:
>	Applications programs that have been designed to run under Unix
>	tend to have a low percentage of re-usable code.
>My observations are based on inspection of graphics applications (which
>Dicomed is in the business of producing) which tend to be predominantly
>user-interface stuff.  I am specificly NOT commenting on code used to
>support program development, operating systems, and tools, but rather
>applications programs that are used in a graphics production environment.
>Why might this be the case ??  Further inspection of much code shows that
>applications designed for the Unix environment tend to follow the spirit
>of the Unix operating system: design your system as a series of small
>programs and "pipe" them together (revelationary!)

Boy!  Maybe I should go work for you.  The applications that I see the most
are these huge megaliths which reivent the wheel all the way down the line.
There is a body of people who think that the best sort of application program
is one that does everything.  As a result you see programs which show the
strain marks as everything including the kitchen sync was jammed in to fit.

People who are working on fairly hostile O/S's like MS/DOS may be absolutely
correct in this perception for their environment, but every time I find a
unix application program that reimplements strcpy **AGAIN** or atoi or
any number of other things...but I digress. 

>As a result of this philosophy to design systems as a network of filters
>piped together:
>	o Much of the bulk of the code is involved in argument parsing,
>	  most of which is not re-usable.

I think that you have missed out on the unix design philosophy here. There
is nothing sacred in filters, per se. A good filter does one job well. If
people are rewriting the argument parsing for every new application, rather
than reusing exiting argument parsing, then either argument parsing cannot
be done by a standard filter, or you have not written the filter you need
yet.  For a long time *all* unix programs did their own argument parsing.
Now we have getopt(3) <and some earlier programs which cannot be conveniently
converted to use getopt, alas>.  Getopt solves the parsing problem -- noone
need ever write an argument parser for a standard unix program again.

If your application programs are such that it is possible that one general
parser could parse all or most of them, then you should write that and then
you will have the reusable code that you want.  If your applications are
not structures this way and cannot be restructureshd this way, then you
will have to write an argument parser for each application. But I fail to see
that you are going to avoid this problem if you write it in any other style -
it seems inherant in the nature of such applications.

>	o Error handling is minimal at best.  When your only link to the
>	  outside world is a pipe, your only recourse when an error
>	  occurs is to break the pipe.

This is *wrong* *wrong* *wrong*.  Most unix programs do not check for errors,
this is true. But this is because the programmers are either sloppy, or do
not know how to check for errors.  See *Real Programs Dump Core* by Ian Darwin
and Geoff Collyer.  I think that this paper was in the winter 84 usenix, but
if I am wrong I am sure that they will both post corrections....

Most unix programs filters do not break at the pipes.  They break because
you run out of file descriptors, or because malloc fails, or because you
cannot open a file for some reason, or because you try to divide by zero.
All of these things can be, and should be checked.  There is nothing in the
unix philosophy which says that you have to be sloppy or lazy about this.

[More and more I am coming to the conclusion that the problem is not sloppiness
or laziness, just sheer ignorance, by the way.  Do the world a favour. Teach a
friend to check the return codes of system calls. Then teach him to use lint.]

>	o Programs do not tend to be organized around a package concept,
>	  such as one sees in Ada or Modula-2 programs.  The programs are
>	  small, so data abstraction and hiding seem inappropriate.  Also
>	  the C language support for these concepts is cumbersome, forcing
>	  the programmer to use clumsy mechanisms such as ".h" files and
>	  "static" variables to accomplish packaging tasks.

This is a real deficiency.  However, if you write your filters correctly, you
can view them as packages and treat them the same way. I have never used any
language which has modules for any serious work, but I have often wondered how
useful they actually are.  There are nights when I think that data abstraction
is a virtue because ``real programmers won't use lint'' and its chief virtue is
that it handles your casts for you.  Some modula-2 enthusiasts have agreed with
me about this, and I will probably get a lot of rotten tomatoes from the rest.
But I still don't know how to measure how useful classes and the like are.  I
don't know how to meausre why I like programming in lisp more than programming
in C either, though.
>	o Programmers invent "homebrew" data access mechanisms to supplement
>	  the lack of a standard Unix ISAM or other file management.  Much
>	  of this code cannot be re-used because the programmer implemented
>	  a primitive system to satisfy the needs of this one filter.

What you need to do is to select your standard and then write the rest of 
your code to deal with it.  This is not a problem with the unix philosophy, but
a problem because you have not set a standard and required your code use it.

>Despite all this, the graphics community is settling in on using Unix as
>the operating system of choice.
>Are we being lulled into using an O/S and language that allows us to whip
>together quicky demos to demonstrate concepts, at the expense of long-term
>usefulness as a finished product ??

It depends on how you run your company, of course.  If you do not have
re-usable code then I think that you need to identify what you are rewriting
and then make a standard and comply with it.  If you can't get re-usable
code with unix then I can't see why you expect to get it anywhere else...the
mechanisms seem the same to me.  I may be missing something, but I can't see

In addition there are a fair number of exisiting unix graphics standards in
existence. Couldn't you standardise around one of them?

>(Speaker steps off soapbox amid a torrent of rotting vegetables)

Well, i don't thinkt hat I was that bad, was I?

Laura Creighton		
sun!l5!laura		(that is ell-five, not fifteen)
l5!laura at lll-crg.arpa

More information about the Comp.lang.c mailing list