Hard links to directories: why not?
Phong Vo[drew]
kpv at ulysses.att.com
Fri Jul 27 05:18:39 AEST 1990
In article <1990Jul23.181554.17938 at dg-rtp.dg.com>, goudreau at larrybud.rtp.dg.com (Bob Goudreau) writes:
> In article <837 at ehviea.ine.philips.nl>, leo at ehviea.ine.philips.nl (Leo
> de Wit) writes:
> > No need for that if find only - recursively - follows those
> > subdirectories 'sub' for which the inode of 'sub/..' is the same as
> > that of '.' ...
> ... thus defeating the purpose of find, since the user doesn't get
> what he expected to get (namely, the entire directory tree descending
> from his specified target). ...
Right. In graph theoretic terms, the find problem is to compute in a directed
graph the set of nodes that are reachable from a given node. It is well-known
how to solve this using standard search techniques such as depth-first.
I wrote a ftwalk function to do this two years ago. It's described in
"An Efficient File Hierarchy Walker" in the Summer '89 USENIX Proceedings.
The upshot is that you can find the reachable set in the presence of
hard-links or symlinks even if there are cycles; and you can do it fast.
The paper also described reimplementations of standard utilities such as
find, ls, etc...
Phong Vo
More information about the Comp.unix.wizards
mailing list