Text-compression utilities
jim at haring.UUCP
jim at haring.UUCP
Sun Jul 15 22:26:17 AEST 1984
I have been fiddling with the compress/uncompress programs
posted by Spencer Thomas. I found a couple of bugs and also
removed the Vax `asm's, with no appreciable performance loss.
(I've tested it on a SUN). The problem is that it is a memory
hog, and I'm still wondering how to make it run with less.
Our interest is that we use compaction on all our news batching
since we use international rates a lot.
The observed compression we get with compact/uncompact is around
33% on news traffic. The new compress program gives 57% on the
same 600K byte sample of net.unix-wizards. But the nice thing is
that it is also much faster, >5 times, and I haven't profiled it
yet.
Since we are all comparing pack/compact/compress type programs,
perhaps we should all be running tests on (approximately) the
same data, say some large manual section, chunk of source, etc?
Time, compression-ratio and memory demand can all be traded off,
but (unmodified) compact/uncompact looks a loser in any case.
Jim McKie Centrum voor Wiskunde en Informatica, Amsterdam ..mcvax!jim
More information about the Comp.unix.wizards
mailing list