Algorithm needed: reading/writing a large file
Jeffrey W Percival
jwp at larry.sal.wisc.edu
Sat Jul 8 06:41:12 AEST 1989
In article <8137 at bsu-cs.bsu.edu> dhesi at bsu-cs.bsu.edu (Rahul Dhesi) writes:
>In article <205 at larry.sal.wisc.edu> jwp at larry.sal.wisc.edu (Jeffrey W
>Percival) writes:
>[how do I sort a large file that won't fit in memory?]
>There are many variations on the merge sort. Here is a simple one:
Please be careful with your paraphrasing. I did not ask how to sort a
large file that won't fit in memory. I said I already had a fast and
efficient sort and that the sorting was already known. My question was
about optimizing the process of rearranging a disk file according to a
*given* mapping.
One helpful person suggested reading sequentially and writing randomly,
rather than vice-versa, and I tried that but it didn't help. I guess
the benefit gained from using the input stream buffering was cancelled
out by the effective loss of the output stream buffering.
The ever-attentive and always-reliable Mr Torek suggested looking
into disk cache algorithms in OS journals, and verily that I will do,
with thanks to those who responded.
--
Jeff Percival (jwp at larry.sal.wisc.edu)
More information about the Comp.unix.wizards
mailing list