Shell Database Management (?)
David Johnson x4-6506
davej at mrsvr.UUCP
Wed Sep 6 00:02:09 AEST 1989
>From article <4885 at omepd.UUCP>, by merlyn at iwarp.intel.com (Randal Schwartz):
= In article <956 at mrsvr.UUCP>, davej at mrsvr (David Johnson x4-6506) writes:
= | From article <10596 at dasys1.UUCP>, by parsnips at dasys1.UUCP (David Parsons):
= |
= | > The problem... the database consists of addresses... positions 99 and 100
= | > in each record contain a two-position abbreviation for the state. It's easy
= | > to get cut to read those two characters, and grep to identify the state I
= | > want to extract, but how the ^#$&! do you then copy the ENTIRE record
= | > thus identified to another file??? Using grep alone is no good because
= | > the abbreviation appears in various other places in the record...
= | >
= | > David Parsons
= |
= | Try this:
= |
= | cut -c99,100 database | grep -n "$abbr" | sed 's/:..*$/p/' | ed - foo
=
= Hmmppph. I saw a guy removing a screw with the claw of a claw hammer
= the other day. You remind me of him. Wrong set of tools, dude!
I must take issue with this. The question was phrased in terms of cut and
grep; my answer responded at the level of the question. I think my solution
is easier to understand for the "casual" shell user (who doesn't know his
way around awk). Suppose further that his requirements now changed to
"delete the ENTIRE record thus identified . . .". The solution above is
easy to modify for the "casual" user (i.e. one who understands ed).
= I can cut those lines in one process:
=
= awk 'substring($0,99,2) == "'"$abbr"'"' database
=
= Just another UNIX toolsmith (who can tell a claw hammer from a straight
= screwdriver...),
But who can't RTFM ;-) (try using "substr" instead).
=
= /== Randal L. Schwartz, Stonehenge Consulting Services (503)777-0095 ====\
--
David J. Johnson - Computer People Unlimited, Inc. @ GE Medical Systems
gemed!python!davej at crd.ge.com - OR - sun!sunbird!gemed!python!davej
"What a terrible thing it is to lose one's mind." - Dan Quayle
More information about the Comp.unix.questions
mailing list