Americas

  • United States
sandra_henrystocker
Unix Dweeb

Favorite Unix Tools & Commands

Analysis
Sep 16, 20087 mins
Data CenterLinuxOpen Source

After two and a half decades of working with Unix, I’ve seen a lot of change in my favorite OS and, at the same time, retained a number of favorite commands that continue to keep my job as a Unix sysadmin from boring me to tears or eating up all of my free time. Unix is still clever enough to keep me amused, competitive enough to keep me employed and innovative enough that there’s always something new to learn.

My first impression of Unix as an operating system was so wound around the idea of pipes that I am still somewhat entranced at how much one can do with a single line of text. Though I don’t pursue the high density commands that make the One-Liner Hall of Fame, I string commands together on the command line on a routine basis and never stop appreciating the utility of a carefully crafted one-liner. Examples of some of my favorite one-liners include the likes of this ls and grep command:

ls -l | grep “^d”

This simple command lists only the files in the current file system location that are directories. You can do the same thing for a non-current directory like this:

ls -l /usr/local | grep ^d

If I want to do this same kind of thing recursively, then I use find with a file type of “d”:

find . -type d -ls

Another command that I find myself using a lot is this awk command:

awk ‘{print $NF}’ filename

It prints only the last string in each line of a file. Frankly, it’s surprising how often I need to do this. The “trick” in this command is that NF (without the $) represents the number of fields in each line of input. $NF then represents the value of the rightmost field.

Another useful command that has helped keep me cool under pressure is the “cd -” command. It takes me back to whichever directory I was in just before moving to the current one. This works in bash anyway — the shell in which I prefer to work. This is much better than trying to remember what directory I was in before I made the last move.

Now, say you want to view the particulars on a command that you use routinely. Try these commands:

ls -l `which scp`

cksum `which scp`

These commands list the details of an executable without you having to supply the path. They use the which command to identify the particular executable that you will be using when you call it by name and then provide you with the details.

Ever try typing date instead of date, rm instead of rm and so on? These and similar commands will override any aliases you might have defined for the particular commands. How many times have you tried to remove a bunch of junk files only have the system ask you to reassure it — one file at a time — that you really meant “rm” when you typed “rm”? This trick makes it easy to ignore any aliases that are defined for the account you are working in. You can also use the unalias command to turn off the particular alias (e.g., unalias rm) for the remainder of the current login session.

I also appreciate the “–” option for overriding option processing. I can get rid of files that start with hyphens without having to resort to using find and the file’s inode number.

# rm — -huh

I can’t talk about favorite one-liners without mentioning “perl -p -i -e” commands for making changes to any number of files without having to open a single one of them in a text editor. This can also be type “perl -pi -e”, but I have an easier time with “-p -i -e”. It has something to do with my favorite dessert. The command shown here will take out the carriage returns from a text file originating on a Windows box (i.e., it turns cariage return and linefeed combinations into linefeeds):

perl -pi -e ‘s/rn?/n/’ *

Other useful shortcuts include zmore and bzcat. If a text file is gzipped, the command “zmore file.gz” will allow you to page through it without first going through the process of unzipping it. If the file is bzipped, you can usr “bzcat file.bz2 | more” instead.

And, of course, I will never stop appreciating grep and egrep for helping me to find a specific or several alternative strings in a set of files.

I admire rsync for its magnificent efficiency in replicating directories. Rsync can compare collections of files without transmitting any content (only checksums). Change one byte in one file in a thousand files that are replicated with rsync and rsync will find and send that one byte to make the collections the same again. What incredibly useful and efficient technology!

I still use wget for downloading web sites. It is especially useful if all the files are linked into the site so you can get them all with a single recursive command such as this:

wget -r -t1 http://www.mysite.org/ -o download.log

The output from the download goes into the download.log file in this example. The content will all land in a directory named after the site — in this case, www.mysite.org.

The sort and uniq commands still save me an enormous amount of time more times every month than I care to estimate.

I have also come to routinely make use of return codes (using $?) instead of parsing command output in my scripts. So much easier than trying to do some kind of string match on the output of whatever command was run!

if [ $? != 0 ]; then # previous command failed

echo “ERROR: Command failed”

exit 1

fi

I’ve also come to appreciate how || and && can streamline my scripts. With “&&” (and), the test mimics an if-then. With “||”, it’s an if-else. The syntax if crisp and very nice to use if you only need to work with one outcome (success or failure).

ping $server && echo “It’s UP!”

ping $server || echo “no go :-(“

Another acquired favorite is the select command for building menus inside scripts — bash style!

echo “Select your release:”

select REL in `ls`

do

if [ “$REL” != “” ]; then

break

fi

done

if the current directory contains files named r11, r12, r13, r14 and r15, your menu might look like this:

Select your release:

1) r11

2) r12

3) r13

4) r14

5) r15

#? 5

The ping command remains one of the most useful commands for testing basic connectivity and figuring out what systems have recovered and which are still down after a power outage. And, of course, it provides a quick and dirty way to check your access to remote systems, your default route, your network card, whether a particular domain is registered and so on.

I’ve come to appreciate the flexibility of using while loops that process files one line at a time as in this example:

while read arg1 arg2 rest-of-line

do

case $arg1 in

1) echo $1

2) echo $2

3) echo $3

esac

done

For performance issues, top (written by my old friend Bill LeFebvre!) and the Solaris prstat command are wonders for providing quick performance analysis on systems I manage. The uptime command is also very handy. I’ve been able to get many quick answers to the “Why is my system so slow?” question using these tools.

Unix commands may not constitute anyone’s favorite entertainment, but gosh, someone has actually made a youtube video about pwd! Just think about how much drama we could add with some of the spiffy commands I’ve mentioned above!

And, last but not least, I’ve also grown particularly fond of ^D! It’s still nice to go home at the end of a busy day.

sandra_henrystocker
Unix Dweeb

Sandra Henry-Stocker has been administering Unix systems for more than 30 years. She describes herself as "USL" (Unix as a second language) but remembers enough English to write books and buy groceries. She lives in the mountains in Virginia where, when not working with or writing about Unix, she's chasing the bears away from her bird feeders.

The opinions expressed in this blog are those of Sandra Henry-Stocker and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.