‘xargs’ – Handling Filenames With Spaces or Other Special Characters

xargs is a great little utility to perform batch operations on a large set of files. Typically, the results of a find operation are piped to the xargs command: find . -iname "*.pdf" | xargs -I{} mv {} ~/collections/pdf/ The -I{} tells xargs to substitute '{}' in the statement to be executed with the entries being piped through. If these entries have spaces or other special characters, though, things will go awry. For example, filenames with spaces in them passed to xargs will result in xargs Read more [...]

Useful diff Aliases

Add the following aliases to your '~/.bashrc' for some diff goodness: alias diff-side-by-side='diff --side-by-side -W"`tput cols`"' alias diff-side-by-side-changes='diff --side-by-side --suppress-common-lines -W"`tput cols`"' < p>You can, of course, use shorter alias names in good old UNIX tradition, e.g. 'ssdiff' and 'sscdiff'. You might be wondering why (a) I did not do so, and (b) what is the point, conversely, of having aliases that are almost as long as the commands that they are Read more [...]

Supplementary Command-History Logging in Bash: Tracking Working Directory, Dates, Times, etc.

Introduction Here is a way to create a secondary shell history log (i.e., one that supplements the primary "~/.bash_history") that tracks a range of other information, such as the working directory, hostname, time and date etc. Using the "HISTTIMEFORMAT" variable, it is in fact possible to store the time and date with the primary history, but the storing of the other information is not as readibly do-able. Here, I present an approach based on this excellent post on StackOverflow. The main differences Read more [...]

Stripping Paths from Files in TAR Archives

There is no way to get tar to ignore directory paths of files that it is archiving. So, for example, if you have a large number of files scattered about in subdirectories, there is no way to tell tar to archive all the files while ignoring their subdirectories, such that when unpacking the archive you extract all the files to the same location. You can, however, tell tar to strip a fixed number of elements from the full (relative) path to the file when extracting using the "--strip-components" option. Read more [...]

Piping Output Over a Secure Shell (SSH) Connection

We all know about using scp to transfer files over a secure shell connection. It works fine, but there are many cases where alternate modalities of usage are required, for example, when dealing when you want to transfer the output of one program directly to be stored on a remote machine. Here are some ways of going about doing this. Let "$PROG" be a program that writes data to the standard output stream. Then: Transfering without compression: $PROG | ssh destination.ip.address Read more [...]

Neat Bash Trick: Open Last Command for Editing in the Default Editor and then Execute on Saving/Exiting

This is pretty slick: enter "fc" in the shell and your last command opens up for editing in your default editor (as given by "$EDITOR"). Works perfectly with vi. The"$EDITOR" variable approach does not seem to work with BBEdit though, and you have to:
$ fc -e '/usr/bin/bbedit --wait'
With vi, ":cq" aborts execution of the command.

`gcd` – A Git-aware `cd` Relative to the Repository Root with Auto-Completion

The following will enable you to have a Git-aware "cd" command with directory path expansion/auto-completion relative to the repository root. You will have to source it into your "~/.bashrc" file, after which invoking "gcd" from the shell will allow you specify directory paths relative to the root of your Git repository no matter where you are within the working tree. gcd() { if [[ $(which git 2> /dev/null) ]] then STATUS=$(git status 2>/dev/null) Read more [...]

Filter for Unique Lines Adjacent or Otherwise While Preserving Original Order

There are two BASH utilities that help you filter input for unique lines: 'uniq' and 'sort': One gotcha with 'uniq' is that it only filters out duplicate adjacent lines. So if your input looks like: apple apple apple chicory chicory chicory banana banana Then running 'uniq' on it will yield: apple chicory banana But if the input has non-adjacent duplicate lines: apple banana banana chicory apple banana chicory banana banana apple apple apple banana chicory Then the results are: apple banana chicory apple banana chicory banana apple banana chicory < p>The Read more [...]

Filesystem Management with the Full Power of Vim

Just discovered "vidir" , a way to manipulate filenames inside your favorite text editor (better known as Vim). Previously, I would use complex and cumbersome BASH constructs using "for;do;done", "sed", "awk" etc., coupled with the operation itself: $ for f in *.txt; do mv $f $(echo $f | sed -e 's/foo\(\d\+\)_\(.*\)\.txt/bar_\1_blah_\2.txt/'); done Which almost always involved a "pre-flight" dummy run to make sure the reg-ex's were correct: $ for f in *.txt; do echo mv $f $(echo Read more [...]

Dealing with ‘Argument list too long’ Problems

The solution to this problem is to the "Argument list too long" error when trying to archive a large number of files is the "-T" option of the "tar" command to pass in a list files generated by a "find" command: Create a list of the files to be archived using the "find" command: $ find . -name="*.tre" > filelist.txt Use the "-T" option of the "tar" command to pass in this list of filenames: $ tar cvjf archive.tbz -T filelist.txt If you want to delete a long list Read more [...]

Add the Following Lines to Your `~/.bashrc` and You Will Be Very Happy

I added the following to my `~/.bashrc` and I am loving it! ## Up Arrow: search and complete from previous history bind '"\eOA": history-search-backward' ## alternate, if the above does not work for you: #bind '"\e[A":history-search-backward' ## Down Arrow: search and complete from next history bind '"\eOB": history-search-forward' ## alternate, if the above does not work for you: #bind '"\e[B":history-search-forward' (see the comments below for explanation of the alternate codes) The Read more [...]