Tuesday, February 24, 2015

Changing to a sibling directory easily

I deal with multiple source trees on a daily basis.  Occasionally it's handy to switch from a subdirectory in one to the same subdirectory in a different source tree.  So I put together the following set of shell functions (similar to command aliases but they allow greater flexibility in checking parameters) to make this easier.

For example, let's suppose my current directory is ~/src/tree1/subdir1/subdir2/subdir3/ and I want to switch to the same path in the directory ~/src/tree2.  With these bash functions defined, I can type the command "cds tree1 tree2".  If I want to return to the previous directory, I can hit the up arrow key to recall the command, cursor over, and change the "cds" command to "cdsb".

# cd sideways (replaces one portion of current path with new string and changes to that directory)
function    cds
{
    if [ -z "$2" ]                           # Is parameter #1 zero length?
    then
        echo "Usage: cds DirPatternToReplace DirNewPattern"
    else
        NWD=`echo $PWD | sed -e "s/$1/$2/"`
        echo "Changing directory"
        echo "from: $PWD"
        echo "to:   $NWD"
        cd $NWD
    fi
}

# cd sideways back (same as previous command but parameters are reversed to go backwards)
function    cdsb
{
    if [ -z "$2" ]                           # Is parameter #1 zero length?
    then
        echo "Usage: cdsb DirPatternToReplace DirNewPattern"
    else
        NWD=`echo $PWD | sed -e "s/$2/$1/"`
        echo "Changing directory"
        echo "from: $PWD"
        echo "to:   $NWD"
        cd $NWD
    fi
}

Saturday, February 21, 2015

Bash history

I've always had a strong preference for using command lines interfaces (AKA CLIs) over GUIs.  I can get tasks accomplished much faster using a Unix bash shell than I can on any GUI.  Plus CLIs lend themselves to greater levels of customization than GUIs do.  If I can customize a user interface, I can adapt it to the way I prefer doing things which makes the CLI even faster to use.

One of the customizations I use in Unix style shell interfaces is to modify how commands are stored in the shell command history.  I prefer the Bash shell (AKA Bourne Again Shell).  If you don't understand the humor in that, a little time on Google can clear it up for you.

The first step I took is to specify a permanent file for my shell history.  This allows multiple shells to share the same command history which frees me from the need to remember which shell I entered a command of interest into so I can recall it.  The following lines in a .bashrc file will set this for me.

# override default history size and file settings
export HISTSIZE=500
export HISTFILESIZE=500
export HISTFILE=~/.bash_history

I also need to prevent my history from being wiped out when a shell is closed.

# prevent closing a shell from overwriting history
shopt -s histappend

I also find it helpful to store timestamps for each command stored in the history.  This can be useful for shared computers where you may not be the only user entering commands.



# Store timestamp information for each command
export HISTTIMEFORMAT="%m%d %T  "





I also hate seeing duplicate commands in my command history.  One of each is sufficient and any more than that just clutter up the history unnecessarily.

# don't store duplicate commands
export HISTCONTROL=ignoredups:erasedups

And last but not least, I hate to waste space in my command history for short commands.  Typing ls is faster than looking it up in the command history so why waste space that could be storing more complicated commands that are harder to remember?

# ignore certain commonly issued commands
export HISTIGNORE=ls:ps:pwd






These lines added to your .bashrc should work on Linux, Cygwin under Windows, or Mac OS X terminal sessions.

Sunday, March 17, 2013

Sharing data

These days I split my computing time between a desktop computer at work, a desktop computer at home, a tablet device, and a smartphone.  Frequently I find myself wanting to save data in the form of a bookmark, a link to a web pace, or a note on one of these devices to access later on others.  Fortunately, there are a number of applications which make this easy.  Here are the applications I've picked to do the job.

Dropbox - Installing this application on computers and mobile devices allows effortless sharing of all types of files between devices.  Dropbox gets used by a number of other applications like PlainTest (listed below) to make life easier.

PlainText - Allows easy viewing and/or editing of text files stored on your Dropbox account from your mobile devices.  Only available on iOS devices like iPhone and iPad but you can find similar applications for Android devices.

Xmarks - Makes keeping bookmarks synchronized between browsers on desktop and laptop computers dead simple.

Instapaper - Ideal for those URLs you stumble upon on an application on one device that you want to save for later viewing.  A number of mobile device applications such as Twitter feature integration with Instapaper to simplify the task of saving interesting web pages.

Saturday, February 9, 2013

Windows development tools

It's no secret that given my own choice, I'd abandon the use of Windows PCs altogether.  However it's a sad fact of life that many of development tools I need to use at work are commercial Windows based tools.  In order to make using a Windows PC on a daily basis more bearable, I add the following tools.  An unmodified Windows PC is almost unusable to me these days.  I've no idea how anyone can get anything done on an unmodified Windows PC.

All of the following tools are free except VMware.

  • 7zip - The best archive utility I've found for Windows.  It handles all the archive formats I need to use like ZIP, RAR, and TGZ.
  • Ack - A handy little Perl script similar but superior to Grep which searches only source files.
  • ctags - Creates tags files which many editors, including Vim, can use to make source code navigation dramatically easier.
  • cygwin - Well worth it for the Unix style shell alone but you can add Windows ports of most Unix tools using this.
  • DropBox - Makes sharing files between multiple systems possible.  I take it one step further and have added it to my phone as well so my files are now easily portable.
  • FeedReader - I'm faced with periodic downtime at work where I have to wait for software builds, downloads, and tests to complete.  This RSS reader allows me to stay up-to-date on development tools and techniques during these intervals.
  • Sumatra PDF reader - Using a less popular PDF reader lowers the chances that you'll fall prey to malware using PDF files as a delivery mechanism.
  • Irfanview - Handy for cropping screenshots and other light image file manipulation.
  • Pidgin - Our office uses IM to stay in touch.  This is a nice little IM program with support for multiple IM protocols.
  • putty - I periodically need to connect to remote systems using telnet or ssh protocols.  This program makes that easy.
  • source navigator - Useful for familiarizing yourself with large bodies of source code.
  • sysinternals - These utilities proved so handy that Microsoft purchased the company which developed them.
  • TeraTerm - A decent terminal emulator.  Handles both telnet and serial port connections but I only use it for its serial capabilities.
  • Thunderbird - I use this to monitor my home email account.
  • TortoiseSvn - Integrates the Subversion source code control system with Windows Explorer.
  • TrueCrypt - A useful program for encrypting files, directories, and disk images.  I use it for some of the files I store on DropBox.
  • VMware - I need to run Linux software occasionally.  VMware is the fastest and easiest method I've found of doing this without using a separate PC. 
  • Winmerge - This is the best visual tool I know of for displaying differences between files and for merging changes from one file to another.
  • winscp - Handy for transferring files between systems using ftp, sftp, or scp protocols.
  • Wireshark - The best Ethernet packet sniffer.  It understands lots of protocols and can be extended to understand new ones if necessary.
  • Vim - My favorite editor.  It's a Vi clone with modern features like color syntax highlighting and column editing.
  • Xvi32 - My favorite hex editor.

Sunday, January 27, 2013

Using grep in vim

One of the things I like about Vim (and vi) is the ability to invoke Unix utilities to manipulate text in ways that might be hard or impossible with just the regular editor commands.  It's definitely written with the Unix Philosophy in mind.

One thing I do frequently while debugging problems is to add log messages with a distinct pattern so I can find them easily in the log file.  For purposes of this example, let's assume the pattern I use is XYZ.  If I open the log file in Vim, I can issue the following command to isolate just the lines in the log file which contain XYZ.

^[ggVG!grep XYZ^M

That looks like a pretty complicated command, doesn't it?  But if we examine it piece by piece, it's not really that bad.

At the beginning of the command we've got ^[ which is the escape key.  Look at this ASCII chart if you're not familiar with the caret followed by a letter shorthand for control characters.  I issue the escape key to make sure Vim is in command mode.  While we're talking about control characters, the ^M at the end of the command is shorthand for the carriage return (AKA the Enter key).  That causes the command to be executed.

The ggVG in the command serves to do a visual selection of all the text in the file.  The gg causes the cursor to be placed at the first line of the file.  The V invokes visual marking of text.  And finally, the G causes the cursor to be placed on the final line of the file.  That causes the marked area to contain all the lines of the file.

The real meat of the command is the next part - !grep XYZ.  The exclamation mark pipes the marked text to the external Unix utility which follows which is the grep command.  This particular command line searches for lines which match a pattern of XYZ.

Issuing this command will cause Vim's current data buffer (the full contents of the log file) to be replaced with the output of the external Unix utility which will be just the lines within the log file which contain the pattern XYZ.

That makes it really simple to isolate just the log commands I've added.  Once I'm done, I can either exit Vim without saving or just issue the u command (undo last text manipulation) to leave the log file untouched.

Sunday, January 20, 2013

Unix tool: xargs

One of the reasons I like Unix style operating systems so much is the Unix Philosophy.  One of the principles is it's better to include a bunch of small, fast tools which can be combined together to accomplish a variety of tasks than it is to build large special purpose tools which are complicated to use.  One of my favorite of these Unix tools is xargs.  What xargs is good at is taking a bunch of separate lines of input and changing those into arguments for another command.

Perhaps an example will serve to illustrate better than a dry explanation.

Let's imagine we want to find all the source files from the current directory (recursively) which contain the string "stdio.h" and to edit each of those files using vi.  The following line will accomplish that.  Of course in Unix, there are numerous ways to accomplish the same task.  This happens to be my favorite way to perform the task and serves to illustrate the use of xargs nicely.

find . -name "*.c" -print | xargs grep -l "stdio.h" | xargs vi

The first part of the command (that before the first pipe character) is a simple find command.  The only thing worthy of explanation is the quotes around the file pattern.  Unix shells will substitute matches for a wildcard such as this before the find command gets invoked.  If we invoke this long command in a directory containing files which match the pattern *.c, the matches would be substituted on the command line instead of the actual *.c pattern.

The second part of the command is a simple grep command but combined with the xargs command.  This passes the file names output by the find command to be used as arguments to the grep command.  The grep command is looking for files which contain the string "stdio.h", a common c library header file.

The third part of the command simply takes the matching files found by grep and passes them to vi.

This example should work on either Linux or Mac OS X.  It will also work on Windows provided you've installed a Unix environment such as Cygwin.  Cygwin is one of the first things I install on any Windows machine I have to use on more than a casual basis.

Play around with the xargs command to get a feel for what it can do.  It's a handy part of any Unix tech's grab bag of tricks.

Tuesday, January 15, 2013

Portable tools

Having written software on a variety of development systems including Sun workstations, Windows PCs, Linux PCs, and Macs, I've never regretted the decision to choose easily portable tools for use in my daily work.  By doing so, I learn a tool once and am able to use it regardless of what type of development system I am given at a new company.  I also have a preference for free software.  In the past it's been difficult if not impossible to get companies to pay for software tools.  Having a ready arsenal of free tools for any system I use is a huge bonus.

My editor of choice these days is one of the GUI versions of Vim (MacVim at home and GVim on the Windows PCs at work).  Besides the obvious enhancements which either version of Vim adds to the standard vi commands, the developer who started the project, Bram Moolenaar, runs the project as charity-ware with donations going to help children in Uganda.  I've got no problems donating to such a worthy cause.

I continue to use vi clones because vi was available on most of the systems I've used over the past 20+ years.  Using PCs in the early days, I chose vi over emacs because early ports of emacs were extremely resource intensive.  While vi would fit easily in a 40 KB .COM file which could be executed from a floppy disk on base PCs, the emacs .EXE file (necessary because it required more than 64 KB of code space) typically required a PC with an expensive EMS board.  

Were I to be starting out these days, I believe I'd probably choose to learn UltraEdit or SlickEdit since both have Linux, Windows, and Mac versions.  Both editors also have a much less daunting learning curve.  At best, vi commands are perplexing to new users.  You really need to understand the history of vi to grasp why the commands are designed the way they are.  In a nutshell, software in the early days of Unix had to support a variety of dumb terminals.  Commands had to be formed from the common subset of keys available on all terminals.

If you combine Vim with Exuberant CTAGS, you have a very powerful programming editor which makes it easy to learn and to navigate unfamiliar source code.

I've also chosen the Bash shell as my daily command line interface.  It's available on Linux and Mac.  It can also easily be added to Windows systems as part of Cygwin.  Bash is much more powerful than the standard Windows command shell.  I've heard good things about PowerShell but since it's only available on Windows, it's not really an attractive option for me.

Aside from the tools above, I'm fond of AWK and SED.  Used either separately or combined with their Unix based brothers like xargs, find, and grep, they give an nearless endless variety of ways to manipulate text files.

I've got one last recommendation.  There's a great grep replacement for software developers called ack.  It makes it easy to restrict the files searched to just source files.  You can do the same thing with grep but the command line gets very complicated.