Showing posts with label tech. Show all posts
Showing posts with label tech. Show all posts

Friday, September 05, 2025

The benefits of a varied technical background

This week I helped a colleague solve a strange problem which he had encountered.  He was modifying some application level code which had to read and write a device driver I had created to control two LEDs on our device.  Previously the application code had only written to the driver but the decision was made to control the two LEDs independently which required reading the old state from the driver prior to updating the LED state.  Fortunately I had added the ability to read the LED status since it improved my ability to debug the device driver.

The problem was caused by the need to interleave reads and writes to the device driver which under Linux gets treated as a file.  Unbeknownst to my colleague, any read or write to a file stream affects the file pointer which keeps track of the location within the file which will be accessed next.  A simple device driver has no need of the file pointer concept but since Linux treats devices as files, the standard library code which enables accesses to devices and files keeps track of the supposed file pointer even if it doesn't need to do so.  In a standard file access, I should have been able to do a fseek (file seek) to the current position between the read and write calls to fix this issue.  Unfortunately, since my device driver is very bare bones, I suspect there was extra call within the device driver needed to handle fseek calls.  I used a brute force fix of closing the device driver and re-opening it within the application code.

Somehow this makes me think of the common wisdom from early in my career which suggested one shouldn't change jobs too often lest one be labeled a "job hopper".  It turns out that job hopping has given me a very diverse background which has improved my chances of finding jobs.  Changing jobs more frequently has helped me escape from jobs where the work was boring or which placed me under managers which were difficult to deal with.

Friday, August 08, 2025

More machine language fun

When I first starting working as a Senior System Analyst at GEISCO (GE Information Systems) in the mid-1980s, they had us logging into mini and mainframe computers via terminals.  Several of the commands we had to use needed elevated privileges which required us to enter a password of the day.  In order to get this special password, they gave us a small script which retrieved this password and most people put a call to this script as part of their network login to automatically show the password of the day.  Being a curious sort, I wanted to know how the script to display the password worked.  Most people found it cryptic since it consisted of several groups of 12 digit numbers and none of the digits were larger than 7.  I knew this likely meant that these digits were octal numbers which require 3 bits each to represent.  Couple that with the fact that the groupings of numbers were 12 digits long told me that they represented 36 bit words.  Since I knew GE made heavy use of Honeywell mainframe computers at the time, I concluded that the script was some type of interpreted machine language program.  So I dug out my old Honeywell assembly language documentation and discovered that the script was a simple little program to issue a system call (MME - Master Mode Entry) and then print out the results.  To test my theory further, I modified the program to shift the characters of the master password so they would print out backwards.  It basically served to entertain me each time I logged in.  It's amazing the little challenges which I find amusing, huh?

While I was working at GE, a project was launched to upgrade the storage device on the CC (Central Concentrator) network node.  One of the tasks performed by the CC was to load software on the other, smaller network nodes and its original 2 MB device was deemed too small to handle network expansion.  Believe it or not, that 2 MB storage device was a magnetic drum from Vermont Research.  I had signed up for this project because the replacement storage device was originally specified as a 10 MB hard drive similar to those used on higher end PCs of that time.  I was anxious to get experience on these disk devices which were cutting edge technology at the time and writing a device driver from scratch sounded like fun.  Somehow Vermont Research found out about the project and submitted a lower bid for an upgrade to a 10 MB drum device.  So my dreams of writing a device driver became the much less interesting task of updating the old device driver to extend the addressing to accommodate the extra storage.  The only challenging part of the project was that the diagnostic program also needed to be updated and somehow the source code for the diagnostic had been lost.  So I was forced to read the punched card deck into the mainframe in order to print out the binary data the deck contained so I could disassemble it.  Then I had to figure out how to write a patch for the diagnostic program.  And finally, I had to figure out how to get the mainframe's card punch to reproduce the same punch card format used by the diagnostic.  For a few days the computer operators for the mainframe got used to me making multiple daily attempts to convert the binary file containing my patches into a format which could be punched in the same format as the diagnostic deck.  They told me that they hadn't seen anyone use the card punch in many years.  Each attempt required me to tweak my program to convert the diagnostic's binary data into a slightly different format.  It wasn't as much fun as I had hoped for but it did prove pretty challenging.

Thursday, July 31, 2025

The joys of machine language programming

When I started my career as a field engineer for Honeywell mainframe computers in the late 1970s, I worked a lot of swing and midnight shifts.  While day shift was always pretty busy, the night shifts were often boring.  To entertain myself, I read the CPU manuals with the goal of being able to modify the diagnostic programs used to test the computers.  Occasionally it proved handy to load one of the diagnostics and then to patch them in memory to loop through operations which were failing.  This allowed using an oscilloscope to trace signals of interest though the 80 wire-wrap boards which made up the CPU.

Eventually writing these machine language programs became my favorite pastime on slow nights.  Part of the draw was the maintenance panel switches which made it easy to read and write memory locations.  There was a definite thrill to getting a program working and watching its progress via the flashing lights on the maintenance panel.

For those who aren't familiar with low level programming, machine language programming involves directly entering the binary encoded instructions into memory locations for later execution.  More people are familiar with assembly language programming which replaced the binary programming with mnemonic names for the instructions and any modifiers.  For example, a Honeywell mainframe had an instruction called LDA which loaded the A (or accumulator) register with some value.  In machine language programming, that LDA instruction had the opcode of octal 235.  Older mainframes often used octal encoding instead of the hexadecimal encoding which is more often used today.  The other convenience offered by using assembly language over machine language is that the assembler would calculate the addresses automatically rather than forcing you to manually calculate the address offsets by hand which was painful.

My second job was as a field engineer for DEC PDP-11 minicomputers.  These smaller machines were so much less complex than the mainframes that fixing the hardware wasn't much of a challenge.  The saving grace was the PDP-11 instruction set was simple enough to allow me to quickly come up to speed on its machine language.  When I was in Boston for training, I wrote a machine language program to determine which terminal connected to the PDP-11 had had data entered on its keyboard.  Apparently the way I approached programming was different than most people's because the instructors had trouble figuring out how my program worked.

Believe it or not, the ability to decipher machine language is still useful when I have to use gdb to debug a program.

Tuesday, February 25, 2025

Configuring Windows/Mac/Linux for embedded development

A few days ago Scott Hanselman posted an interesting question on Bluesky.  He asked how much stuff people needed to add to Windows to make it useful for day to day work.  He also asked a similar question of Mac users.

Admittedly, my use case differs from that of most people.  I do embedded firmware development.  For me, my company Windows laptop mostly acts as a way to connect with the Linux build machines and target machines I use.  It's really little more than a glorified terminal except for running Outlook, Office, and Slack.

Windows

Having made the switch to a Mac at home 24 years ago, I only use Windows at work now.  On any new Windows machine, I first install the following software.  It's all free software as most companies I've worked for make it so difficult to justify the purchase of commercial software, that it's not worth the effort.

  • Gvim - I occasionally need to do some local editing on Windows and for that a graphical version of vi is an absolute necessity for me.  I've been using some version of vi for 35+ years and while I've had occasionally dalliances with other programming editors, I've always returned to vi.
  • VcXsrv - Being able to launch graphical applications remotely makes my life much easier.  That means using an X11 server.  I know there's pressure to move to Wayland but it strikes me as more effort than it's worth at this point.  It's the same feeling I have when I hear someone suggest that I try writing a device driver in Rust.  I just want to get work done, not spend time blazing a trail.
  • Putty - I need to connect via SSH or serial communications to a number of Linux machines (build servers, target systems, etc) and Putty is my hands down favorite way of accomplishing this.  I make sure to enable X11 forwarding on Putty SSH sessions because this allows me to launch GUI programs and have them display on my Windows laptop.
  • WinSCP - This allows me to easily copy files back and forth between Linux machines and my Windows laptop.  It also enables easy remote editing of files which reduces the pain of editing a file on a remote machine over a slow Internet link.

Mac

When I first started using a Mac at home, I loved the development environment which the combination of Mac OS X, Xcode, and the Quartz X11 server provided.  It was the best development platform I had seen since my days last using a Sun workstation in 1996.  Over time and Apple's push to combine features of iOS and Mac OS, it's become much harder for me to set up a reasonable development environment on the Intel Mac Mini which serves as my desktop machine at home these days.  Since most of my embedded development is done for work, that's not a deal breaker.

  • MacVim - As mentioned above in the Gvim section, I need to edit files locally on my Mac.  MacVim gives me a version tailored for use on Macs.
  • Homebrew - Unfortunately, many of the tools I've come to rely upon are only available through an alternate install path.  Homebrew gives me access to a number of development tools not available through the Mac AppStore.
  • XQuartz - This X11 server used to be available in the Xcode tools but now the best version seems to require being installed via Homebrew.
  • Unfortunately I have not found a free GUI SCP application for Mac I like yet so I resort to using the standard Mac Terminal app and the command line scp tool.

 Linux

I use a Raspberry Pi 5 at home since Linux is orders of magnitude better at interfacing with a variety of small embedded machines than either Windows or Mac are.  I typically use a pared down Linux distribution because I don't need the typical blend of applications like Open Office.  I've been using Debian Bookwork with the Xfce desktop environment.  

It's easy to install X11 apps, Gvim, and Putty on Linux.  The IT group at work has our Windows laptops very locked down so installing new software such as the GUI software for a USB protocol analyzer sometimes requires getting it approved which can take a few days.  Mac has gotten harder to run third party application software as well, much like the iOS app store which is very locked down.  Development goes so much faster when I can install any software I need without facing roadblocks.

Linux is also good at doing compiles for the firmware and application software I create for the newest embedded ARM device at work which is also an ARM 64-bit processor.  It has better USB support too.  Windows often requires the installation of device drivers for various USB serial devices which can be hard to do when using a laptop with limited admin rights.

Sunday, February 23, 2025

WordStar

I recently commented on a post on Mastodon about Wordstar by Tom Jennings (yes, the one associated with FidoNet).  In his post Tom extolled the virtues of Wordstar for what a good piece of software it was and I completely agree.  Not only was it good for its time but it compares quite favorably against modern software.  It needed to be configurable because software which ran on CP/M often required customization of the display and printer settings to match the hardware connected to the user's machine.  It was also quite robust.  I last used it around 1989 and I don't recall it ever crashing.  I cannot say the same about any modern word processors I use.  Finally, it was remarkably full featured for its time.  I recall being excited to discover that it had a column editing mode which at the time I had only seen on IBM's PE (Personal Editor).

I appreciate both Mastodon and Bluesky because they allow me to see what favorite authors, scientists, engineers, and artists are up to at the moment. 

 

Sunday, February 04, 2024

Making changes to your development setup easier

As I'm occasionally assigned new projects at work and as those projects progress, I find the need to update my development environment on a regular basis.  Having the ability to customize my setup allows me to be more productive.

To that end, I have defined a few bash aliases which help make this process easier for me.  These aliases are defined in my ~/.bash_aliases file.

#  config bash
alias   cfgb='gvim ~/.bash_aliases'
# config vim
alias   cfgv='gvim ~/.vimrc'
# source my bash aliases to pick up any new changes
alias   redot='source /home/rod/.bash_aliases'

You'll notice that I have an alias to make changes to my vim setup.  It makes sense to also have a vim macro to read any new configuration settings or vim macros.  This macro is defined in my ~/.vimrc file.

" source .vimrc ro pick up any new macros
map \.  :source /home/rod/.vimrc^M

Friday, December 29, 2023

Default apps for 2023

Lately, I've seen a lot of the "what apps am I using" type posts, apparently prompted by a podcast episode.  You can see a collection of what a number of people are using here.  I find these interesting as they're a good way to discover new apps which I hadn't been aware of previously.  These are the apps I use for my iPhone, Mac, and Raspberry Pi at home as well as the Linux and Windows machines I use at work.


  • Mail Client: Apple Mail, Outlook at work (barely tolerable but dictated by the IT team)
  • Notes: Apple Notes for shared notes, Editorial for notes I don't want on the cloud (such as those containing sensitive information such as birthdays)
  • Chat: iMessage
  • Camera: Apple Camera, Night Capture for evening sky photos
  • Photo Management: manual sync into directories on my Mac
  • Photo Editing: Preview on Mac, IrfanView on Windows
  • Calendar: Google Calendar - I like it for its ability to do custom repeats and for ease of having more than 2 notification reminders
  • Browser: On Macs I use Safari for lightweight browsing for its ease of sharing bookmarks with mobile devices, Firefox for general browsing because it has a richer collection of security plugins.  At work I use Chrome (barely tolerable but dictated by IT).
  • Backup: ChronoSync
  • Read It Later: Instapaper
  • RSS: Inoreader (web and app)
  • News: RSS feeds from news sites which lets me keep up with events without being distracted by things I'm not interested in such as sports
  • Podcasts: Overcast (great performance and the best UI of any app on my phone, bar none)
  • Books: Kindle, Audible, Libby, Hoopla
  • Database: Tap Forms (good sync between mobile and Mac)
  • Personal Finance: GNUcash
  • Password Management: 1Password - with 600+ non-trivial passwords to remember, this is a must
  • Music - Apple Music, Remote app to control desktop from my phone
  • Editing (code and general) - MacVim on Mac, GVim on work machines (Windows and Linux)
  • X Servers - XQuartz on Mac, VcXsrv on Windows
  • SCP client - WinSCP on Windows, scp from the shell on Linux/Mac machines
  • Terminal emulation - putty on Windows and Linux machines

Saturday, March 18, 2023

Exploding windows

I've encountered "exploding windows" on both Windows and Linux lately.  I try to resize a window and by clicking the window near the title bar, the window suddenly and unexpectedly expands to fill the entire vertical area of the monitor it inhabits.  Sometimes I'm able to do the simple resize operation which I wanted but randomly the operation results in an exploding window.  Perhaps the UI designer thought this was a neat feature but it causes me to curse each time I encounter it.  I'd love to figure out how to disable this behavior but each time I try to describe it in a Google search, I'm led down a rabbit hole of unrelated results.  It seems silly spending too much time researching something which while annoying only costs me annoyance and a few seconds to correct.

Sunday, February 19, 2023

Developers need IT skills

Often software developers can benefit from having skills typically found in IT support engineers.  For example I occasionally bring small embedded systems home from work since it makes working from home more productive.  It's hard to physically reconfigure equipment I've left at the office and the reset button is tough to hit across a VPN.  

In this scenario I need to be able to access the small embedded system on our home LAN as well as my build desktop machine which stays at work.  Having a VPN client which supports split tunneling makes this possible.  It requires that there be no overlap in IP addresses between the two LANs.  It also requires that I change the static IP address on the equipment I bring home to match the subnet which my home router uses.

The diagram gives a rough idea of how the split tunneling works.

VPN split tunneling diagram

Sunday, January 15, 2023

Getting paid to solve puzzles

 

The thing I enjoy most about my job developing firmware for embedded systems is that it's a lot like being paid to solve puzzles.  Many of the projects I work on are just as challenging as the old text based adventure games such as Infocom's Zork series.

The datasheets which contain information about how the chips in embedded hardware are supposed to function can be challenging to decipher.  Vendors do their best but it's not unusual for the datasheets to either be incomplete or to contain subtle inaccuracies.  A chip I developed a device driver for recently had an accurate datasheet but the device driver gave the wrong results because of an issue with the C compiler for the ARM processor.  It turns out gcc for ARM CPUs does not support signed character types by default which this chip required.  Fortunately gcc includes a compiler flag "-fsigned-char" which allows this strange behavior to be overridden.

Friday, December 30, 2022

My worst technical interview

About 38 years ago I interviewed for a software job at GEISCO (GE Information Systems).  This would be my second software position after making the switch from field engineering.  It was a systems software position, writing software to run on one of their network nodes.  At the time GEISCO supposedly ran the largest private data network in the world

One of the engineers I interviewed with felt obligated to test my software skills by asking me to write a short program on his whiteboard.   The problem was most systems software was written in assembly language at the time.  I knew 4 assembly languages which didn't overlap at all with the assembly languages he knew.  Upon learning that we didn't share a language, I asked if he still wanted me to write a program since there was no way he could determine the correctness of my program.  Being a good corporate drone, he insisted that I still needed to complete this step so he could mark it off on the interview form.  So I proceeded to write some code in GMAP, the assembly language used by Honeywell mainframes since that was the one I was most familiar with.  I guess I talked my way through it successfully because I ended up getting the job.  

My program wasn't very functional as I wrote system software and it was rare for me to need to use system calls.  I definitely had never written a system call which performed any I/O.  I believe I loaded an immediate value into a register, performed a left shift, saved it into a memory location, and then ended the program with a MME GEFINI system call which terminates the current program.

Even though I first encountered GMAP assembly language 45 years ago, I still remember the numeric value for the opcodes of a number of the instructions.  Perhaps some day I'll once again need to know that a 235 opcode is a LDA instruction (load accumulator) but I'm skeptical.  There's a lot of old knowledge I'd love to be able to purge to reclaim the memory space.

Sunday, February 14, 2021

Remote Desktop replacement

At work I have 5 Linux machines (1 desktop and 4 tiny embedded machines) and a Windows laptop in my office which I need to use.  There are KVM switches which would allow to connect my monitor, keyboard, and mouse to the machines in that large a setup but they're expensive and the cabling would be a nightmare plus I'd be stuck using just one machine at a time.  I tried using Remote Desktop and VNC to access the desktop of another machine remotely but that's cumbersome and it's slow across a VPN.  To be fair, there are situations when Microsoft's Remote Desktop works wonderfully across a VPN.  At my last job I used to use the Mac version of Remote Desktop to access my Windows laptop at work.  Since a Windows machine was the destination, Remote Desktop did a great job of data compression plus it allowed the client machine to arrange windows differently than they were arranged on my Windows laptop at the office.  So my old 27" iMac which has nearly 4k resolution gave me a better viewing setup than I had at work.

Now I use X11 (aka X-Windows) for accessing all the Linux machines I need access to.  Making this work requires a couple pieces of software.

First off, I need an X Server on my Windows laptop.  That is used to display the windows created by the remote Linux machine.  I use VcXsrv, a version of the classic xorg X Server but recompiled under Visual C++.  This recompilation allows it better access to Windows resources and I find it works better than any other X Server I've tried such as xming or cygwin's xwin.  VcXsrv needs to be run on your Windows machine prior to attempting a connection to another machine.

The next thing required is an SSH client which is capable of X11 forwarding.  That allows any programs launched using the SSH client to send their displays back to the client machine.  I use PuTTY, which has been around a long time and which can handle serial and SSH connections.

Once you've run VcXsrv on your Windows machine and then used PuTTY to connect to a Linux machine (taking care to first set the configuration to allow X11 forwarding), launching a remote program and having it display on your client machine is as simple as typing the program's name.  As an example you can try xterm which will create an X11 terminal window.  Just be sure to launch it in the background by appending an ampersand so the program is placed in the background.  Otherwise, your input and output will remain tied up by the program being launched.  The command to launch xterm this way would be "xterm &".

Saturday, February 06, 2021

GNU Screen can make SSH sessions persistent

 I do a fair number of Linux kernel builds at work.  Doing a Yocto build for our target machine takes about an hour and 40 minutes for a the build to complete and the root filesystem to be prepared.  If I do that from home, there's a chance that I'll get a hiccup on my VPN connection which might disrupt the build in progress.  Even at work, I ssh into my Linux build machine from a Windows laptop and use xterm sessions for most of my development tasks.  If I kick off a build before heading home, there's always a chance that Windows will decide to do an update which could also disrupt my build.

Fortunately GNU Screen can make your sessions persistent through disruptions due to network disconnections or other reasons.  Now I make sure to start a GNU Screen session before doing my build.  I start named screen sessions with a command such as "screen -S build".  That way, if my session is disconnected, I can rejoin it with the command "screen -r build".

The one thing I don't like about GNU Screen is its default escape key, Control-A, since that interferes with the default Emacs command line editing mode in Bash.  So I've created a ~/.screenrc configuration file with the following command to change the escape key to Control-T.

escape ^tt

Sunday, January 19, 2020

Running Ubuntu Linux in a VirtualBox VM

I've recently started a new job which requires me to build software under Ubuntu Linux.  Since I don't particularly care for the Ubuntu distribution and having only a single development PC which runs Windows 10, the natural solution is to create a VirtualBox virtual machine and install Ubuntu in that.  However the combination of VirtualBox and Ubuntu caused a few problems for me.

The first problem I encountered is that VirtualBox seems determined that new VMs will all run at very low resolutions.  I've seen 640x480 being used on when running VirtualBox on both Macs and PCs running Windows 10 and it's ridiculous to be stuck with a resolution that first appeared in 1987.  At that resolution, it's difficult to get through the Ubuntu installation setup as some of the buttons appear off the screen.  That issue I believe is Ubuntu's fault.  They should make it possible to see all buttons regardless of screen resolution.  Poking around on Google, I discovered that there's a command line utility included with VirtualBox which allows you to run higher resolutions in the guest OS in VMs.  First make sure that VirtualBox is not running.  Start a command prompt with administrative privileges and navigate to the directory where VirtualBox is installed since the utility we need is not located in the command path.  The directory you want will most likely be something like "C:\Program Files\Oracle\VirtualBox".  Issue the command "VBoxManage setextradata global GUI/MaxGuestResolution any".  This plus installed the Guest Additions tools in the guest OS should allow you to change your VM to a higher resolution.

The second problem I encountered was after I installed the GUI version of my favorite editor, Vim, using the command "sudo apt install vim-gnome".  I was able to launch gvim but the menu was hidden which makes accessing a few features more difficult than necessary.  It turns out Gnome on Ubuntu has a bug which prevents some applications from displaying a menu bar.  The workaround is to define a command alias to launch Vim which undefines the UBUNTU_MENUPROXY variable.

Wednesday, November 06, 2019

Vi/Vim macros

Of all the editors I've used (and there have been a lot) I like Vi/Vim the most.  Once you get past the very steep learning curve, you can become extremely productive, in part because you almost never need the mouse.  I've been using Vi on and off since 1990 which has imprinted the commands into my fingers so I no longer think about them.  A few of the macros below have existed in some form since that time.

The macro commands below are from the _vimrc file I use to customize Vim under Cygwin running on Windows 10.  Note that characters in red below indicate control characters.  ^M is control-M which is the code produced by the Enter key on Windows keyboards.  ^[ is the code produced by the Escape key.

One of the things which saves time for me is having the ability to easily copy and paste lines from one file being edited to another.  These macros help me accomplish that.

" set temporary yank and paste filename
let @p="c:\\Users\\ViUser\\Documents\\vim\\vim.tmp"

" mark beginning of a line block (uses the a mark)
map \m ma

" yank lines to temp file (from mark to cursor pos. - uses b mark, b buffer)
map \Y :'a,.w! p^Mmb"ay'a`b

" paste lines previously yanked or deleted to temp file at cursor pos.
map \P k:r p^M

Another thing I do frequently is edit log files and search for my initials which I include in any log messages I add.  The macro below searches the log file being edited for the letters "TBD" and removes all other text.  Just remember to quit without saving or use the Vim undo command.

map \gt ^[ggVG!grep TBD^M

Monday, November 04, 2019

Customizing the bash prompt

I prefer to use a bash prompt which displays both the date and time.  That makes it easier to see when commands were issued when scrolling back through the terminal's buffer.  Here's what my prompt looks like.



The bash settings for this prompt are shown below.  You can add these lines to your .bashrc file.  For operating systems which support an admin or root user, I change the prompt's background from blue to red to give quick confirmation of that elevated privilege level.

# Attribute codes:
# 00=none 01=bold 04=underscore 05=blink 07=reverse 08=concealed
# Text color codes:
# 30=black 31=red 32=green 33=yellow 34=blue 35=magenta 36=cyan 37=white
# Background color codes:
# 40=black 41=red 42=green 43=yellow 44=blue 45=magenta 46=cyan 47=white
PBG=44
PFG=37
TFG=36

PS1='\n\[\e]1;Term ${$} \a\e]2;\u@\h - ${PWD}\a\e[1;${PFG};${PBG}m\][\D{%m-%d} \t] eh? \[\e[m\] '

Sunday, August 12, 2018

Searching for source files which contain multiple keywords

Recently I was presented with the challenge of searching for the code which built a SQL query.  I knew what the query looked like from a log file but searching for individual terms used in the query such as SELECT, FROM, or WHERE produced hundreds of matches which I didn't want to waste time checking one at a time.

So I created the following bash script to search files for each term in order.  I used ack instead of grep for the first search so it would pick only source files of interest for me.  The results from my bash script returned only a single file which contained all the terms and it was this one which I needed to modify.

if [ -z "$2" ]                       # Is parameter #1 zero length?
then
    echo "usage -- rgt term1 term2 [term3]..."
    exit 1
fi

files=`ack -l "$1"`


# shift past the first search term argument since 
# we've already used that one 
shift  

for var in "$@"
do
    files=`echo $files | xargs grep -l $var`
done

if [ -z "$files" ]; then    # are results zero length?
    echo "no matching files found"
else
    echo "matching files:"
    echo "-------------------------------------"
    echo "$files"
fi

Quick and easy code searches

Getting presented with a large and unfamiliar code base can sometimes prove daunting.  One of the biggest challenges is how to efficiently search for things when you need to make changes.

Since I'm a big fan of platform independent tools, I had been using recursive grep searches but those frequently return too many results.  It can be difficult to limit the files searched to just the source files of interest.  Perhaps you have a unit test directory you don't wish to search.  You might also want to skip directories containing library files or other binaries.  I prefer a tool called ack to help limit the unwanted files being searched.  That prevents lots of false positives which you need to wade through.

Creating a file called .ackrc in your home directory is an easy way of setting the default settings for ack.  The following sample .ackrc file configures ack to ignore a few directories which normally don't contain code of interest.  It also sets an alternate file extension for makefiles and configures the default source languages of interest.

--ignore-dir=.svn
--ignore-dir=dist
--ignore-dir=docs
--ignore-dir=examples
--ignore-dir=lib
--ignore-dir=tests
--type-add=make:ext:make
--asm
--cc
--make

Using ack and an appropriately configured .ackrc file can significantly reduce the number of lines of matches you must check when searching your source tree for a keyword.

Tuesday, July 17, 2018

Making code searches easier


When modifying source code I'm not familiar with, it's usually necessary to do some recursive searches to figure out what to change.  You can use a simple recursive grep but I prefer a tool called ack.  You can download ack in one of several forms.  I prefer the all-in-one perl script version.  I prefer ack because it can be configured via .rc (run command) file or command line to limit the searches to files of interest.  You can ignore files with certain extensions or entire directories.  Few things are more disheartening than searching for a term within the code only to find out it appears on many hundreds of lines.  Who feels like wading through hundreds of lines to find what you're actually looking for?
 
I also use the following bash function to pipe ack's output to gvim so I can weed out unwanted matches quickly.  This is handy if the search term is a common word which is also used as a variable or function name.  I delete lines containing any variations which aren't of interest which helps reduce the job to something more manageable.

The ack script may be replaced with a simple recursive grep if you'd like.  You can also use a different editor if you don't care for Vim/Gvim.  You'll just need an editor capable of accepting input from stdin.  The following function also tells gvim to highlight the search term which I find to be a time saver.

function    ackvr    # invoke ack perl script and edits the match
{
    if [ -z "$1" ]; then
        echo "Usage -- ackvr SearchPattern
    else
        echo "ack searching for $1"
        ack "$1" | gvim -c "/$1" -

    fi
}

Saturday, November 04, 2017

Using RSS feeds to save time

I've long been a fan of RSS feeds as a way to save time when monitoring blogs.  It affords me the ability to see the headlines of new posts (and only the new posts) as they appear.  I can choose to visit the blog if the headline sounds interesting or if none of the articles catches my eye, I can mark all headlines read which makes the web site disappear in my RSS reader until the next time they add content.  It is so much faster than actually visiting a bunch of web sites that it's not funny.

I have three separate lists of RSS feeds I monitor.  I use a cross-platform app called QuiteRSS on both my Mac at home and on my Windows laptop at work.  On my phone and tablet device, I use a mobile app called Feedly to monitor a third set of RSS feeds.

At home I follow RSS feeds for beer news, book releases, and information about RetroComputing.  I don't read about these topics at work because I don't feel right about asking a company to pay me to do that.

At work I follow a number of technical RSS feeds on computer security, software development, embedded system, computer communications, and a few other tech topics.  These are all directly or indirectly related to my job.  Having these RSS feeds to read allows me to take small breaks at work while my software is building or when I'm waiting for code to download to one of my development boxes.  I'm a firm believe in the studies that say these micro-breaks improve productivity.

Whenever I have a bit of downtime during the day such as while waiting at an appointment or in line at a store, I can look at the local news sites I monitor in the Feedly app on my phone.  Feedly synchronizes feeds so when I look at those same feeds at home on my tablet device, I only see articles I haven't marked read on either device.