Tuesday, February 25, 2025

Configuring Windows/Mac/Linux for embedded development

A few days ago Scott Hanselman posted an interesting question on Bluesky.  He asked how much stuff people needed to add to Windows to make it useful for day to day work.  He also asked a similar question of Mac users.

Admittedly, my use case differs from that of most people.  I do embedded firmware development.  For me, my company Windows laptop mostly acts as a way to connect with the Linux build machines and target machines I use.  It's really little more than a glorified terminal except for running Outlook, Office, and Slack.

Windows

Having made the switch to a Mac at home 24 years ago, I only use Windows at work now.  On any new Windows machine, I first install the following software.  It's all free software as most companies I've worked for make it so difficult to justify the purchase of commercial software, that it's not worth the effort.

  • Gvim - I occasionally need to do some local editing on Windows and for that a graphical version of vi is an absolute necessity for me.  I've been using some version of vi for 35+ years and while I've had occasionally dalliances with other programming editors, I've always returned to vi.
  • VcXsrv - Being able to launch graphical applications remotely makes my life much easier.  That means using an X11 server.  I know there's pressure to move to Wayland but it strikes me as more effort than it's worth at this point.  It's the same feeling I have when I hear someone suggest that I try writing a device driver in Rust.  I just want to get work done, not spend time blazing a trail.
  • Putty - I need to connect via SSH or serial communications to a number of Linux machines (build servers, target systems, etc) and Putty is my hands down favorite way of accomplishing this.  I make sure to enable X11 forwarding on Putty SSH sessions because this allows me to launch GUI programs and have them display on my Windows laptop.
  • WinSCP - This allows me to easily copy files back and forth between Linux machines and my Windows laptop.  It also enables easy remote editing of files which reduces the pain of editing a file on a remote machine over a slow Internet link.

Mac

When I first started using a Mac at home, I loved the development environment which the combination of Mac OS X, Xcode, and the Quartz X11 server provided.  It was the best development platform I had seen since my days last using a Sun workstation in 1996.  Over time and Apple's push to combine features of iOS and Mac OS, it's become much harder for me to set up a reasonable development environment on the Intel Mac Mini which serves as my desktop machine at home these days.  Since most of my embedded development is done for work, that's not a deal breaker.

  • MacVim - As mentioned above in the Gvim section, I need to edit files locally on my Mac.  MacVim gives me a version tailored for use on Macs.
  • Homebrew - Unfortunately, many of the tools I've come to rely upon are only available through an alternate install path.  Homebrew gives me access to a number of development tools not available through the Mac AppStore.
  • XQuartz - This X11 server used to be available in the Xcode tools but now the best version seems to require being installed via Homebrew.
  • Unfortunately I have not found a free GUI SCP application for Mac I like yet so I resort to using the standard Mac Terminal app and the command line scp tool.

 Linux

I use a Raspberry Pi 5 at home since Linux is orders of magnitude better at interfacing with a variety of small embedded machines than either Windows or Mac are.  I typically use a pared down Linux distribution because I don't need the typical blend of applications like Open Office.  I've been using Debian Bookwork with the Xfce desktop environment.  

It's easy to install X11 apps, Gvim, and Putty on Linux.  The IT group at work has our Windows laptops very locked down so installing new software such as the GUI software for a USB protocol analyzer sometimes requires getting it approved which can take a few days.  Mac has gotten harder to run third party application software as well, much like the iOS app store which is very locked down.  Development goes so much faster when I can install any software I need without facing roadblocks.

Linux is also good at doing compiles for the firmware and application software I create for the newest embedded ARM device at work which is also an ARM 64-bit processor.  It has better USB support too.  Windows often requires the installation of device drivers for various USB serial devices which can be hard to do when using a laptop with limited admin rights.

Sunday, February 23, 2025

Experience versus enthusiasm

We live in a somewhat rural area which means we have a well as there's no municipal water supply available.  Last week we discovered that we had no water pressure.  Since our house is 24 years old and supposedly well pumps seem to have an average lifetime about 20 years, this was an inconvenience but not a huge surprise.

What I found interesting was observing what it took to get the problem resolved.  The plumbing company we called did an excellent job.  They had a young plumber out to diagnose our problem within 4 hours of us reporting the problem.  The plumber they sent was very nice and extremely diligent.  Since he dealt mostly with houses on the eastern and more suburban portion of our county, he wasn't familiar with wells.  However he was able to get advice on how to troubleshoot the problem from more experienced plumbers at his company and after 3 hours, he determined that our well pump had finally died. 

The next day he returned early with a more experienced plumber (one closer to my age) who was familiar with wells and rural water supply equipment.  The two of them worked hard to replace our well pump in very cold temperatures (15-20°F).  During the times they came into the house, I had a few chances to chat with the more experienced plumber and found him to be not only very knowledgeable but also a really nice guy.

It struck me after they had left that the older plumber and I have found ourselves in somewhat similar situations.  I'm one of the two oldest engineers on our team at work and I'm definitely the oldest who still works full time.  I work on things that the younger engineers don't have experience with such as firmware, device drivers, and operating systems.  From time to time, the need to deal with old technology such as a serial port crops up and I'm happy to do it because it brings back memories of a simpler time.  I also seem to get all the core dumps to analyze which I find to be challenging puzzles.  Who needs brain teasers like Wordle when I can spend hours solving a crash?

I guess the lesson to be learned it that it's useful to have engineers of varying degrees of experience on a team as learning from people who have been around some type technology longer is more efficient than younger techs having to learn everything on their own.

WordStar

I recently commented on a post on Mastodon about Wordstar by Tom Jennings (yes, the one associated with FidoNet).  In his post Tom extolled the virtues of Wordstar for what a good piece of software it was and I completely agree.  Not only was it good for its time but it compares quite favorably against modern software.  It needed to be configurable because software which ran on CP/M often required customization of the display and printer settings to match the hardware connected to the user's machine.  It was also quite robust.  I last used it around 1989 and I don't recall it ever crashing.  I cannot say the same about any modern word processors I use.  Finally, it was remarkably full featured for its time.  I recall being excited to discover that it had a column editing mode which at the time I had only seen on IBM's PE (Personal Editor).

I appreciate both Mastodon and Bluesky because they allow me to see what favorite authors, scientists, engineers, and artists are up to at the moment. 

 

Friday, November 29, 2024

A few of my favorite programming environments

In my 40+ year career in software/firmware engineering, only 3 of the computers I've used have had what I considered superior development environments.  This type of environment makes it easy to be more productive.

The first was the Sun 3/80 Workstation i used at work in 1990 while writing software for packet switching communications equipment.  It had a Motorola 68030 CPU which is still my favorite assembly language and offered good performance.  The Sun offered my first extended exposure to Unix and it was love at first sight.  Unix was mature by that time and the development tools were top notch.  The only possible drawback was it only had a monochrome display albeit a high resolution one.  I enjoyed the way it allowed me to define a default window layout so I could have 4 terminal windows in standard locations each time I logged on.

The second machine I thought offered a great development experience was the Apple PowerMac G4 I bought for home use in 2002.  It was purchased to replace a PC because I had grown tired of Windows XP plus I had picked up an iPod 10 GB music player which integrated much better with Mac machines than it did with Windows.  The PowerMac G4 ran OS X, a Unix like OS derived from NeXT's NeXTSTEP. It had a PowerPC G4 CPU which was my first exposure to the RISC architecture.  OS X came with the X11 (aka X Windows) window manager which was also used on the Sun workstation.  X11 enabled me to run many of the same tools I had grown used to using on the Sun.  At the time I was working for an optical networking startup.  Some of the boards in our networking equipment used PowerPC G3 and G4 CPUs so it was handy having one at home to experiment with.  The Mac also gave me an environment which was very close the the Sun I had started using 12 years earlier with the added benefit of supporting a color display.  I've come to love color syntax highlighting of source files.  I eventually made the switch to Intel Macs and then Apple's own CPUs.  Sadly, they also seem to be making Mac OS look more like the iOS used on their mobile devices at the cost of making it less useful as a development machine.

The third (and my current favorite) development environment I've considered exceptional is a Raspberry Pi 5 which runs a multicore 64-bit ARM CPU.  It still amazes me that I could built a reasonable development machine (8 GB memory, 500 GB SSD storage) for less than $200.  Since it runs Linux with X11, I'm once again able to run the same tools (or their successors) on this machine.  It integrates better with the small embedded ARM machines I write firmware for than does my work laptop running Windows 10 or my Mac Mini which runs the latest iteration of MacOS.  Sadly X11 is no longer easy to get running on the Mac.  I know some will think I need to migrate to Wayland but I rarely feel the need to use tools which aren't fully mature yet.

All of these machines have run Unix or Unix-derived OSes which has been my strong preference for 35 years now.  This allows me to edit source files using vi (now vim) which had a steep learning curve but rewards the initial effort by being fast and quite powerful.  Having ctags allows me to quickly jump from a reference in a source file to where the function or variable is defined.  I've used it for so long that it has become second nature to me.  I'm happy to be able to search my source code efficiently again using grep.  And I love being able to use a shell with a proper scripting language (currently bash but I've used sh, csh, and ksh in the past).  Best of all, I've still got good support for the C programming language which I view as a portable assembly language and which has been my favorite since I first learned it in 1985.

These aren't the only environments which can improve productivity.  I've worked with a few engineers who swear by Visual Studio or Eclipse and I've seen how both can help speed up common tasks.  However both have a sufficiently steep learning curve for me to ignore them.  I feel the same way about Emacs when friends point out its advantages over vim.  At this stage in my career, I'm not willing to make radical changes to a workflow I've been comfortable with for a long time at the cost of weeks to months of feeling less productive.

Sunday, October 06, 2024

Antisocial with a reason

I'm sure people have noticed that I've been much less social since the start of the pandemic.  That's in large part due to the fact that my wife has autoimmune issues which put her at increased risk from viruses such as Covid-19.  To limit the likelihood of her getting infected, we both mask up whenever we're inside in public spaces.  While this is acceptable for quick shopping trips, it doesn't work well for meeting friends in restaurants or at parties unless there's an option for outdoor dining.

The need for outdoor dining has limited when it's feasible to meet people to times of the year which offer reasonable temperatures to brave the patio.  Those times happen to coincide with when I've got 5+ hours of weekly yard work to complete.  The need for dry weather while mowing can make it difficult to find a mutually agreeable time to meet.  Sadly, patios tend to fill up during prime dining hours.

It may come as a surprise to people that some of us need to be careful because the media offers so little coverage of Covid-19 these days.  The common wisdom seems to be that Covid-19 is no more dangerous than the flu which is definitely not true.  It's easy with an Internet search to find wastewater data which helps track infections through monitoring of sewage treatment (see link below).  Sadly this doesn't always paint a completely accurate picture of infection levels in more rural areas such as ours where municipal water and sewer service are not available so we're forced to infer using the data from surrounding areas.

https://www.cdc.gov/nwss/rv/COVID19-currentlevels.html

My wife and I both miss seeing friends as often.  We also miss live music and dining out.  Hopefully we'll be able to eventually return to doing the things we enjoy without the need to be so cautious.

Wednesday, September 04, 2024

Sometimes it pays to be skeptical

I may have been born a skeptic.  I've been questioning things I was told for as long as I can remember.  I'm sure many of my teachers were happy to see me advance out of their classroom because of that.  In many situations that doesn't make you popular, however it can serve you well in an engineering career.

On occasion I've needed to be skeptical of things colleagues tell me.  Such misinformation was most prevalent when I was a field engineer (aka FE) 40+ years ago.  If you're not familiar with that title, it's basically a mechanic for computers.  In my first job in the computer industry, I worked on mainframes and minicomputers.  For part of that time I was a specialist which meant I got called in on difficult problems after other engineers had tried and failed to fix.  I started these visits by asking questions of the FEs onsite only to sometimes have them tell me that of course they had checked the things I was asking about.  I learned which engineers I could trust to admit they hadn't checked something which seemed a logical troubleshooting step.  The challenge with engineers I didn't know well or with those I knew were too proud to admit they had missed something was to suggest that we check something together which they had assured me they had done already without embarrassing them too much.

These days my skepticism allows me to discover the discrepancies inherent in technical documentation.  I don't recall ever seeing a chip datasheet which didn't have a few errors (or instances of wishful thinking on the part of the documentation team).  Accepting the idea that the documentation can be wrong allows one to move beyond seemingly impossible situations such as a device register which occasionally isn't as persistent as the manufacturer's docs suggest.  Software documentation is frequently more error prone than hardware documentation.  I don't think I've ever seen an API document without a few mistakes.

Comments in code is another area it's dangerous to trust blindly.  Engineers will often add extensive comments in code when a function is first created.  Subsequent revisions may not see those comments updated to reflect changes in logic.

That makes the world of engineering seem somewhat bleak.  How do we combat it?  For my part, I try to report errors I discover.  That doesn't always work.  I've reported errors in compilers my company has had to pay healthy amounts of money to license only to be told that the compiler is EOF (end of life) and that no errors would be addressed.  I couldn't even convince the vendor to add my discovery to the list of known bugs.  The thing which keeps me trying is occasionally someone at a vendor will be appreciative of having a bug reported.

Friday, June 14, 2024

The non-traditional start of my software engineering career

Unlike most software engineers, I started my career as a field engineer responsible for maintaining those large mainframe computers like the ones you see in movies.  You know the type with the line of tape drives whose reels keep spinning back and forth.  I got that job by attending the last full time field engineering class taught entirely by live instructors at the Arlington campus of Control Data Institute in 1976.  It was a 6 month full time program which taught electronics repair, debugging, and machine language programming.  They delved into topics such as optimizing Boolean logic which we then implemented by wiring together small boards each of which contained a single Boolean gate made up of discrete components such as transistors.  You haven't lived until you've built an adder circuit this way.  One of the big attractions was Control Data consistently managed to find jobs for more than 90% of their graduates from each class.  For the class which followed mine, they started using an early version of a computerized training system called PLATO for part of the training and eventually switched entirely to computer-based instruction.

In 1977, I completed my course and got hired as a field engineer by Honeywell Information Systems with the responsibility for maintaining their Level 6000 and 6600 series mainframes.  Much of the time I worked on peripheral equipment such as disk drives, tape drives, and line printers.  It's a sad fact of life that machinery with moving parts breaks down a lot more often than purely electronic equipment does.  Fixing mechanical problems is a necessary part of the job for field engineers but it's not always interesting and it often means getting covered in printer ink or grease.

My favorite part of the job was the more difficult debugging required when a computer fails to boot or when one crashes.  When the computer itself had problems, I occasionally had to troubleshoot the CPU the way the field engineer in the picture below is doing. 

The CPU for Honeywell's 6000 series of mainframes contained about 80 logic boards, each of which measured about 12" square and contained over 100 integrated circuits.  For the most difficult problems, we sometimes had to put the boards on board extenders similar to the one in the picture above which provided access for easier debugging using an oscilloscope to trace signals from board to board within the CPU.  For me, the best part of those mainframe computers was the large maintenance panels they had.  From that panel I could stop the CPU to examine the contents of registers, single step the CPU to see how it behaved while executing a section of software, and insert breakpoints to automatically stop on certain conditions.  That was when I was infected by the love of programming.  I'd get 2+ hour windows for preventive maintenance and if I rushed through running the system diagnostics, I'd have time left over to enter machine language programs on those maintenance panels.  Seeing the lights on the panel blink in ways I expected was a thrill.  I spent 5 years working at Honeywell but ultimately left because I was frustrated by the obstacles they presented to employees who want to make the switch from hardware to software.

My second job was with a small company called Atex which used DEC PDP-11 minicomputers to run publishing software used by magazines and newspapers.  For a while I was the onsite field engineer at USA Today and helped install the 12 PDP-11 minicomputers they used when they launched.  Most of my work was maintaining the 200+ terminals used by newspaper staff.  The terminals were simple but company policies declared the ORU (optimal repairable unit) to be the logic board which time consuming to replace.  So I started saving the chips which commonly caused failures from all of the defective boards I sent back to be repaired and I learned which chip to replace for various symptoms.  A vertical line across the screen meant the horizontal deflection chip was likely bad.  That allowed me to reduce my time to repair 45 minutes equired to replace the logic board to about 5 minutes to pop a new chip into a socket.  The time savings left me ample time to write machine language programs I could try during preventive maintenance windows.

Since the PDP was a smaller computer, its maintenance panel was much simpler (see below) than the one used by mainframes.  The CPU's instruction set was much simpler as well.  Maintaining minicomputers proved much less challenging than working on mainframes was.  Their CPUs only had a couple boards and instead of doing chip level repairs, we just replaced the bad board.  Frankly, I found the job a bit boring.  After a mishap where a minicomputer fell on top of me (which is a story for another time), I decided to make the switch from computer hardware to software. 


What I discovered at my first software job was thanks to my unorthodox introduction to programming, I had a much better understanding of low level programming than most of my colleagues did.  I also appreciated that as a software engineer, I was able to write assembly language programs rather then entering instructions as machine language which is just a group of numbers.  Writing assembly language was much less labor intensive.  The instruction "LDA" (load the accumulator register) was easier to remember than the fact that a 235 was the CPU's "op code" for a LDA instruction.   I also didn't have to manually calculate the jump offsets between instructions.  Yes sir, assembly language programming was much easier than what I was used to doing.

At GE, which was my 4th job, I wrote code for Honeywell 4500 series computers.  These were process control systems with a 24 bit CPU which had been modified for use in GE's packet switched network.  The 4500 was yet another another machine with a front maintenance panel.  Most of the code we wrote was in assembly language but the assembler wasn't very sophisticated and it wouldn't give a proper warning when a jump location was too far away.  What could happen was the jump offset might get large enough that it would set the top bit of the offset field, changing the jump direction from forward in memory to backward which caused unpredictable behavior.  I could catch these errors pretty quickly because the assembly listings from our programs also showed the machine language instructions one one side.  In the machine language column the direction of the jump offset was obvious to people who understood the instruction set.  Eventually I was put on a project to upgrade the network nodes to more modern hardware.  I was sent off to be trained on the new machines where I discovered that my management had completely ignored the class prerequisites of knowledge of the C programming language, much to the instructor's chagrin.  So I was forced to learn C during a several week course where I was expected to already have expertise in C and was also learning about the equipment's API.  It was hard until I finally realized that C is really just a portable assembly language.  After that, C became my favorite programming language and remains a favorite to this day.

Eventually I joined Sprint International where I was responsible for developing software for the equipment used in their packet switched network.  It was at Spring that I became obsessed with Unix because I had a Sun 3/80 workstation on my desk.  To this day, the Sun remains my idea of the ideal development environment.  It was also there where I first used a Motorola 680x0 CPU which features my favorite instruction set of any CPU.  One thing I found frustrating was Sun's version of vi (a visual editor) had restrictions on the width of a window and the overall size of the files which could be edited.  My machine language skills allowed me to patch my workstation's copy of vi to expand those restrictions to values I found easier to live with.

The vast majority of my career has been spent writing system software (operating system, device drivers, bootloaders, etc) or firmware.  The line between system software and firmware is sometimes hard to detect.  It can range from simple control loops which monitor and control hardware to what it has become today, hardware specific drivers on any one of multiple embedded operating systems, up to and including Linux.  Regardless of the platform I'm working on, the ability to decode machine language remains a valuable tool for me.  Since my firmware jobs often involve custom hardware, knowing how to read schematics has proved an essential skill as well.