Monday, May 04, 2009

Windows data recovery

Last night while poking about the web in my typical pseudorandom fashion, I found a new tool which may help prepare for the next time Windows decides not to boot. TestDisk is a tool designed to run under multiple OSes which can perform lots of handy data recovery functions including repairing and restoring an NTFS boot sector.

It turns out it's already available on the System Rescue CD I've been using. I just needed to poke around a bit more.

I'll be making backups of the MBR and boot record since I don't expect this to be the last time that Windows goes belly up. *sigh*

Sunday, May 03, 2009

Windows strikes again

Despite the fact I made the switch to a Mac about 6 years ago, Windows still manages to be a thorn in my side from time to time. Yesterday morning my wife discovered that her XP laptop which had been working Friday night had decided not to boot. What was even more entertaining was it decided to display a completely blank screen so there was no clue what was causing the boot to fail. What's even more disturbing is this is the second time I've seen these same symptoms.

I poked around a bit after booting the system under Linux using a System Rescue CD. That allowed me to see which files had been added or modified since the last full backup a month ago. To do so, I had to issue a few Linux commands.

Here's the command to mount the Windows NTFS file system on the /mnt/windows mount point.

ntfs-3g /dev/sda1 /mnt/windows

Here's the commands to mount an external USB disk containing a Windows FAT32 file system on the /mnt/usbdisk mount point. Note that you have to create the mount point first. Also note that the device name of the USB drive may be different than /dev/sdb1. You can figure this out by issuing the command "ls -l /dev/sd*" before and after connecting the USB disk. The device name which appears after connecting the disk will be the correct one.

mkdir /mnt/usbdisk
mount -t vfat /dev/sdb1 /mnt/usbdisk

Here's the command to show you the files which have a later modification date than the last backup. This will give us an idea of what files need to be backed up before we perform any potentially destructive operations to the old disk.

find /mnt/windows -newer /mnt/usbdisk/backup -print

Once I did this, I had a quick look at the partition table/MBR to see if that might be the cause of the failure to boot. That required dumping sector 0 via a program which can display binary data in hexadecimal form. The xxd command will display hexadecimal equivalents for binary data and we can use the dd command to copy the first sector to xxd. Here's the command I used.

dd if=/dev/sda1 count=1 | xxd

A cursory glance indicated the partition table looked okay. It ended with the required 0xaa55 (since PCs are little endian this appears as 0x55aa when viewed as bytes).

I verified the rest of the files involved in the early portion of the XP boot process appeared to be present and had a reasonable size and time stamp. All appeared fine.

Now I was almost ready to try to fix the problem. Since this involved issuing some potentially destructive commands, I booted from the Acronis True Image CD to create a new "since save" backup of the data which had changed since the last full backup.

First I tried booting into the XP Recovery Console to issue the fixmbr command. Sadly, that didn't fix the problem. It did issue some stern looking warnings about the drive having a nonstandard partition table. Since I had used Windows to set up the disk, I found this very irritating. Why would Windows use a partitioning scheme which it would later declare to be nonstandard, especially when the default method is chosen? By this point, there aren't many of Windows' quirks which surprise me. They do make me all the happier that I deal mostly with MacOS and Linux these days.

I also tried an XP Repair Installation. Strangely enough that didn't change the boot failure symptoms at all.

What finally fixed the problem? Doing a full restore with Acronis True Image followed by a restore of the Acronis "since save" data. I may be completely disillusioned with Windows but Acronis makes a very nice product. I don't think I'd consider running Windows without it.

Monday, April 20, 2009

Apple makes switching phones incredibly easy

One of the things I've always dreaded about switching phones the laborious process of entering the phone numbers of friends and family members. I've got all those in my Mac Address Book so why can't I sync it straight to my phone? I've gotten used to doing so with my Palm Centro but wasn't sure it was easily accomplished using my old Razr.

Apple comes to the rescue again. It turns out they've included enough intelligence in their Bluetooth software that I can choose groups of contacts from my Address Book to sync.

But wait, there's more. I was also able to add my favorite wallpaper image to the old phone by using Apple's Bluetooth browsing feature. In the future, I'll be able to transfer photos taken with my cell phone via Bluetooth instead of being charged a fee for using the phone's data connection.

The procedure for accomplishing this was pretty easy to figure out... after all, it is an Apple. But here's the procedure spelled out if you don't feel like noodling it out yourself.

I did a little work with Google but I didn't see a similar procedure available for Windows. That doesn't mean it doesn't exist but if it does, it must not be quite so obvious as Apple's.

Sunday, April 12, 2009

A farewell to Palms

The phone portion of my Palm Centro has been slowly dying for the past few months. The signal strength varies wildly over time. Occasionally I'll look at the Centro's status display and see the full 4 bars indicating a strong signal. Minutes later at the same location it may show a single bar or worse yet, the dread "SOS Service Only" message. A visit to Google indicates that this is a very common problem with Centros.

So much for my attempt to combine PDA and phone functionality into a single device. I had such high hopes for the Centro. Having been a Palm PDA user since I got the original Pilot as part of a special employee purchase back when I worked for U.S. Robotics, I've grown to depend on my PDA heavily. Palm PDAs provided the ideal combination of application support and built-in features. If only their phone capability worked as well as the rest of the device.

I've been doing a bit of research and so far I don't see a suitable replacement device, at least not among the subset of devices available from my cellular provider. The iPhone comes closest but I'm too cheap to want to be saddled with a $30 per month data plan when a little planning can make it completely unnecessary. The iPhone doesn't make any sense without the data plan. Many of the other smart phones suffer from poor battery life, poor signal quality, crashes, or insufficient application support.

For the time being I've switched back to my old Motorola Razr phone which despite its age has much better signal strength and battery life than the Centro did when it was new. I'm using the Centro without a SIM card as a standalone PDA device.

I'm considering the Motorola Q Windows Mobile device. I can find Windows Mobile equivalents for all the applications I've come to depend upon (contact management, calendar, to-do list, memos/notes, database, and secure password storage). I've got a few months before I'm eligible for a reduced cost equipment upgrade. I'm hoping a better option presents itself between now and then as the prospect of switching to a Windows Mobile device doesn't thrill me.

Saturday, March 28, 2009

FAT32 sucks!

I'm doing my monthly offsite backups and am seriously frustrated by the limitations of the FAT32 file system again. FAT32 is something of a de facto standard since most external hard drives and USB flash storage devices come pre-formatted with FAT32. However FAT32 carries with it the limitation that the size of an individual file may not grow beyond 4 GB. Many backup programs don't handle this limitation gracefully since this requires the backup program create multiple output files.

So what are my options for getting around this limitation? I'm currently using external drives with 2 partitions equally split between FAT32 so the Windows machines can write to it and HFS so our Macs won't be limited by the arbitrarily small maximum file size. Another option is formatting the entire drive with NTFS and using MacFUSE/NTFS-3g to allow the Macs to read and write to the NTFS partition. I like this approach better as it doesn't force me to correctly predict how much space I'll need for each type of machine.

Ultimately I think I'd prefer to use drivers to allow me to mount, read, and write to Linux EXT file systems but this requires more investigation. This is mainly because I really hate the thought of trusting my Mac data to NTFS.

Vim continues to amaze me

Every time I think I've learned all of Vim's tricks, it manages to surprise me with some new feature. There are two things I've discovered recently which have made it more enjoyable to use.

The first is Vim's ability to edit files on remote systems which can be reached via scp or ftp. Since I frequently need to edit files on embedded Linux systems which don't have X11 installed, this feature is a huge timesaver for me. The following commands illustrate the basic commands for either editing a file over scp or ftp.

gvim scp://username@system//home/username/filename
gvim ftp://username@system//home/username/filename

This feature is referred to as NetRW.

The other thing I discovered was completely by accident. I was using MacVIM (my favorite Mac port of the GUI version of Vim made available from the fine folks at Google) when I found I could drag the file icon off the titlebar and into a bash shell. Once dragged into the window containing the bash shell, the full pathname of the file being edited was placed on the command line. Granted this isn't something I need to do often but when I do, it saves me from having to manually type a long pathname.

Saturday, February 28, 2009

Python is slowly winning me over

I've been learning Python at work and I have to admit that it's slowly winning me over.

I was skeptical at first. I've been a low level programmer (firmware, diagnostics, operating systems, device drivers, etc) for most of my lengthy career. A mixture of C and assembly language has served me well for most of that time with a little shell scripting thrown in for good measure.

When I started my current job, I discovered they use a mixture of C and Python. The parts of the product which aren't performance sensitive such as the GUI are implemented in Python. This code is much smaller than C code to accomplish the same purpose would be. Thanks to the interactive nature of Python it's also much quicker to develop this code and to get instant feedback.

What finally won me over was examining a short script designed to generate some proprietary TCP/IP packets. This script had a small problem when talking to one of our products. It produced the integer and float portions of the structure with the wrong endianness. A little searching with Google showed me that adding a single "<" character to the format string used by the "struct" module (which prepares individual fields within a structure) would correct the problem.

It took a minute for the full impact of that to sink in to my deeply ingrained C mindset. A single character would correct my problem. Couple that with the fact that the script to generate these packets was probably less than a third of the length of a comparable C program and I was forced to admit that Python is much more useful than I originally thought.

I was tempted to end with a comment about old dogs learning new tricks but I suspect that phrase is far too tired to carry the proper impact. I am finding that old engineers can learn new tools with sufficient incentive. Seeing how much more productive I could be with solid Python experience under my belt is more than enough incentive.