The relief of a backed-up PC

After last weekend’s disaster with the dead hard disk, my PC is finally getting back to normal. Despite Mick’s recommendation not to buy a drive from a manufacturer starting with the letter “M”, I ordered a new 80GB Maxtor D740X from Scan on Tuesday. (Scan have a reputation for low prices, but relatively poor customer service–especially when it comes to returns. Their order tracking system certainly cannot be faulted: my order generated five emails informing me of its status at various points.) It arrived the next day.

Since then, I’ve spent a large amount of my spare time reinstalling Windows in various configurations, and shuffling files back and forth between drives. Some notes on this experience:

  • An external USB hard drive is wonderfully convenient, but it’s also kinda slow. (I think I see an external firewire HDD enclosure in my future…)
  • If you have a hard disk with Windows XP installed on it, and you want to put a new installation on a second, separate hard disk, remove the old hard disk completely. If you don’t, you’ll find that the new installation ends up on drive “E” or something weird like that. XP does give you a nice interface for changing drive letters, but it won’t allow you to change the letter of your boot drive, or your system drive.
  • If you’re using Windows XP Home edition, or Professional edition in a stand-alone configuration (not attached to a domain), you get a pretty startup screen with a list of users who can log in to the computer. “Administrator” is not one of these users. It can be useful, though, to log in as Administrator for performing system tasks (like installing new programs). To do this, you can press ctrl-alt-del twice at the startup screen, and you’ll be presented with a standard login box, which allows you to log in as any valid user (with login rights), not just the ones that appear in the list.

Also, I’m considering buying branded CD-Rs for “serious” backup purposes. Normally, we just buy a spindle of unbranded CD-Rs as cheaply as we can, but while I was reviewing some of our older archives and backups (redundant ones, fortunately), I found two that could not be read by my DVD-ROM drive. The CD writer drive could still read the data on them, but the DVD drive just gave up the ghost. Lessons:

  • Not all CD-ROM, DVD-ROM and CD writers are created equal.
  • Not all CD-R disks are created equal.

Tom’s Hardware Guide has an article about backing up copy-protected CDs that mirrors these sentiments. It makes me wary about entrusting “final” archives (stuff that we move off of on-line or nearline storage completely) to the cheapest CD-Rs we can find. But the main wisdom to take away from the whole dead disk experience should be: never entrust critical data to just a single location. Always make sure you have a copy elsewhere.

Of course, I should have learned this lesson about five years ago, when I accidentally reformatted my hard disk (yes, it does really happen) and then found that the tape drive we had was shagged. At that time, we were able to recover the data by running the tape drive in the dark. The problem lay with the LED and the sensor, you see… By completely eliminating all external light sources, the sensor was just able to function properly.

I’ve been lucky twice now. What are the chances of me being lucky a third time? In five years time, will I have forgotten how gut-wrenchingly awful catastrophic data loss feels? Time will tell…

The Dead Disk Blues, part 3

The relief I’m feeling right now is almost indescribable. I’ve got my data back!

It wasn’t as a result of sticking the hard disk in the fridge for the afternoon, as suggested by The Register. Nor did I have to shell out £1040, as quoted by The Data Clinic, who specialise in recovering data from this particular batch of Fujitsu drives. (Mine was a MPG3409AT, 40GB.)

No, the (free!) fix came courtesy of a very kind Australian gentleman, who had posted a message on a newsgroup thread discussing this very problem. He had also suffered from a disk failure. But when he was in touch with Fujitsu Australia, they sent him a piece of software they use internally for recovering these drives. It’s a bootable disk with a rescue program on it. He forwarded it to me, I ran it, and…my PC recognized the disk again.

But not for very long, though. I’ve just spent the last two hours or so racing against the clock to get all of my data off of the disk before it died on me again. The rescue program had worked once, but I had no intention of relying on it to work a second time!

But yes, I was able to grab everything I thought I had lost: most importantly the photos we’d taken and downloaded onto the disk in the last two months. To celebrate, below is one of the pictures we thought we’d lost. It’s one of Alex looking moody on our day trip to Glasgow two weeks ago. It’s a beautiful photo, and I am simply filled with joy that we still have it.

Alex on the train to Glasgow, looking out of the window.

To my benefactor in Australia, “thank you” doesn’t say it strongly enough, but it’s all I can do on a simple web page. Thank you!

See also:

Trade it on Trodo!

John Rhodes of Webword has just launched the secret project he has working on and enigmatically referring to for the last few months. Trodo is an on-line trading community, where people can hook up and swap books, CDs, DVDs, and other stuff.

Note that it’s swap, not sell. Unlike Ebay, you’re not trying to maximize your profits from the items you list. One book gains you one “credit” for another book. When you sign up, you have to list at least three items you’re willing to trade, and this gives you an initial three credits. With these credits you can then request items from other users.

The economics of this credit system are very interesting. You might think that it would be easily abused. Because each book (or video, or DVD) is worth exactly the same as any other (1 credit), what is there to stop someone from only listing old, decrepit pulp paperback novels, but only requesting shiny new computer books? Well, the credit system itself acts as a brake on this kind of behaviour, because apart from your initial three signup credits, you only get more credits when someone requests one of your items, and you fulfil that request.

If you list nothing but rubbish, no-one will request your items, so you won’t be able to request other people’s “good” stuff. You therefore have an incentive to list items that other will find attractive enough to request from you.

There are many other interesting aspects to the system. Why don’t you hop on over there and take a look?

The Dead Disk Blues, part 2

Sigh. It’s not like I didn’t have any warning there was something wrong with the disk. A few weeks ago, I had got a “delayed write failure” error message when I was downloading some stuff. Windows suggested that this might be due to a faulty network connection or faulty hardware. Because I had been performing a network operation at the time, and because traditionally our wireless network has always been a bit tricky, I was inclined to think that the network was the problem.

When I rebooted the PC at that point, the checkdisk program kicked in, and reported that the file I had been working with had something wrong. It was able to fix it, though, and the problem didn’t repeat itself. Fine, I thought. The network problem must have screwed up a couple of sectors on the disk, but everything’s okay now.

Despite having read several web pages that spoke of imminent disk failure, did I even consider this as a possibility? Uh-uh. Did I take the opportunity to do a compete data backup just in case? Nope.

Then yesterday evening I was performing some very disk-intensive tasks: converting a bundle of SHN files (shortened audio files–a lossless way of compressing sound files) into WAV files I could play. There was about 400MB of SHN data, which, when extracted, turned into about 750MB of WAV.

This was the first time I had used the SHN format, and the first time I had used the program for extracting them (mkwACT). So when I started getting more “delayed write failure” messages, I blamed it on the application. I thought that maybe the program was producing data faster than the hard disk could write. Hence, a “delayed write”. This might have been “failing” because of a bug in the application code.


Repeat after me: “Delayed write failure” means back up your data NOW because your hard disk is about to DIE.

In the process of completely extracting the WAV files, I had to reboot my PC four times because of complete lock-up system crashes. That’s more crashes in one evening that I’ve had in the whole year I’ve been running Windows XP at home.

On two of the reboot occasions, scandisk ran, and reported a couple of disk errors. It also reported that it was able to repair them. I went into the Disk Management section of “My Computer”, but it told me that the drive was “healthy”. It wouldn’t complete a scan of it, though. And then there were the funny clicking sounds….

So repeat after me: Funny clicking noises mean back up your data NOW because your hard disk is about to DIE.

But did I take the opportunity to back up my data? Nope.

How many warning signs did I need???

Repeat after me: AAARGH.

So what have I lost? Well, fortunately or unfortunately (I haven’t decided yet), it was my data drive that died. I have (or had) two hard disks in my computer. The first one contains my Windows installation, all program files, and most programs’ immediate data and settings. (Which is why I’m still able to write and post this from my PC.) It also holds my email, and my address books. The second hard disk holds (held) all of my music, photos, software, backups (!), documents, spreadsheets, and other “stuff.”

Very fortunately, we invested in a brand new hard drive a few months ago, which we are using as a nearline storage unit kind of thing. This was just before I started my annual Linux experiment, and as a precaution I backed up both internal drives onto this external unit. So it’s only two months of data that is gone.

The software I had on the drive was all downloaded from the net, and is all replaceable. (Our broadband connection will be very handy here.) Likewise, the music on my drive was all ripped from my CD collection, and can be ripped again without too much effort.

In terms of documents and spreadsheets, I haven’t actually done much in the last few months. Most of what I have written in that time has gone up on the Sunpig web site, or has been sent via email, and is therefore not actually lost. And most of the spreadsheets we update regularly (book catalogue, and accounts spreadsheet) are located on our server, and not on my hard disk. So document-wise, I’m not too badly off, either.

What really hurts is the photos. All of the photos we’d taken during October and November were on the disk that died, and those were the only copies. A very small number of them had been uploaded to the Sunpig web site, and a few more are currently serving as our desktop background pictures, but we have probably lost about 150-200 photos, and maybe 5-10 video clips.

It’s possible that we could get these back. There are companies that specialise in recovering data from dead hard disks. However, this can be a very expensive process. An optimistic estimate would probably be a couple of hundred pounds. And are they really worth that much? It’s very sad to lose the photos, but this is probably too expensive. Instead, we’re just going to try to learn a lesson from this disaster, and move on.

The lesson is, if you insist on using computers, sooner or later, you will suffer a catastrophic hardware failure. The question you have to ask yourself is, “how much data are you comfortable losing?”

A day, I can put up with. A week would be a nuisance, but the chances are I wouldn’t have actually made very many file changes, or saved new photographs. In any given month, there will usually be a bundle of photos of Alex that would be a shame to lose. In two months, that shame doubles, and becomes actively painful.

Amongst the photos we lost, there are some from a day trip to Glasgow, a bunch we took when Andy came up to visit, and a whole big stack of pictures that Abi took on a trip to Hewit’s. It makes me sad to know that we’ve lost them.

So what am I going to do?

Well, the first thing is to define a new backup strategy. I think what I’m going to do is get a new, large disk (60GB or so), and use this as my main PC disk (Disk A). I’ll put everything on it: Windows, apps, and data. Then, I’ll use my current main disk (13GB) and use it as a “staging” backup unit (Disk B). The nearline unit will serve as our primary backup unit (Disk C), and then we’ll have the usual variety of CD-Rs for longer-term and off-line archiving.

The process will be: on a nightly basis, a backup script will copy fresh data from Disk A to Disk B. Once a month, I’ll do a manual copy from Disk B to Disk C. Quarterly (probably), I’ll burn new CD-Rs from Disk C. Disk B and C will only be cleaned up when they get to 90% capacity, so that there will generally be a certain amount of overlap between them.

And most importantly: stick to this procedure. It’s worthless unless I actually do the backups on the intended schedule.

On the positive side, when I had my PC open, I noticed that the light thrumming noise it makes is not coming from the CPU fan, as I’d previously thought. It’s being produced by the fan on my graphics card. This is a good thing, because it means I can shut my PC up completely by paying another visit to the nice folks at, and buying one of their silent video card heatsinks. Nifty!

Also, I had been planning to buy a bunch of new computer components in the new year. After this whole disaster, I’m going to have to add a new hard disk into the mix as well. And I might just be forced to buy them all before Christmas… 🙂

See also:

What makes the web? Part 2

Last week, I wrote about what I think makes the Web what it is: where its true identity lies, and what its key qualities are. I identified four main points:

  • Content (primarily the sheer volume that is available)
  • Indexing (how easy it is to find information)
  • Community (how it brings people together)
  • Connectedness (how any web page can be linked to any other)

I’ve been thinking about this some more, and I’ve come to the conclusion that these four qualities are also the cornerstone of all good web sites. The properties that make the web as a whole such a successful medium are exactly the same as those that determine the strength of its building blocks.


Content is what makes someone visit a web site in the first place. This can be content in its classical form, such as news stories, articles, or fiction. But it can just as easily be a product: a book to order and have delivered to your home address, or piece of software to download.

Note that the strength of a site’s content lies not just in its quality, but also in its volume, its freshness, and its speed of delivery.


A web site is generally useless if you can’t find what you’re looking for when you go there. “Indexing,” in the context of a web site means more than just a list of keywords, hyperlinked to the relevant pages. It means more than just a site map, or a search box, or an outline tree, although these are all useful elements. It means findability in general. It means that you must have some way of mapping your visitors’ content desires onto the structure of your web site. This is where Information Architecture comes into play.

Information Architecture draws on library science and cognitive science to bring people and information closer together. It helps make sure that when you arrive at a web site, you leave with what you were looking for.


An on-line message board, where a web site’s patrons chat with each other, is a very simple example of community. More generally, though, community is about being and feeling in touch with a web site’s owners, as well as its other users. An email newsletter brings the web site to you, even when you haven’t visited it for a while. On an e-commerce site, showing feedback from customers can build a sense of community.


The great strength of the web in general is also a great strength of individual web sites. If you’re showing a visitor a particular product, you can instantly hook them into related or complementary products. If you’re presenting classical content, keywords can be hyperlinked to useful definitions, references, or more in-depth material. The web it determinedly non-linear, and people will jump around at the slightest mention of something interesting.

Successful web sites turn this to their advantage by making sure that these connections add value to the user’s experience, therefore ensuring that the user will come back for more usefulness!

Combining the principles

When I’m talking about the “success” of web sites, I’m talking about the success of the sites themselves, not of the businesses that may underlie them. It is quite possible for a successful web site to be a lousy value proposition for a business running it. Conversely, just because a web site is poor, doesn’t mean that it can’t be making heaps of money for its owners.

A successful web site is one that is recognized within the ecosystem of the web as a whole as a strong entity. In a Darwinian sense, it is one that is capable of survival.

Numerous examples of strong, successful web sites spring to mind: Amazon, IMDB, Ebay, Slashdot. All of these score highly on all four of the principles I’ve outlined above:

  • Amazon’s product reviews build both content and community. It cross-links to an almost obsessive degree, and few would fault how easy it is to find stuff there.
  • IMDB is the place for information about movies. It has a strong reviewing community. You can search on almost anything, and once you arrive at a given movie, or actor, you can hop around to your heart’s content (or at least until you’ve found the link to Kevin Bacon)
  • Ebay may seem like it has little community on board, but it concentrates this into its user ratings. On Ebay, as well as in real life, the community is what determines your reputation. Ebay scores relatively low on connectedness, but interestingly I have seen it taking steps to improve this aspect, by doing things like showing your recently viewed items under new searches. Its low connectedness also indicates where it could potentially reap large benefits.
  • Slashdot has good content, good community features, and excellent linking to external web sites. Its indexing is poor, because the only way to get at its old content is through a relatively primitive search box and search page. Slashdot could improve its holistic web “strength” according to my ratings by working on this aspect. On the other hand, it is primarily a news site, and being able to find things in their archives is of less importance to the site’s visitors.

I think that these four principles (dimensions?) are a decent way of classifying a web site’s strength. It doesn’t make for a perfect analysis, but it’s allows for quick identification of a site weaknesses, and where resources could be applied to improve it. Of course, using these success criteria is only of use if you want to increase a site’s Darwinian potential within the social ecosystem of the Web. The criteria say nothing about a site’s commercial prospects.

I suppose I need to work on the commercial aspects of this classification system…. It’s fine and well improving your web site in an abstract sense, but most businesses are probably more interested in how it can make them more money.