Categories
Blogging Techie

Absorbing the “Second Best” blog

At the end of last year, I created a new blog called “The Second Best Swordsman In Caribastos”. The title is a reference to a quote from the book Paladin of Souls by Lois McMaster Bujold. I had intended to post techie content to that blog, and to keep my main blog here for more personal and irrelevant stuff. Anyone who was interested in the techie stuff could subscribe to the feed over there, and ignore everything else I post here.

The problem is that that’s a false distinction for me. I am fundamentally a techie geek, and my personal life is intricately interwoven with technology, coding, and the web. Having to figure out which of the two blogs a potential entry should go in has on several occasions frozen me into such complete indecision that I ended up not writing anything at all. That’s just not good.

So from now on, the other blog is dead, and its content (what little of it there was) has been absorbed here. There are redirects in place, so any links to the old pages won’t break. At some point in the future I might set up tag-specific feeds here to allow a more filtered view of my brain, but for the moment the you’re just going to have to live with whatever randomness I decide to spout. (Hey, at least I’m not posting cat pictures.)

Categories
Second Best

Speed up your laptop with a new hard disk

If you’re feeling dissatisfied with the speed of your laptop, there are a couple of quick and relatively low-cost ways to give it a bit of extra zing. The first option is to add more RAM. Most modern laptops make their memory slots easily accessibly by means of small panel that you can open up with nothing more than a screwdriver. Check with somewhere like Crucial to find out what memory modules you need, slap ’em in, and watch it go.

The second option is less known, but it is definitely the connoisseur’s choice: upgrade the hard disk. The ease with which this can be done varies greatly between manufacturers. With some, there’s a simple panel you unscrew; with others, you might have to consult Google to find a guide to fully opening up the case. And the reason a new hard disk can give you a speed boost is that most present-day laptops are fitted with a 4200 rpm disk, which is slow, slow, slow. The faster a hard disk spins, the faster it can provide your processor with data, and the faster your machine will load programs, read and save files, and even boot and shutdown. In fact, the whole machine will just feel snappier.

As a rough illustration, have a look at these (informal) timings from a WinXP laptop (Dell Inspiron 5150) I just upgraded:

Operation 4200rpm drive 7200rpm drive
From power up to Windows logon screen 46s 29s
Start up Outlook 2003 (cold start) 9s 5s
Start up Firefox 1.5.0.1 (cold start) 13s 7s

Faster disks are more expensive than slower ones, but you can get a sweet little 80GB Hitachi 7K100 Travelstar drive (7200rpm) for around £100. If you’re thinking about buying a new laptop because your own one has lost its zing, you might consider upgrading its disk rather than splashing out many times that price for a new machine.

One thing that always puts me off upgrading my main hard disk is the hassle of reinstalling Windows, applying patches, and configuring the system–and that’s all before I get round to installing all the applications I need for my everyday life. But…there is a BETTER WAY! Oh boy, is it better. Basically, you clone your disk (and its whole Windows installation) using open-source (free) tools.

The key requirements for this solution are a network connection, and access to an FTP server with LOTS OF SPACE (enough to hold your old hard disk). Ideally, this FTP server should be on your local network, because if it’s out on the internet, data transfers are going to take ages.

First of all, go to http://www.feyrer.de/g4u/ and download the g4u ISO image. Burn this image to a blank CD, and then use it for booting up the machine whose hard disk you’re upgrading (with the old hard disk still inside it). By following the instructions on the g4u web site, you can use the uploaddisk tool to upload a bit-for-bit copy of your main hard disk to an FTP server of your choice. (It will upload as a single, gzip-compressed file to save space.) Depending on the speed of the machine and the network, this may take some time (i.e. several hours), but it’s hands-off time. Set it going overnight, and come back the next morning.

Next, replace the old hard disk with the new one. Don’t bother with formatting. (YAY.)

Boot up with the g4u disk again, and now use the slurpdisk tool to download the old hard disk onto the new one. Again, this may take some time, but it will probably be faster than the upload.

Remove the g4u disk, and restart the machine. Assuming it was Windows you were running before, then your machine should boot up as normal without any further changes.

If your new disk is the same size as the old one, you’re done. If the new disk is bigger than the old one, though, (not an unreasonable assumption) there is one further step to take. If you go into the Disk Management tool, you’ll see that your new hard disk has a partition on it of the same size as the old disk, and a bunch of unallocated space. If you’re happy with adding a new drive letter to your machine, you can create a new logical drive in the unallocated space. However, if you want your system drive to make use of the whole disk’s space, you’ll need additional tools, because Windows doesn’t have built-in ability to resize your system partition.

Enter Knoppix. Go to the Knoppix web site and download the iso image for version 4.0.2 (or later), and burn this to a CD. Boot up from this CD into Linux, and use the QTParted tool (already present on the Knoppix disk) to resize your system partition. (Further instructions are available at the ntfsresize page.)

End result: a faster, bigger hard disk, containing an identical clone of your previous system installation. Although the elapsed time may be longer than the time for a complete Windows reinstall, you don’t have to worry about configuration or post-installation tasks. In my book, that’s a BIG win.

Categories
Second Best

Windows Forms applications: Just Say No

Of all the things I’ve learned this year, the most important is this: in a corporate/enterprise environment, you need to have a damn good reason for building a client-side forms application instead of a web app.

I reached this conclusion after building an in-house stock control system as a Windows forms app. Here’s how the architecture ended up:

Order system architecture: actual implementation

There’s nothing too controversial here, I’m sure you’ll agree. The most unusual thing is perhaps placing the burden of document generation (stock spreadsheets, production orders, invoices, etc.) on the client. The reason for this was cost and (supposed) simplicity: all the client machines using the application are equipped with MS Office, and so I could use Office automation to interact with a set of existing document templates. Buying a server-side component to generate Office documents (such as SoftArtisans OfficeWriter) seemed too expensive given the small size of the app, and creating a new set of templates in order to use a less expensive (or open source) PDF creator seemed too elaborate. (Don’t even get me started on working with WordML.)

In fact, document generation was the deciding factor in building a client-side app. In retrospect, this is probably the worst decision I made in 2005. The downsides were numerous:

Deployments

The obvious one has probably been the least painful. My experiences with .NET zero-touch deployments have been mixed at best. I’ve seen it working, and I’ve had it working myself, but the experience was awkward. Same with application updaters. Distributing a .msi setup package is simple and mostly foolproof, though. Nevertheless, it means the clients have to reinstall whenever a new version is available. If I had to do this again, I would choose one of the hands-off approaches, and work through the pain to get it up and running. Still if this were a web app, I wouldn’t have to deal with any of this.

Asynchronous communication

Easy enough in theory, but a bugger to get right in practice. The main idea is to keep the UI responsive while it is talking to the server by doing all the web service calls on a secondary thread. It was a lot of effort just to get progress bars working properly, and in the end I’m not entirely convinced it was worth it. As a UI specialist I am fully aware of the need for continuous feedback and a snappy feel, but for a small project like this I think it was overkill.

The .NET Datagrid component

Bane. Of. My. Life. Looks fine on paper, or in a demo, but COMPLETELY USELESS in any kind of real-world scenario. The amount of code you have to produce (either by writing it yourself, or by copying it from those who have suffered before you) to get even simple functionality working, like setting row heights, colouring the grid, or adding a drop-down list to a cell is staggering. If you want to do any serious client-side development with grids, you really must buy a third-party component.

In fact, the whole “rich user interface” benefit that has traditionally been the advantage of forms applications needs to be completely re-examined in the light of modern web apps, which draw upon javascript for better and more responsive interaction (Prototype, Script.aculo.us, Rico et al.), and CSS for visual flair. I can see a trend these days (in corporate environments) towards making client-side forms applications look and feel more like web pages, whereas just a few years ago it was the other way round.

Office automation with .NET

Not nearly as good as it should have been. Sure, I was able to re-use the existing templates to produce niceply formatted documents, but the Office API hasn’t improved significantly since 2000. Add to that the painful burning sensation of accessing it through COM Interop, and you get a whole heap of…yuckiness.

So with the benefit of hindsight, what should I have done instead? I’m glad you asked. Here’s the architecture for version 2:

Order system architecture: v2

The Forms UI is going away, and will be replaced by a clean HTML/CSS/JS front-end. The business logic, which was distributed between the client and the server will now be purely server-based. (There will still be a certain amount of client-side validation via the ASP.NET validator controls, but that will be controlled from server-side code.) It might include some Ajax further down the line, but the initial development will be a simple, traditional page-based web app.

And document generation? Version 2 will be using the OpenDocument (OpenOffice.org) format. This is an XML format that is an awful lot easier to get right than WordML, meaning that I can use simple XmlDocument-based code on the server to create documents. The client machines will get OpenOffice 2.0 installed, and upon clicking a “Print” button will receive a response with a Mime type of application/vnd.oasis.opendocument.text. The web browser can then pass it straight to OpenOffice for viewing and printing. OpenOffice has come a long way in the last few years, and version 2.0 is excellent. It happily converts Word templates to its own format, so I don’t even have to do much work in converting all the existing assets.

There is definitely still a need for client-side forms applications. If you want to make use of local hardware features, such as sound (e.g. audio recording), graphics (dynamic graphing and charting), and peripheral devices (barcode scanners), or if you want to have some kind of off-line functionality, you’re going to have to stick closely to the client. But for typical corporate/enterprise applications–staff directory, timesheets, CRM, and every bespoke data-entry package under the sun–I can see no compelling reason to consider a forms application as the default architecture.