I have finally got our database backups running here on Sunpig. Our site is made up of half a dozen Movable Type blogs and a couple of phpBB forums. There are a bunch of photos scattered around the site, but apart from that, most all of the text content resides in a single MySQL database. (Abi’s BookWeb is the notable exception.) Even if all of the pages on sunpig.com were accidentally deleted, we could regenerate about 90% of them by reinstalling the basic MT and phpBB applications, and restoring that master database. So it’s quite important that we keep it safe.
I’ve knocked together a little perl script that runs as a cron job every night. It produces a backup of the database, compresses it to save space, and then stores the file somewhere safe (i.e. not in our webroot). It tags the file with the weekday name (“mon”, “tue”, etc.), so these files only last seven days before they get overwritten with newer versions. They will mostly be useful in case the live database becomes corrupted, or if I do something stupid, like accidentally delete a blog. (It does happen, you know.)
Then, once a week, the script takes the backup file it has produced, and uses FTP to transfer a copy to a different server half-way around the world. The remote file gets tagged with the full date in YYYYMMDD format, so they don’t get overwritten. I’ve only got 30MB of space on this other server, and the gzipped backup files are about 2MB each, so I’ll have to do a manual download and archive of these files every three months or so. Still, this is a vast improvement over manual backups on a “whenever I remember” basis.
In case you’re interested, here is the script:
dbbackup.pl
# Connection details for local database
$dbhost = ‘aaa’;
$dbuser = ‘bbb’;
$dbpass = ‘ccc’;
$dbname = ‘ddd’;
# Get the day of the week
$day = `date “+%a” | tr ‘A-Z’ ‘a-z’`;
chomp $day;
# Construct a file name consisting of the database name and the day of the week
$localfilename = “/home/sunpig/sqldump/$dbname\_$day.sql.gz”;
# Dump the database to file, and gzip it up
system “/usr/local/bin/mysqldump –host=$dbhost –user=$dbuser –password=$dbpass $dbname | gzip > $localfilename”;
# If today is friday, ftp the dump file to the remote server
if ($day eq ‘fri’) {
use Net::FTP;
# Connection details for remote FTP server
$ftpdest=’eee’;
$ftpuser=’fff’;
$ftppass=’ggg’;
$date = `date “+%Y%m%d”`;
chomp $date;
$remotefilename = join “”, $dbname, “_”, $date, “.sql.gz”;
# Connect to server and put the file
$ftp=Net::FTP->new($ftpdest) or die “Couldn’t connect to host\n”;
$ftp->login($ftpuser, $ftppass) or die “Couldn’t login\n”;
$ftp->put($localfilename, $remotefilename) or die “Couldn’t put file\n”;
$ftp->quit;
}