Simple Drupal Remote Backup

Like many web types I have a number of small websites I look after, and I've never been happy with my backup solution. Until today.

My requirements are simple, create a backup of the database and files and save it on a local backup drive. The sites don't warrant anything fancy (like a cloud based solution that costs money). The procedure until now involved multiple, fussy steps. It turned out a single script solution is simple once a few key technologies are in place.

Prerequisites

  • SSH no-password/public-key access
    Use your favourite search engine to search on the words: ssh public key authentication for a list of articles on setting up no-password, public-key ssh access. If you don't have public key ability, I suspect the script could be modified to use a password.
  • Drush 4.5 on the host
    If you have root access just follow the drush installation instructions. This article by Robin Monks provided the missing step for my shared-host sites: Installing Drush on a Shared DreamHost Account. The information in this article on setting up an alias, i.e. @site-name, is still valid in drush 4.5: New features in Drush 3

The Backup Script

The backup script uses the drush archive-dump command, available as of drush 4.5, to create a backup tarball that includes the database and files. The backup tarball is scp'ed to the backup drive and deleted from the host.

#!/bin/bash
#
# Backup the Group 42 live site
#
FILENAME=`date "+group42-%Y-%m-%d.tar"`
drush @g42live archive-dump --destination=/home/site/backuptemp/$FILENAME
scp user@example.com:/home/site/backuptemp/$FILENAME /Volumes/Memory-Alpha/Backups/Group42/.
ssh user@example.com rm /home/site/backuptemp/$FILENAME

Four lines of script (plus comments) and whenever I need a site backup it's one done in one command!

Comments

You could also do this without drush, by using the mysqldump command directly. What I do is set up a read-only database user on the host with access to all the databases, and use that to do the dump. Of course, that makes the script less portable -- drush will read the settings.php file and figure out the database, user name, and password to use, but if you use mysqldump, you have to put all of that on the command line, so you have to customize that one line for each installation.

I don't use a dated filename, dump into a git repo and use:

cd /home/site/backuptemp/siteRepo
git add -A
git commit -m "autocommit $(date +%Y%m%d)"

in my bash script. If you use the mysqldump arg "--skip-dump-date" and your dump file has no changes the git commit is a no-op (very rare if you get any traffic.)

Jennifer, good point for people who can't install the command line drush.

I have done something similar using Backup and Archive module's scheduled backup feature to create the mysql dump in the files directory, and then tarring the Drupal directory tree (which includes the mysql dump).

It probably isn't clear enough in the post, the advantage of the drush archive-dump command is you can use the drush archive-restore command to restore both the database and file system from the tarball in one command.

John, that's an interesting sqldump parameter I didn't know about. My use case doesn't call for it, but definitely good to know.