Keith Smiley | About | RSS | Mastodon

Backing up with Capistrano

We all know not backing up has consequences. While losing sentimental files would definitely ruin your day, losing your web server's data could be even worse. I've mentioned before that I use Linode for my server hosting, and while they do offer an automated backup service I decided I'd rather setup my own solution to back up periodically to my local machine.

Many people use rsync to do their server backups. In fact Linode even has a guide on how to set it up (there's a better one here). I decided that instead of a 1 for 1 directory backup, I would prefer to have a tarball of the contents. While I could've easily done this with a few bash commands from the server that's not particular ideal for my setup. My local machines don't run 24/7 so if I set it up on the server to automate the backup every week, it may try to initiate the backup when my machine was off (I could try to guess when it's on every week but that's not ideal either).

The obvious solution to this is run it from my local machine instead every week. That way once a week when it's powered up it would log in to the server, create the tarball and pull it down. Insert Capistrano ([sudo] gem install capistrano) a RubyGem for 'Remote multi-server automation.' So I wrote a very basic Capfile to automate this for me (replace the path to your www folder accordingly).

load 'deploy'

$SERVER_USER = "username"
$SERVER_IP   = "1.1.1.1"

desc "Backs up server www files"
task :backup, :hosts => $SERVER_IP do
  run "cd /srv; tar -pvczf ~/backup.tar.gz www/"
  run_locally "scp #{ $SERVER_USER }@#{ $SERVER_IP }:~/backup.tar.gz ~/Dropbox/Backups/Server"
end

Then I added this to my crontab on my local machine by running crontab -e and adding the line:

@weekly /Users/ksmiley/.rbenv/shims/cap -f ~/path/to/Capfile backup

I included the path to the Capistrano executable since cron (on OS X) executes tasks with sh, which isn't setup with my $PATH.