Guides on system administration, 3D printing and other technology related projects.

Backup Your Ghost Blog to Amazon S3

Backup Your Ghost Blog to Amazon S3

There’s a lot of reasons why you should make regular backups of your blog. Your server could get hacked. The hard drive for your server might fail. You might accidentally delete some files. Who knows what might happen, right? That’s why I automatically backup my site’s data to the cloud.

The cloud storage provider that I recommend is Amazon. Amazon offers a service called Simple Storage Server (S3), which allows you to remotely store your data and retrieve it at a later time.

There’s even a free tier for Amazon S3, which gives you 5GB of storage and 15GB of monthly data transfer out.

And with the help of a command line tool called s3cmd, which is used for uploading and managing your data, the whole process is extremely simple.

If you’re unfamiliar with Amazon S3, I recommend that you read these two guides first:

Requirements

  • Amazon Web Services account
  • Amazon S3 bucket
  • S3cmd tool

Configure S3cmd

If you haven’t already done so, configure the S3cmd tool.

s3cmd --configure

Bash Script

Next we’ll create the file for our bash script.

nano amazon-s3-ghost-backup.sh

Paste the following into script into it and modify the variables according to your server specifics:

#!/bin/bash

pushd ()
{
    dirname=$1
    DIR_STACK="$dirname ${DIR_STACK:-$PWD' '}"
    cd ${dirname:?"missing directory name."}
    echo "$DIR_STACK"
}

# Amazon S3 bucket information
#
AWS_BUCKET_NAME="my-bucket-name"
AWS_BUCKET_FOLDER="/ghost/mywebsite.com"

# MySQL database credentials
SQL_HOST="127.0.0.1"
SQL_USER=""
SQL_PASS=""
SQL_DB=""

# Location of blog directory
#
# Example values if your Ghost blog is located in
# /var/www/ericmathison.com/blog"
#
GHOST_PARENT_DIR="/var/www/ericmathison.com"
GHOST_CHILD_DIR="/blog"

# Local directory to store backups
BACKUP_PATH="/backups/ghost"

# Timestamp format
TIMESTAMP=$(date +"%F_%T")

#######################################################
# Do not edit below this line
#######################################################

# Delete local backup files older than 30 days
find $BACKUP_PATH/* -mtime +30 -exec rm {} \;

# Set default file permissions
umask 117

# Backup the MySQL database
mysqldump --host=$SQL_HOST --user=$SQL_USER --password=$SQL_PASS $SQL_DB > /tmp/$SQL_DB-$TIMESTAMP.sql

# Create a zip archive of the Ghost blog directory and MySQL database
pushd $GHOST_PARENT_DIR
zip -rq $BACKUP_PATH/website-$TIMESTAMP.zip .$GHOST_CHILD_DIR

pushd /tmp
zip -q $BACKUP_PATH/website-$TIMESTAMP.zip ./$SQL_DB-$TIMESTAMP.sql
rm -rf /tmp/$SQL_DB-$TIMESTAMP.sql

# Upload the latest backup to Amazon S3
s3cmd put $BACKUP_PATH/website-$TIMESTAMP.zip s3://$AWS_BUCKET_NAME$AWS_BUCKET_FOLDER/$TIMESTAMP.zip

Make the script executable.

chmod +x amazon-s3-ghost-backup.sh

Test the Script

You can go ahead and test the script out now:

./amazon-s3-ghost-backup.sh

If everything was successful, you’ll see the backup zile file in your Amazon S3 bucket.

Schedule Cron Job

Create a cron job and schedule the script to run automatically every night at 6:00 AM.

crontab -e

0 6 * * *       /<path to script>/amazon-s3-ghost-backup.sh > /dev/null 2>&1

© Eric Mathison 2017-2020.