Ghost multi-blog backup bash script to Minio (S3 Compatible)

Simple bash script to automatic backup multiple Ghost blog on the same server to remote AWS S3 compatible server.

So, inspired from Jerry Ng’s Ghost backup script which can be used to backup single blog site to remote storage using rclone, I write this script that can be used to automatic backup multi Ghost blog on the same server to remote AWS S3 compatible server (in this case Minio) using Minio-CLI.

What’s backed up?

  • config.production.json and package-lock.json file.
  • Everything under content/ directory.
  • MySQL dump database.

Structure of generated backup on remote backup location (S3): [BUCKET_NAME]/ghost/[WEBSITE_NAME]/[YEAR]/[MONTH]/.

Requirements

  • Access to Linux Ghost admin user.
  • Configured Minio CLI.
  • S3 Compatible storage server (in this case Minio)

Script

 1#!/bin/bash
 2# Backup ghost website(s) to Minio
 3# Inspired from https://jerrynsh.com/backing-up-ghost-blog-in-5-steps/
 4# 
 5# This script also need Minio CLI configured, see:
 6# https://min.io/docs/minio/linux/reference/minio-mc.html
 7# Or edit and adapt with your favorite s3 client on
 8# S3_SECTION below.
 9
10set -e
11
12MINIO_REMOTE_ALIAS="myminio" # your mc `alias` name
13MINIO_BUCKET="backups"
14MINIO_FOLDER="ghost/"        # Mandatory, don't forget the trailing slash at the end
15
16# Array of website, `IFS` property separate by `|`
17# `IFS[0]` = website shortname, used to organize backuo folder location on S3
18# `IFS[1]` = Ghost website directory
19GHOST_WEBSITES=(
20    "example_blog1|/path/to/blog1"  # 1st website
21    "example_blog2|/path/to/blog2"  # 2nd website
22)
23
24##### End basic config #####
25
26SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
27
28for WEBSITE in "${GHOST_WEBSITES[@]}"
29do
30    IFS='|' read -ra WEBPARAMS <<< "$WEBSITE"
31    if [ ! -d "${WEBPARAMS[1]}" ]; then
32        echo "Folder not exists.. Skipping ${WEBPARAMS[0]}"
33    else
34        BACKUPDATE=`date +%Y-%m-%d-%H-%M`
35        echo "Performing backup ${WEBPARAMS[0]}"
36        cd ${WEBPARAMS[1]}
37        
38        ### ARCHIVE ###
39        tar -czf $SCRIPT_DIR/$BACKUPDATE-${WEBPARAMS[0]}.tar.gz content/ config.production.json package-lock.json
40        
41        ### DATABASE SECTION ###
42        db_user=$(ghost config get database.connection.user | tail -n1)
43        db_pass=$(ghost config get database.connection.password | tail -n1)
44        db_name=$(ghost config get database.connection.database | tail -n1)
45        mysqldump -u"$db_user" -p"$db_pass" "$db_name" --no-tablespaces | gzip > "$SCRIPT_DIR/$BACKUPDATE-$db_name.sql.gz"
46        
47        ### S3_SECTION ###
48        # adapt to your env
49        mc cp $SCRIPT_DIR/$BACKUPDATE-${WEBPARAMS[0]}.tar.gz $MINIO_REMOTE_ALIAS/$MINIO_BUCKET/$MINIO_FOLDER${WEBPARAMS[0]}/$(date +%Y)/$(date +%m)/$BACKUPDATE-${WEBPARAMS[0]}.tar.gz
50        mc cp $SCRIPT_DIR/$BACKUPDATE-$db_name.sql.gz $MINIO_REMOTE_ALIAS/$MINIO_BUCKET/$MINIO_FOLDER${WEBPARAMS[0]}/$(date +%Y)/$(date +%m)/$BACKUPDATE-$db_name.sql.gz
51        
52        
53        # REMOVE LOCAL BACKUP
54        rm -f $SCRIPT_DIR/$BACKUPDATE-${WEBPARAMS[0]}.tar.gz
55        rm -f $SCRIPT_DIR/$BACKUPDATE-$db_name.sql.gz
56	cd $SCRIPT_DIR
57    fi
58
59done
60
61exit 0

Or you can find it on https://gist.github.com/ditatompel/dc1d13259df3b945a633f8c0b789bd80.

How to use

Edit MINIO_REMOTE_ALIAS, MINIO_BUCKET, MINIO_FOLDER and array of website(s) to GHOST_WEBSITES variable. Then you can execute the script using cron.