Skip to content

How to create automated database backups on GitHub & S3.

For setting up automated backups on servers to GitHub or S3, we’ll be using shell script which will run periodically after a period of time to create and send backups on the server.

First, we’ll create the shell file which can be named anything with the extension .sh using nano:

nano /var/www/db-backups.sh

Then we’ll create variables and initialize them which can be used later in multiple commands in our script.

#!/bin/bash
 
export PATH=/bin:/usr/bin:/usr/local/bin
TODAY=`date +"%d%b%Y"`
 
DB_BACKUP_PATH='/var/www/backups'
MYSQL_HOST='localhost'
MYSQL_PORT='3306'
MYSQL_USER='user'
MYSQL_PASSWORD='password'
DATABASE_NAME='db'
BACKUP_RETAIN_DAYS=30   ## Number of days to keep local backup copy

DB_BACKUP_PATH is where the backups will be created for the MySQL database.

MYSQL_HOST, MYSQL_PORT, MYSQL_USER, MYSQL_PASSWORD, DATABASE_NAME are MySQL database details, which need to be backed up. This will help us connect to the database and create .sql backups at the defined path.

BACKUP_RETAIN_DAYS is to set the maximum time a created backup will remain on the server after the backup. After the maximum time is reached, backups will be deleted automatically.

TODAY variable contains the date when the backups is started and will be used to name the files.

First, we’ll create a directory with the date as a name:

mkdir -p ${DB_BACKUP_PATH}/${TODAY}

Then use mysqldump command to create the backups in the created directory using the variables defined above.

echo "Backup started for database - ${DATABASE_NAME}"
 
 
mysqldump -h ${MYSQL_HOST} \
   -P ${MYSQL_PORT} \
   -u ${MYSQL_USER} \
   -p${MYSQL_PASSWORD} \
   ${DATABASE_NAME} | gzip > ${DB_BACKUP_PATH}/${TODAY}/${DATABASE_NAME}-${TODAY}.sql.gz

This will create a backup file for the MySQL database and then will use the gzip command to compress the file with .sql.gz extension which will be pushed or uploaded to GitHub and S3.

Pushing automated backup to GitHub

For pushing the created backups to the GitHub we’ll be using the –work-tree to define where the git command should run and –git-dir to define where the git environment is set up.

We’ll run all three commands, supposing that the environment is already set in the backup directory using git init command.

/usr/bin/git --work-tree="$path/$name" --git-dir="$path/$name"/.git add .
/usr/bin/git --work-tree="$path/$name" --git-dir="$path/$name"/.git commit -m "$(date +%F)"
/usr/bin/git --work-tree="$path/$name" --git-dir="$path/$name"/.git push origin "$name"

This will create a commit with the date as a commit message and then push the code to GitHub.

Uploading automated backup to S3

For uploading to S3, we’ll need to setup the aws-cli first in the os environment using aws configure command. And then add aws s3 cp (reference) command to upload the database backup files to S3.

if [ $? -eq 0 ]; then
  echo "Database backup successfully completed"
  aws s3 cp ${DB_BACKUP_PATH}/${TODAY}/${DATABASE_NAME}-${TODAY}.sql.gz s3://bucket/${TODAY}/
else
  echo "Error found during backup"
  exit 1
fi

Deleting the created backups

Now, if we keep creating backups daily, or even multiple backups daily, then it’ll cause problems in terms of server space, and will keep getting pushed to the GitHub. It’s better we keep deleting the created backups after sometime. As we have defined the retail variable BACKUP_RETAIN_DAYS, when this time has passed, all the past backups will be deleted automatically.

DBDELDATE=`date +"%d%b%Y" --date="${BACKUP_RETAIN_DAYS} days ago"`
 
if [ ! -z ${DB_BACKUP_PATH} ]; then
      cd ${DB_BACKUP_PATH}
      if [ ! -z ${DBDELDATE} ] && [ -d ${DBDELDATE} ]; then
            rm -rf ${DBDELDATE}
      fi
fi

Crontab

This is the complete script to upload or push the MySQL database backups to S3 or GitHub respectively. But this needs to run periodically to created and upload the latest backups daily or after specified hours. For this we’ll be using crontab, which will call the script when we want automatically. For example:

0 2 * * * /usr/bin/sh /home/ubuntu/backups/db-backups.sh >> /var/www/crontab.logs 2>>&1

Bonus

The above script will create and upload the backups of one database defined in the script. In case you want to create and upload backups of all the databases in MySQL automatically without defining each of them in the script, you’ll need to tweak the script a little.

We’ll get the list of all the databases and then loop them and create the backup files one by one. Later on, all the steps will be the same, such as zipping and uploading to the desired location.

Instead of mysqldump command, add

mysql -e "show databases;" | while read Database; do
DATABASE_NAME=$Database
mysqldump -h ${MYSQL_HOST} \
   -P ${MYSQL_PORT} \
   -u ${MYSQL_USER} \
   -p${MYSQL_PASSWORD} \
   ${DATABASE_NAME} | gzip > ${DB_BACKUP_PATH}/${TODAY}/${DATABASE_NAME}-${TODAY}.sql.gz

done

Also, Read about unzipping a zip file at the same place in s3 using lambda.

Leave a Reply

Your email address will not be published. Required fields are marked *