Creating wordpress backups is necessary and easy – a simple bash script creating zipped files from htdocs and mysqldump are sufficient. The more related question is – where to store these backups? As these blogs don’t contain any sensitive private information, but are all served by public web domains, best to store them in the cloud.

Since I do have lots of Google Drive storage available, it’s reasonable to upload the most recent backups over there. While looking for possible CLI options, GDrive got my attention. The client is written in Go without any further dependencies – download the binary for Linux amd64, make it executable, and run it once.


$ wget https://drive.google.com/uc?id=0B3X9GlR6EmbnTjk4MGNEbEFRRWs -O drive-linux-amd64

It will generate an authorization url for the Google API, and you’ll have to authorize the application and copy-paste the authorization token into the cli. Done. Now you’re able to upload/download files using the binary.


$ chmod +x drive-linux-amd64
$ ./drive-linux-amd64
Go to the following link in your browser:
https://accounts.google.com/o/oauth2/auth?client_id=...

Enter verification code:

 

wordpress_backup_gdriveGdrive stores the token and config below the ~/.gdrive directory.

While GDrive could upload folders, it will always generate a new id with the same folder name below its target. That’s not something I want to keep, so each file is uploaded on its own preserving the global parent folder id (backup/wordpress in my Google Drive tree).

You can easily extract the parent folder id by looking at the Google Drive url:


https://drive.google.com/drive/#folders/<firstlevel(backup)>/<secondlevel(wordpress)>

which then means, only the second level id is needed for the backup script.


#!/bin/bash
# requires sudo for mysqldump

BACKUP_PATH=/home/michi/backup/wordpress
GDRIVE_BIN=/home/michi/backup/drive-linux-amd64

GDRIVE_BACKUP_PARENT_ID=YOURPARENTTOKEN

declare -A WEBS
# vhostname = dbname
WEBS=([“www.legendiary.at”]=”wp_legendiary_at”)

WEB_DIR=/var/www
WEB_SUBDIR=htdocs
WEB_LOGDIR=logs

# clear backups older than
find $BACKUP_PATH -type f -mtime +10 | xargs rm -f

# start backup and upload to google drive
cd $BACKUP_PATH

for web in “${!WEBS[@]}”
do
timestamp=`date +%Y-%m-%d-%H%M%S`
web_path=”$WEB_DIR/$web/$WEB_SUBDIR”
web_tar_gz=”$web-$timestamp-$WEB_SUBDIR.tar.gz”
db_name=${WEBS[$web]}
db_sql_gz=$web-$timestamp.sql.gz

echo “Creating $web_tar_gz backup for $web…”
`tar czf $web_tar_gz $web_path`

echo “Uploading $web_tar_gz backup to GDrive…”
$GDRIVE_BIN upload -f $web_tar_gz -p $GDRIVE_BACKUP_PARENT_ID

echo “Creating $db_sql_gz backup for $web…”
`sudo mysqldump –databases $db_name | gzip > $db_sql_gz`

echo “Uploading $db_sql_gz backup to GDrive…”
$GDRIVE_BIN upload -f $db_sql_gz -p $GDRIVE_BACKUP_PARENT_ID
done

 

Add a cron job running every day.


0 0 * * * /home/michi/backup/backup_wordpress.sh > /dev/null 2>&1

Keep in mind to secure access to the GDrive binary and its configuration – it is able to not only list but manipulate your Google Drive data. If exploited, backups won’t be necessary in Google Drive anymore 😉