If you have a Google Apps account (a business account with more than 5 users or one provided by your school), you are given an “unlimited” amount of storage on Google Drive. In addition, there are great clients like Rclone that add features like encryption and a full rsync like feature set. This makes Google Apps a great platform to store nightly database dumps.
Installation
First, we need to install Rclone. After heading over to Rclone’s documentation on installation, all we need to do is:
wget https://rclone.org/install.sh
Open the install.sh file to make sure it is not malicious, and then run it as root. This will add rclone to your path to invoke from any directory.
Configuring Rclone
We need to add our Google Drive account as a remote endpoint. In your terminal, run rclone config
. Press `n` to create a new remote. For the name, you can make it anything, but I will use gdrive
. For the type, we are using Google Drive which is drive
. For the application ID & Secret, you can fill in your own API keys from the Google API Console, but this is optional, just press return to skip these. Service Account Credentials is also an optional advanced field. I always chose to not use auto config, especially if I am setting up rclone over SSH. All this means is that you will have to copy and paste a link from the terminal into your web browser.
After opening the link, you will need to login to your Google account and give rclone access. Copy and paste the access token from Google into your terminal window. Choose no on the team drive prompt. If all looks good, type y
to save your config.
Configuring an Encrypted Remote
We now have our Google Drive account configured, but we should also set up encryption. To do this, run rclone config
and create a new remote with the type crypt
. You will see this prompt:
Remote to encrypt/decrypt. Normally should contain a ':' and a path, eg "myremote:path/to/dir", "myremote:bucket" or maybe "myremote:" (not recommended). remote>
This is asking where you want your encrypted container to go. “crypt” is only a container for encryption and needs to be inside another remote (the Google Drive remote we set up earlier). I will be using gdrive:encrypted
where the first part is the name of my remote earlier and the second part being the folder I want this container to be located.
I also encrypt filenames with the standard encryption and encrypt directory names. I also let rclone generate a password/salt and be sure to write down these values before finishing the process.
Setting Up the Backup
What we need now is a tool that can:
- Login to our database
- Quickly dump the tables without locking
- Upload these dumps to our encrypted rclone remote
- Delete the dump locally
I have already created a backup script for this purpose and have uploaded it to my Github. This script will not lock tables while backing up data, but keep in mind that any writes to your other tables made recently will not be reflected. If this is important to you, take a look at some HotBackup solutions for your database.
Configure the variables at the top of the script and give it a run. If your database files appear in your rclone crypt, then you have set everything up correctly.
Using Crontab to Automatically Backup Our Database
Using the same user you have configured rclone with, open up the crontab using crontab -e
. Scroll to the bottom of the file, and add this rule:
00 06 * * * /bin/bash /home/me/sql/full.sh
Where /home/me/sql/full.sh is the path to the script you used earlier. This will run a full backup at 06:00 local server time every day to your rclone remote.
I hope this tutorial has been helpful, and of course be sure to check and test your backups regularly. =)
Leave a Reply