When dealing with simple project with very few concurrent users, staying with SQLite when going to production can help driving costs down. Unfortunately, that means you can use convenient services like Amazon RDS to manage automated backups for you.

In this tutorial, we will see how we can backup your SQLite database to common cloud storage providers. Some of them will even give you a few Gigabytes for free so it is probably a good fit for your usual hobby project. There are dozens of open-source packages that will help you backup your files to cloud providers. I wanted one that is lightweight, easy to configure and is available in binary. This ensures that it does not depend on any lanaguage/interpreter so it is suitable for a wide-range of projects. One of them spotted my interest: rclone. It does not only suit my requirements, but is use a very similar interface to rsync.

The first step is to download rclone. At the time of writing, the --backup-dir option is not yet available in stable version; so we'll download the rclone-v1.35-40-gb6848a3β version. This feature is very useful for incremental backups as it moves files that would be otherwise overwritten.

wget http://beta.rclone.org/v1.35-40-gb6848a3/rclone-v1.35-40-gb6848a3%CE%B2-linux-amd64.zip

or get the latest one from beta.rclone.org.

Then follow installation instructions from official website. Once rclone is properly installed, you should add a remote. rclone supports Amazon Drive, Amazon S3, Backblaze B2, Dropbox, Google Cloud Storage, Google Drive, Hubic, Microsoft OneDrive, SSH/SFTP Connection and a few others. Setup steps might be different depending on your provider, so go ahead and create a new remote using rclone config. I created a Google Drive remote named drive1 and rclone config displays it.

Current remotes:

Name                 Type
====                 ====
drive1               drive

To copy a file, you should use rclone copy such as rclone copy source:sourcepath dest:destpath. You can test if your remote was properly configured by creating a dummy.txt file and then rclone copy dummy.txt drive1:backup (replace drive1 with the name of your remote). dummy.txt file should be available on your cloud storage under the backup folder.

The next step is to configure a cron for your project to automate backups. A common example is to backup the database daily at 3 a.m. This is where we can take advantage of --backup-dir. E.g., rclone copy db.sqlite3 drive1:backup/latest --backup-dir drive1:backup/daily/$(date +"%Y_%m_%d") will copy db.sqlite3 to backup/latest on your remote and move the the previous backup (if available) to backup/daily/YYYY_MM_DD.

Let's move on and add the complete command to your user crontab using crontab -e. The following command will run the cronjob every day at 3 a.m.

PATH=/usr/sbin:/usr/bin:/sbin:/bin

0 3 * * * rclone copy /home/user/project/db.sqlite3 drive1:backup/latest --backup-dir drive1:backup/daily/$(date +"\%Y_\%m_\%d")

If you are not familiar with cron, have a look at the excellent CronHowto wiki to configure it according to your requirements.