Creating backups of server data in Dropbox.

Recently faced with the task of creating server data backups on my vps. Various websites are hosted on it. My personal ones are built on more or less reasonable technologies/engines, so all the code is in version control system. For such sites, only backup of the database and uploaded data is needed. Some third-party sites are built on WordPress, so the source code also needs to be backed up.

So, the task is as follows: with a certain frequency, databases and the contents of certain directories need to be backed up. Backups should be kept for a specific period of time to save space.

Also, I want to not only back up to the server and send it somewhere to the cloud, but also receive it locally on my hard drive with minimum effort; data synchronization is needed.

I decided to use Dropbox for this. (For reference, the server is running Debian)

But first, some preparations.

Basic idea: Create a user (I named it backupper), install Dropbox for that user, and create a directory for synchronization.

1. Installing Dropbox on the server.

I hope Dropbox is already installed and configured on your machine; we won't cover that process here. Let's move on to installing on the server.

To start, let's create a new user:

useradd backupper

Log in as the new user.

Next, follow the instructions from https://www.dropbox.com/install?os=lnx:

cd ~ && wget -O - "https://www.dropbox.com/download?plat=lnx.x86" | tar xzf -
~/.dropbox-dist/dropboxd

But I'll tell you right away that there's an easier way: download the cli script, as recommended (you'll need to download it anyway to manage the Dropbox daemon):

wget https://www.dropbox.com/download?dl=packages/dropbox.py -O dropbox.py

Next, move the script to /usr/local/bin/dropbox.py, although you can put it in the home directory of dropbox and add the necessary directory to the path if desired, like this:

mv  dropbox.py ~/local/bin/
chmod u+x ~/bin/dropbox.py

Add it to the PATH:

nano ~/.bashrc
PATH=$PATH:$HOME/bin

Now

dropbox start -i

will install Dropbox.

Next, simply

dropbox start

You can add it to autostart:

dropbox.py autostart y

2: Backing up databases (MySQL).

I used mysqldump for this.

#!/bin/bash

# Config
db_user=root
db_pass=pass
target_dir="/home/backupper/backup/"
dbs=("folkprog" "site1" "site2" "site3")
date_string=`date +%Y-%m-%d-%H-%M`

# DBs backup
for db in "${dbs[@]}"; do
    mysqldump -u"$db_user" -p"$db_pass" "$db" | gzip -c > "${target_dir}${db}_${date_string}.sql.gz"
done

3. Backing up site(s) data and configs

I used tar for this.

#!/bin/bash

# Config
target_dir="/home/backupper/backup/"
folkprog_assets_base_dir="/home/deploy/projects/folkprog.net/web/assets/"
deploy_others_project_dir="/home/deploy-others/projects/"
nginx_config_dir="/etc/nginx/"

# Folkprog files backup
tar -czf "${target_dir}folkprog-files.tar.gz" "${folkprog_assets_base_dir}"

# 3rd-party sites files backup
tar -czf "${target_dir}deploy-others.tar.gz" "$deploy_others_project_dir"

# Nginx config backup
tar -czf "${target_dir}nginx.gz" "$nginx_config_dir"

4. Cleaning up old files

#!/bin/bash

# Config
target_dir="/home/backupper/backup/"
keep_files_days=60

# Clean old backup files
find "$target_dir" -type f -mtime +"$keep_files_days" -exec rm -f {} \;

5. Packing and sending files to the sync directory:

#!/bin/bash

# Packing files to Dropbox sync dir
tar -czf "${target_sync_dir}backup_${date_string}.tar.gz" "$target_dir"

The final version of the script /usr/local/bin/backup, which will be executed as root:

#!/bin/bash

# Config
db_user=root
db_pass=pass
target_dir="/home/backupper/backup/"
target_sync_dir="/home/backupper/Dropbox/"
folkprog_assets_base_dir="/home/deploy/projects/folkprog.net/web/assets/"
deploy_others_project_dir="/home/deploy-others/projects/"
nginx_config_dir="/etc/nginx/"
dbs=("folkprog" "site1" "site2" "site3")
keep_files_days=60
# End Config

date_string=`date +%Y-%m-%d-%H-%M`

# DBs backup
for db in "${dbs[@]}"; do
    mysqldump -u"$db_user" -p"$db_pass" "$db" | gzip -c > "${target_dir}${db}_${date_string}.sql.gz"
done

# Folkprog files backup
tar -czf "${target_dir}folkprog-files.tar.gz" "${folkprog_assets_base_dir}"

# 3rd-party sites files backup
tar -czf "${target_dir}deploy-others.tar.gz" "$deploy_others_project_dir"

# Nginx config backup
tar -czf "${target_dir}nginx.gz" "$nginx_config_dir"

# Clean old backup files
find "$target_dir" -type f -mtime +"$keep_files_days" -exec rm -f {} \;

# Packing files to Dropbox sync dir
tar -czf "${target_sync_dir}backup_${date_string}.tar.gz" "$target_dir"

6. Adding a cron task.

crontab -u root -e
0 4 * * 4     /usr/local/bin/backup

The script /usr/local/bin/backup will be executed every Thursday at 4 AM.

That's basically it. Backups will come to you automatically.

For further searching: the task can also be solved using duplicity.