Skip to content
Rafael Reggiani Manzo edited this page Oct 12, 2016 · 5 revisions

On the database machine we set at /etc/cron.daily the following script:

#!/bin/bash

now=$(date +"%Y.%m.%d-%H.%M.%S")
dest_dir=/root/pg-backup/$now
dbs="kalibro_configurations_production kalibro_processor_production mezuro"
ret=0

mkdir -p "$dest_dir"

for db in $dbs; do
    echo "Creating backup of $db in $dest_dir"
    if ! sudo -u postgres pg_dump "$db" > "$dest_dir/$db.sql"; then
        echo "ERROR: FAILED TO BACKUP $db!!!!"
        ret=1
    fi
done

if (cd "$dest_dir" && tar czf "backup.tar.gz" *.sql); then
    rm $dest_dir/*.sql
    if ! drive upload --name "mezuro-db-backup-$now.tar.gz" "$dest_dir/backup.tar.gz"; then
        echo "ERROR: FAILED TO UPLOAD BACKUP!!!!"
        ret=1
    fi
else
    echo "ERROR: FAILED TO COMPRESS BACKUP!!!!"
    ret=1
fi

exit $ret

Notice it uses drive upload command. Which you can get from https://github.com/prasmussen/gdrive#downloads. This needs to get configured by creating a folder .gdrive at the root user home and placing inside it the following files:

  • config.json holding the client_id and secret_id for the google account
{
    "ClientId": "<CLIENT_ID>",
    "ClientSecret": "<CLIENT_SECRET>"
}

The drive executable will then generate and update a token.json.

Clone this wiki locally