Question :
I am new with PostgreSQL, So my question is: what is the easiest way to make regular backups and send them to Google Drive. Perhaps, there is a tool that can make it or something else.
Answer :
You could try:
https://github.com/andreafabrizi/Dropbox-Uploader
to send your pg_dump output to Dropbox. I got this by Googling script load file dropbox. You may find some of the other scripts more to your liking.
Remember the dba’s motto – backup or f*ckup and test, test and test again. Personally, I would recommend a local (speed of recovery) and remote (double-sure) copy (as the Americans say, “belt and suspenders”).
It is a bash tool, but then I always assume PostgreSQL
ers are on a decent operating system :-).
Update – the bash shell can now be used natively on Windows as well since Microsoft appear to be doing more embracing and less smothering of *nix tools!
I use gdrive (follow the installation and set-up guidelines from the link).
Once gdrive is installed, I dump my own databases to a /bak
folder daily, and then upload my files to Google Drive’s PGBAK folder using gdrive. I use the following script in /etc/cron.daily/:
#!/bin/bash
PATH=$PATH:/usr/local/bin
BAK=/bak
# backup globals (database names, users, etc)
# get id of PGBAK folder
PGBAK="`gdrive list --no-header -q "name = 'PGBAK' and trashed = false" | grep -o '^[^ ]*'`"
if [ -z "$PGBAK" ]; then
PGBAK="`gdrive mkdir PGBAK | head -n 1 | sed 's/^Directory ([^ ]*) created$/1/'`"
if [ -z "$PGBAK" ]; then
echo "Can't create the PGBAK folder in the Google Drive" >2
exit 1
fi
fi
sudo -iu postgres pg_dumpall -g | xz -9 > "$BAK"/globals.sql.xz
# get files ids
GF="`gdrive list --no-header -q "'$PGBAK' in parents and trashed = false" | grep 'sglobals.sql' | head -n 1 | grep -o '^w*'`"
if [ -z "$GF" ]; then
# upload file
gdrive upload --no-progress -p "$PGBAK" "$BAK"/globals.sql.xz > /dev/null
else
# a revision list can be queried as follows:
# gdrive revision list --no-header "$GF"
# update an existing file (revision will be stored for 30 days)
gdrive update --no-progress "$GF" "$BAK"/globals.sql.xz > /dev/null
fi
# backup all databases individually, in script mode
sudo -iu postgres psql -Atc 'select datname from pg_database order by 1' |
while read db; do
# echo "Backing up $db..."
DBF="$BAK/$db.sql"
sudo -iu postgres pg_dump -C "$db" | xz -9 -T 4 > "$DBF.xz"
GF="`gdrive list --no-header -q "'$PGBAK' in parents and trashed = false" | grep "\s$DBF" | head -n 1 | grep -o '^w*'`"
if [ -z "$GF" ]; then
# upload file
gdrive upload --no-progress -p "$PGBAK" "$DBF.xz" > /dev/null
else
# you can get the revision list as follows:
# gdrive revision list --no-header "$GF"
# update an existing file (revision will be stored for 30 days)
gdrive update --no-progress "$GF" "$DBF.xz" > /dev/null
fi
done