Encrypted, incremental PostgreSQL backups to S3 (pg_dump + rclone + cron)

Configurare noua (How To)

Situatie

We’ll dump daily logical backups with pg_dump, compress them, encrypt with gpg, sync to S3 via rclone, and keep a rotating retention policy locally and on S3.

Solutie

Prerequisites

  • Linux server with postgresql user access.

  • pg_dump available.

  • rclone installed and configured with an S3 remote (rclone config).

  • gpg installed (for symmetric encryption).

  • A shell user with cron access.

Files & variables (examples)

BACKUP_DIR=/var/backups/pg
PG_USER=postgres
DB_LIST="db1 db2" # or "all" to dump all DBs in a loop
RCLONE_REMOTE="s3:my-bucket/pg-backups"
GPG_PASSPHRASE_FILE=/root/.backup_gpg_pass
RETENTION_DAYS=14

Steps

  1. Create backup dir & passphrase file

sudo mkdir -p $BACKUP_DIR
sudo chown $(whoami) $BACKUP_DIR
echo "your-very-strong-passphrase" > $GPG_PASSPHRASE_FILE
chmod 600 $GPG_PASSPHRASE_FILE
  1. Install rclone & test remote

# on Debian/Ubuntu
sudo apt update && sudo apt install -y rclone gnupg postgresql-client
rclone ls $RCLONE_REMOTE # confirm it can list the target bucket
  1. Create backup script /usr/local/bin/pg_backup_to_s3.sh

#!/usr/bin/env bash
set -euo pipefail
# config
BACKUP_DIR=/var/backups/pg
PG_USER=postgres
DB_LIST=“db1 db2”
RCLONE_REMOTE=“s3:my-bucket/pg-backups”
GPG_PASSPHRASE_FILE=/root/.backup_gpg_pass
RETENTION_DAYS=14
TODAY=$(date +%F)

mkdir -p $BACKUP_DIR/$TODAY

for DB in $DB_LIST; do
FILE=$BACKUP_DIR/$TODAY/${DB}_${TODAY}.sql.gz”
# dump and compress
sudo -u $PG_USER pg_dump -Fc $DB” | gzip > $FILE
# encrypt
gpg –batch —yes –passphrase-file $GPG_PASSPHRASE_FILE” -c $FILE
rm $FILE
done

# sync to S3
rclone sync $BACKUP_DIR/$TODAY$RCLONE_REMOTE/$TODAY” –transfers=4 –checksum

# retention local
find $BACKUP_DIR -mindepth 1 -maxdepth 1 –type d -mtime +$RETENTION_DAYSexec rm -rf {} \;

# retention on S3 (optional)
rclone delete –min-age ${RETENTION_DAYS}d $RCLONE_REMOTE –rmdirs

Make it executable:

sudo chmod +x /usr/local/bin/pg_backup_to_s3.sh
  1. Schedule cron job
    Open root crontab or a dedicated user:

# run daily at 02:30
30 2 * * * /usr/local/bin/pg_backup_to_s3.sh >> /var/log/pg_backup.log 2>&1
  1. Restore example

# download encrypted file from S3
rclone copy s3:my-bucket/pg-backups/2025-11-01/db1_2025-11-01.sql.gz.gpg /tmp/
# decrypt
gpg --batch --yes --passphrase-file /root/.backup_gpg_pass -o /tmp/db1_2025-11-01.sql.gz /tmp/db1_2025-11-01.sql.gz.gpg
gzip -d /tmp/db1_2025-11-01.sql.gz
# restore (example using pg_restore for custom format)
sudo -u postgres pg_restore -d db1 --clean /tmp/db1_2025-11-01.sql

Troubleshooting & notes

  • Use pg_dump -Fc (custom format) for faster pg_restore. If you prefer plain SQL use pg_dump > file.sql

  • Test restores regularly (monthly) to ensure backups are usable

  • gpg -c is symmetric encryption; for team setups use public-key encryption instead

  • Monitor cron logs and rclone exit codes; integrate with your alerting.

Tip solutie

Permanent

Voteaza

(0 din 1 persoane apreciaza acest articol)

Despre Autor

Leave A Comment?