Skip to main content

Command Palette

Search for a command to run...

How to daily back up PostgreSQL into S3

Updated
1 min read
T

I am a dedicated software engineer with a deep passion for security and a commitment to developing robust and scalable solutions. With over three years of hands-on experience in the .NET ecosystem, I have built, maintained, and optimized various software applications, demonstrating my ability to adapt to diverse project needs. In addition to my expertise in .NET, I have six months of specialized experience working with Spring Boot and ReactJS, further broadening my skill set to include full-stack development and modern web technologies. My professional journey includes deploying small to medium-sized systems to cloud platforms and on-premises environments, where I have ensured reliability, scalability, and efficient resource utilization. This combination of skills and experience reflects my versatility and commitment to staying at the forefront of the ever-evolving tech landscape.

Create a shell bash file as shown below

#!/bin/bash
# Directory containing temporary backup files
BACKUP_DIR="~/temp_backup"

# Format for backup file names (Ex: bk_2025-03-01.tar)
FILE_NAME="bk_$(date +%Y-%m-%d).tar"
FILE_PATH="$BACKUP_DIR/$FILE_NAME"

# S3 Bucket
S3_BUCKET="s3://your-bucket-name"

# PostgreSQL
PG_HOST=localhost
PG_PORT=5432
PG_USERNAME=postgre
PG_PASSWORD=<PGPASSWORD>
DB_NAME=postgres
DB_SCHEMA_NAME=public

# Execute a database backup leveraging Docker and the `pg_dump` utility.
docker run --rm -v "$BACKUP_DIR":/temp_backup --user root postgres bash -c "PGPASSWORD=$PG_PASSWORD pg_dump --verbose --host=$PG_HOST --port=$PG_PORT --username=$PG_USERNAME --format=t --encoding=UTF-8 --file /temp_backup/$FILE_NAME -n $DB_SCHEMA_NAME $DB_NAME"

# Checking the file's successful creation, then updating it in S3.
if [ -f "$FILE_PATH" ]; then
    echo "Uploading to S3..."
    aws s3 cp "$FILE_PATH" "$S3_BUCKET/$FILE_NAME"

    # If uploading file successfully (exit code = 0) then remove local temporary file (optional)
    if [ $? -eq 0 ]; then
        echo "Uploaded file to S3 successfully. Removing local temporary file"
        rm -f "$FILE_PATH"
    else
        echo "Failed to upload backup file to S3"
    fi
else
    echo "Could not find the backup file: $FILE_PATH"
fi

Create a crontab to run daily or at any time you prefer

crontab -e
0 0 * * * /path/to/backup_script.sh >> /var/log/backup_script.log 2>&1
chmod +x /path/to/backup_script.sh

Check the Crontab log

Ubuntu/Debian

grep CRON /var/log/syslog

CentOS/RedHat

grep CRON /var/log/cron