🎁 New User? Get 20% off your first purchase with code NEWUSER20 Register Now →
Menu

Categories

How to Automate Linux Server Backups: Complete Guide (2026)

How to Automate Linux Server Backups: Complete Guide (2026)

If you are not backing up your Linux servers automatically, you are one hardware failure away from losing everything. Manual backups are unreliable because humans forget, make mistakes, and skip weekends. This guide covers every aspect of automating Linux server backups, from simple file copies to enterprise-grade multi-location strategies.

Backup Strategy: The 3-2-1 Rule

Before writing any scripts, understand the 3-2-1 backup rule:

  • 3 copies of your data (original + 2 backups)
  • 2 different storage media (local disk + remote/cloud)
  • 1 copy offsite (different physical location)

1. File System Backups with rsync

# rsync is the gold standard for file synchronization
  # Only transfers changed data (incremental by default)

  # Basic local backup
  rsync -avz --delete /var/www/ /backup/www/

  # Backup to remote server via SSH
  rsync -avz --delete -e "ssh -p 22" /var/www/ backup@remote:/backup/www/

  # Exclude unnecessary files
  rsync -avz --delete \
    --exclude='*.log' \
    --exclude='node_modules/' \
    --exclude='.git/' \
    --exclude='cache/' \
    /var/www/ /backup/www/

  # Bandwidth-limited backup (useful for production servers)
  rsync -avz --bwlimit=10000 /var/www/ backup@remote:/backup/www/

  # Dry run (preview what would be transferred)
  rsync -avzn --delete /var/www/ /backup/www/

2. Compressed Archives with tar

# Full system backup (excluding virtual filesystems)
  sudo tar czf /backup/full-backup.tar.gz \
    --exclude=/backup \
    --exclude=/proc \
    --exclude=/sys \
    --exclude=/dev \
    --exclude=/tmp \
    --exclude=/run \
    /

  # Web application backup
  tar czf /backup/webapp-backup.tar.gz /var/www/myapp/

  # Configuration files backup
  tar czf /backup/etc-backup.tar.gz /etc/

  # List contents without extracting
  tar tzf /backup/webapp-backup.tar.gz | head -20

  # Restore from archive to specific directory
  tar xzf /backup/webapp-backup.tar.gz -C /restore/

3. Database Backups

# PostgreSQL single database backup
  pg_dump -U postgres mydb > /backup/mydb.sql
  pg_dump -U postgres mydb | gzip > /backup/mydb.sql.gz

  # All PostgreSQL databases
  pg_dumpall -U postgres | gzip > /backup/all-databases.sql.gz

  # MySQL/MariaDB backup
  mysqldump -u root -p mydb > /backup/mydb.sql
  mysqldump -u root -p --all-databases | gzip > /backup/all-databases.sql.gz

  # Restore PostgreSQL
  psql -U postgres mydb < /backup/mydb.sql
  gunzip -c /backup/mydb.sql.gz | psql -U postgres mydb

  # Restore MySQL
  mysql -u root -p mydb < /backup/mydb.sql

4. Cloud Backups with rclone

# Install rclone
  sudo apt install rclone

  # Configure cloud storage provider
  rclone config
  # Supports: AWS S3, Google Drive, Backblaze B2, Wasabi, SFTP, 40+ providers

  # Sync backups to S3-compatible storage
  rclone sync /backup/ s3remote:my-backup-bucket/server01/

  # Sync to Backblaze B2 (cheapest cloud storage option)
  rclone sync /backup/ b2:my-backups/server01/

  # Encrypted backup to cloud
  rclone sync /backup/ crypt-remote:server01/

  # With bandwidth limit (50 MB/s)
  rclone sync --bwlimit 50M /backup/ s3remote:my-backup-bucket/

5. Complete Automated Backup Script

#!/bin/bash
  # /opt/scripts/daily-backup.sh
  # Automated daily backup: files + database + cloud sync

  set -euo pipefail

  BACKUP_DIR="/backup"
  RETENTION_DAYS=30
  TIMESTAMP=$(date +%Y%m%d_%H%M%S)
  LOG_FILE="/var/log/backup.log"
  DB_NAME="myapp"
  DB_USER="postgres"
  REMOTE="s3remote:my-backup-bucket/$(hostname)"

  log() {
      echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" | tee -a "$LOG_FILE"
  }

  log "=== Starting daily backup ==="

  # Step 1: Database backup
  log "Backing up database..."
  pg_dump -U "$DB_USER" "$DB_NAME" | gzip > "$BACKUP_DIR/db-$TIMESTAMP.sql.gz"
  log "Database backup complete"

  # Step 2: Application files
  log "Backing up application files..."
  tar czf "$BACKUP_DIR/app-$TIMESTAMP.tar.gz" \
    --exclude='node_modules' \
    --exclude='*.log' \
    --exclude='.git' \
    /var/www/myapp/
  log "File backup complete"

  # Step 3: Configuration backup
  log "Backing up configuration..."
  tar czf "$BACKUP_DIR/config-$TIMESTAMP.tar.gz" \
    /etc/nginx/ \
    /etc/postgresql/ \
    /etc/ssh/sshd_config \
    /etc/crontab 2>/dev/null
  log "Config backup complete"

  # Step 4: Sync to cloud
  log "Syncing to cloud..."
  rclone sync "$BACKUP_DIR/" "$REMOTE/" 2>> "$LOG_FILE"
  log "Cloud sync complete"

  # Step 5: Clean up old backups
  log "Cleaning backups older than $RETENTION_DAYS days..."
  find "$BACKUP_DIR" -name "*.gz" -mtime +$RETENTION_DAYS -delete

  # Step 6: Verify integrity
  gzip -t "$BACKUP_DIR/db-$TIMESTAMP.sql.gz" && log "DB verify: OK"
  gzip -t "$BACKUP_DIR/app-$TIMESTAMP.tar.gz" && log "App verify: OK"

  log "=== Backup complete ==="

6. Schedule with Cron

# Make script executable
  chmod +x /opt/scripts/daily-backup.sh

  # Edit crontab
  crontab -e

  # Daily full backup at 2 AM
  0 2 * * * /opt/scripts/daily-backup.sh

  # Database backup every 6 hours
  0 */6 * * * pg_dump -U postgres mydb | gzip > /backup/db-hourly.sql.gz

  # Weekly full system backup on Sundays
  0 3 * * 0 /opt/scripts/weekly-full-backup.sh

7. Backup Verification and Restore Testing

# CRITICAL: An untested backup is NOT a backup!
  # Schedule monthly restore drills

  # Test database restore
  createdb myapp_restore_test
  gunzip -c /backup/db-latest.sql.gz | psql -U postgres myapp_restore_test
  psql -U postgres myapp_restore_test -c "SELECT count(*) FROM users;"
  dropdb myapp_restore_test

  # Test file restore
  mkdir -p /tmp/restore-test
  tar xzf /backup/app-latest.tar.gz -C /tmp/restore-test
  ls -la /tmp/restore-test/var/www/myapp/
  rm -rf /tmp/restore-test

  # Verify archive integrity
  gzip -t /backup/*.gz
  echo "All archives passed integrity check"

Backup Strategy Decision Guide

ScenarioToolFrequency
Web files to remote serverrsyncEvery 6 hours
Databasepg_dump/mysqldumpEvery 6 hours
Full system archivetarWeekly
Config filestar + gitDaily
Cloud offsite copyrcloneDaily

Download our Linux Backup Automation Cheat Sheet for a printable reference.

Share this article:
Dorian Thorne
About the Author

Dorian Thorne

Cloud Infrastructure, Cloud Architecture, Infrastructure Automation, Technical Documentation

Dorian Thorne is a cloud infrastructure specialist and technical author focused on the design, deployment, and operation of scalable cloud-based systems.

He has extensive experience working with cloud platforms and modern infrastructure practices, including virtualized environments, cloud networking, identity and acces...

Cloud Computing Cloud Networking Identity and Access Management Infrastructure as Code System Reliability

Stay Updated

Subscribe to our newsletter for the latest tutorials, tips, and exclusive offers.