TL;DR — Quick Summary

Rclone syncs files to 70+ cloud storage providers including S3, Google Drive, and Backblaze B2. Learn setup, encrypted backups, automation, and cron scheduling.

Rclone is the Swiss Army knife for cloud storage — a command-line tool that syncs, copies, and manages files across 70+ cloud providers. It handles everything from simple backups to complex multi-cloud sync workflows with built-in encryption and bandwidth control. This guide covers setup, encrypted backups, and production automation.

Prerequisites

  • Linux, macOS, or Windows
  • Account on a cloud storage provider (S3, B2, Google Drive, etc.)
  • Basic command-line familiarity

Installation

# Debian/Ubuntu
sudo apt install rclone

# Fedora/RHEL
sudo dnf install rclone

# Official installer (always latest)
curl https://rclone.org/install.sh | sudo bash

# Verify installation
rclone version

Configuration

Add a Storage Remote

rclone config

# Interactive wizard:
# n) New remote
# name> backblaze-b2
# storage> b2
# account> your-account-id
# key> your-application-key

Add Encrypted Wrapper

rclone config

# n) New remote
# name> encrypted-backup
# storage> crypt
# remote> backblaze-b2:my-bucket/encrypted
# filename_encryption> standard
# directory_name_encryption> true
# password> (enter strong password)
# password2> (enter salt password)

Now encrypted-backup: transparently encrypts everything uploaded to backblaze-b2:my-bucket/encrypted.

Core Commands

# Sync local to cloud (mirror — deletes extra files on destination)
rclone sync /home/user/documents encrypted-backup:documents --progress

# Copy only (never deletes destination files)
rclone copy /home/user/documents encrypted-backup:documents --progress

# List files on remote
rclone ls encrypted-backup:documents

# Check integrity (compare checksums)
rclone check /home/user/documents encrypted-backup:documents

# Mount cloud storage as local filesystem
rclone mount encrypted-backup:documents /mnt/cloud --daemon

Automated Backup Script

#!/bin/bash
# /usr/local/bin/backup-to-cloud.sh

REMOTE="encrypted-backup"
LOG="/var/log/rclone-backup.log"
LOCKFILE="/tmp/rclone-backup.lock"

# Prevent concurrent runs
if [ -f "$LOCKFILE" ]; then
    echo "Backup already running" >> "$LOG"
    exit 1
fi
trap "rm -f $LOCKFILE" EXIT
touch "$LOCKFILE"

echo "=== Backup started $(date) ===" >> "$LOG"

# Sync with bandwidth limit and retries
rclone sync /home/user/documents "$REMOTE:documents" \
    --bwlimit 50M \
    --retries 3 \
    --low-level-retries 10 \
    --log-file "$LOG" \
    --log-level INFO \
    --stats 30s \
    --transfers 4 \
    --checkers 8

echo "=== Backup finished $(date) ===" >> "$LOG"

Schedule with cron:

# Run backup daily at 2 AM
0 2 * * * /usr/local/bin/backup-to-cloud.sh

Comparison

FeatureRclonersyncaws s3 syncrestic
Cloud providers70+SSH onlyS3 onlyLocal + Cloud
EncryptionBuilt-inNoServer-sideBuilt-in
DeduplicationNoNoNoYes
VersioningVia providerNoVia S3Snapshots
Bandwidth controlYesYesNoYes
FUSE mountYesNoNoYes

Real-World Scenario

You manage 500 GB of project files across a team. You need nightly encrypted backups to Backblaze B2 (at $5/TB/month), with bandwidth limited to 50 Mbps to avoid saturating the office connection. Rclone handles this with a simple cron job: encrypted sync with bandwidth control, automatic retry on network errors, and log rotation for monitoring.

Gotchas

  • sync vs copy: rclone sync deletes files on the destination that don’t exist on the source. Use rclone copy if you only want to add new files
  • API rate limits: Some providers (Google Drive, OneDrive) have strict API limits. Use --drive-pacer-min-sleep and --transfers 1 to stay within limits
  • Large file handling: For files over 5 GB, rclone automatically uses multipart upload. Configure --s3-chunk-size for optimal performance
  • Crypt password recovery: There is no way to recover files if you lose the crypt passwords. Store them in a password manager and keep offline backup of the passwords

Summary

  • Rclone syncs files to 70+ cloud providers with a single, consistent interface
  • Built-in crypt encryption protects data at rest with XSalsa20 + Poly1305
  • Use rclone sync for exact mirrors, rclone copy for additive backups
  • Automate with cron or systemd timers using bandwidth limiting and retry logic
  • Mount cloud storage as local filesystem with FUSE for transparent access
  • Always test restores from encrypted backups to verify password correctness