Command Line, Cloud Storage, Backup

Rclone: Finally, Cloud Storage Sync That Doesn't Suck

Sync files between Google Drive, Dropbox, S3, and local storage. Mount cloud storage as a drive. Automate backups with cron. Here's what I actually use.

March 11, 2026 ยท 10 min read

How I Got Here

Needed a way to sync files between my servers and cloud storage. Tried a bunch of options:

Then someone mentioned Rclone. One tool for all cloud storage. Can sync, mount, serve files. Command-line only. Sounded exactly like what I needed.

Getting It Installed

Installation is straightforward:

# Linux/macOS
curl https://rclone.org/install.sh | sudo bash

# Or with Homebrew (macOS/Linux)
brew install rclone

# Verify installation
rclone version

That's it. No dependencies, no configuration files yet. Just a binary that works.

Setting Up Cloud Storage

Rclone uses "remotes" - configuration profiles for each cloud storage service. Setting up a remote is interactive:

rclone config

This starts an interactive setup. Here's what it looks like for Google Drive:

No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n

name> gdrive

Type of storage to configure.
Choose a number from below, or type in your own value
 1 / 1Fichier
   \ "fichier"
 2 / Alias for an existing remote
   \ "alias"
...
17 / Google Drive
   \ "drive"
...
Storage> 17

Google Application Client Id - leave blank normally.
client_id>

OAuth Client Secret - leave blank normally.
client_secret>

Scope that rclone should use when accessing the remote.
Choose a number from below, or type in your own value
 1 / Full access all files, except Application Data Folder.
 2 / Read-only access to file metadata and file contents.
 3 / Read-only access to file metadata.
...
scope> 1

root_folder_id>

service_account_file>

Edit advanced config? (y/n)
y) Yes
n) No (default)
y/n> n

Remote config

Use auto config?
 * Say Y if not sure
 * Say N if you are working on a remote or headless machine
y) Yes (default)
n) No
y/n> n

If your browser doesn't open automatically please go to the following link:
https://accounts.google.com/o/oauth2/auth?...

Log in and authorize rclone for access

Enter verification code> 4/1AXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

Configure this as a team drive?
y) Yes
n) No (default)
y/n> n

--------------------
[gdrive]
type = drive
scope = drive
token = {"access_token":"XXX","token_type":"Bearer"}
--------------------
y) Yes this is OK (default)
e) Edit this remote
d) Delete this remote
y/e/d> y

Key points:

After setup, you can list remotes:

rclone listremotes

Basic Sync Commands

Once configured, syncing is straightforward:

# List files in remote
rclone ls gdrive:backups

# Copy files (local to remote)
rclone copy /path/to/local gdrive:backups

# Sync files (makes target match source)
rclone sync /path/to/local gdrive:backups

# Copy from remote to local
rclone copy gdrive:backups /path/to/local

# Check what would change without doing it
rclone sync --dry-run /path/to/local gdrive:backups

๐Ÿ’ก Copy vs Sync

copy copies files that don't exist at destination. sync makes destination match source (deletes files at destination that don't exist at source). Be careful with sync!

Mounting Cloud Storage as a Drive

This is the feature that made me stick with Rclone. You can mount cloud storage as a local directory:

# First, install FUSE (if not already installed)
# Ubuntu/Debian:
sudo apt install fuse

# macOS with Homebrew:
brew install macfuse

# Create mount point
mkdir ~/gdrive

# Mount Google Drive
rclone mount gdrive: ~/gdrive --daemon

Now you can access Google Drive like a local folder:

ls ~/gdrive
cp file.txt ~/gdrive/backups/
cd ~/gdrive/documents

Mount options that are useful:

# Mount with caching (faster for frequent access)
rclone mount gdrive: ~/gdrive \
  --daemon \
  --vfs-cache-mode full \
  --cache-dir /tmp/rclone-cache

# Mount with read-write access
rclone mount gdrive: ~/gdrive \
  --daemon \
  --allow-other

# Unmount when done
fusermount -u ~/gdrive  # Linux
umount ~/gdrive         # macOS

The --vfs-cache-mode full flag is important. It caches files locally for faster access and better compatibility with applications.

Automating Backups with Cron

This is what I actually use Rclone for - automated daily backups:

#!/bin/bash
# backup-script.sh

# Local directories to backup
SOURCE_DIRS=("/var/www" "/home/user/documents" "/etc")

# Backup destination (Google Drive)
REMOTE="gdrive:server-backups"

# Date stamp
DATE=$(date +%Y%m%d_%H%M%S)

# Create backup directory
BACKUP_DIR="${REMOTE}/${DATE}"

# Sync each directory
for DIR in "${SOURCE_DIRS[@]}"; do
    DIR_NAME=$(basename "$DIR")
    echo "Backing up $DIR to ${BACKUP_DIR}/${DIR_NAME}"
    rclone sync "$DIR" "${BACKUP_DIR}/${DIR_NAME}" \
        --progress \
        --log-file=/var/log/rclone-backup.log \
        --log-level INFO
done

# Keep only last 30 days of backups
echo "Cleaning up old backups..."
rclone delete $REMOTE --min-age 30d

echo "Backup completed: $DATE"

Add to crontab for daily execution:

# Edit crontab
crontab -e

# Add this line for daily backup at 2 AM
0 2 * * * /path/to/backup-script.sh >> /var/log/backup.log 2>&1

โš ๏ธ First Run Tip

First backup will take a long time if you have lots of files. Test with --dry-run first to see what will happen without actually transferring anything.

Backing Up to S3-Compatible Storage

I also use Rclone to backup to S3-compatible storage (like MinIO or AWS S3):

# Configure S3 remote
rclone config

# Choose S3 storage type
Storage> 3  # S3

provider> 1  # AWS

access_key_id> YOUR_ACCESS_KEY

secret_access_key> YOUR_SECRET_KEY

region> us-east-1

endpoint> https://s3.amazonaws.com

location_constraint>

acl>

Edit advanced config? y/n> n

Remote config
[name> s3
storage> s3
... (config details) ...
y/e/d> y

Now you can sync to S3 just like Google Drive:

# Sync to S3
rclone sync /path/to/backups s3:my-bucket/backups

# List S3 buckets
rclone lsd s3:

# Check S3 storage usage
rclone about s3:

For self-hosted S3 (like MinIO), just use your endpoint:

# MinIO setup
endpoint> https://minio.example.com

# Then use it normally
rclone sync /data minio:my-bucket/data

Serving Files Over HTTP

Rclone can also serve files over HTTP/S. Great for quick sharing:

# Serve current directory over HTTP
rclone serve http /path/to/files --addr 0.0.0.0:8080

# Serve remote (like Google Drive) over HTTP
rclone serve http gdrive:public-files --addr 0.0.0.0:8080

# With authentication
rclone serve http /path/to/files \
  --addr 0.0.0.0:8080 \
  --user admin \
  --pass securepassword

# Serve with HTTPS (needs cert)
rclone serve http /path/to/files \
  --addr 0.0.0.0:443 \
  --cert /path/to/cert.pem \
  --key /path/to/key.pem

I use this to quickly share files from my servers without setting up nginx or Apache.

What Actually Works for Me

After using Rclone for a while, here's my setup:

Daily Backup Script

#!/bin/bash
# /usr/local/bin/daily-backup.sh

# Backup database
mysqldump -u root -p$MYSQL_ROOT_PASSWORD \
  --all-databases | gzip > /tmp/mysql_backup.sql.gz

# Backup to Google Drive
rclone sync /tmp/mysql_backup.sql.gz \
  gdrive:server-backups/mysql/$(date +%Y%m%d).sql.gz \
  --progress

# Backup to S3 (redundancy)
rclone sync /tmp/mysql_backup.sql.gz \
  s3:my-bucket/mysql-backups/$(date +%Y%m%d).sql.gz \
  --progress

# Cleanup local temp file
rm /tmp/mysql_backup.sql.gz

echo "Backup completed: $(date)"

Monitor Sync Status

# Check sync status
rclone check gdrive:backups /local/backups

# Get size of remote
rclone size gdrive:backups

# Monitor in real-time
rclone sync /local gdrive:backups --stats 1s

Mount for Continuous Access

# /etc/fstab entry (Linux)
# Requires: rclone mount gdrive: /mnt/gdrive fuse \
#   defaults,idmap=user,allow-other,_netdev,user 0 0

# Or systemd service (for boot-time mount)
# /etc/systemd/system/rclone-gdrive.service
[Unit]
Description=Rclone Google Drive Mount
After=network-online.target
Wants=network-online.target

[Service]
Type=notify
ExecStart=/usr/bin/rclone mount gdrive: /mnt/gdrive \
  --vfs-cache-mode full \
  --allow-other
Restart=on-failure
RestartSec=5s

[Install]
WantedBy=default.target

Problems I Hit

Mount Fails with "Device or Resource Busy"

Happens when mount point is already in use. Fix:

# Check what's mounted
mount | grep gdrive

# Force unmount
fusermount -uz ~/gdrive

# Try mounting again
rclone mount gdrive: ~/gdrive --daemon

Sync is Slow

Large number of files can slow down sync. Solutions:

# Use --fast-list (faster for large directories)
rclone sync /local gdrive:backups --fast-list

# Limit concurrent transfers
rclone sync /local gdrive:backups --transfers 8

# Skip modtime checks (faster but less accurate)
rclone sync /local gdrive:backups --no-update-modtime

OAuth Token Expires

After some time, Google Drive OAuth tokens expire. Re-run config:

# Reauthorize existing remote
rclone config reconnect gdrive:

# Or edit existing remote
rclone config edit gdrive

Final Thoughts

Rclone is one of those tools that just works. Once configured, it's reliable and fast.

The combination that works for me:

Not claiming it's perfect - sometimes API limits cause issues,ๅถๅฐ”้œ€่ฆ้‡ๆ–ฐๆŽˆๆƒ. But compared to desktop apps or other CLI tools, Rclone is by far the most reliable option I've found.

๐Ÿ“š Recommended Reading