How to back up Mongo to S3 or R2 using shell script

The familiar problem of backup size is one I’ve recently run in to. My spare VPS which I was using for daily backups would often freak out as its disk filled up with data. So the next logical step is to download the data, compress it, then store it in an S3 compatible provider such as CloudFlare R2.

The below script it one I’m now using, and seems to be working well. It first connects to the mongo instance and does a mongodump. The resulting data is compressed to .gz, and then it is uploaded to CloudFlare R2.

After this, we delete the folder and its contents from the server, and unset any keys.

All very simple, and efficient.

#!/bin/bash

DB_BACKUP_PATH='./backup'
TODAY=`date +"%d%b%Y"`
MONGO_PASS='<mongo pass>'

mkdir -p ${DB_BACKUP_PATH}/${TODAY}
cd ${DB_BACKUP_PATH}/${TODAY}
mongodump --uri="your-uri-here" --archive=${TODAY}.gz --gzip

export AWS_ACCESS_KEY_ID='<access key here>'
export AWS_SECRET_ACCESS_KEY='<secret key here>'
export AWS_REGION=auto
AWS_S3_ENDPOINT='https://<your-account-number-here>.eu.r2.cloudflarestorage.com'

aws s3 cp ${DB_BACKUP_PATH}/${TODAY}/${TODAY}.gz s3://<your r2 bucket> --endpoint-url ${AWS_S3_ENDPOINT}

# unset
unset AWS_ACCESS_KEY_ID
unset AWS_SECRET_ACCESS_KEY

rm -rf ${DB_BACKUP_PATH}/${TODAY}

Next we crontab -e and add the following line to execute the script daily:

0 0 * * * /bin/bash /<user>/backup.sh