Automatic backup CyberPanel websites to S3 Storage without CyberPanel Cloud

How To's Feb 5, 2023

In the previous article, I wrote about how to automate CyberPanel git push without it's default Git Manager feature, today I like to share a way to automatic backup CyberPanel websites to S3-compatible Storage without CyberPanel Cloud.

Actually, CyberPanel comes with automatic backup to S3 storage. But, to use the "official" S3 Storage backup, you need to connect your CyberPanel instance to CyberPanel Cloud.

With this article method, I use bash script so it can be automatically executed using cron without connecting your CyberPanel instance to CyberPanel Cloud.

This article also written in Indonesia titled "Cara Backup Otomatis CyberPanel Website ke S3 tanpa menggunakan CyberPanel Cloud".


Before starting, there are several prerequisites that must be met to be able to use this method: We need an S3 client. There are many options you can use, such as official AWS S3 Client or Minio CLI. On this article, I'll use the Minio CLI as my S3 client.

Install and configure S3 Client (Minio CLI)

On Linux distributions such as Arch Linux, the Minio client can be installed using it's package manager by running pacman -S minio-client and the minio-client binary will be saved as mcli.

In other distributions such as Ubuntu, minio-client can be installed by downloading its binary program. Follow the official documentation on Minio CLI page.

Install and configutration example on Ubuntu

curl \
  --create-dirs \
  -o $HOME/minio-binaries/mc
chmod +x $HOME/minio-binaries/mc
export PATH=$PATH:$HOME/minio-binaries/

Then add export PATH=$PATH:$HOME/minio-binaries/ to your system $PATH variable on your shell you use (Ie: ~/bashrc if you use bash or ~/.zshrc if you use zsh).

Create alias for S3-Compatible service on Minio CLI

Execute this command to create an alias on Minio CLI:

  • Replace ALIAS with the name related to your S3 service.
  • Replace HOSTNAME with your S3 endpoint URL .
  • Replace ACCESS_KEY and SECRET_KEY with your S3 access and secret key.


mc alias set backup SomERanDomAcceSsKey SomERanDomSeCreTKey

bash script for CyberPanel backup

After S3 alias is configured, create bash script to do a backup job for CyberPanel website to S3.

#title           :
#description     : Simple script to backup CyberPanel websites to S3 Storage.
#author          : Christian Ditaputratama <[email protected]>
#date            : 2023-02-05
#last update     : 2023-02-05
#version         : 0.0.1
#usage           : bash
#notes           : This script need S3 client (minio-cli) installed and
#                  configured.
#                  Please read
#                  for more information.

set -e

MINIO_REMOTE_ALIAS="backup" # your mc `alias` name
MINIO_FOLDER="path/to/remote/folder/"  # Mandatory, don't forget the trailing slash at the end

##### End basic config #####
# stop editing here


# prevent multiple backup running at the same time
if [ -f "$PID_FILE" ]; then
    echo "Process is running! Exiting..."
    exit 0
touch $PID_FILE

LIST_WEBSITES=$(cyberpanel listWebsitesJson | jq -r '. | fromjson')

for WEBSITE in $(echo "${LIST_WEBSITES}" | jq -r '.[].domain'); do
    echo "Backing up ${WEBSITE}"
    cyberpanel createBackup --domainName ${WEBSITE}

    echo "Uploading to S3..."
    mc mirror /home/${WEBSITE}/backup/ $MINIO_REMOTE_ALIAS/$MINIO_BUCKET/$MINIO_FOLDER${WEBSITE}/ --overwrite
    echo "Remove old backup..."
    find /home/${WEBSITE}/backup -type f -name "backup-${WEBSITE}-*.tar.gz" -delete

    mc rm $MINIO_REMOTE_ALIAS/$MINIO_BUCKET/$MINIO_FOLDER${WEBSITE}/ --recursive --dangerous --force --older-than ${BACKUP_RETENTION_DAY}d

Change script file permission so it can be executed with  chmod +x path/to/ command.

Adjust variable values to suit with your environment :

  • MINIO_REMOTE_ALIAS :  alias name that we have previously configured.
  • MINIO_BUCKET : bucket name you use
  • MINIO_FOLDER : The folder location on S3 storage where we save the folder. Don't forget to put / at the end of the folder.
  • BACKUP_RETENTION_DAY : How long (in days) the backup on remote storage (S3) is kept.

Then, create a cron job , adjust as needed:

0 * * * * /bin/bash /path/to/ >/dev/null 2>&1