How to Use the Rsync Utility to Make a Remote Backup of a Linux System

Written by David Worthington on February 18, 2022

Share This Article

Jump to Tutorial

We cannot emphasize enough the importance of having a backup. Natural disasters, cyberattacks, or other devastating events can happen when you least expect them. To be on the safe side, it is always recommended to have round-the-clock system backups to ensure business continuity in case of service interruption.

One of the more useful backup utilities for Linux systems is the rsync utility. Rsync, short for remote sync, is a data transfer and synchronization tool that intelligently transfers and synchronizes files between directories or across networked computer systems. It achieves this by comparing file sizes and modification times. If the file size and modification times are different, then it transfers the files from the directory or system that hosts the files to another directory or remote system. 

Rsync is configured to securely transfer and synchronize data over the SSH protocol. The file synchronization happens immediately, and with the proper backup testing process in place, you can rest assured that you have a safe, accurate backup. 

In an earlier tutorial, we covered how to make local backups using rsync. In this guide, we will go a step further and demonstrate how to make a remote backup — i.e., your data is stored in a separate machine — using the rsync utility.

Prerequisites

As you get started, ensure you have the following:

  1. SSH is installed and running on both the local and destination servers. Chances are SSH daemon is already installed and no further action is required.

To check the version of SSH you are running, run the following command:

$ ssh -V

  1. In addition, you need two Linux servers — the source or local server and the remote server. Here is the lab setup we will use to demonstrate how rsync works:

Local Server IP: 173.82.120.115 (Ubuntu 20.04)

Remote Server IP: 173.82.255.207 (CentOS 8)

  1. Lastly, ensure you have a local user configured with sudo privileges.

Step 1: Install Rsync on the Local Server

To start off, ensure that rsync is installed. In this example, we will install rsync on the local server (Ubuntu 20.04) as follows:

$ sudo apt install rsync 

Once installed, start and enable the rsync service.

$ sudo systemctl start rsync 

$ sudo systemctl enable rsync 

To confirm that rsync is installed, run the command:

$ rsync –version

The output below confirms that we have rsync installed:

Step 2: Install and Configure Rsync on the Destination Server

In addition to installing rsync on the source or local server, we also need to install it on the destination server or cloud server. To install rsync on CentOS 8, use the DNF package manager as follows:

$ sudo dnf install rsync rsync-daemon

Once installed, confirm it is installed with the following:

$ rpm -qi rsync

Next, you need to configure rsync to allow remote connections from the source or local server. To do so, create a configuration file as follows:

$ sudo vim /etc/rsync.conf

Then paste the following lines in the configuration file. The path directive specifies the path to the destination backup directory while the hosts allow directive indicates the IP address of the source server.

# add to the end

pid file = /var/run/rsyncd.pid

log file = /var/log/rsyncd.log

max connections = 4

# log transfer results or not

transfer logging = yes

# any name you like

[backup]

# target directory to copy

path = /home/user/backup

# hosts you allow to access

hosts allow = 173.82.120.115

hosts deny = *

list = true

uid = root

gid = root

read only = false

Save and exit.

Then start and enable the rsync service.

$ sudo systemctl start rsyncd

$ sudo systemctl enable rsyncd

And confirm the rsync daemon is running.

$ sudo systemctl status rsyncd

If SELinux is enabled, configure the Boolean setting as follows:

$ setsebool -P rsync_full_access on

Next, configure the firewall to allow rsync service:

$ sudo firewall-cmd –add-service=rsyncd –permanent

$ sudo firewall-cmd –reload

Now let’s put our setup to test and see if we can successfully back up data from the source server to the remote server.

Step 3: Test the Configuration

To test the file backup process, log back into the source server. We already have a directory in our home folder containing a few files that need to be backed up.

To save and sync files remotely, rsync takes the following syntax:

$ sudo rsync -avz -e ssh SOURCE_DIRECTORY DESTINATION_IP::backup

Where:

SOURCE_DIRECTORYis the directory to be backed up.

DESTINATION_IPis the IP address of the remote or destination server.

In our example, the full command will be:

$ sudo rsync -avz -e ssh /home/jumpcloud/data/  [email protected]:/home/user/backup

The directory to be backed up on the local or source server is the /home/jumpcloud/data/ folder and the destination backup directory is the /home/user/backup folder. You can create your own source and destination directories in different paths as you deem fit.

From the output of running the command, the files were transferred successfully to the remote server. This is proof that the file transfer worked.

Step 4: Automate the Backup Process

By default, rsync does not have a built-in automation process for backing up and syncing files without user intervention. Fortunately, we can automate the backup process by creating a shell script with the backup command and automating the script to run at specific times using a cron job.

But first, we need to configure passwordless SSH authentication between the local and remote server since rsync uses SSH to initiate a connection between the two and securely transfer files.

Therefore, generate an SSH keypair as follows:

$ ssh-keygen

This generates both a public and private key which are cryptographic keys that are saved in the .ssh directory.

Now, we need to copy the public key to the remote server to enable passwordless SSH authentication. To do so, we will use the ssh-copy-id command as follows:

$ ssh-copy-id [email protected]

When prompted, authenticate with the remote user’s password. The public key is saved in the authorized_files in the .ssh directory on the remote server’s home folder.

To verify that password authentication has been disabled, we will try to log in to the remote server normally as shown.

$ ssh [email protected]

Now, we are going to create a shell script that will contain the backup command.

$ sudo vim backup.sh

The first line starts with a shebang header — a signature of all shell scripts — followed by the backup command.

#!/bin/bash

rsync -avz -e ssh /home/jumpcloud/data/  user@’173.82.255.207′:/home/user/backup

Save the script and exit. Next, make the shell script executable:

$ chmod +x backup.sh

The last step is to automate the running of the script using a cron job. To create a cron job, issue the command:

$ crontab -e

This opens the crontab file. At the very bottom of the file, add the following line: 

05 22 * * * /home/jumpcloud/backup.sh

The line stipulates that the script will run at exactly 10:05 p.m. 

Save the script and exit. You should see the output indicating that a new crontab is being installed.

You can list the cron jobs using the command:

$ crontab -l

To simulate the file synchronization, we added two additional files in our data directory in our source server. When the clock ticked 10:05 p.m., the synchronization happened as evidenced in the output when we listed the contents of the backup directory.

This time around, you can see that additional files were added to the backup directory.

We even verified the synchronization occurred by viewing the /var/log/syslog log file.

$ cat /var/log/syslog log

Conclusion

We have successfully enabled remote file backup using rsync. With rsync, you can back up an entire home directory or any directories of your choice to a remote server. If you are working on a system that has constantly changing data, it’s advisable to set the backup to take place at shorter time intervals as long as this does not impede network speed and disrupt users working during business hours.

Interested in learning about other strategies you can use to ensure the security of your Linux system? Check out one of the recommended tutorials here.

David Worthington

I'm the JumpCloud Champion for Product, Security. JumpCloud and Microsoft certified, security analyst, a one-time tech journalist, and former IT director.

Continue Learning with our Newsletter