Raspberry Pi for Backups with IDrive
I've previously written about my backups, the server they were running from, and the NAS drive problems i have had. I've also had issues getting the backups up to the cloud for a while, but I figured my problems were solved when the Raspberry Pi Foundation released the 8GB Raspberry Pi.
My initial cloud backup provider didn't provide an application for ARM processors, which I stupidly didn't factor in to the purchase, so I found a one which has support for Linux, specifically mentioning Ubuntu. Great, I thought, I can run Ubuntu on my Pi and go from there.
Creating the Boot Drive
The Raspberry Pi foundation provides a tool for creating bootable SD cards, which will do the hard work for you. However, the package wouldn't install first time on Ubuntu 20.04 LTS when running
sudo dpkg -i imager_amd64.deb
You need to install a load of QML modules it needs. Simply run the following in your terminal
sudo apt install -y qml-module-qtquick-controls2 qml-module-qt-labs-settings qml-module-qtquick-layouts qml-module-qtquick-templates2 qml-module-qtquick-window2 qml-module-qtgraphicaleffects
Then you can run the dpkg command and all should be fine. The Raspberry Pi imager tool is then available to run from the applications menu. I tried using the snap package, but that wouldn't run for me, hence the long winded approach.
With the imager running, tell it to create the Ubuntu 20.04 LTS server, select the card, and hit Write. Simple, really. It will create the image and verify it on the card. Once complete, just pop the MicroSD card into the Raspberry Pi, and go.
Mapping the Drive
My NAS drive is a Buffalo LinkStation, with Raided drives to mirror the contents, and provide a nice level of security if something bad happens. It means I have a wonderful 4TB storage on it, and it's available on the network. To back it up to the cloud, it needs to be mapped as a drive (and my previous cloud backup provider didn't allow mapped drives on Windows, so I needed a Linux solution).
Fortunately, Linux is designed for these types of things, and after installing cifs-utils (`apt install -y cifs-utils`) it was a simple case of mapping the samba share:
mount //servername/share /folder/to/map/to -t cifs -o username=myuser,password=mypass,vers=samba-version
In the case of my NAS, vers
needs to be 2.0 to work, any other output gives the bash output of:
mount error(2): No such file or directory
Refer to the mount.cifs(8) manual page (e.g. man mount.cifs) and kernel log messages (dmesg)
To get the drive to mount when the machine boots up, the entry needs to be made in /etc/fstab
. The information is pretty much the same as a manual mount, above, but misses out the mount
command, and the flags ahead of the options. My fstab entry for the drive above looks like:
//servername/share /folder/to/map/to cifs username=myuser,password=mypass,vers=samba-version 0 0
Save the changes, then run mount -a
. The drive should mount to the relevant location. To fully test, reboot the machine and log in. It should be mounted ready.
Configuring IDrive
My new chosen backup provider is IDrive, as I saw a link from TechRadar offering a years backups (up to 5TB) for absolute peanuts. They also happened to have Linux backup support via Perl scripts. It was worth a try, I figured.
Unfortunately, it wasn't always that easy. Trying to run the account_setting.pl
script ran into issues with "Unable to verify EVS Binary". Not a lot of help out there for that, but I eventually hacked my way through the files to find areas to add logging. I eventually had to download the relevant ARM 64-bit binary from https://www.idrivedownloads.com/downloads/linux/download-options/IDrive_synology_aarch64bit.zip. and manually extract that to the scripts/iDrivelib/dependencies
folder, ensuring the idevsutil
was in the IDrive_synology_aarch64bit
folder.
Next was the error "Unable to download static Perl binary". Further hacking and digging showed that the scripts only try to download an x86_64 binary of Perl. IDrive have seen this before, and have an FAQ for it. You need to download a zip file based on the machine architecture, and pass it in as a command line argument for it to work. Using the ARM architecture one caused an issue, as the hardware platform registered as 64-bit, so it wanted the IDrive_Linux_64.zip file passing in. The issue here was it didn't like the zip archive which was downloaded.
This was easier to fix than you would expect. Simply use the archive manager to extract the zip archive, rename the zip archive, and zip the extracted folder. This will zip the files in the manner which is needed by your system to access it correctly, and then the account_setting.pl
script can extract the archive correctly. Everything then ran as expected.
Configuring the Backup Set
For IDrive to work, it needs to know what to back up. The script for doing this is within the scripts
folder of the IDrive installation. Just run ./edit_supported_files.pl
and follow the instructions. This will open the configuration text file in vi, which can be daunting to those who don't know how to use it. If this worries you, and you're more used to nano or something else, edit the file BackupsetFile.txt in the location idriveIt/user_profile/MACHINE_USER/IDRIVE_USER/Backup/DefaultBackups
. For this path, the following are important:
- the
idriveIt
folder is in the same folder as thescripts
folder where all of the IDrive work takes place - the
MACHINE_USER
is the user for logging into the machine - the
IDRIVE_USER
is the email address for logging into IDrive with
I set this up to back up the folder /home/ubuntu/Alexandria
(I have a slight Greek theme on my machines, generally. My old file server was called Hercules. My NAS folder is like the library at Alexandria.)
Running the backups
Once everything is configured, simply run ./Backup_Script.pl
. It will read the locations from the BackupsetFile.txt file, and upload them to IDrive. It makes sense to do this in a specific window, as it outputs the progress and stats for the backups. In my case, I had 1.45TB files to back up, which was going to take a long time.
Finally, run ./scheduler.pl
to configure the cron job. Se this to start later, and then modify the entry to specify a start and end time.
Because I set my cron job to run at a point earlier in the day, and wanted it to start immediately, I then logged in to the machine and re-ran the ./Backup_Script.pl
file to get things going. I kept the occasional eye on it by connecting to the Raspberry Pi via SSH, navigating to the IDrive scripts folder, and running ./Status_Retrieval_Script.pl
. This will guide you through a menu to to find running options, and then output what is happening on to screen. Should you wish to exit this, use the normal Ctrl+C to exit. This only affects the ./Status_Retrieval_Script.pl
script, not the Backup_Script running in the background.
Conclusion
Getting cloud backups running on a Raspberry Pi might not be easy, but it is certainly doable. And unless you need more than 5TB storage, it's not going to cost too much either.