Duplicity – quick and easy backup
Many server administrators have to face the problem of backing up their files when creating their architecture. This is a very important activity, because if something goes wrong, you can quickly restore your system to its pre-crash state with such files. The simplest tools for making basic backups are tar, gzip and lftp. The first application creates a package with the required files, the second one compresses it, and the third one sends to an external server, designed for such copies. Of course, some people can also add data encryption within maximum security.
However, after some time, it turns out that these backups are becoming larger and larger, so they take up a lot of space, take a long time and send too much time to other machines. What can we do about it? Look for a tool that will enable us to create incremental backups.
And here appears Duplicity. It is an application for backing up selected directories in the form of encrypted files and then saving them in some location (both remote and local). It uses the rsync library, which allows you to easily create incremental archives that store only the changed data of some files. Archives can be encrypted and transmitted remotely to another server, using protocols such as: FTP, IMAP, RSYNC, S3, SSH/SCP, WEBDAV and many others.
To download and install Duplicity we will need the EPEL repository:
yum install epel-release
Next, issue the following command:
yum install duplicity
After a while the application will be ready to use in the system.
An example of a set of commands to perform all activities locally.
duplicity full --no-encryption [path to the directory to make a copy] file://[path to where to upload the backup file]
Makes a full backup, without encryption of the archive.
Full backup with conditions
duplicity --full-if-older-than [ilość dni]D --no-encryption [path to the directory or file to make a copy] file://[path to where to upload the backup file]
Make a full backup, without encryption, if the last one is older than the specified number of days.
duplicity --no-encryption [path to the directory or file to make a copy] file://[path to where to upload the backup file]
It makes individual backups to us, without encryption.
Deleting full backups
duplicity remove-all-but-n-full [liczba] --no-encryption --force file://[backup path]
Deletes all full backups, except for the last ones entered as a number. Entering a value of 1 will delete all full backups, except for the latest one.
Removing incremental copies
duplicity remove-all-inc-of-but-n-full [liczba] --no-encryption --force file://[backup path]
Deletes all incremental backups assigned to full backups. The parameter [number] indicates the number of full backups. The –force option is required to delete files instead of printing them on the screen.
Restore the entire backup
duplicity -t [ilość dni, np. 5D] --force --no-encryption file://[backup path] [path to where to unpack the backup]
Extracting a specific file from the backup
duplicity -t [ilość dni, np. 1D] --force --no-encryption --file-to-restore [file or directory] file://[backup path] [path to where to unpack the backup]
Restores a specific file from the archive, instead of the entire backup. With archives exceeding a few gigabytes, it saves us a lot of time.
Deleting files with a period greater than days or months
duplicity remove-older-than [enter days or months, e.g. 15D, 1M] file://[backup path]
Listing files from the archive where the backup is located
duplicity list-current-files [opcje] file:///[backup path]
This is a very useful option if you only want to extract specific files or directories. You can also use the grep switch and search for a particular phrase or throw the result of a command into a file.
Output of backup and related information
duplicity list-current-files [opcje] file:///[path to the backup directory]
To illustrate the differences between duplicate backup and Tara we did some tests. We created a directory with 50 GB of randomly generated files. Then we made a full backup with our tool and Tarem. Then we created an additional 3 GB of files. Duplicity we made an incremental backup and Tarem a standard backup.
As we can see on the chart, Duplicity only needed 3 minutes to check what had changed and back up the differences. Tar had to do everything.
- Good project documentation
- Simple syntax
- Incremental backup speed
- Saving the Internet connection
- Easy backup management
- Extracting specific files from the archive very quickly
- Long full backup time
- Damage to files from incremental copy, makes it impossible to recover data from that date.
Check other blog postsSee all blog posts
- Read more
Many clients of hosting companies are mistakenly convinced that a hosting provider will solve their problems and take care of their crucial issues, while maintaining their website hosted on their servers. (Un)fortunately, that doesn’t work that way. What are the things your hosting won’t do for you? It won’t do the dishes, design a website,…
- Read more
My mind’s made up. You’ve had enough of the silly hosting or cosmic costs of its maintenance and finally you decide to move the site you’ve placed on WordPress. How to move WordPress to another server? What do you need to remember and how should you prepare for it? Migration of WordPress to another server…
- Read more
Google Workspace is probably the best business mail for your work. Great anti-spam filters, full control over your data, the fact you can choose a country where your data are stores, a calendar, Google Meets… these are only a fraction of what G Suite offers. Take a look at why you should consider switching to Google…