Ubuntu: Comparison of backup tools



Question:

This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. While you are encouraged to help maintain its answers, please understand that "big list" questions are not generally allowed on Ask Ubuntu and will be closed per the help center.

Backup is incredibly important. Obviously there's no best backup tool, but a comparison of the options would be very interesting.

  • Graphical Interface? Command line?
  • Incremental backups?
  • Automatic backups?
  • Install method: In standard repositories? PPA?


Solution:1

Déjà Dup Install Déjà Dup

Déjà Dup is (from Ubuntu 11.10) installed by default. It is a GNOME tool intended for the casual Desktop user that aims to be a "simple backup tool that hides the complexity of doing backups the Right Way".

It is a front end to duplicity that performs incremental backups, where only changes since the prior backup was made are stored. It has options for encrypted and automated backups. It can backup to local folders, Amazon S3, or any server to which Nautilus can connect.

Integration with Nautilus is superb, allowing for the restoration of files deleted from a directory and for the restoration of an old version of an individual file.

Main Window Screenshot

Restore earlier version of file

Note that as of February 2016 this project appears to be almost completely ignoring bug reports with only minor triage activity and the last bugfix dates back to 2014, though there are new releases with minor changes.


Solution:2

Back in Time Install Back in Time

I have been using Back in Time for some time, and I'm very satisfied.

All you have to do is configure:

  • Where to save snapshot
  • What directories to backup
  • When backup should be done (manual, every hour, every day, every week, every month)

And forget about it.

To install (working on Ubuntu 16.04 for gnome):

sudo add-apt-repository ppa:bit-team/stable  sudo apt-get update  sudo apt-get install backintime-gnome  

The program GUI can be opened via ubuntu search for "backintime".

alt text

Project is active as of May 2017.


Solution:3

rsnapshot vs. rdiff-backup

I often refer to this comparison of rsnapshot and rdiff-backup:

Similarities:

  • both use an rsync-like algorithm to transfer data (rsnapshot actually uses rsync; rdiff-backup uses the python librsync library)
  • both can be used over ssh (though rsnapshot cannot push over ssh without some extra scripting)
  • both use a simple copy of the source for the current backup

Differences in disk usage:

  • rsnapshot uses actual files and hardlinks to save space. For small files, storage size is similar.
  • rdiff-backup stores previous versions as compressed deltas to the current version similar to a version control system. For large files that change often, such as logfiles, databases, etc., rdiff-backup requires significantly less space for a given number of versions.

Differences in speed:

  • rdiff-backup is slower than rsnapshot

Differences in metadata storage:

  • rdiff-backup stores file metadata, such as ownership, permissions, and dates, separately.

Differences in file transparency:

  • For rsnapshot, all versions of the backup are accessible as plain files.
  • For rdiff-backup, only the current backup is accessible as plain files. Previous versions are stored as rdiff deltas.

Differences in backup levels made:

  • rsnapshot supports multiple levels of backup such as monthly, weekly, and daily.
  • rdiff-backup can only delete snapshots earlier than a given date; it cannot delete snapshots in between two dates.

Differences in support community:

  • Based on the number of responses to my post on the mailing lists (rsnapshot: 6, rdiff-backup: 0), rsnapshot has a more active community.


Solution:4

rsync Install rsync

If you're familiar with command-line tools, you can use rsync to create (incremental) backups automatically. It can mirror your directories to other machines. There are lot of scripts available on the net how to do it. Set it up as recurring task in your crontab. There is also a GUI frontend for rsync called Grsync that makes manual backups easier.

In combination with hard links, it's possible to make backup in a way that deleted files are preserved.

See:


Solution:5

Duplicity Install Duplicity

Duplicity is a feature-rich command line backup tool.

Duplicity backs up directories by producing encrypted tar-format volumes and uploading them to a remote or local. It uses librsync to record incremental changes to files; gzip to compress them; and gpg to encrypt them.

Duplicity's command line can be intimidating, but there are many frontends to duplicity, from command line (duply), to GNOME (deja-dup), to KDE (time-drive).


Solution:6

Dropbox

A cross-platform (proprietary) cloud sync for Windows, Mac, and Linux. 2GB of online storage is free, with paid options. Advertised as a way to "store, sync, and, share files online" but could be used for backup purposes too.

Note that even on paid accounts revision history is limited to one year and on free accounts it is only one month.

Dropbox in use on Ubuntu


Solution:7

luckyBackup Install LuckyBackup

It's not been mentioned before, so I'll pitch in that "LuckyBackup" is a superb GUI front end on rsync and makes taking simple or complex backups and clones a total breeze.

Note that this tool is no longer developed.

The all important screenshots are found here on their website with one shown below:

luckyBackup


Solution:8

BackupPC Install BackupPC

If you want to back up your entire home network, I would recommend BackupPC running on an always-on server in your basement/closet/laundry room. From the backup server, it can connect via ssh, rsync, SMB, and other methods to any other computer (not just linux computers), and back up all of them to the server. It implements incremental storage by merging identical files via hardlinks, even if the identical files were backed up from separate computers.

BackupPC runs a web interface that you can use to customize it, including adding new computers to be backed up, initiating immediate backups, and most importantly, restoring single files or entire folders. If the BackupPC server has write permissions to the computer that you are restoring to, it can restore the files directly to where they were, which is really nice.

BackupPC Web Interface - Server Status Page


Solution:9

CrashPlan

CrashPlan is an award-winning endpoint backup solution providing unlimited data protection on all laptops and desktops for businesses of any size.

Features

  • Triple destination data storage and protection
  • Silent and continuous
  • Generous retention and versioning
  • Deleted file protection

I had considered a bunch of options and configurations (using rdiff-backup, duplicity, backup-ninja, amazon s3, remote server). What it finally came down to was simplicity.

CrashPlan is cross platform, but not open source.

There is a charge if you use their servers to host your backup, but you can also backup to a folder (or drive), another computer you own, or a computer of someone you know. Or any combination of those.

It's also worth noting that with a (paid) CrashPlan Central 'family' plan you can backup all the computers you own.


Solution:10

Bacula

I used Bacula a long time ago. Although you would have to learn its architecture, it's a very powerful solution. It lets you do backups over a network and it's multi-platform. You can read here about all the cool things it has, and here about the GUI programs that you can use for it. I deployed it at my university. When I was looking for backup solutions I also came across Amanda.

One good thing about Bacula is that it uses its own implementation for the files it creates. This makes it independent from a native utility's particular implementation (e.g. tar, dump...).

When I used it there weren't any GUIs yet. Therefore, I can't say if the available ones are complete and easy to use.

Bacula is very modular at it's core. It consists of 3 configurable, stand-alone daemons:

  • file daemon (takes care of actually collecting files and their metadata cross-platform way)
  • storage daemon (take care of storing the data - let it be HDD, DVDs, tapes, etc.)
  • director daemon (takes care of scheduling backups and central configuration)

There is also SQL database involved for storing metadata about bacula and backups (support for Postgres, MySQL and sqlite.

bconsole binary is shipped with bacula and provides CLI interface for bacula administration.


Solution:11

bup

A "highly efficient file backup system based on the git packfile format. Capable of doing fast incremental backups of virtual machine images."

Highlights:

  • It uses a rolling checksum algorithm (similar to rsync) to split large files into chunks. The most useful result of this is you can backup huge virtual machine (VM) disk images, databases, and XML files incrementally, even though they're typically all in one huge file, and not use tons of disk space for multiple versions.

  • Data is "automagically" shared between incremental backups without having to know which backup is based on which other one - even if the backups are made from two different computers that don't even know about each other. You just tell bup to back stuff up, and it saves only the minimum amount of data needed.

  • Bup can use "par2" redundancy to recover corrupted backups even if your disk has undetected bad sectors.

  • You can mount your bup repository as a FUSE filesystem and access the content that way, and even export it over Samba.

  • A KDE-based front-end (GUI) for bup is available, namely Kup Backup System.


Solution:12

Simple Backup Install Simple Backup

Simple Backup is another tool to backup your file and keep a revision history. It is quite efficient (with full and incremental backups) and does not take up too much disk space for redundant data. So you can have historical revision of files à-la Time Machine (a feature Back in time - mentioned earlier - is also offering).

Features:

  • easy to set-up with already pre-defined backup strategies
  • external hard disk backup support
  • remote backup via SSH or FTP
  • revision history
  • clever auto-purging
  • easy sheduling
  • user- and/or system-level backups

alt text

As you can see the feature set is similar to the one offered by Back in time.

Simple Backup fits well in the Gnome and Ubuntu Desktop environment.


Solution:13

tar your home dirrectory

open a terminal

  • cd /home/me
  • tar zcvf me.tgz .
  • mv me.tgz to another computer
    • via samba
    • via NFS
    • DropBox
    • Other

Do the same to /etc
Do the same to /var iff your running servers in default ubuntu setup.
Write a shell script to do all three tars

Backup your browser bookmarks

This is enough for 95% of folks

  • backing up aplications is not worth the effort just reinstall packages.



To restore
mv me.tgz back to /home/me right click extract here


Solution:14

Spideroak

A dropbox like backup/syncing service with comparable features. Free accounts have unlimited revision history.

  • Access all your data in one de-duplicated location
  • Configurable multi-platform synchronization
  • Preserve all historical versions & deleted files
  • Share folders instantly in web
  • ShareRooms w / RSS
  • Retrieve files from any internet-connected device
  • Comprehensive 'zero-knowledge' data encryption
  • 2 GBs Free / $10 per 100 GBs / Unlimited devices

Listed supported systems: Debian Lenny, OpenSUSE, RPM-Based (Fedora, etc.), CentOS/RHEL, Ubuntu Lucid Lynx, Ubuntu Gutsy Gibbon, Ubuntu Karmic Koala, Ubuntu Maverick Meerkat, Ubuntu Intrepid Ibex, Debian Etch, Ubuntu Hardy Heron, Slackware 12.1, Ubuntu Jaunty Jackalope

More info at https://spideroak.com


Solution:15

DAR Install DAR

DAR - the Disk ARchive program - is a powerful command line backup tool supporting incremental backups and restores. If you want to backup a lot of files then it may be considerable faster than rsync (rolling checksum) like solutions.


Solution:16

FlyBack

Similar to Back in Time

Apple's Time Machine is a great feature in their OS, and Linux has almost all of the required technology already built in to recreate it. This is a simple GUI to make it easy to use.

FlyBack v0.4.0


Solution:17

Attic Backup

Attic is a deduplicating backup program written in Python. The main goal of Attic is to provide an efficient and secure way to backup data. The data deduplication technique used makes Attic suitable for daily backups since only the changes are stored.

Main Features:

  • Easy to use
  • Space efficient storage: Variable block size deduplication is used to reduce the number of bytes stored by detecting redundant data.
  • Optional data encryption: All data can be protected using 256-bit AES encryption and data integrity and authenticity is verified using HMAC-SHA256.
  • Off-site backups: Attic can store data on any remote host accessible over SSH
  • Backups mountable as filesystems: Backup archives are mountable as userspace filesystems for easy backup verification and restores.

Requirements:

Attic requires Python >=3.2. Besides Python, Attic also requires msgpack-python and OpenSSL (>= 1.0.0). In order to mount archives as filesystems, llfuse is required.

Note:

There is also now a fork of Attic called Borg.


Solution:18

Areca Backup

is also a very decent GPL program to make backups easily.

Features

  • Archives compression (Zip & Zip64 format)
  • Archives encryption (AES128 & AES256 encryption algorithms)
  • Storage on local hard drive, network drive, USB key, FTP / FTPs server (with implicit and explicit SSL / TLS)
  • Source file filters (by extension, subdirectory, regular expression, size, date, status, with AND/OR/NOT logical operators)
  • Incremental, differential and full backup support
  • Support for delta backup (store only modified parts of your files)
  • Archives merges : You can merge contiguous archives into one single archive to save storage space.
  • As of date recovery : Areca allows you to recover your archives (or single files) as of a specific date.
  • Transaction mechanism : All critical processes (such as backups or merges) are transactional. This guarantees your backups' integrity.
  • Backup reports : Areca generates backup reports that can be stored on your disk or sent by email.
  • Post backup scripts : Areca can launch shell scripts after backup.
  • Files permissions, symbolic links and named pipes can be stored and recovered. (Linux only)


Solution:19

Jungledisk Pay for application

Is a winner as far as I'm concerned. It backs up remotely to an optionally-encrypted Amazon S3 bucket, it's customisable, it can run in the background (there are various guides available for setting that up). There's a decent UI or you can hack an XML file if you're feeling so inclined.

I backup all of my home machines with the same account, no problem. I also can remotely access my backed-up data via myjungledisk.com .

It's not free, but in US terms it's certainly cheap enough (I pay around $7 a month). I feel that's more than acceptable for an offsite backup where someone else deals with hardware and (physical) security etc issues.

I can't recommend it enough.

-- peter


Solution:20

I run a custom Python script which uses rsync to save my home folder (less trash etc) onto a folder labelled "current" on a separate backup HDD (connected by USB) and then the copy (cp) command to copy everything from "current" onto a date-time stamped folder also on the same HDD. The beautiful thing is that each snapshot has every file in your home folder as it was at that time and yet the HDD doesn't just fill up unnecessarily. Because most files never change, there is only ever one actual copy of those files on the HDD. Every other reference to it is a link. And if a newer version of a file is added to "current", then all the snapshots pointing to the older version are now automatically pointing to a single version of the original. Modern HDD file systems takes care of that by themselves. Although there are all sorts of refinements in the script, the main commands are simple. Here are a few of the key ingredients:

exclusion_path = "/home/.../exclusions.txt" # don't back up trash etc  media_path = "/media/... # a long path with the HDD details and the "current" folder  rsync -avv --progress --delete --exclude-from=exclusion_path /home/username/ media_path  current = "..." # the "current" folder on the HDD  dest = "..." # the timestamped folder on the HDD  cp -alv current dest  

I had some custom needs as well. Because I have multiple massive (e.g. 60GB) VirtualBox disk images, I only ever wish to have one copy of those, not snapshot versions. Even a 1 or 2 TB HDD has limits.

Here are the contents of my exclusions file. The file is very sensitive to missing terminal slashes etc:

/.local/share/Trash/  /.thumbnails/  /.cache/  /Examples/  


Solution:21

TimeVault

TimeVault a is tool to make snapshots of folders and comes with nautilus integration. Snapshots are protected from accidental deletion or modification since they are read-only by default.

The application is currently in beta stage and can be downloaded from Launchpad.


Solution:22

Dirvish

Dirvish is a nice command line snapshot backup tool which uses hardlinks to reduce diskspace. It has a sophisticated way to purge expired backups.

Here is a nice tutorial for it: http://wiki.edseek.com/howto:dirvish


Solution:23

Duplicati

An open source, gratis backup application running on Linux, with gui that "securely stores encrypted, incremental, compressed backups on cloud storage services and remote file servers. It works with Amazon S3, Windows Live SkyDrive, Google Drive (Google Docs), Rackspace Cloud Files or WebDAV, SSH, FTP (and many more)".

Version 1.0 is considered stable; there is a version 2 in development with considerable internal changes that is currently working (though I wouldn't use it for production). There are standard or custom filter rules to select files to backup.

I have been using it for years partly (not connected to anyone there but have considered looking at the API to add a backend, speaking as a developer) although infrequently, on both a Windows laptop and my Ubuntu 14.04 install.

A fork of duplicity.


Solution:24

inosync

A Python script that offers a more-or-less real-time backup capability.

"I came across a reference to the “inotify” feature that is present in recent Linux kernels. Inotify monitors disk activity and, in particular, flags when files are written to disk or deleted. A little more searching located a package that combines inotify's file event monitoring with the rsync file synchronization utility in order to provide the real-time file backup capability that I was seeking. The software, named inosync, is actually a Python script, effectively provided as open-source code, by the author, Benedikt Böhm from Germany (http://bb.xnull.de/)."

http://www.opcug.ca/public/Reviews/linux_part16.htm


Solution:25

PING is a no-nonsense free backup tool that will let you make backups of entire partitions. It is a standalone utility that should be burnt on CD.

What I like about this program is that it copies the entire partition. Imagine this: while modifying your Ubuntu as a superuser, you changed a vital part and Ubuntu won't start up anymore.

You could format the hard disk and reinstall Ubuntu. While backup solutions as Dropbox, Ubuntu One etc. might be useful for retrieving the important files , it won't restore your wallpaper, Unity icons and other stuff that made your Ubuntu the way you liked it.

Another option is to ask for help on the internet. But why not just restore the whole system to the way it was a few days ago? PING will do exactly this for you.

Pro's:

  • Will not only backup documents, but system files as well
  • It's easy to use
  • It is possible to backup other (non-Linux) partitions as well
  • It will compress the backup in gzip or bzip2 format, saving disk space

Cons:

  • The PC will have to be restarted before being able to backup
  • PING will make a backup of an entire partition, even when only few files have been modified
  • You'll need an external hard drive or some free space on your PC to put your backups

An excellent Dutch manual can be found here.


Solution:26

s3ql is a more recent option for using Amazon s3, Google Storage or OpenStack Storage as a file system. It works on a variety of Linux distros as well as MacOS X.

Using it with rsync, you can get very efficient incremental offsite backups since it provides storage and bandwidth efficiency via block-level deduplication and compression. It also supports privacy via client-side encryption, and some other fancy things like copy-on-write, immutable trees and snapshotting.

See Comparison of S3QL and other S3 file systems for comparisons with PersistentFS, S3FS, S3FSLite, SubCloud, S3Backer and ElasticDrive.

I've been using it for a few days, starting from s3_backup.sh, (which uses rsync) and am quite happy. It is very well documented and seems like a solid project.


Solution:27

Obnam

'Obnam is an easy, secure backup program. Backups can be stored on local hard disks, or online via the SSH SFTP protocol. The backup server, if used, does not require any special software, on top of SSH.

Some features that may interest you:

  • Snapshot backups. Every generation looks like a complete snapshot, so you don't need to care about full versus incremental backups, or rotate real or virtual tapes.
  • Data de-duplication, across files, and backup generations. If the backup repository already contains a particular chunk of data, it will be re-used, even if it was in another file in an older backup generation. This way, you don't need to worry about moving around large files, or modifying them.
  • Encrypted backups, using GnuPG.'

An old version can be found in the Ubuntu software sources, for the newest version refer to Chris Cormacks PPA or Obnams website.


Solution:28

backup2l

From the homepage:

backup2l is a lightweight command line tool for generating, maintaining and restoring backups on a mountable file system (e. g. hard disk). The main design goals are are low maintenance effort, efficiency, transparency and robustness. In a default installation, backups are created autonomously by a cron script.

backup2l supports hierarchical differential backups with a user-specified number of levels and backups per level. With this scheme, the total number of archives that have to be stored only increases logarithmically with the number of differential backups since the last full backup. Hence, small incremental backups can be generated at short intervals while time- and space-consuming full backups are only sparsely needed.

The restore function allows to easily restore the state of the file system or arbitrary directories/files of previous points in time. The ownership and permission attributes of files and directories are correctly restored.

An integrated split-and-collect function allows to comfortably transfer all or selected archives to a set of CDs or other removable media.

All control files are stored together with the archives on the backup device, and their contents are mostly self-explaining. Hence, in the case of an emergency, a user does not only have to rely on the restore functionality of backup2l, but can - if necessary - browse the files and extract archives manually.

For deciding whether a file is new or modified, backup2l looks at its name, modification time, size, ownership and permissions. Unlike other backup tools, the i-node is not considered in order to avoid problems with non-Unix file systems like FAT32.


Solution:29

saybackup and saypurge

There is a nice script called saybackup which allows you to do simple incremental backups using hardlinks. From the man page:

This script creates full or reverse incremental backups using the
rsync(1) command. Backup directory names contain the date and time
of each backup run to allow sorting and selective pruning. At the end of each successful backup run, a symlink '*-current' is updated to always point at the latest backup. To reduce remote file
transfers, the '-L' option can be used (possibly multiple times) to
specify existing local file trees from which files will be
hard-linked into the backup.

The corresponding script saypurge provides a clever way to purge old backups. From the home page of the tool:

Sayepurge parses the timestamps from the names of this set of backup directories, computes the time deltas, and determines good deletion candidates so that backups are spaced out over time most evenly. The exact behavior can be tuned by specifying the number of recent files to guard against deletion (-g), the number of historic backups to keep around (-k) and the maximum number of deletions for any given run (-d). In the above set of files, the two backups from 2011-07-07 are only 6h apart, so they make good purging candidates...


Solution:30

faubackup

Another small tool which lets you do incremental backups with hardlinks is faubackup.

From the homepage:

This Program uses a filesystem on a hard drive for incremental and full backups. All Backups can easily be accessed by standard filesystem tools (ls, find, grep, cp, ...)

Later Backups to the same filesystem will automatically be incremental, as unchanged files are only hard-linked with the existing version of the file.

It allows to create different levels of backups. From the man page:

FauBackup may be configured to keep certain backups for a long time and remove others. Have a look at traditional backup systems. You have tapes for daily, weekly, monthly and yearly backups, and store them according to your local backup policy. FauBackup can do this for you on harddisks, too. That is, it can keep some yearly, weekly, etc. backups for you and automatically remove other obsoleted backups.

Four different backup-types are recognized: daily, weekly, monthly and yearly. The first existing backup in such an interval will be considered belonging to the coresponding type. Thus, the first backup in a month (eg. 2000âˆ'12âˆ'01@06:30:00) will be a monthly backup; the first backup in 2001 will be of all four types, as January 1st, 2001 is a Monday.

The number of backups kept for each type is configureable (See faubackup.conf(5) ). If a backup doesn’t belong to such a type (eg. second backup in a day), or is too old for that type, it will be removed on faubackup --


Note:If u also have question or solution just comment us below or mail us on toontricks1994@gmail.com
Previous
Next Post »