There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

What is you backup tool of choice?

I don’t mean system files, but your personal and work files. I have been using Mint for a few years, I use Timeshift for system backups, but archived my personal files by hand. This got me curious to see what other people use. When you daily drive Linux what are your preferred tools to keep backups? I have thousands of pictures, family movies, documents, personal PDFs, etc. that I don’t want to lose. Some are cloud backed but rather haphazardly. I would like to use a more systematic approach and use a tool that is user friendly and easy to setup and program.

pound_heap ,

At this moment I use too many tools.

For user data on my PC and on home server I mostly use Duplicacy. It is fast and efficient. All data backed up locally on NAS box over SFTP, and a subset of that data is backed up to S3 cloud storage.

I have a Mac, this one is using TimeMachine, storing data on NAS, then it’s synced to S3 cloud storage one a day.

And on top of that VMs and containers from home server are backed up by Proxmox built in tool to NAS. These mostly exclude user data.

test1 ,
@test1@calendario-lunar.com avatar

A hand-made combination of tar, rsync and rclone, to a set of portable drives and remote systems.

After having suffering the breakage of computers since the 80's, I want to have the easiest way of restoring backups as possible.

TheImpressiveX ,
@TheImpressiveX@lemmy.ml avatar

GNOME Disk Utility for backing up the whole hard drive. Otherwise, I use BackInTime.

Fryboyter ,

I am using Borg for years. So far, the tool has not let me down. I store the backups on external hard drives that are only used for backups. In addition, I save really important data at rsync.net and at Hetzer in a storage box. Which is not a problem because Borg automatically encrypts locally and for decryption in my case you need a password and a key file.

Generally speaking, you should always test whether you can restore data from a backup. No matter which tool you use. Only then you have a real backup. And an up-to-date backup should always additionally be stored off-site (cloud, at a friend’s or relative’s house, etc.). Because if the house burns down, the external hard drive with the backups next to the computer is not much use.

By the way, I would advise against using just rsync because, as the name suggests, rsync only synchronizes, so you don’t have multiple versions of a file. Which can be useful if you only notice later that a file has become defective at some point.

philipstorry ,

My local backups are handled by rdiff-backup to a mirror set of disks. That means my data is versioned but easily accessible for immediate restore, and now on three disks (my SSD, and two rotating rust drives). It also makes restores as simple as copying a file if I want the latest version, or an easy command if I want an older version. And testing backups is as easy as a diff command to compare the backup version with the live version.

Having your files just be files in your backup solution is very handy. At work I don’t mind having to use an application like Veeam, because I’m being paid to do that. At home I want to see my backups quickly and easily, because I’d rather be working on my files than wrestling with backup software…

Remote backups are handled by SpiderOak, who have been fine for me for almost a decade. I also use them to synchronise my desktop and laptop computer. On my desktop SpiderOak also backs up some files in an archive area on the rotating rust mirror set - stuff that’s large and I don’t access often, so don’t need to put on my laptop but do want backed up.

I also have a USB thumbdrive that’s encrypted and used when I’m travelling to back up changes on my laptop via a simple rsync copy - just in case I have limited internet access and SpiderOak can’t do its thing…

I did also have a NAS in the mix once, but I realised that it was a waste of energy - both mine and electricity. In normal circumstances my data is on 5 locations (desktop SSD, laptop SSD, desktop mirror set, SpiderOak’s storage) and in the very worst case it’s in two locations (laptop SSD, USB thumbdrive). Rdiff-backup to the NAS was simply overkill once I’d added the local mirror set into my desktop, so I retired it.

I’d added the local mirror set because I was working with large files - data sets and VM images - and backups over the network to the NAS were taking an age. A local set of cheap disks in my desktop tower was faster and yet still fairly cheap.

Here’s my advice for your consideration:

  • Simple is better than complicated.
  • How you restore is more important than how you backup; perform test restores regularly.
  • Performance matters; backups that take ages are backups you won’t run.
  • Look to meet the 3-2-1 criteria; 3 copies, on 2 different storage systems, with at least 1 in a different geographic location. Cloud storage helps with this.

Good luck with your backup strategy!

Zucca ,

⬆️ for rdiff-backup since it keeps the last backup easily readable.

I had before (and I think I’ll implement it again) snapshot capable filesystem where to I rsynced my stuff. Then once a day did a snapshot of the backups. It has the advantage of all the backups being easily readable as long as your backup filesystem is intact and your kernel can mount it.

Lemmyin ,

I’ve recently started using proxmox -backup-client. Works well. Goes to my backup server along with my vm image backups. Works nicely with full deducing and such. Quite good savings if you are backing up multiple machines.

I the. Rsync this up to cloud once a day.

Rooty ,

External harddrive, drag&drop.

tool ,
@tool@r.rosettast0ned.com avatar

At work/for business, you can’t beat Veeam. It’s the gold standard and there is literally nothing better.

At home, Duplicity. Set it up once and then just let it go, and it supports a million different backup targets you can ship your backups off to, including the local filesystem. Has auto-aging/removal rules, easy restores, incrementals, etc. Encrypts by default too.

Choctaw ,
@Choctaw@lemmy.radio avatar

Syncthing. I rotate different Linux operating systems daily, but all my data is synced with Syncthing between all my machines, smartphone and a server. Kind of like Chrome OS in that I can wipe or reinstall the OS and be instantly back up and running, or even install a new version of Linux to try out.

whiny9130 ,

If only restic deduplicated… But other than that it does okay.

Hopscotch ,
Starfish ,

grsync, its easy to use

ISOmorph ,

I almost never see FreeFileSync mentioned in those threads. It’s the only GUI based app I know that also gives you options to not copy file deletions for example. Also has the option to be automated with crontab. Backups are not fragmented or repackaged so you can browse them just fine. Encryption can be done with Veracrypt.

joel_feila ,
@joel_feila@lemmy.world avatar

Timesift and a usb drive

garam ,
@garam@lemmy.my.id avatar

I use this and then for each 2 weeks rsync to my cold storage. Some data I also use rclone bisync to backup to cloud, in case I need it so bad, when I’m hitting the road.

DJFart ,
@DJFart@lemmy.world avatar

Time shift with rsync, and on occasion I clonezilla the drive and save it to my NAS.

omeara4pheonix ,

I use timeshift for local backups, then duplicati for backing up to Amazon glacier monthly.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines