There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Tools and ideas for backup of many files

Hello everyone,

In a day or two, I am getting a motherboard with an N100 integrated CPU as a replacement to the Raspberry Pi 4 (2 GB Model). I want to run Jellyfin, the *arr stack and Immich on it. However, I have a lot of photos(for Immich) and movies(for Jellyfin) (in total about 400 GB) that I want to back up, just in case something happens. I have two 1TB drives, one will have the original files, and the second will be my boot drive and have the backup files.

How can I do that? Just copy the files? Do I need to compress them first? What tools do I need to use, and how would you do it?

Thanks in advance.

EDIT: I forgot to mention that I would prefer the backups to be local.

geography082 ,

Rclone crypt, cron, whatever cheap cloud or even free service.

Decronym Bot , (edited )

Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

Fewer Letters More Letters
LXC Linux Containers
RAID Redundant Array of Independent Disks for mass storage
SSD Solid State Drive mass storage

3 acronyms in this thread; the most compressed thread commented on today has 12 acronyms.

[Thread for this sub, first seen 15th Jun 2024, 12:25] [FAQ] [Full list] [Contact] [Source code]

r0ertel ,

Restic and Borg seem to be the current favorites, but I really like the power and flexibility of Duplicity. I like that I can push to a wide variety of back ends (I’m using the rsync), it can do synchronous or asynchronous encryptions and I like that it can do incremental with timed full backups. I don’t like that it keeps a local cache of index files.

I back up to a Pi 0 with a big local disk and rsync the whole disk to another Pi at a relative’s house over tailscale. I’ve never needed the remote, but it’s there.

I’ve had to do a single directory restore once and it was pretty easy. I was able to restore to a new directory and move only the files that I clobbered.

Andrzej ,

If you have the RAM for it, I would recommend going the Promox route. I made the switch this year, and now running daily container image backups is a doddle.

VitabytesDev OP ,

Can you explain more the setup? What VMs would I need to run?

Andrzej ,

Sure. The hardware is a cheap little beelink with an n100 and 16gb of RAM. Proxmox can do VMs, but is primarily focused on LXCs, which are Linux containers. They share the kernel with the host, so they’re very lightweight — you can spin up basically as many (say) Debian systems as you want. So I have Jellyfin on one container, Sonarr/Radarr on another (though you could put them on separate containers if you wanted), transmission has a container, sabnzb has a co- … you get the idea lol.

The cool thing is that it’s easy to mount drives/directories from the host, and have your containers share them that way.

Wrt backups, Proxmox had some built in functionality you can run from the web ui. So I back up images of the LXCs to the external hard drive daily, then have a borg container that backs up the back up directory to cloud storage.

It’s also very convenient to make a quick backup before making any changes to a container — you can restore to a previous image with the click of a button.

mlaga97 ,

Restic and borg are both sorta considered ‘standard’ for doing incremental backups beyond filesystem snapshotting.

I use restic and it automatically handles stuff like snapshotting, compression, deduplication, and encryption for you.

kylian0087 ,

For automated backups defiantly. For a one time use I often use just rsync. It is the simplest to quickly use.

mlaga97 ,

I do find rclone to be a bit more comprehensible for that purpose, rsync always makes me feel like I’m in xkcd.com/1168/

lemmyvore ,

If you literally mean one time then rsync is fine-ish… if you combine it with a checksum tool so you can verify it copied everything properly.

If you need to backup regularly then you need something that can do deduplication, error checking, compression, probably encryption too. Rsync won’t cut it, unless you mean to cover each of those points using a different tool. But there are tools like borg that can do all of them.

colifloro ,

I’d try to stick to the official recommendations to backup immich: immich.app/docs/…/backup-and-restore/

CaptDust ,

I use rsync with a systemd timer. When I first installed the backup drive it took a while to build the file system, but now every Monday it runs, finds the difference between source and target drive, and pulls just the changes down for backup. It’s pretty quick, doesn’t do any compression or anything like that.

themachine ,

I prefer restic for my backups. There’s nothing inherently wrong with just making a copy if that is sufficient for you though. Restic will create small point in time snapshots as compared to just a file copy so I’m the event that perhaps you made a mistake and accidentally deleted something from the “live” copy and managed to propagate that to your backup it is a nonissue as you could simply restore from a previous snapshot.

These snapshots can also be compressed and deduplicated making them extremely space efficient.

scrubbles ,
@scrubbles@poptalk.scrubbles.tech avatar

rclone is my go to for backups I run regularly. It is very nice and scriptable.

rsync might be what you’re looking for, a bit more verbose and… determined? for a large job like that.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines