There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

YonatanAvhar ,

At the moment I’m doing primarily hopes and prayers

Llamajockey ,

I had to upgrade to Hopes&Prayers+ after I ran out of hope and my prayers kept getting return to sender.

webjukebox ,
@webjukebox@mujico.org avatar

I was in the same boat, until my prayers weren’t listened and my hopes are now dead.

I lost some important data from my phone a few days ago. My plan was to backup at night but chaos was that same day in the morning.

PlutoniumAcid ,
@PlutoniumAcid@lemmy.world avatar

Ah yes, the ostrich algorithm.

BitSound ,

I’ve been using Borg to back my stuff up. It gets backed up to rsync.net, which has good support for Borg:

www.rsync.net/products/borg.html

If you’re good enough at computers, you can even set up a special borg account with them that’s cheaper and has no tech support.

ErwinLottemann ,

I was a rsync.net user for many years and recently switched to borgbase, because of how easy it is to manage multiple backup targets.

tables ,

I'm on the same boat right now, borg and borgbase.

BitSound ,

That looks cool, and they’ve got some other nifty looking things like www.pikapods.com. Any idea how stable the company is? I partially like rsync.net because it’s pretty unlikely to just disappear someday.

neardeaf ,

Seconding this. On my unRAID host, I run a docker container called “Vorta” that uses Borg as its backend mechanism to backup to my SynologyNAS over NFS. Then on my Syno, run two backup jobs using HyperBackup, one goes to my cousin’s NAS connected via a Site-to-Site OpenVPN connection on our edge devices (Ubiquity Unifi Security Gateway Pro <-> UDM Pro), the other goes to Backblaze B2 Cloud Storage.

OP, let me know if you need any assistance setting something like this up. Gotta share the knowledge over here on Lemmy that we’re still used to searching evil Reddit for.

PlutoniumAcid ,
@PlutoniumAcid@lemmy.world avatar

My brother and I both run an USG. Would love to learn from you how to set up site2site VPN!

neardeaf ,

Niiiice, quick question, are both of y’all running the latest UniFi Controller version & using the new WebUI view layout?

PlutoniumAcid ,
@PlutoniumAcid@lemmy.world avatar

His gear is v7 (Unifi and also Synology DSM) and I am still on v6 because I didn’t have a good reason to upgrade. If it works, don’t fix it, you know? Feature-wise there the same anyway just different UI. But sure, give me a good reason to upgrade, and I will :)

stridemat ,

Love Borg and the associated docker containers and the like. Really is set and forget!

0110010001100010 ,

Local backup to my Synology NAS every night which is then replicated to another NAS at my folks house through a secure VPN tunnel. Pretty simple and easy to deploy.

Greidlbeere OP ,

Sounds good. What do you use for replication?

neardeaf ,

Most likely Hyper Backup & Hyper Vault, two applications built into Synology’s DSM software that runs on their NAS devices.

0110010001100010 , (edited )

Just simple old rsync. The nas at the far-end is an old QNAP I had lying around.

kaotic , (edited )

I’ve had excellent luck with Kopia, backing up to Backblaze B2.

At work, I do the same to a local directory in my company provided OneDrive account to keep company data on company resources.

ComptitiveSubset , (edited )

For app data, Borg as backup/restore software. Backup data is then stored on Hetzner as an offsite backup - super easy and cheap to setup. Also add healthchecks.io to get notified if a backup failed.

Edit: Backup docker compose files and other scripts (without API keys!!!) with git to GitHub.

lynny ,
@lynny@lemmy.world avatar

Someone on lemmy here suggested Restic, a backup solution written in Go.

I back up to an internal 4TB HDD every 30 minutes. My most important files are stored in an encrypted file storage online in the cloud.

Restic is good stuff.

mr_nEJC ,
@mr_nEJC@lemmy.world avatar

Thanks for this tip. Seems interesting - watched this tutorial/presentation video. Will try it out asap 😅

francisco_1844 ,

Restic for backup - can send backups to S3 and SFTP amongst other target options.

There are S3 (block storage) compatible services, such as Backblaze’s B2, which are very affordable for backups.

DataDreadnought ,
@DataDreadnought@lemmy.one avatar

I doubt your using NixOS so this config might seem useless but at its core it is a simple systemd timer service and bash scripting.

To convert this to another OS you will use cron to call the script at the time you want. Copy the part between script=“” and then change out variables like the location of where docker-compose is stored since its different on NixOS.

Let me explain the script. We start out by defining the backupDate variable, this will be the name of the zip file. As of now that variable would be 2023-07-12. We then go to each folder with a docker-compose.yml file and take it down. You could also replace down with stop if you don’t plan on updating each night like I do. I use rclone to connect to Dropbox but rclone supports many providers so check it out and see if it has the one you need. Lastly I use rclone to connect to my Dropbox and delete anything older than 7 days in the backup folder. If you end up going my route and get stuck let me know and I can help out. Good luck.

<pre style="background-color:#ffffff;">
<span style="color:#323232;">systemd = {
</span><span style="color:#323232;">      timers.docker-backup = {
</span><span style="color:#323232;">        wantedBy = [ "timers.target" ];
</span><span style="color:#323232;">        partOf = [ "docker-backup.service" ];
</span><span style="color:#323232;">        timerConfig.OnCalendar= "*-*-* 3:30:00";
</span><span style="color:#323232;">      };
</span><span style="color:#323232;">      services.docker-backup = {
</span><span style="color:#323232;">        serviceConfig.Type = "oneshot";
</span><span style="color:#323232;">        serviceConfig.User = "root";
</span><span style="color:#323232;">        script = ''
</span><span style="color:#323232;">        backupDate=$(date  +'%F')
</span><span style="color:#323232;">        cd /docker/apps/rss
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose down
</span><span style="color:#323232;">
</span><span style="color:#323232;">        cd /docker/apps/paaster
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose down
</span><span style="color:#323232;">
</span><span style="color:#323232;">        cd /docker/no-backup-apps/nextcloud
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose down
</span><span style="color:#323232;">
</span><span style="color:#323232;">        cd /docker/apps/nginx-proxy-manager
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose down
</span><span style="color:#323232;">
</span><span style="color:#323232;">        cd /docker/backups/
</span><span style="color:#323232;">        ${pkgs.zip}/bin/zip -r server-backup-$backupDate.zip /docker/apps
</span><span style="color:#323232;">
</span><span style="color:#323232;">        cd /docker/apps/nginx-proxy-manager
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose pull
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose up -d
</span><span style="color:#323232;">
</span><span style="color:#323232;">        cd /docker/apps/paaster
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose pull
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose up -d
</span><span style="color:#323232;">
</span><span style="color:#323232;">        cd /docker/apps/rss
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose pull
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose up -d
</span><span style="color:#323232;">
</span><span style="color:#323232;">        cd /docker/no-backup-apps/nextcloud
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose pull
</span><span style="color:#323232;">        ${pkgs.docker-compose}/bin/docker-compose up -d
</span><span style="color:#323232;">
</span><span style="color:#323232;">        cd /docker/backups/
</span><span style="color:#323232;">        ${pkgs.rclone}/bin/rclone copy server-backup-$backupDate.zip Dropbox:Server-Backup/
</span><span style="color:#323232;">        rm server-backup-$backupDate.zip
</span><span style="color:#323232;">        ${pkgs.rclone}/bin/rclone delete --min-age 7d Dropbox:Server-Backup/
</span><span style="color:#323232;">        '';
</span><span style="color:#323232;">      };
</span><span style="color:#323232;">    };
</span><span style="color:#323232;">
</span>
thejevans ,
@thejevans@lemmy.ml avatar

Thanks! I just started setting up NixOS on my laptop and I’m planning to use it for servers next. Saving this for later!

bier ,

my 20 TB storage is currently hosted by Hetzner on a SMB Share with a acompanying server The storage is accessable via NFS/SMB i have a Windows 10 VPS running Backblaze Personal Backup for 7$/Month with unlimited storage while mounting the SMB share as a “Physical Drive” using Dokan because Backblaze B1 doesn’t allow backing up Network shares If your Storage is local you can use the win Backup Agent in a Docker container

gobbling871 ,

I use restic (and dejadup just to be safe) backing up to multiple cloud storage points. Among these cloud storage points are borgbase.com, backblaze b2 and Microsoft cloud.

lnxtx ,
@lnxtx@feddit.nl avatar

VM instances on the Proxmox VE with native integration with the Proxmox Backup Server (PBS).

For non-VM a little PBS agent.

stown ,
@stown@sedd.it avatar

I host everything on Proxmox VM’s so I just take daily snapshots to my NAS

angrox ,

At home I have a Synology NAS for backup of the local desktops. Offsite Backups are done with restic to Blackblaze B2 and to another location.

SeeJayEmm ,
@SeeJayEmm@lemmy.procrastinati.org avatar

Desktop: I was using Duplicati for years but I’ve recently switched to Restic directly to B2. I’m using this powershell script to run it.

Server: I’m also using restic to b2.

I also have a Qnap NAS. I’m synchronizing my replaceable data to crappy old seagate NAS locally. For the irreplaceable data that’s using the Qnap backup client to B2.

giant_smeeg ,

Rsync to backblaze b2. I only backup the stuff I can’t download again (photos/docs etc). It’s the cheapest option and does version control automatically for me on a 12 hourly basis.

I then also once a week plug in my external SSD and it auto dumps changes to it via rclone.

Google drive/takeout etc are all done periodically and put into a folder that will roll into the above backups.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • lifeLocal
  • goranko
  • All magazines