I just got my home server up and running and was wondering what you guys recommend for backups. I figure it will probably be worth having backups on cloud servers tjay are external, are there any good services yall use for that?

    • pacjo@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Not so much about testing, but one time I really needed to get to my backups I lost password to the repository (I’m using restic). Luckily a copy of it was stored in bitwarden, but until I remembered it, were perhaps one of the worst moments.

      Needless to say, please test backups and store secrets in more then one place.

    • witten@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Ehhh I would say then you have probabilistic backups. There’s some percent chance they’re okay, and some percent chance they’re useless. (And maybe some percent chance they’re in between those extremes.) With the odds probably not in your favor. 😄

  • Qu4ndo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Borgbase with Borgmatic (Borg) as the Software. As far as I know the whole Borgbase Service is from a Homelab guy (with our needs in mind).

    Also 3-2-1 rule!

    • Arrayrepairman@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      That is great for hardware failures, but what about disasters? I would hate to lose my house to a fire and all the data (including things not replaceable, like family photos) I have on my server at the same time because my primary and backup were both destroyed.

    • raiun@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      While I agree with you, hard drives do have a shelf life. How many years seems to be up for debate but it does exist. If you don’t have multiple drives that are of different ages you may be in a world of hurt one day.

      • Chadus_Maximus@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Why? If you check the drive once a month, and it fails once per 10 years on average, the time when both the back up drive and the main drive fail simultaneously is on average 2340 years. Of couse they are much more likely to fail if they’re old but the odds are very small.

  • johntash@eviltoast.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    rsync.net is great if you need something simple and cheap. Backblaze B2 is also decent, but does have the typical download and API usage cost.

  • kalleboo@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Backblaze B2 for automatic syncing of all the little files

    Glacier for long term archiving of old big files that never change

  • hollyberries@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I use Duplicati connected to Storj with data volumes that incrementally get backed up once per month. My files don’t change very often, so monthly is a good balance. Not counting my Jellyfin library, those backups are around 1 TB. With the Jellyfin library, almost 15 TB.

    Earlier this year, I recovered from a 100% data loss scenario, as I didn’t (and still don’t) have space for physical backups. I have a 25 TB allowance, so my actual cost was €0. If I had to pay, it would have been under €1.

    • gamer@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      That looks like a cool setup, but I would never trust important data to some crypto shit (Storj) no matter what kind of track record they have.

      • hollyberries@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        That’s fair. I’m 100% onboard the decentralisation train, and do my hardest to practice what I preach. In the event that the service does go bust, I can make a backup on a different S3 compatible service immediately as long as my working copy is intact. The likelihood of the backup service AND the working copy dying at the exact same time would be my cue to take up knitting.

      • hollyberries@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        Definitely 25 TB. I’ve used the service for a long time, since before they accepted credit cards. I attached my credit card one day and got a bump to 25 TB. Since that happened, I pay basically nothing and my account is still 100% storj token funded.

        Edit: I dug up screenshots I sent someone recently

        • Deebster@lemmyrs.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          my account is still 100% storj token funded

          That seems to be the key bit, since everyone can use up to 25TB (if they can pay for it). Are you also hosting a node to earn credits tokens?

  • kennyboy55@feddit.nl
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I have an unraid server which hosts an docker image of Duplicacy. It is paid though for the web interface. And it backs up to Backblaze B2. I have roughly 175GB backed up, for which I pay $0.87 a month.

    • lal309@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Do you have other clients backing up to your unraid? I’m looking for a complete solution to backing up end user workstations (windows, Mac and Linux) to my unraid server then backing up my unraid server to something like wasabi, Amazon, backblaze, etc. Preferably a single solution.

      • kennyboy55@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Yes, I have another server automatically rsyncing important config files to a nfs share. And my pc has a samba share where I manually backup files to.

  • cctl01@feddit.nl
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Duplicati to Backblaze B2 for the important stuff. For as far as the media library goes, no backup just local raid setup…

  • ErwinLottemann@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    borg with an external hard drive and borgbase as a remote. I use the 2-2-1 rule (🙈), as I struggle to find a good way to do another backup and RAID does not count 😬

  • dsemy@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Git Annex.

    Took me a while to wrap my head around it, but nothing comes close to it once you set it up.

    Edit: should have read the post more carefully, I use Git Annex both locally and on a VPS I rent from openbsd.amsterdam for off-site backups.

    • Rakn@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Somehow “took me a while to wrap my head around it” doesn’t make me feel comfortable. Apart from git-annex themselves saying that they aren’t a backup system and just a building block to maybe create one, a backup system should imho be dead simple sind easy to understand.

      • dsemy@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Once you actually start using it it is dead simple and integrates extremely well with stuff you (might) already do.

        I have a Git repo which contains my dotfiles + every “large” (annexed) file I want to back up under my home directory.

        Git annex automatically tracks where all annexed files are, how many copies there are on various repos, etc.

        I add and modify files using mostly standard git commands.

        It supports pretty much anything as a “remote”.

        It’s extremely simple to restore backups locally or remotely.

        Basically Git annex is the Git of backup solutions IME, allowing you extreme flexibility to do exactly what you want, provided you take the time to learn how to do what you want.

        • Rakn@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Features that are important to me are things like an easy overview of all backup jobs (ideal via a web UI), snapshots going back every day for a week and after that every month. Backup to providers like Backblaze or AWS and the ability to browse these backups and individual snapshots.

          I’d assume that you can build all of this with git annex in some way. But I really want something that works out of the box. E.g. install the backup software give it some things to backup and an B2 bucket and then go.

          What I’m curious about is that the git-annex site explicitly days that they aren’t a backup system, but you describe it as such.

          • dsemy@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            I don’t care about stuff working OOTB - half the fun is messing around with things IMO.

            I also don’t care about web UIs and similar features (I always got the impression from selfhosting communities that this is considered important but I never really understood why - I don’t spend all day staring at statistics, and when I need some info I can get it through the terminal usually).

            Also, first sentence on Git Annex’s website:

            git-annex allows managing large files with git, without storing the file contents in git. It can sync, backup, and archive your data, offline and online. Checksums and encryption keep your data safe and secure.

            Not sure why you’re saying it’s not a backup solution.

            Efit: I guess the “what git-annex is not” page says this.

            To quote a comment by the creator on the same page:

            It’s definitely possible to use git-annex in backup-like ways, but what I want to discourage is users thinking that just putting files into git-annex means that they have a backup. Proper backups need to be designed, and tested. It helps to use software that is explicitly designed as a backup solution. git-annex is more about file distribution, and some archiving, than backups.

            So basically he says this just so people won’t yell at him when they fail to use it as a backup solution correctly.

            • Rakn@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              I don’t care about stuff working OOTB - half the fun is messing around with things IMO.

              I generally agree. Backups for me are just something I don’t want to tinker with. It’s important to me that they work OOTB, are easy to grasp and I have a good overview.

              The web interface is important to me because it gives me that overview from any device I’m currently using without needing to type anything into a terminal. The OOTB is important to me since I want to be able to easily set this all up again even without access to my Ansible setup or previous configuration.

              To each their own. I’m not saying your way of doing this is wrong. It’s just not for me. This is just my reasoning / preferences. It’s also the reason something like borg wasn’t my chosen solution, even though it’s generally considered great.

              • dsemy@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I understand your position, though I always have access to a terminal pretty much so I still don’t see the point of a web UI.

                Though I realize I’m in the minority here.