I just got my home server up and running and was wondering what you guys recommend for backups. I figure it will probably be worth having backups on cloud servers tjay are external, are there any good services yall use for that?
Removed by mod
Backblaze b2, borgbase.com. There are also programs like dejadup that will let you backup to popular cloud drives. The alternatives are limitless.
Regardless of service, if you don’t test your backups, you have none.
Not so much about testing, but one time I really needed to get to my backups I lost password to the repository (I’m using restic). Luckily a copy of it was stored in bitwarden, but until I remembered it, were perhaps one of the worst moments.
Needless to say, please test backups and store secrets in more then one place.
Ehhh I would say then you have probabilistic backups. There’s some percent chance they’re okay, and some percent chance they’re useless. (And maybe some percent chance they’re in between those extremes.) With the odds probably not in your favor. 😄
Schrodinger’s backups.
Exactly.
Borgbase with Borgmatic (Borg) as the Software. As far as I know the whole Borgbase Service is from a Homelab guy (with our needs in mind).
Also 3-2-1 rule!
Tears… Natural, salty, wet tears…
Removed by mod
That is great for hardware failures, but what about disasters? I would hate to lose my house to a fire and all the data (including things not replaceable, like family photos) I have on my server at the same time because my primary and backup were both destroyed.
Removed by mod
While I agree with you, hard drives do have a shelf life. How many years seems to be up for debate but it does exist. If you don’t have multiple drives that are of different ages you may be in a world of hurt one day.
Why? If you check the drive once a month, and it fails once per 10 years on average, the time when both the back up drive and the main drive fail simultaneously is on average 2340 years. Of couse they are much more likely to fail if they’re old but the odds are very small.
rsync.net is great if you need something simple and cheap. Backblaze B2 is also decent, but does have the typical download and API usage cost.
I had never heard of rsync.net until now. I like the idea but it seems more expensive than B2. $15/TB vs $5/TB. Am I doing the math wrong or reading it wrong?
I’ve never heard of it either, but I came to the same conclusion as you
Yeah rsync.net has always been pricey.
When I researched what to use for my backup I found rsync.net. They have some nice features nobody else seems to support, like they support ZFS send/receive https://www.rsync.net/products/zfsintro.html
But in the end the price made me go with borgbase.com
I don’t see it on their website right now, but they offer a discount if you’re using something like restic/borg and only need scp/sftp access. Their support is also super friendly. I’ve had an account forever and got moved to the 100+ TB pricing even though I have < 50TB stored. YMMV but it doesn’t hurt to ask if they have any additional discounts.
Also keep in mind that B2 charges for bandwidth too. It’s $5/TB for storage, but $10/TB to download that same data.
deleted by creator
How are you using rsync with B2? Are you mounting the bucket locally?
deleted by creator
Backblaze B2 for automatic syncing of all the little files
Glacier for long term archiving of old big files that never change
I use Duplicati connected to Storj with data volumes that incrementally get backed up once per month. My files don’t change very often, so monthly is a good balance. Not counting my Jellyfin library, those backups are around 1 TB. With the Jellyfin library, almost 15 TB.
Earlier this year, I recovered from a 100% data loss scenario, as I didn’t (and still don’t) have space for physical backups. I have a 25 TB allowance, so my actual cost was €0. If I had to pay, it would have been under €1.
That looks like a cool setup, but I would never trust important data to some crypto shit (Storj) no matter what kind of track record they have.
That’s fair. I’m 100% onboard the decentralisation train, and do my hardest to practice what I preach. In the event that the service does go bust, I can make a backup on a different S3 compatible service immediately as long as my working copy is intact. The likelihood of the backup service AND the working copy dying at the exact same time would be my cue to take up knitting.
Do you mean 25TB as the storj site says 25gb? Did some promotion give you that much free?
Definitely 25 TB. I’ve used the service for a long time, since before they accepted credit cards. I attached my credit card one day and got a bump to 25 TB. Since that happened, I pay basically nothing and my account is still 100% storj token funded.
Edit: I dug up screenshots I sent someone recently
my account is still 100% storj token funded
That seems to be the key bit, since everyone can use up to 25TB (if they can pay for it). Are you also hosting a node to earn
creditstokens?
I have an unraid server which hosts an docker image of Duplicacy. It is paid though for the web interface. And it backs up to Backblaze B2. I have roughly 175GB backed up, for which I pay $0.87 a month.
Do you have other clients backing up to your unraid? I’m looking for a complete solution to backing up end user workstations (windows, Mac and Linux) to my unraid server then backing up my unraid server to something like wasabi, Amazon, backblaze, etc. Preferably a single solution.
Yes, I have another server automatically rsyncing important config files to a nfs share. And my pc has a samba share where I manually backup files to.
Look into Veeam. The free version should be enough for this workflow.
Duplicati to Backblaze B2 for the important stuff. For as far as the media library goes, no backup just local raid setup…
What’s the 2-2-1 rule?
32 different copies of the data in 2 different locations is 1 actual backup (it’s actually 3-2-1…)
AWS Glacier. I use the Synology plugin that does it automatically on a schedule.
Their prices are ridicules if you add cost of outbound traffic.
But if not (for disaster recovery only) it is pretty cheap. Like 1$/TB/month.
Git Annex.
Took me a while to wrap my head around it, but nothing comes close to it once you set it up.
Edit: should have read the post more carefully, I use Git Annex both locally and on a VPS I rent from openbsd.amsterdam for off-site backups.
Somehow “took me a while to wrap my head around it” doesn’t make me feel comfortable. Apart from git-annex themselves saying that they aren’t a backup system and just a building block to maybe create one, a backup system should imho be dead simple sind easy to understand.
Once you actually start using it it is dead simple and integrates extremely well with stuff you (might) already do.
I have a Git repo which contains my dotfiles + every “large” (annexed) file I want to back up under my home directory.
Git annex automatically tracks where all annexed files are, how many copies there are on various repos, etc.
I add and modify files using mostly standard git commands.
It supports pretty much anything as a “remote”.
It’s extremely simple to restore backups locally or remotely.
Basically Git annex is the Git of backup solutions IME, allowing you extreme flexibility to do exactly what you want, provided you take the time to learn how to do what you want.
Features that are important to me are things like an easy overview of all backup jobs (ideal via a web UI), snapshots going back every day for a week and after that every month. Backup to providers like Backblaze or AWS and the ability to browse these backups and individual snapshots.
I’d assume that you can build all of this with git annex in some way. But I really want something that works out of the box. E.g. install the backup software give it some things to backup and an B2 bucket and then go.
What I’m curious about is that the git-annex site explicitly days that they aren’t a backup system, but you describe it as such.
I don’t care about stuff working OOTB - half the fun is messing around with things IMO.
I also don’t care about web UIs and similar features (I always got the impression from selfhosting communities that this is considered important but I never really understood why - I don’t spend all day staring at statistics, and when I need some info I can get it through the terminal usually).
Also, first sentence on Git Annex’s website:
git-annex allows managing large files with git, without storing the file contents in git. It can sync, backup, and archive your data, offline and online. Checksums and encryption keep your data safe and secure.
Not sure why you’re saying it’s not a backup solution.
Efit: I guess the “what git-annex is not” page says this.
To quote a comment by the creator on the same page:
It’s definitely possible to use git-annex in backup-like ways, but what I want to discourage is users thinking that just putting files into git-annex means that they have a backup. Proper backups need to be designed, and tested. It helps to use software that is explicitly designed as a backup solution. git-annex is more about file distribution, and some archiving, than backups.
So basically he says this just so people won’t yell at him when they fail to use it as a backup solution correctly.
I don’t care about stuff working OOTB - half the fun is messing around with things IMO.
I generally agree. Backups for me are just something I don’t want to tinker with. It’s important to me that they work OOTB, are easy to grasp and I have a good overview.
The web interface is important to me because it gives me that overview from any device I’m currently using without needing to type anything into a terminal. The OOTB is important to me since I want to be able to easily set this all up again even without access to my Ansible setup or previous configuration.
To each their own. I’m not saying your way of doing this is wrong. It’s just not for me. This is just my reasoning / preferences. It’s also the reason something like borg wasn’t my chosen solution, even though it’s generally considered great.
I understand your position, though I always have access to a terminal pretty much so I still don’t see the point of a web UI.
Though I realize I’m in the minority here.