Anyone using HashBackup?

Status
Not open for further replies.

Robert Trevellyan

Pony Wrangler
Joined
May 16, 2014
Messages
3,778
I found HashBackup a couple of months ago via the b2 integrations page. I decided to take a closer look yesterday. It's an interesting approach to onsite and offsite backup.

I'm interested to know what others think of it, ideally from direct experience.
 

fracai

Guru
Joined
Aug 22, 2012
Messages
1,212
I've looked at HashBackup before and it does look very interesting. The primary downside for me though, is the price. Currently, it looks like it's a free beta. However, there is a license FAQ that suggests that at some point in the future it will switch to $250-450 / year / server. That is just far too expensive for something that's not even providing the backup space. I'm therefore hesitant to even look in to the product for fear that it starts charging and I've already switched over to using the tool.
 

Robert Trevellyan

Pony Wrangler
Joined
May 16, 2014
Messages
3,778
Yes, I noticed that. There's talk of a free version for "small backups", and the possibility of per TB pricing, but nothing firm. There's also a promise that any existing backups will always be readable with an existing release, even if it expires and will no longer create new backups.

I can't quite make up my mind about the local backup repository aspect. I suppose every offsite backup solution needs some local storage, and at least with HashBackup it's transparent and controllable.

Anyway, I have it installed on my desktop machine with a repository on an external drive, just to get some hands-on time with it.
 

fta

Contributor
Joined
Apr 6, 2015
Messages
148
I've been using it for over a year now, and I've been backing up to B2 since support for it was added to hashbackup. It works great, and the author is very responsive to bug reports and feature requests. I'm also concerned about pricing, but if it ends up being too expensive, I'll just switch to something else.
 

Robert Trevellyan

Pony Wrangler
Joined
May 16, 2014
Messages
3,778
What did you decide to do with:
  • cache-size-limit
  • dedup-mem
  • pack-remote-archives
  • any other settings you tweaked?
 

fta

Contributor
Joined
Apr 6, 2015
Messages
148
These are my settings:

arc-size-limit 100mb
audit-commands
backup-linux-attrs False
cache-size-limit 1gb
copy-executable true
dedup-mem 0
disable-commands
enable-commands
hfs-compress False
no-backup-ext
no-backup-tag .nobackup
no-compress-ext
no-dedup-ext
pack-age-days 30
pack-bytes-free 1MB
pack-percent-free 50
pack-remote-archives False
remote-update normal
simulated-backup False
 

Robert Trevellyan

Pony Wrangler
Joined
May 16, 2014
Messages
3,778
I was looking at whether to leave pack-remote-archives False. I see that downloading 1GB would cost the same as storing 1GB for 10 months, but there's a free daily download allowance of 1GB. I figure there's a good chance that a daily hb retain won't have to download more than 1G, so I set mine to True.

Any reason you don't let hb do deduplication?
 

fta

Contributor
Joined
Apr 6, 2015
Messages
148
I was looking at whether to leave pack-remote-archives False. I see that downloading 1GB would cost the same as storing 1GB for 10 months, but there's a free daily download allowance of 1GB. I figure there's a good chance that a daily hb retain won't have to download more than 1G, so I set mine to True.

My backups are not particularly volatile so for the amount of space this would save me, it's simply not worth it.

Any reason you don't let hb do deduplication?

For the amount of space it would save me (not much), it's not worth the RAM hit when running hb.
 

Robert Trevellyan

Pony Wrangler
Joined
May 16, 2014
Messages
3,778
Is there anything in particular you're unsure of? I hesitate to commit to a tutorial because a) I'm sure I can't do a better job than the author's documentation and b) I'm not running it on FreeNAS. However, I'd be happy to answer specific questions, and I'm not ruling out a tutorial yet.
 

fta

Contributor
Joined
Apr 6, 2015
Messages
148
I've been using it with B2 since hashbackup first supported it. I put hashbackup in a dataset on my pool and tell it to exclude that dataset. Follow hashbackup's docs and set up a cron job. That's really all there is to it.
 

ninjabilly

Cadet
Joined
Jun 6, 2015
Messages
6
If you are using the retain function of hashbackup are the 'arc' files on B2 being removed according to your backup schedule.

Sent from my Nexus 6P using Tapatalk
 

Robert Trevellyan

Pony Wrangler
Joined
May 16, 2014
Messages
3,778
That depends on how you configure HashBackup, and in particular the pack-remote-archives setting. Or do you mean "is it working as intended"?
 

Robert Trevellyan

Pony Wrangler
Joined
May 16, 2014
Messages
3,778
I assume you've set pack-remote-archives to true.

What is your setting for:
  • pack-age-days?
  • pack-bytes-free?
  • pack-percent-free?
What options are you using for hb retain?

What behavior are you seeing that doesn't meet expectations?

Have you contacted Jim for support?
 

ninjabilly

Cadet
Joined
Jun 6, 2015
Messages
6
Pack-remote-archives was not set to True.

pack-age-days 30
pack-bytes-free 1MB
pack-percent-free 50

My expectation was to have a full backup with a total of 7 incremental backups. The retain settings I use is a simple -s7d. If I have the wrong expectation of the retain function is there anything which I can do to somewhat meet those expectations.

Sent from my Nexus 6P using Tapatalk
 

Robert Trevellyan

Pony Wrangler
Joined
May 16, 2014
Messages
3,778
Your retain option of -s7d is what you should use if your goal it to keep one backup from each of the last 7 days. However, with pack-remote-archives set to false, nothing would have been removed from B2. Now that you have that set to true, there is at least a possibility of files on B2 being removed, but the other options will come into play.

In particular, pack-age-days = 30 means a backup file won't be packed until at least 30 days after the backup that created it. Then pack-bytes-free and pack-percent-free will also come into play.

Are you using de-duplication in HashBackup? If so, older arc files might contain blocks that are reused in recent backups, so behavior would not be as simple as a discard of the oldest arc file after 7 days.

Here are the highlights of my hb config:
Code:
cache-size-limit 1000
copy-executable True
dedup-mem 1GB
pack-age-days 0
pack-bytes-free 1MB
pack-percent-free 50
pack-remote-archives True

On hb retain I use -s safe and -x 1y.
 

ninjabilly

Cadet
Joined
Jun 6, 2015
Messages
6
Changed my config to somewhat reflect your settings. Will update in a few days to verify that the older backups are being removed.

Sent from my Nexus 6P using Tapatalk
 
Status
Not open for further replies.
Top