Quickest way to backup everything?

profzelonka

Explorer
Joined
Mar 29, 2020
Messages
67
I've been trying to find a quick way to backup all the FreeNAS necessities.

From what I've found, that would be:
- FreeNAS config file
- Jails
- Datasets set outside of iocages

Issues:
- Rsync files to a cloud is not secure, and encrypting it isn't necessary for local backup.
- Replicate a dataset to an SMB folder will cause file permission issues as the files aren't permitted to other local users.
- Backing up each jail individually requires to stop the jail, and the zip path can't be set to be, for example directed to an SMB dataset.

Seems odd that FreeNas doesn't have a quick backup and restore options. Of course the pools aren't a part of "everything" as they are already covered with RAID, but also, how *does* one backup a pool if need be?
 

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
Well, jails are on a pool. So are datasets. So if your pools are set up with adequate redundancy, so are your jails and datasets. But, of course, RAID isn't a substitute for a backup.
Backing up each jail individually requires to stop the jail, and the zip path can't be set to be, for example directed to an SMB dataset.
I have no idea why you'd think any of this is true, or the method of "backing up each jail individually" you have in mind.
Seems odd that FreeNas doesn't have a quick backup and restore options.
It seems more odd that you would write this without considering the tools that FreeNAS does have available. No, there's no single "click here to backup" button, and that wouldn't make sense anyway--there's too much variety in people's setups. But there are a number of options:
  • You've already mentioned rsync, which can sync to pretty much any Unix-y system that you can reach on the network (LAN or WAN)
  • Then there's ZFS replication, which can sync to any system using ZFS that you can reach via SSH--this would be most straightforward to set up with another FreeNAS box. Neither this nor rsync encrypt the data on the receiving system, so either that would need to be a system you also control, that system would need to have encrypted storage, or you'd need to be OK with that.
  • The Cloud Sync feature will sync your data to a variety of cloud storage providers, optionally encrypted.
  • Your config file is already backed up daily to your .system dataset; if you like, there are scripts floating around (check the Resources section) to send backups somewhere else (like to your email) on a regular basis.
  • Then when you get into the plugins, there's more. Duplicati will also sync your data to a variety of cloud (and other remote) storage providers, and it will be encrypted there. I believe Syncthing will as well.
 

garm

Wizard
Joined
Aug 19, 2017
Messages
1,556
Disk redundancy isn’t backup!!!

my backup plan is rclone of datasets to various locations. Jails are never backed up as a whole, although you can do that with iocage. I use what ever backup procedures outlined by each application, snapshots and ZFS send to package each backup.

as an example let’s take Nextcloud. Which consists of the database, web application, data location and potentially various external storage locations. Each location containing data is simply sent to backup with rclone. The database is dumped and copied to a dedicated backup dataset together with a copy of the web application. Snapshots are used to keep versions of backup in the dataset. But basically any dataset can be backed up in a verity of ways, don’t really see the issue here
 

profzelonka

Explorer
Joined
Mar 29, 2020
Messages
67
It seems more odd that you would write this without considering the tools that FreeNAS does have available.
The issues are actually responses to the available tools.. There is no easy way to backup the necessary files without additional tools and technical knowledge. I'm not being critical of the product, rather wish to make a thread for other users to find who want a quick and simple way to manually backup.

For a beginner user who is likely running a Windows computer and set up a NAS with FreeNAS.
  1. Where is this daily backup of system dataset?
  2. And how does one obtain it in a zip on their Windows hdd?
  3. How does one restore it if things go wrong?
  4. How does one backup jails and restore them?
  5. How about pools entirely?
I'm sorry if I sound arrogant, I do not mean to, I'm still new myself and just trying to understand what is expected of me from FreeNAS, as the user experience is very unclear as to how one can backup their FreeNAS properly and in a way that they can restore it easily in the case of anything goes wrong.

For example, I'm about to update FreeNAS and NextCloud. Instead of relying on snapshots, I'd much rather just download everything, then update. If something goes wrong, and then something bigger goes wrong like a power failure, I could then restore my manual backup.
 

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
Where is this daily backup of system dataset?
There isn't a "daily backup of system dataset." There's a daily backup of the config file, on the .system dataset. That's a dataset that ordinarily lives on your pool, while the "live" config file lives on your boot pool.
And how does one obtain it in a zip on their Windows hdd?
You use one of the scripts that I referred to in my earlier post to upload it to your desired cloud storage location, or email it to you. Well, they don't .zip it (I'm not sure why you'd care about that).
How does one restore it if things go wrong?
Not too easily, unfortunately--the devs have had this daily backup in place for 5+ years, but still haven't gotten around to providing a straightforward way to retrieve it when you set up a new system. As it is, you import the pool, find the latest backup in the .system dataset, copy it to your client machine, then upload it through the GUI. If you want something simpler, once again, there are scripts in the Resources section that will help.
  1. How does one backup jails and restore them?
  2. How about pools entirely?
Using any of the tools I mentioned above--rsync, ZFS replication, Cloud Sync, Duplicati, Syncthing, etc. If you'd define your requirements a bit more clearly, I or others might be able to give more specific answers.
 

profzelonka

Explorer
Joined
Mar 29, 2020
Messages
67
That's the .db file right? Yeah the UI works for that - that's easy.

What's a good way to backup iocages? I mean, I can sync/replicate them to another place but when comes time to restore the backup, what's the easiest way to set the data up for a quick restore?

I'm a little confused on datasets - where is dataset info stored? If I push the entire iocage dataset somewhere, how do I restore it? And, if there are datasets within the iocage dataset, will FreeNAS recognize it with it's settings?

And as far as pools go, there's no such thing as "backing up a pool", right? One can just replicate/sync the pool data into another pool, therefor "back up the pool".
 

Heracles

Wizard
Joined
Feb 2, 2018
Messages
1,401
Hey profzelonka,

There is TrueCommand that you can use to take backups remotely. As of now, I created a ticket about the feature not working when connecting over SSL with local certificates. It is supposed to be fixed in the next release. Still, if you go clear text or with public certificates like Lets Encrypt, it should work.

As for backups, see my complete strategy in my signature...
 

profzelonka

Explorer
Joined
Mar 29, 2020
Messages
67
Well, I can't imagine everyone having cash for purchasing and running 3 systems let alone 3 equally-sized pools of HDDs.

The question is simple and doesn't appear to have a good answer.
How do I efficiently backup locally without a cloud or a second Unix-like system on the network?

If that's not specific enough, pretend you're a non-technical home user who simply made a NAS for light use. You have a Windows system, and something else running FreeNAS. You want to backup your config, jails/datasets, and maybe even all the pool data entirely. You don't use clouds or clouds are too small for your backups or you refuse to use encryption but also don't want to put unencrypted files on some cloud service. How do you efficiently backup locally without a cloud or a second Unix-like system on the network?
 

garm

Wizard
Joined
Aug 19, 2017
Messages
1,556
Let’s back up a bit, what do you want to back up? And what are your intended target? Backing up a NAS to a client PC seams silly, just use the client PC and back up to your NAS?
You have basically four options for backups in FreeNAS,
or any script you’ll like using anything else (rclone can do local sync but it’s not in the GUI I think)
 

profzelonka

Explorer
Joined
Mar 29, 2020
Messages
67
I feel like we're speaking 2 languages here.. I'm very aware of the Tasks features and ability to pull and push data with the choice of sync, copy/replicate, and choice of cloud, SSH, or local pool copy.

What to backup, again:
- Config (covered via GUI's export ability)
- Datasets. For example nextcloud dataset folders for db and config.
- Jails. For example a jail that has an Apache installation with your own website on it, or just a set up Sonarr.
- Pools. This yeah is wild of an idea to backup, but say you had a small pool you just wanted to backup too.

Your target: A folder on your Windows computer or an external drive or a USB drive, something that is local and not a Linux/Unix box. In fact, what if you want a zip backup of each jail and maybe a zip backup of a select dataset.

The idea is to backup and restore easily, and in the case your NAS system physically dies all in all, you can (someone) just restore the above listed things without having to rebuild everything.
 

garm

Wizard
Joined
Aug 19, 2017
Messages
1,556
Datasets. For example nextcloud dataset folders for db and config.
Make a snapshot, tar.gz the data in the snapshot and pull/push it with/to windows using anything really, ssh, smb, nfs.

Jails. For example a jail that has an Apache installation with your own website on it, or just a set up Sonarr.
Back up the application and restore by spinning up a new jail and restore the backup, will be application specific. In the case of Apache, take a snapshot of the jail, make a tar.gz of the Apache config, webapp and any other resource needed and repeat above fetch method
Pools. This yeah is wild of an idea to backup, but say you had a small pool you just wanted to backup too.
There is no such thing as backing up a pool, a pool is one or more vdevs where you can create datasets. See above.

If you have a second ZFS pool you can zfs send a snapshot to that pool, this is probably the safest and most effective method of backing up a complete dataset.
A jail consists of more then your application, it’s a complete copy of FreeBSD, backing that up is in the vast majority of cases overkill. Just follow what ever backup process is documented for your application, eg
Then take a snapshot and make a tar.gz of what ever content that process generate.

When it comes to large datasets such as a photo collection taking up a large fraction of a pool, there is no practical ability to package the content in an archive first. You need to copy the data itself, that is done with either on of the tasks available or by pulling it. If you want to use your windows machine as a backup target, just set up a robocopy script in a schedule to pull the data using the same mechanism as above
 

garm

Wizard
Joined
Aug 19, 2017
Messages
1,556
As a local windows machine is far from ideal as a backup target, using a secondary pool must not be expensive at all.
 

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
There is no such thing as backing up a pool,
"Backing up the root dataset of a pool" is functionally equivalent to "backing up a pool", and is (theoretically at least) entirely possible. But the answer is, of course, the same as given above: cloud sync, rsync, and/or replication. Or third-party software.
I'm very aware of the Tasks features and ability to pull and push data with the choice of sync, copy/replicate, and choice of cloud, SSH, or local pool copy.
How, in your mind, does this differ from a "backup"?
 

garm

Wizard
Joined
Aug 19, 2017
Messages
1,556
"Backing up the root dataset of a pool" is functionally equivalent to "backing up a pool", and is (theoretically at least) entirely possible.
Sure, but it’s still a dataset ;)
 

Jailer

Not strong, but bad
Joined
Sep 12, 2014
Messages
4,977
Your target: A folder on your Windows computer or an external drive or a USB drive, something that is local and not a Linux/Unix box. In fact, what if you want a zip backup of each jail and maybe a zip backup of a select dataset.
You can do this and automate it so it runs without any user interaction. I have a jail that gets the database backed up and copied to another dataset outside the jail. Also the files in the jail get copied to that same dataset as the database backup. Every 5 days those files get zipped with a date and time stamp. That dataset is set up as a SMB share and mapped to my desktop. I use synctoy on my desktop to copy the share contents to a folder on my desktop that is synced with google drive.

That's my way of doing it based on my needs and skill level. As others have stated you'll have to figure out what way works best for you and your skill level. But to say there aren't options is a bit ridiculous as there are many.
 

NasKar

Guru
Joined
Jan 8, 2016
Messages
739
I'm not sure if this is the best way but I'll share what I do. I've written sh scripts to automate the process mainly because I find it impossible to get the syntax of various commands correct even a week later. The data gets written to a directory in my case located at /mnt/<yourpool>/backup. Then I use SyncBackFree to copy it to my windows computer. In addition I use replication to a second Freenas but a lot of people myself included find the concepts confusing. I'm still working on it. Having a method or multiple methods to restore after a disaster is very important.

The config file. I use this script to back it up. It creates a directory with the FN version and stores the last 3 days of config files.
Code:
#!/bin/sh
#
# Set these Variables - location of backup files and max number of files to keep
backuploc="/mnt/v1/backup/config"
maxnrOfFiles=3

#
# Create directory for freenas version and backupname
versiondir=`cat /etc/version | cut -d' ' -f1`
backupname=`date \+%Y\%m\%d`_`cat /etc/version | cut -d' ' -f1`.db
echo "Directory to put config in ${versiondir}"
echo "Name of backup file ${backupname}"
mkdir -p $backuploc/$versiondir

#
# Copy config file and rename
cp /data/freenas-v1.db ${backuploc}/${versiondir}/${backupname}
echo "cp /data/freenas-v1.db ${backuploc}/${backupname}"

#
# Delete old backups
#
# initallize variables
nrOfFiles=0

backupMainDir="${backuploc}/${versiondir}"
echo "Number of files to keep ${maxnrOfFiles}"
if [ ${maxnrOfFiles} -ne 0 ]
then
     echo "maxnrOfFiles is not 0"
     nrOfFiles="$(ls -l ${backupMainDir} | grep -c "^-.*")"
     echo "nrOfFiles=" $nrOfFiles
     nFileToRemove="$((nrOfFiles - maxnrOfFiles))"
     echo "nFileToRemove=" $nFileToRemove
while [ $nFileToRemove -gt 0 ]
do
     echo
     echo "number files to remove=" $nFileToRemove
     fileToRemove="$(ls -t ${backupMainDir} | tail -1)"
     echo "Removing file ${fileToRemove}"
     nFileToRemove="$((nFileToRemove - 1))"
     rm ${backupMainDir}/${fileToRemove}
done
fi
echo "done"

Similarly I backup all the jails with the a script the uses the iocage export jail_name command, nextcloud (files db and mysql), my jail data which is stored outside the jail. Your are correct that the jails need to be stopped and then restarted but the script/task does it in the middle of the night. Look forward to hearing feedback on this process.
 

profzelonka

Explorer
Joined
Mar 29, 2020
Messages
67
You can do this and automate it so it runs without any user interaction. I have a jail that gets the database backed up and copied to another dataset outside the jail. Also the files in the jail get copied to that same dataset as the database backup. Every 5 days those files get zipped with a date and time stamp. That dataset is set up as a SMB share and mapped to my desktop. I use synctoy on my desktop to copy the share contents to a folder on my desktop that is synced with google drive.

That's my way of doing it based on my needs and skill level. As others have stated you'll have to figure out what way works best for you and your skill level. But to say there aren't options is a bit ridiculous as there are many.
Do you mind sharing how you have your database copied to another dataset, the jail files copied, and what kind of commands/cron you have for zipping with time stamp? This is pretty much exactly what I'm looking to do.

No doubt the ability to manually backup is possible, I meant that there's no easy UI way to backup without figuring out the correct commands, plus they may change with version changes. For bigger setups I see that there may not be a need for such backup solutions of course as having the bulk data in a zip on a hdd wouldn't be practical. (Not in my case of course.)


Make a snapshot, tar.gz the data in the snapshot and pull/push it with/to windows using anything really, ssh, smb, nfs.

Back up the application and restore by spinning up a new jail and restore the backup, will be application specific. In the case of Apache, take a snapshot of the jail, make a tar.gz of the Apache config, webapp and any other resource needed and repeat above fetch method
How do you pull snapshots and tar.gz them?

Seems excessive to manually back up mysql exports, apache configs, etc per each jail if one can export the entire jail, the file size difference can't be *that* big, right? With iocages, I can stop the jail and export a tar.gz of the jail and move it to an SMB dataset, which I can also move back and import the same way - which is great, except that the jail needs to be stopped before exporting. Would compressing/zipping the entire iocage folder with all jails in it be a workable backup to import? How does one import a dataset?
 
Top