Quickest way to backup everything?

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
the file size difference can't be *that* big, right?
To pick an example mostly at random, my Plex jail is ~7.5 GB, not including the configuration and metadata (those are stored outside the jail). That jail can be recreated with about five minutes' work at the CLI, or less running my script, or less yet installing the plugin. Why would you want to back up that much worthless data?
 

Jailer

Not strong, but bad
Joined
Sep 12, 2014
Messages
4,977
Do you mind sharing how you have your database copied to another dataset, the jail files copied, and what kind of commands/cron you have for zipping with time stamp?


Databse is dumped nightly using mysqldump command via a script set to run via cron. The scripts is named backup.sh. I have 2 directories in the dataset where they are saved named daily and weekly. The daily directory is where the daily backups get saved and the weekly is where the compressed files get saved. I've replaced my file paths and usernames in the scripts and commands below with caps of information you will have to provide that matches your system structure.

The cron task for running the backup script is below.
Code:
iocage exec YOURJAILNAME csh /backup.sh

The code for the backup.sh script is below. This goes in the root of your jail.
Code:
/usr/local/bin/mysqldump --login-path=local --databases YOURDATABASENAME > /YOURBACKUPNAME.sql

You'll also have to provide credentials for mysqldump to run. You do this in your jail via cli.
Code:
mysql_config_editor set --login-path=local --host=localhost --user=YOURDATABASEUSERNAME --password

The database dump is copied via rsync.
Code:
rsync -av --delete /mnt/YOURPOOL/jails/YOURJAILNAME/YOURBACKUPNAME.sql /mnt/YOURPOOL/PATH/TO/YOUR/DATASET/daily

The files are also copied via rsync.
Code:
rsync -av --delete /mnt/YOURPOOL/jails/SOURCE/OF/FILES /mnt/YOURPOOL/PATH/TO/YOUR/DATASET/daily

The files and database backup are tar'd up every 5 days via a script located in the root of the backup dataset called weekly.sh that is also run from cron. The script is below.
Code:
tar -jcf "/mnt/YOURPOOL/PATH/TO/YOUR/DATASET/`date '+%Y-%m-%d_%H-%M-%S.bz2'`" -C "/mnt/YOURPOOL/PATH/TO/YOUR/DATASET" "daily"


Once you've got all that set up the rest is just setting up a SMB share on the backup dataset so your favorite sync program can sync it to the location of your choice on your desktop. I use synctoy and have it set up as a scheduled task via the task manager in Windows that runs every day at 6PM since my computer is usually on every day at that time. Do note that I strongly encourage you to educate yourself a bit instead of just blindly setting things up so you will be able to fix it if something ever breaks. Also practice your disaster recovery so you know it will work when you need it.
 

garm

Wizard
Joined
Aug 19, 2017
Messages
1,556
How does one import a dataset?
Using zfs send/receive , iocage is just a framework, it’s using existing tools.
How do you pull snapshots and tar.gz them?
I never backup a “live” filesystem, there is no need with ZFS. My Nextcloud instances are backed up this way;
Cron on FreeNAS executes a backup script that 1) puts Nextcloud in maintenance mode with iocage exec 2) dumps the database 3) snapshots the jail and any other relevant datasets (for Nextcloud I have several datasets containing data for Nextcloud and one dataset for the database) 4) turns off maintenance mode (and from this point on Nextcloud is available to users) 5) clone the snapshot (or use the .zfs folder) and archive the database dump, application files and any other relevant files to a dedicated backup dataset. With all the data to be backed up collected and snapshot, the content of the dataset is sent off the backup location. Depending on the content, snapshots are kept around for various ages, anything from just a few days to years.
Seems excessive to manually back up
No, you don’t do anything manually, that just introduces an element of risk, everything is tested and done via script
the file size difference can't be *that* big, right?
My Nextcloud jail is a 5 GB dataset, a database dump is about 1% the size of the actual database, and the Nextcloud app itself is only a few hundred MB, in total a complete backup of Nextcloud is (excluding content) is less then 5% of the total jail size... I would say that is significant...
Would compressing/zipping the entire iocage folder with all jails in it be a workable backup to import?
that would be excessive to say the least, most of the content in iocage can be easily rebuilt using third party sources (FreeNAS, FreeBSD) and it’s trivial to reverse a backup script to restore a backup, actually, unless you have successfully done so, it’s not really a backup yet.
 
Last edited:

Heracles

Wizard
Joined
Feb 2, 2018
Messages
1,401
How do I efficiently backup locally without a cloud or a second Unix-like system on the network?

If you do not go for a second system, your backups stay online and onsite, so there is a very high probability for them to get destroyed at the same time as your original.

By refusing to go to a cloud, you then need to host yourself that second system that will backup your data.

By refusing to go for a second Unix-like system, you drop to a last option : a second system that will run Windows.

That system will be higher risk than your original in many ways. Windows filesystem is terrible, the operating system is designed to crash, security holes are the daily news and more.

Should you use your Windows system as a gateway to offline storage like DVDs, you end up with a very limited capacity for storing your data. A DVD holds less than 5 G of data...

So you are asking for something that will magically transfer, store and restore a high volume fo data without any effort. That will not happen auto-magically.

So as @danb35 suggested, define your backup needs first :
what data ?
how big ?
how often do they change ?
how long a history do you need ?
do you want to protect yourself against physical threats ?
do you want protection against logical threats ?
what is the acceptable gap between two backups ?

Once you defined your needs, it will be possible to design a solution.
 

singlemalt8

Dabbler
Joined
Aug 15, 2020
Messages
11
I'm not sure if this is the best way but I'll share what I do. I've written sh scripts to automate the process mainly because I find it impossible to get the syntax of various commands correct even a week later. The data gets written to a directory in my case located at /mnt/<yourpool>/backup. Then I use SyncBackFree to copy it to my windows computer. In addition I use replication to a second Freenas but a lot of people myself included find the concepts confusing. I'm still working on it. Having a method or multiple methods to restore after a disaster is very important.

The config file. I use this script to back it up. It creates a directory with the FN version and stores the last 3 days of config files.
Code:
#!/bin/sh
#
# Set these Variables - location of backup files and max number of files to keep
backuploc="/mnt/v1/backup/config"
maxnrOfFiles=3

#
# Create directory for freenas version and backupname
versiondir=`cat /etc/version | cut -d' ' -f1`
backupname=`date \+%Y\%m\%d`_`cat /etc/version | cut -d' ' -f1`.db
echo "Directory to put config in ${versiondir}"
echo "Name of backup file ${backupname}"
mkdir -p $backuploc/$versiondir

#
# Copy config file and rename
cp /data/freenas-v1.db ${backuploc}/${versiondir}/${backupname}
echo "cp /data/freenas-v1.db ${backuploc}/${backupname}"

#
# Delete old backups
#
# initallize variables
nrOfFiles=0

backupMainDir="${backuploc}/${versiondir}"
echo "Number of files to keep ${maxnrOfFiles}"
if [ ${maxnrOfFiles} -ne 0 ]
then
     echo "maxnrOfFiles is not 0"
     nrOfFiles="$(ls -l ${backupMainDir} | grep -c "^-.*")"
     echo "nrOfFiles=" $nrOfFiles
     nFileToRemove="$((nrOfFiles - maxnrOfFiles))"
     echo "nFileToRemove=" $nFileToRemove
while [ $nFileToRemove -gt 0 ]
do
     echo
     echo "number files to remove=" $nFileToRemove
     fileToRemove="$(ls -t ${backupMainDir} | tail -1)"
     echo "Removing file ${fileToRemove}"
     nFileToRemove="$((nFileToRemove - 1))"
     rm ${backupMainDir}/${fileToRemove}
done
fi
echo "done"

Similarly I backup all the jails with the a script the uses the iocage export jail_name command, nextcloud (files db and mysql), my jail data which is stored outside the jail. Your are correct that the jails need to be stopped and then restarted but the script/task does it in the middle of the night. Look forward to hearing feedback on this process.

Thanks for sharing your script. I am new to FreeNAS (and Linux scripts for that matter) and running into some issues. Wondering if you could advise how to trouble shoot? Any advice appreciated! I clearly need a little guidance :smile:

My Script Directory: /mnt/pool1/my_scripts
My Backup location: "/mnt/pool1/backups/freeNAS/config_files"
Script File Name: config_backup_3days.sh

Errors I am seeing:
root@freenas[~]# sh /mnt/pool1/my_scripts/config_backup_3days.sh
: not foundmy_scripts/config_backup_3days.sh:
Directory to put config in FreeNAS-11.3-U4.1
Name of backup file 20200902_FreeNAS-11.3-U4.1.db
: not foundmy_scripts/config_backup_3days.sh:
: No such file or directory.db/config_files/20200902_FreeNAS-11.3-

Your script that I modified:
Code:
#!/bin/sh
# run from shell: sh /mnt/pool1/my_scripts/config_backup_3days.sh
# Set these Variables - location of backup files and max number of files to keep
# backuploc="/mnt/v1/backup/config"
backuploc="/mnt/pool1/backups/freeNAS/config_files"
maxnrOfFiles=3

#
# Create directory for freenas version and backupname
versiondir=`cat /etc/version | cut -d' ' -f1`
backupname=`date \+%Y\%m\%d`_`cat /etc/version | cut -d' ' -f1`.db
echo "Directory to put config in ${versiondir}"
echo "Name of backup file ${backupname}"
mkdir -p $backuploc/$versiondir

#
# Copy config file and rename
cp /data/freenas-v1.db ${backuploc}/${versiondir}/${backupname}
echo "cp /data/freenas-v1.db ${backuploc}/${backupname}"

#
# Delete old backups
# initallize variables
nrOfFiles=0

backupMainDir="${backuploc}/${versiondir}"
echo "Number of files to keep ${maxnrOfFiles}"
if [ ${maxnrOfFiles} -ne 0 ]
then
     echo "maxnrOfFiles is not 0"
     nrOfFiles="$(ls -l ${backupMainDir} | grep -c "^-.*")"
     echo "nrOfFiles=" $nrOfFiles
     nFileToRemove="$((nrOfFiles - maxnrOfFiles))"
     echo "nFileToRemove=" $nFileToRemove
while [ $nFileToRemove -gt 0 ]
do
     echo
     echo "number files to remove=" $nFileToRemove
     fileToRemove="$(ls -t ${backupMainDir} | tail -1)"
     echo "Removing file ${fileToRemove}"
     nFileToRemove="$((nFileToRemove - 1))"
     rm ${backupMainDir}/${fileToRemove}
done
fi
echo "done"
 
Last edited:

NasKar

Guru
Joined
Jan 8, 2016
Messages
739
I would change to the directory you want to store the script in and create the script with nano config_backup_3days.sh and paste my code in. Then edit the 5th line in your version with backuploc="/mnt/pool1/backups/freeNAS/config_files"
Save the file with Control-X
Then make it executable with chmod +x config_backup_3days.sh
then run the script by typing ./config_backup_3days.sh
 

singlemalt8

Dabbler
Joined
Aug 15, 2020
Messages
11
I would change to the directory you want to store the script in and create the script with nano config_backup_3days.sh and paste my code in. Then edit the 5th line in your version with backuploc="/mnt/pool1/backups/freeNAS/config_files"
Save the file with Control-X
Then make it executable with chmod +x config_backup_3days.sh
then run the script by typing ./config_backup_3days.sh
Thanks for trying to help! I tried to follow those steps but no luck. After editing the sh file and then trying to chmod and run it, I get "Operation not permitted". And then trying to run using ./ I get a "bad interpreter" error. See below:
2020-09-02_22-39-00.png
 

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504

singlemalt8

Dabbler
Joined
Aug 15, 2020
Messages
11
Ah! I did not know that was an issue. I did try Nano in the FreeNAS GUI but only editing existing file. Will try again from scratch. I think I am beginning to see why others use SSH and not the GUI shell...
 
Last edited:

singlemalt8

Dabbler
Joined
Aug 15, 2020
Messages
11
You're editing the file with a Windows editor that's putting the wrong line endings in place.
wow! unbelievable. the whole issue was editing in windows rather than linux. Thanks for the tip!!
Ugh... learn something new every day...

question: very hard to work in FreeNAS GUI shell. Is there another editor available in FreeNAS or a way to edit script files in windows properly?
 

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
very hard to work in FreeNAS GUI shell.
Yes. Don't use it. Seriously, forget it exists and use SSH instead.
Is there another editor available in FreeNAS
I know of ee, vi, and nano; there may be others.
a way to edit script files in windows properly?
Yes, what I mentioned in my post above--Notepad++. No doubt there are others as well (I'm sure VSCode would do too, for example), but it works well and is free software.
 

singlemalt8

Dabbler
Joined
Aug 15, 2020
Messages
11
Otherwise, use an editor that can deal with Unix line endings--Notepad++ is good and free. But definitely skip the GUI shell.

I have been using Notepad++ in Windows for years, and that's what I used when I originally tried to edit the script from @NasKar - but that led to the problems (saved as "unix script file", UTF-8 encoding). If you know how to "save as" correctly with Notepad++ that would be a big help. Not sure SSH really helps me as I would be using an editor in windows anyway?

Thanks to all on the thread as it got me at least started :smile:
 

singlemalt8

Dabbler
Joined
Aug 15, 2020
Messages
11
Interesting find in Notepad++. Seems like you can change the EOL character to match Linux... Worked perfectly!!
1599133311145.png
 

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
Not sure SSH really helps me as I would be using an editor in windows anyway?
No, if you're ssh'd into the server, you'd use one of the editors on the server--ee, vi, or nano (but only use vi if you're a masochist). Or Notepad++ has a plugin to use SFTP, so it can save files directly to the server (even if they aren't in a shared directory), and that can be pretty handy too.
 

singlemalt8

Dabbler
Joined
Aug 15, 2020
Messages
11
No, if you're ssh'd into the server, you'd use one of the editors on the server--ee, vi, or nano (but only use vi if you're a masochist). Or Notepad++ has a plugin to use SFTP, so it can save files directly to the server (even if they aren't in a shared directory), and that can be pretty handy too.
Thanks @danb35. Now that the script is running, I may try SSH so I can use native editors more easily. I appreciate the help!
 
Last edited:

videopete

Dabbler
Joined
Sep 5, 2021
Messages
14
Are these the only methods of backing up datasets still today ?
Synology has Hyperbackup, a simple backup program that allows you to backup any data, full, incremental, etc to an attached external USB & more. Simple and effective. Truenas needs to have a backup utility BUILT IN. Especially a simple method of backing up all you important data to an externally attached USB drive. Simple yet effective.
 
Top