Backup config file every night automatically!

CrossEye

Cadet
Joined
Oct 23, 2013
Messages
5
I love the fact that there are multiple ways to skin this cat. Personally I like being able to log into the WebGUI and viewing the Tasks / Cron Jobs and seeing the contents of all the scripts running on that NAS vs calling a script from the Cron Jobs and not being able to see the contents of that script from the WebGUI.

So for those that would like a script that can run inside the WebGUI and with a retention window (currently 90 days),

Code:
cp /data/freenas-v1.db /mnt/volume/backups/config-`hostname`-`cat /etc/version | awk -F'[- ()]' '{print $2 "_" $4}'`-`date "+%Y%m%d%H%M"`.db && find /mnt/volume/backups/* -mtime +90 -type f -delete

Outputting: config-hostname.internal.domain.com-9.10.1_d989edd-201608250252.db
 

Xyrgh

Explorer
Joined
Apr 11, 2016
Messages
69
Just ran across this post and thought I'd throw my own solution in. I call this one from cron as Cyberjock outlined above.

This one will keep as many versions of the freenas-v1.db file as are specified by the VERSIONS variable. The script also sends an email of the backup results, optionally attaching a uuencoded copy of the db file.

I normally use mutt for this sort of thing because it can send MIME attachments, but I see it is not installed by default on FreeNAS.

Code:
#!/bin/bash

SCRIPT=${0##*/}
HOST=$(hostname -s)
SRCEFILE=/data/freenas-v1.db
DESTDIR=/mnt/share/admin/backups
DESTFILE=${DESTDIR}/freenas-v1_$(date +%Y%m%d%H%M%S).db
LOGDIR=/mnt/share/admin/logs
LOGFILE=${LOGDIR}/${SCRIPT}_$(date +%Y%m%d%H%M%S).log
RECIPIENT=me@mycompany.com
MAILCONFIG=n
VERSIONS=30

[ -d ${DESTDIR} ] || mkdir -p ${DESTDIR}

_log() { printf "$(date): ${*}\n" >>${LOGFILE} 2>&1; }

_mail() { mailx -s "${SUBJECT}" ${RECIPIENT} < ${LOGFILE}; }

if cp ${SRCEFILE} ${DESTFILE}; then
    _log "PASS: cp ${SRCEFILE} ${DESTFILE}."
else
    SUBJECT="FAIL: ${HOST}: ${SCRIPT}: cp ${SRCEFILE} ${DESTFILE}."
    _log "${SUBJECT}"
    _mail
    exit 1
fi

DATAFILES=$(ls -1rt ${DESTFILE%_*}*)
LOGFILES=$(ls -1rt ${LOGDIR}/${SCRIPT}_*)
for FILES in "${DATAFILES}" "${LOGFILES}"; do
    NUM=$(echo "${FILES}" | wc -l)
    if [ ${NUM} -gt ${VERSIONS} ]; then
        TODELETE=$((${NUM} - ${VERSIONS}))
        for file in $(echo "${FILES}" | head -${TODELETE}); do
            if rm -f ${file}; then
                _log "PASS: rm -f ${file}."
            else
                _log "FAIL: rm -f ${file}."
            fi
        done
    fi
done

if grep "FAIL" ${LOGFILE} >/dev/null 2>&1; then
    RESULT=FAIL
else
    RESULT=PASS
fi

if [ "${MAILCONFIG:=n}" = "y" ]; then
    printf "\n" >>${LOGFILE}
    uuencode ${DESTFILE} ${DESTFILE##*/} >>${LOGFILE}
fi

SUBJECT="${RESULT}: ${HOST}: ${SCRIPT}."
_mail

Hey mate, I'm using your script as it suits what I want, but can you tell me what I need to edit so it doesn't send emails? I don't care much for the emails so want to turn them off. Maybe even adding a flag so that email is optional?
 

KevinM

Contributor
Joined
Apr 23, 2013
Messages
106
Hey mate, I'm using your script as it suits what I want, but can you tell me what I need to edit so it doesn't send emails? I don't care much for the emails so want to turn them off. Maybe even adding a flag so that email is optional?
Just comment out the _mail function at the bottom of the script, like this: #_mail
 

Jacopx

Patron
Joined
Feb 19, 2016
Messages
367
Yeah, I think the parentheses and/or spaces are causing problems. Try my script located earlier in this thread, it strips parentheses and replaces the spaces with a '-'.

EDIT: I've posted the script here on pastebin.

Thanks! I have pick and edit a little your code! Thanks! [emoji6]
 

Convict

Dabbler
Joined
Aug 3, 2016
Messages
20
I have a little problem here...

I just followed step by step cyberjoke's tutorial but nothing happens.
And with nothing happens what i mean...

  1. Created a file in a new dataset called Config_Bkp the exact path is /mnt/Pool_A/Config_Bkp/bkpconfig.sh
  2. Line in file bkpconfig.sh --- cp /data/FreeNAS-v1.db /mnt/Pool_A/Config_Bkp/`date +%Y%m%d`.db
  3. Create a cron job with user root and command used --- sh /mnt/Pool_A/Config_Bkp/bkpconfig.sh.
  4. After job done i ran the script but no file was created in dataset.
Another thing i did was to run the command directly from shell but it returns No such file or directory.

My current built is FreeNAS-9.10.1-U4

Any ideas?
 

Glorious1

Guru
Joined
Nov 23, 2014
Messages
1,211
Does the script run if you execute directly (not through cron)?
Does the command itself work if you execute it directly (not in a script)?
I would suggest cutting out the middleman, just put the working cp command straight into a cron task, forget the script.
 

Convict

Dabbler
Joined
Aug 3, 2016
Messages
20
Does the script run if you execute directly (not through cron)?
Does the command itself work if you execute it directly (not in a script)?
I would suggest cutting out the middleman, just put the working cp command straight into a cron task, forget the script.

Nope, same behavior even without the script.

The path of FreeNAS-v1.db file is correct?

cp /data/FreeNAS-v1.db right? Or is stored somewhere else. That's why it gives me back " no such file or directory "
 
Last edited:

Spearfoot

He of the long foot
Moderator
Joined
May 13, 2015
Messages
2,478
Nope, same behavior even without the script.

The path of FreeNAS-v1.db file is correct?

cp /data/FreeNAS-v1.db right? Or is stored somewhere else. That's why it gives me back " no such file or directory "
Case matters... in my backup script, the .db filename is all lower case, not /data/FreeNAS-v1.db.

You can determine the .db filename by pulling a listing of the /data directory contents: ls /data
 

Convict

Dabbler
Joined
Aug 3, 2016
Messages
20
Case matters... in my backup script, the .db filename is all lower case, not /data/FreeNAS-v1.db.

You can determine the .db filename by pulling a listing of the /data directory contents: ls /data

Path of /data/freenas-v1.db is right.

In my Case was the problem.
Never thought it could make so much noise to be honest.
Work like a charm.
It was the only tweak missing from my setup - Daily Backup of Config

Your advice was valuable.

Thank you.
 
Last edited:

Scharbag

Guru
Joined
Feb 1, 2012
Messages
620
So if you wanted to copy your configuration and then delete all but the most recent 5 days (120 hours to be exact) all on the same CRON job:

Code:
cp /data/FreeNAS-v1.db "/mnt/tank/`date \+%c`.db" && find /mnt/tank/ -mtime +5 -exec rm {} \;

Yes, I didn't do anything special, just joined the two commands where the first one must finish OK before the second one runs. I could have used a semicolon vice the double ampersands but then if the copy fails, you could run into a situation where you have some crap partial files and you are deleting the older good ones. It's personal preference. Also I didn't care for just a simple date code so I used a different format for the file name. Instead of "20150527.db" my file name is "Wed May 27 14:11:26.db". Of course you can customize your own name as well, it's easy enough. Also if lets say you ran this CRON more than once a day (lord knows why someone would) but it will not overwrite the same name file.

If you wanted to change this to a different value for the days being retained, just change the "+5" value to the number of 24 hour periods you would like. You can also specify it by number of weeks, this would give you 1 week (168 hours from the time you run the command)
Code:
cp /data/FreeNAS-v1.db "/mnt/tank/`date \+%c`.db" && find /mnt/tank/ -mtime +1w -exec rm {} \;


I wanted to give you a true 5 backups and not 120 hours but my daughter is bugging me to make dinner so I have to stop now to ensure she doesn't starve.

EDIT: After thinking about this over night I thought to myself, why 5 days or 1 week, why not 30 days. Lets face it, by the time you realize something is wrong, it could be a long time. I'd change the deletion to allow you to maintain as many copies as you can handle. I'm going with 30 days for my system. Someone else might want to elect for 60 or 90 days. The files are not that large to store.

Simpler syntax IMHO is:

find /path/to/db/backup/location/*.db -type f -mtime +30d -delete

Does the same thing but uses the built in delete capabilities of the find command :)

Cheers,
 

diedrichg

Wizard
Joined
Dec 4, 2012
Messages
1,319
After upgrading to 9.10.2, I started receiving
Code:
cp: (a476f16)_.db is not a directory
and my database was not being saved.
The following fixes that.

I decided to implement a script in post #73 but change a few things. I took out the random number - it was generating an error. I modified the delete line to 90 days and commented the hostname line and removed it from the output.

This script outputs: 2017-01-06-FreeNAS-9.10.2 (a476f16).db
Credit to @Tom_ & @Spearfoot
Code:
#!/bin/bash

SRC_FILE="/data/freenas-v1.db"

# name the backup file
DEST_DIR="/mnt/dvgmar/FreeNAS_Config_Backups"
# DEST_HOST=$(hostname -s)
DEST_VERSION=$(cat /etc/version)
DEST_DATE=$(date +%Y-%m-%d)
DEST_FILE="$DEST_DATE"-"$DEST_VERSION".db

# make a copy
cp "$SRC_FILE" "$DEST_DIR"/"$DEST_FILE"

# remove files older than 90 days ending in ".db"
$(find "$DEST_DIR" -type f -mtime +90d -name "*.db" -exec bash -c 'rm {}' \; )
 

Ryan Allen

Explorer
Joined
Oct 11, 2016
Messages
93
Probably a silly question.. but.. I've just starting messing around with scripts over the past few weeks. I have the ones that I wont to use working for me.. so that's good... but one question i still have is....
When i make a directory via SSH ' mkdir', and then use it to store my scripts, ie.. ' mkdir /mnt/TestDrive/.scripts', I am only able to see that directory using SSH. I do not see it as a Dataset on the Web Interface. I would like to be able to copy script files easier via a Shared Dataset from a PC, not just using SSH.

Do i need to make this folder with the Web Interface and make it a Share like other Datasets I access from a PC or do scripts have to run in a directory only able to be accessed via Shell or SSH?

Thanks in advance everyone for the help!
 

Spearfoot

He of the long foot
Moderator
Joined
May 13, 2015
Messages
2,478
Probably a silly question.. but.. I've just starting messing around with scripts over the past few weeks. I have the ones that I wont to use working for me.. so that's good... but one question i still have is....
When i make a directory via SSH ' mkdir', and then use it to store my scripts, ie.. ' mkdir /mnt/TestDrive/.scripts', I am only able to see that directory using SSH. I do not see it as a Dataset on the Web Interface. I would like to be able to copy script files easier via a Shared Dataset from a PC, not just using SSH.

Do i need to make this folder with the Web Interface and make it a Share like other Datasets I access from a PC or do scripts have to run in a directory only able to be accessed via Shell or SSH?

Thanks in advance everyone for the help!
Yes, you should use the web interface and create a dataset for your scripts. I use one called 'sysadmin'.

NOTE: Since you've been creating directories outside of the web interface, it would be a good idea to save off a copy of any files in these directories before 'recreating' them.
 

Ryan Allen

Explorer
Joined
Oct 11, 2016
Messages
93
Yes, you should use the web interface and create a dataset for your scripts. I use one called 'sysadmin'.

NOTE: Since you've been creating directories outside of the web interface, it would be a good idea to save off a copy of any files in these directories before 'recreating' them.
Ok great to hear!

And what do you mean by "save off a copy"? Like go to the script file and COPY the script and paste it somewhere on my PC before making a new one? I'm not good with command lines just yet... still learning.
 

Spearfoot

He of the long foot
Moderator
Joined
May 13, 2015
Messages
2,478
Ok great to hear!

And what do you mean by "save off a copy"? Like go to the script file and COPY the script and paste it somewhere on my PC before making a new one? I'm not good with command lines just yet... still learning.
I mean save a copy to your PC's hard drive. Or someplace else. Just so you'll have a copy in case something bad happens...
 

Xyrgh

Explorer
Joined
Apr 11, 2016
Messages
69
I save all my scripts to Google Drive and to my own github repo, purely in case I lose them. Some of the scripts I've had for years and would have no idea how to get them back.
 

Ryan Allen

Explorer
Joined
Oct 11, 2016
Messages
93
Thanks guys! I'll make some changes right away! I appreciate all of you helping out people like me that don't know much... :)
 

Glorious1

Guru
Joined
Nov 23, 2014
Messages
1,211
When i make a directory via SSH ' mkdir', and then use it to store my scripts, ie.. ' mkdir /mnt/TestDrive/.scripts', I am only able to see that directory using SSH. I do not see it as a Dataset on the Web Interface. I would like to be able to copy script files easier via a Shared Dataset from a PC, not just using SSH.
I have a dataset that is also my user's home directory. It is shared so that I can access it with my Mac GUI-wise. There is a subdirectory (not a dataset) called bin that has my scripts. You can make such a directory in your Windows GUI or in SSH, as you did before. Since it is in your shared directory, it will be accessible through your share.

For me this is very convenient. When I go in via SSH, I land in my home directory and have ready access to scripts. Same thing when I access the share in the Finder, scripts are right there, so I can copy to my Mac or edit with TextWrangler.
 

Ryan Allen

Explorer
Joined
Oct 11, 2016
Messages
93
Perfect! I will set it up that way as well! Thank you very much!
 

Faluzure

Explorer
Joined
Oct 9, 2014
Messages
67
Anyone know where the new DB is stored in Corral?
 
Top