Lost Data in Upgrade to 11.2

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194

CheckYourSix

Dabbler
Joined
Jun 14, 2015
Messages
19
Well, add me to the list too. I upgraded from 11.1-U5 to 11.2-U2 via the web UI. It downloaded, did its thing, rebooted, and never came back up. The bootloader got messed up during the process. I pulled one of the USB sticks, mounted it to a VM, pulled my config DB from it, did a fresh install of 11.1-U7 on the other USB stick, and imported my config. Once I imported the pool, it showed 39.9 TiB free (the entire pool) and only 1 GB used. All of my data is gone. Luckily I have backups of the critical stuff, but I'll have to get the media and other things again.
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
Do I understand correctly that you did not have snapshots?
 

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
The fact that FreeNAS is wiping data on upgrades is troublesome, to say the least--more so since it doesn't seem that the cause has been isolated. But it boggles my mind that people don't set up snapshots on their servers--they're quick, they're simple, and they shouldn't take a great deal of storage space if you're managing things properly.
 

cods69

Explorer
Joined
Sep 11, 2016
Messages
50
The fact that FreeNAS is wiping data on upgrades is troublesome, to say the least--more so since it doesn't seem that the cause has been isolated. But it boggles my mind that people don't set up snapshots on their servers--they're quick, they're simple, and they shouldn't take a great deal of storage space if you're managing things properly.
Has anyone that has lost all data in these upgrades, actually been able to recover everything via the snapshots?
Just asking a dumb question, or more so wondering if the (local) snapshots are vanishing as well.
 
Joined
Jan 4, 2014
Messages
1,644
it doesn't seem that the cause has been isolated
As a precaution, would it help to detach all volumes before attempting an update from legacy FreeNAS to 11.2? If it would, it might be sensible to issue an advisory to that effect.
 
Last edited:

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
Has anyone that has lost all data in these upgrades, actually been able to recover everything via the snapshots?
Yes, and I think that's been noted in this thread--anyone who had snapshots has been able to roll back to those.
would it help to detach all volumes before attempting an update from legacy FreeNAS to 11.2?
I'd expect it would, but I don't know where the downloaded data gets stored. OTOH, "do a recursive snapshot of your entire pool" seems like good advice.
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
I don't know where the downloaded data gets stored.
System dataset, it turns out. That will probably mean "boot device", if the main pool is gone.
 

ben-efiz

Cadet
Joined
Feb 21, 2019
Messages
9
Has anyone that has lost all data in these upgrades, actually been able to recover everything via the snapshots?
Just asking a dumb question, or more so wondering if the (local) snapshots are vanishing as well.
Yes, i was able to recover from snapshots. Snapshot data was completely ok (in my case just not up-to-date, but yeah, thats a learning doing updates).

EDIT: also see my other thread https://www.ixsystems.com/community...er-upgrade-to-11-2-u2-and-volume-import.74056
 
Last edited:

SynbiosVyse

Dabbler
Joined
May 9, 2016
Messages
17
As a precaution, would it help to detach all volumes before attempting an update from legacy FreeNAS to 11.2? If it would, it might be sensible to issue an advisory to that effect.

Is there an argument as to why this *isn't* default behavior by the update scripts?
 

Daryle

Dabbler
Joined
Jan 26, 2017
Messages
13
Throw my hat into the ring. I upgraded last night and everything seemed to be ok. This morning when I sat down to tinker there was a screen saying "Connecting to NAS.... Make sure the NAS system is powered on..." I rebooted and on my server console it would just hang saying it couldn't find /etc/netcli.sh. Over the course of several hours I tried to fall back to 11.1u7. Ended up having to install a fresh 11.2 via an ISO. Once that was complete I imported the volumes but nothing is in them. I have 2 pools and both show space being used.
 

nojohnny101

Wizard
Joined
Dec 3, 2015
Messages
1,478
This thread is growing and is alarming. I would think this would get top priority from within the dev team to figure why this is happening. Any response from iXsystems?
 

SynbiosVyse

Dabbler
Joined
May 9, 2016
Messages
17
Throw my hat into the ring. I upgraded last night and everything seemed to be ok. This morning when I sat down to tinker there was a screen saying "Connecting to NAS.... Make sure the NAS system is powered on..." I rebooted and on my server console it would just hang saying it couldn't find /etc/netcli.sh. Over the course of several hours I tried to fall back to 11.1u7. Ended up having to install a fresh 11.2 via an ISO. Once that was complete I imported the volumes but nothing is in them. I have 2 pools and both show space being used.

If the zpools are showing space being used, then it might be a different problem.
 

Daryle

Dabbler
Joined
Jan 26, 2017
Messages
13
If the zpools are showing space being used, then it might be a different problem.

I'm at work and can't post the actual data output, but when I do a
Code:
df -h
the used space reports back as 0% for each folder. When I do a
Code:
zfs list
(I think that was the command) it displays the space as being used and the sizes are what I would expect.
I was taking snapshots and those are still present, but show no signification data size.
I checked the
Code:
zpool history
and this didn't show anything weird. Admittedly I am not familiar with this log but what I did see during the upgrade time and after was similar to previous logged events. I also checked to see if it was possible the pools were mounted in a folder other than /mnt

What strikes me the most, if the arrays were formatted or somehow reconfigured I would imagine all the datasets would also be blitzed. I had 2 storage pools, both RAIDZ1, and both have the same issue. Could the recent update have change some meta on the disk headers?

I do have an onsite replication so my stuff is backed up. I'm very curious to understand what caused this issue, and if possible, there's a way to recover. I ultimately performed a clean install yesterday but haven't touched much since. If there's anything I can upload to the case that will help the dev's, let me know and I will do so.
 

SynbiosVyse

Dabbler
Joined
May 9, 2016
Messages
17
The general consensus is that the deletions come from a POSIX command (eg., "rm -rf /"). It's likely your snapshots are still taking up space on the zpool because they contain 100% of your data. If you restore your snapshot you should find the data. But be prepared to restore from backups in the back of your mind.

We have yet to encounter a person who has enough mirrored/backedup so that the issue can be duplicated.
 

Daryle

Dabbler
Joined
Jan 26, 2017
Messages
13
The general consensus is that the deletions come from a POSIX command (eg., "rm -rf /"). It's likely your snapshots are still taking up space on the zpool because they contain 100% of your data. If you restore your snapshot you should find the data. But be prepared to restore from backups in the back of your mind.

We have yet to encounter a person who has enough mirrored/backedup so that the issue can be duplicated.

Tonight when I get home I'll try and see if I can rollback a snapshot.

I'd need a couple days to copy the data from my local but once its done, I could try the process again. If that would help?
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
I'm at work and can't post the actual data output, but when I do a
Code:
df -h
the used space reports back as 0% for each folder. When I do a
Code:
zfs list
(I think that was the command) it displays the space as being used and the sizes are what I would expect.
What you should be looking at is zfs list -o space -r [I]poolname[/I], where poolname is obviously the name of your pool. The space shortcut lists the most relevant space usage data.

I could try the process again. If that would help?
Possibly, nobody's been able to reproduce this with any semblance of repeatability. I would like to ask that you capture the console output and, if possible, set up a syslog server to try and preserve the logs. The devs also have some requests in the bug ticket for this.
 

Daryle

Dabbler
Joined
Jan 26, 2017
Messages
13
What you should be looking at is zfs list -o space -r [I]poolname[/I], where poolname is obviously the name of your pool. The space shortcut lists the most relevant space usage data.


Possibly, nobody's been able to reproduce this with any semblance of repeatability. I would like to ask that you capture the console output and, if possible, set up a syslog server to try and preserve the logs. The devs also have some requests in the bug ticket for this.

Here is what I see when running
zfs list -o space -r [I]poolname[/I]
Code:
root@freenas[~]# zfs list -o space -r mediapool
NAME                                                        AVAIL   USED  USEDSNAP  USEDDS  USEDREFRESERV  USEDCHILD
mediapool                                                   3.26T  7.28T         0    166K              0      7.28T
mediapool/.bhyve_containers                                 3.26T   141K         0    141K              0          0
mediapool/.system                                           3.26T  61.8M         0    166K              0      61.7M
mediapool/.system-1eaf0a35                                  3.26T  1.03G         0   1.03G              0          0
mediapool/.system/configs-810048d7feed436fae88f4409435135f  3.26T  4.62M         0   4.62M              0          0
mediapool/.system/configs-9d613bc4d69d4caa9ab03b2439285b53  3.26T   141K         0    141K              0          0
mediapool/.system/configs-c2484c3f88124e51b79845e4fb993a70  3.26T   268K         0    268K              0          0
mediapool/.system/cores                                     3.26T  12.4M         0   12.4M              0          0
mediapool/.system/rrd-810048d7feed436fae88f4409435135f      3.26T   153K         0    153K              0          0
mediapool/.system/rrd-9d613bc4d69d4caa9ab03b2439285b53      3.26T  12.8M         0   12.8M              0          0
mediapool/.system/rrd-c2484c3f88124e51b79845e4fb993a70      3.26T  16.8M         0   16.8M              0          0
mediapool/.system/samba4                                    3.26T   390K         0    390K              0          0
mediapool/.system/syslog-810048d7feed436fae88f4409435135f   3.26T  13.2M         0   13.2M              0          0
mediapool/.system/syslog-9d613bc4d69d4caa9ab03b2439285b53   3.26T   454K         0    454K              0          0
mediapool/.system/syslog-c2484c3f88124e51b79845e4fb993a70   3.26T   377K         0    377K              0          0
mediapool/.system/webui                                     3.26T   141K         0    141K              0          0
mediapool/HomeDir                                           3.26T   141K         0    141K              0          0
mediapool/iocage                                            3.26T  4.82M         0   4.00M              0       844K
mediapool/iocage/download                                   3.26T   141K         0    141K              0          0
mediapool/iocage/images                                     3.26T   141K         0    141K              0          0
mediapool/iocage/jails                                      3.26T   141K         0    141K              0          0
mediapool/iocage/log                                        3.26T   141K         0    141K              0          0
mediapool/iocage/releases                                   3.26T   141K         0    141K              0          0
mediapool/iocage/templates                                  3.26T   141K         0    141K              0          0
mediapool/jails                                             3.26T   626M         0    153K              0       626M
mediapool/jails/.warden-template-pluginjail-11.0-x64        3.26T   625M      621M   3.68M              0          0
mediapool/jails/nextcloud_1                                 3.26T   396K         0    396K              0          0
mediapool/jails/plexmediaserver_1                           3.26T   435K         0    435K              0          0
mediapool/shared_3d                                         3.26T   122G      122G    179K              0          0
mediapool/shared_data                                       3.26T   378G      378G    153K              0          0
mediapool/shared_movies                                     3.26T  5.47T     5.47T    243K              0          0
mediapool/shared_music                                      3.26T  78.4G     78.4G    230K              0          0
mediapool/shared_pictures                                   3.26T   179G      179G    141K              0          0
mediapool/shared_tv                                         3.26T  1.06T     1.06T    192K              0          0
root@freenas[~]#


Code:
root@freenas[~]# zfs list -o space -r personalpool
NAME                                                           AVAIL   USED  USEDSNAP  USEDDS  USEDREFRESERV  USEDCHILD
personalpool                                                   6.88T   163G         0   26.6K              0       163G
personalpool/.system                                           6.88T  4.43M         0   32.0K              0      4.40M
personalpool/.system/configs-c2484c3f88124e51b79845e4fb993a70  6.88T  29.3K         0   29.3K              0          0
personalpool/.system/cores                                     6.88T   310K         0    310K              0          0
personalpool/.system/rrd-c2484c3f88124e51b79845e4fb993a70      6.88T  3.91M         0   3.91M              0          0
personalpool/.system/samba4                                    6.88T  66.6K         0   66.6K              0          0
personalpool/.system/syslog-c2484c3f88124e51b79845e4fb993a70   6.88T  65.3K         0   65.3K              0          0
personalpool/.system/webui                                     6.88T  29.3K         0   29.3K              0          0
personalpool/.vm_cache                                         6.88T   117K         0   29.3K              0      87.9K
personalpool/.vm_cache/boot2docker                             6.88T  87.9K         0   29.3K              0      58.6K
personalpool/.vm_cache/boot2docker/initrd                      6.88T  29.3K         0   29.3K              0          0
personalpool/.vm_cache/boot2docker/vmlinuz64                   6.88T  29.3K         0   29.3K              0          0
personalpool/DBCloud                                           6.88T  22.3G     22.3G   30.6K              0          0
personalpool/images                                            6.88T  29.3K         0   29.3K              0          0
personalpool/personal                                          6.88T   120G      120G   34.6K              0          0
personalpool/vm-storage                                        6.88T  20.3G         0   29.3K              0      20.3G
personalpool/vm-storage/DBixHDD                                6.89T  20.3G         0   3.51G          16.8G          0
root@freenas[~]#
 
Top