9.10.2-U2 to 9.10.2-U3 -> Grub Rescue

Status
Not open for further replies.

cherup

Dabbler
Joined
Nov 21, 2015
Messages
20
I just tried to upgrade to 9.10.2-U3 - and changed after that the boot environment to 9.10.2-U3.
Probably something has not updated correctly, but I cannot boot any longer. Grub will start in the rescue mode with
"i386-pc not found".
I have a raid 1 installed on the USB Boot drived and am not able to do anything (cannot read/write to grub.cfg because of the Raid Volume...)

Would be very thankful if someone can help me out of that issue and help me how to boot the old 9.10.2-U2 configuration.
 

Kim Iskov

Cadet
Joined
Apr 22, 2017
Messages
4
I can confirm that I have got almost similar issue however grub show unknown filesystem.
In grub rescue mode it reconize my two hard drives hd0 and hd1 but it does not reconize the USB drive where FreeNAS is installed.
Yes, I boot on the USB drive.
 
Last edited by a moderator:

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
I have a raid 1
Please explain. If the answer is anything but "Mirrored with ZFS", get rid of it and do it properly.

I can confirm that I have got almost similar issue however grub show unknown filesystem.
In grub rescue mode it reconize my two hard drives hd0 and hd1 but it does not reconize the USB drive where FreeNas is installed.
Yes, I boot on the USB drive.
So it's broken, no big surprise there. Reinstall to new media and upload your config.
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
You seriously mean the USB drive is broke, however the PC actually boot on the drive?
Clearly it did not get very far into the boot process, did it? I wouldn't call that "actually booting". Getting a GRUB prompt needs probably less than 1MB of loading.

USB flash drives are unreliable as hell.
 

Kim Iskov

Cadet
Joined
Apr 22, 2017
Messages
4
Ericloewe said:
Clearly it did not get very far into the boot process, did it? I wouldn't call that "actually booting". Getting a GRUB prompt needs probably less than 1MB of loading.

USB flash drives are unreliable as hell.

You are right so far it did not get very far. Usually the MBR or GPT are located at the first 512b of a drive, however it can be more. I assume FreeNAS does not use more than 512b as it would be more than enough. But that you state the drive is broken is a bit hard statement without even checking any thing. If a drive contain a corrupted partition it might not be recognized by Grub. The partition table is also located in the first 512b.
I actually tent to come with the statement that the update to version 9.10.2-U3 has corrupted the partition table without having checked up on it.
 
Last edited by a moderator:

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
Look, dubious USB flash drives are, by far, the number one cause of failed boots after updates.

The update worked fine for me and several others. It's not going to randomly destroy partition tables. However, USB devices are quite good at failing in interesting ways.

Fortunately, the solution is quick and easy. Install to new media and upload the config.

If you really don't want to throw out the flash drive, feel free to test it afterwards and see if it's usable.
 

cherup

Dabbler
Joined
Nov 21, 2015
Messages
20
No I have checked on my issue a little bit deeper and I am getting worried that my data might be lost (which I do not understand because it is on another array)

As I cannot boot from by USB raid devices, I have installed new (3rd) USB stick with a fresh FreeNAS installation. After booting it seems to have access to my 2 'faulty' usb devices and from the GUI everything looks ok - so /boot is healthy and my Data Pool seems healthy... but all data from the Data pool seems to be deleted...

Once I take out the 3rd USB stick I still cannot boot.... the most important part for me is the data I have stored and access to the jails which I had created... nothing is available right now... any ideas/tips ?
 

Attachments

  • Boot_partition.PNG
    Boot_partition.PNG
    96.2 KB · Views: 315
  • Data_part.PNG
    Data_part.PNG
    422.5 KB · Views: 310
Last edited by a moderator:

cherup

Dabbler
Joined
Nov 21, 2015
Messages
20
So for me the Data/Data/data part is the critical one which is shown now as empty...
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
How about you post your hardware so that we have some clue of what you're doing?
 

cherup

Dabbler
Joined
Nov 21, 2015
Messages
20
How do you figure?

Because I can access the drive but is is empty - and in the volumes I can see that no data in in there (see attachment). I have not done anything with the array after the crash... (like import etc.)

Hardware - I have 2 USB drives with Raid 1 for /Boot and 2 NAS HDD (Raid 1 ) for the Data.
 

Attachments

  • Data_Details.PNG
    Data_Details.PNG
    46.2 KB · Views: 295

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
And the rest of your hardware...?
 

cherup

Dabbler
Joined
Nov 21, 2015
Messages
20
anything specific ?
CPU is Intel(R) Core(TM) i5-3570T CPU @ 2.30GHz
8 GB RAM...
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
Motherboard, disk controller.
 

William Grzybowski

Wizard
iXsystems
Joined
May 27, 2011
Messages
1,754
How much data did you have in Data pool, in GB?

Please paste:

zpool history Data
zpool list Data
 
Last edited by a moderator:

cherup

Dabbler
Joined
Nov 21, 2015
Messages
20
How much data did you have in Data pool, in GB?

Please paste:

zpool history Data
zpool list Data
I guess something about 2 TB...

zpool history Data (not all...)
2017-01-22.02:00:01 zfs snapshot Data/jails/emby_1@auto-20170122.0200-1m
2017-01-22.02:00:01 zfs destroy -r -d Data/jails/emby_1@auto-20161127.0200-1m
2017-01-22.02:00:02 zfs destroy -r -d Data/jails/emby_1@auto-20161218.0200-1m
2017-01-22.02:00:02 zfs destroy -r -d Data/jails/emby_1@auto-20161204.0200-1m
2017-01-22.02:00:07 zfs destroy -r -d Data/jails/emby_1@auto-20161211.0200-1m
2017-01-29.02:00:01 zfs snapshot Data/jails/emby_1@auto-20170129.0200-1m
2017-01-29.02:00:06 zfs destroy -r -d Data/jails/emby_1@auto-20161225.0200-1m
2017-02-05.00:00:09 zpool scrub Data
2017-02-05.02:00:09 zfs snapshot Data/jails/emby_1@auto-20170205.0200-1m
2017-02-05.21:26:30 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 14413126210015302932
2017-02-05.21:26:30 zpool set cachefile=/data/zfs/zpool.cache Data
2017-02-12.02:00:06 zfs snapshot Data/jails/emby_1@auto-20170212.0200-1m
2017-02-13.21:18:14 zfs clone Data/jails/.warden-template-standard@clean Data/jails/transcode
2017-02-13.21:21:09 zfs destroy -fr Data/jails/transcode
2017-02-19.02:00:06 zfs snapshot Data/jails/emby_1@auto-20170219.0200-1m
2017-02-26.02:00:01 zfs snapshot Data/jails/emby_1@auto-20170226.0200-1m
2017-02-26.02:00:06 zfs destroy -r -d Data/jails/emby_1@auto-20170122.0200-1m
2017-03-05.02:00:06 zfs snapshot Data/jails/freebsd@auto-20170305.0200-1m
2017-03-08.20:10:08 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 14413126210015302932
2017-03-08.20:10:23 zpool set cachefile=/data/zfs/zpool.cache Data
2017-03-08.20:14:13 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 14413126210015302932
2017-03-08.20:14:13 zpool set cachefile=/data/zfs/zpool.cache Data
2017-03-12.02:00:06 zfs snapshot Data/jails/freebsd@auto-20170312.0200-1m
2017-03-19.00:00:08 zpool scrub Data
2017-03-19.02:00:08 zfs snapshot Data/jails/freebsd@auto-20170319.0200-1m
2017-03-26.03:00:06 zfs snapshot Data/jails/freebsd@auto-20170326.0300-1m
2017-04-02.03:00:06 zfs snapshot Data/jails/freebsd@auto-20170402.0300-1m
2017-04-09.02:00:06 zfs destroy -r -d Data/jails/freebsd@auto-20170305.0200-1m
2017-04-09.03:00:06 zfs snapshot Data/jails/freebsd@auto-20170409.0300-1m
2017-04-14.21:04:39 zfs clone Data/jails/.warden-template-pluginjail@clean Data/jails/mineos_1
2017-04-16.02:00:06 zfs destroy -r -d Data/jails/freebsd@auto-20170312.0200-1m
2017-04-16.03:00:06 zfs snapshot Data/jails/freebsd@auto-20170416.0300-1m
2017-04-22.09:54:34 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 14413126210015302932
2017-04-22.09:54:34 zpool set cachefile=/data/zfs/zpool.cache Data
2017-04-22.09:59:39 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 14413126210015302932
2017-04-22.09:59:49 zpool set cachefile=/data/zfs/zpool.cache Data
2017-04-22.10:00:39 zfs set mountpoint=none Data/jails/.warden-template-pluginjail
2017-04-22.10:00:39 zfs rename -f Data/jails/.warden-template-pluginjail Data/jails/.warden-template-pluginjail--x86
2017-04-22.10:00:39 zfs set mountpoint=/Data/jails/.warden-template-pluginjail--x86 Data/jails/.warden-template-pluginjail--x86
2017-04-22.10:00:40 zfs set mountpoint=none Data/jails/.warden-template-standard
2017-04-22.10:00:40 zfs rename -f Data/jails/.warden-template-standard Data/jails/.warden-template-standard--x86
2017-04-22.10:00:45 zfs set mountpoint=/Data/jails/.warden-template-standard--x86 Data/jails/.warden-template-standard--x86
2017-04-22.15:31:51 zpool import -o cachefile=none -R /mnt -f 14413126210015302932
2017-04-22.15:32:01 zpool set cachefile=/data/zfs/zpool.cache Data
2017-04-22.15:44:41 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 14413126210015302932
2017-04-22.15:44:41 zpool set cachefile=/data/zfs/zpool.cache Data
2017-04-22.17:05:53 zpool import -o cachefile=none -R /mnt -f 14413126210015302932
2017-04-22.17:05:53 zpool set cachefile=/data/zfs/zpool.cache Data

zpool list Data:
[root@freenas] ~# zpool list Data
NAME SIZE ALLOC FREE EXPANDSZ FRAG CAP DEDUP HEALTH ALTROOT
Data 3.63T 32.7G 3.60T
 

SweetAndLow

Sweet'NASty
Joined
Nov 6, 2013
Messages
6,421
The upgrade you did that very few changes and did not cause your boot issues. Your system has a usb failure. You also keep saying raid 1. Do you actually have raid setup? You are also avoiding questions and not telling the whole story. You need to explain how you come to some of these conclusions you have.

Sent from my Nexus 5X using Tapatalk
 

Kim Iskov

Cadet
Joined
Apr 22, 2017
Messages
4
Ericloewe said:
Look, dubious USB flash drives are, by far, the number one cause of failed boots after updates.
A statement based on probability do not conclude but only indicate if it could be right or wrong. Yes USB drives are often dubious compared to HDD and SSD. I actually think this is the only conclusion that is possible to make out of your statement.

Ericloewe said:
Fortunately, the solution is quick and easy. Install to new media and upload the config.
Let's assume the partition table is broken and not the drive. Could you be more descriptive how to read/write to the drive with the broken partition table without doing anything else?
I know how to copy the config file from a working drive, so this part you can leave out!

UPDATE:
I can now confirm that the drive is working only the GPT is corrupted. I dont know the reason. But I can recreate the issue by installing 9.10.2-U2 and upgrade it to U3. After U3 has been activated and FreeNAS reboot the GPT get corrupted. I even tried to do this on another working USB drive and the issue was the same. As I said I don't know the reason but at least I manged to get the USB drive up running again.
 
Last edited by a moderator:

cherup

Dabbler
Joined
Nov 21, 2015
Messages
20
Ok, after some more investigations it seems that my data is gone... still not sure what was the reason, but I would like to understand if I make a mistake and how to behave in such a situation...

So here are the steps what I have done:

1. I update my freenas to the latest version U3
2. Reboot after that was successful
3. I recognized that boot environment is still on U2 - so I switched the boot environemnt to latest U3
4. System have not booted any more (Grub Rescue Mode)
5. As I was not able to boot, I installed a new freenas U3 installation on an new usb stick (the old ones are still in the system) and I have booted from the new stick
6. During boot an automatic "zpool import" was started. Here it seems to have a problem because booting stops and console ends with a "db>" shell (as I remember)
7. After new reboot system booted and my data zpool array and server configuration was available, but no data was existing any longer (see screenshots earlier provided)

zpool status shows "Healthy" for my Data pool - but as no data in in here it won't help me.
My unstanding was that the data zpool is independant from any boot issues so that even if there is a boot problem the data partiations could not have any problems...
I am not sure what exactly went wrong here, so maybe someone can help me avoiding such a problem once it comes again in the future... of course if anybody has an information how to get the old data would be perfect, but I am not very optimistic here...
 
Status
Not open for further replies.
Top