ESXi 7.0.b UEFI

Jacoub

Dabbler
Joined
Sep 4, 2020
Messages
14
Tried loading TrueNAS 12 beta 2 &2.1 on SuperMicro Dual E-2690 v3 CPU running ESXi 7.0b in UEFI mode. the installation went smooth but the VM will never boot. if i will assign 32 cores to the VM it will run smooth
in Legacy BIOS mode VM loads on 4 core
any suggestions
 

Evertb1

Guru
Joined
May 31, 2016
Messages
700
I haven't time to verify it but I seem to recall that the Bios firmware is default on the VM's. You should be able to edit the .vmx file of that VM to enable UEFI. Open the .vmx file with an editor an add a line like:
Code:
firmware = "efi"

I am running ESXi 7.0 with a TrueNAS 12 beta VM as well and did not experience this problem. So I am not 100% sure this is the solution. But I have read something about it somewhere. Altering the .vmx file is not difficult so it's worth a shot. You can always change it back. If you don't know how to open and edit the.vmx file you can always download a tool like vmtweaker. Never used it but it seems to do the trick.

Edit: The FreeNAS forum has a "FreeNAS & Third-part Software" sub-forum. There is a section there dedicated to Hypervisors like VMware. If you are not able to solve this you might try your luck there.
 
Last edited:

Jacoub

Dabbler
Joined
Sep 4, 2020
Messages
14
thanks for your reply. I can install in both modes UEFI & BIOS but I have to specify "EFI or BIOS while creating the VM". But if chose EFI Boot in the VM & in order that UEFI VM to boot I have to assign a high core count like 32 Cores or above. While if I created a Legacy BIOS boot VM I can boot with 4 Cores.
The question is not how to boot EFI or BIOS since I'm experienced on this the issue is EFI VM requires 32 cores which I find strange
I hope I was able to explain myself.
 

Jacoub

Dabbler
Joined
Sep 4, 2020
Messages
14
I'm attaching the error message
 

Attachments

  • Error.PNG
    Error.PNG
    40.4 KB · Views: 399

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,681
in order that UEFI VM to boot I have to assign a high core count like 32 Cores or above. While if I created a Legacy BIOS boot VM I can boot with 4 Cores.
The question is not how to boot EFI or BIOS since I'm experienced on this the issue is EFI VM requires 32 cores which I find strange

So if you do "Thing A," which almost everyone does, it runs with 4 cores.

And if you do "Thing B," which almost nobody does, it needs 32 cores.

If you're "experienced on this issue," then why is there even a question here?

Virtualizing FreeNAS is not going to work 100% with any random settings you decide to pick. You actually do need to use compatible settings....
 

JFisher

Cadet
Joined
Dec 2, 2016
Messages
4
I experienced the same thing attempting to boot FreeNAS 12 on VMWare with UEFI, while the same settings worked fine booting FreeNAS 11. I tried both VMWare 6.7 and 7.0. I also tried taking a working FreeNAS 11 and upgrading to 12 which would result in an unbootable state. Not sure what the deal was but definitely something that wasn't an issue before.
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,176
Can you be more specific? I'm not sure I've seen a failure mode described in this thread. Does the boot process seemingly hang (if so, can you try the serial console to see if it's just the virtual VGA? )? Do you get an error message? And adding cores magically resolves the issue?
 

JFisher

Cadet
Joined
Dec 2, 2016
Messages
4
I'd have to rebuild the VM and try it again, at the moment I've switched back to a bare metal install for testing. From what I recall of the behavior the boot sequence would barely get started before the VM would just error and shut down.
 

vangoose

Cadet
Joined
Aug 31, 2020
Messages
4
I experienced the same thing attempting to boot FreeNAS 12 on VMWare with UEFI, while the same settings worked fine booting FreeNAS 11. I tried both VMWare 6.7 and 7.0. I also tried taking a working FreeNAS 11 and upgrading to 12 which would result in an unbootable state. Not sure what the deal was but definitely something that wasn't an issue before.

If you select FreeBSD 12 as the guest OS, it will automatically set it to BIOS, and there is a reason for it, right?
 

Jacoub

Dabbler
Joined
Sep 4, 2020
Messages
14
If you select FreeBSD 12 as the guest OS, it will automatically set it to BIOS, and there is a reason for it, right?
But you can still boot UEFI on bare metal hence the fact it defaults to BIOS is irrelevant. As far as I know it is a FreeBSD issue & I tried ESXI 7.0 U1 Beta with TrueNAS 12 RC1 still same issue
 

irenedakota

Cadet
Joined
Dec 14, 2018
Messages
1
Can confirm that I saw the same thing with TrueNAS 12 release. Had FreeNas 11.3 U5 running perfectly on an ESXi 7.0 (Version ESXi-7.0b-16324942) VM with 4 cores (I'm running ESXi free so can't test at 32 cores) and EFI mode, upgrading to TrueNAS 12 saw it refusing to boot. Tried a fresh install as well, the same symptom.

Exact same output as https://www.truenas.com/community/threads/esxi-7-0-b-uefi.87396/post-605992

What happens is the VM begins booting, gets to that display and then the VM switches off. No logs or other clues in ESXi to point to a possible cause.

Switching to BIOS mode (In ESXi, and then reinstall TrueNAS) works 100%.

Obviously not something critical since there is a simple workaround, but still something really weird.
 

joeschmuck

Old Man
Moderator
Joined
May 28, 2011
Messages
10,972
Yesterday I installed TrueNAS 12.0-U1 on my ESXi 7.0-U1c server (2 sockets w/1 Core per socket, 16GB RAM) and of course used BIOS as I always do (UEFI just too problematic for me ever since it was introduced). No issues at all, not even a slight problem. TrueNAS reports that I have 2 threads. I'm not sure where the need for 4 cores came from.
 

Jacoub

Dabbler
Joined
Sep 4, 2020
Messages
14
Yesterday I installed TrueNAS 12.0-U1 on my ESXi 7.0-U1c server (2 sockets w/1 Core per socket, 16GB RAM) and of course used BIOS as I always do (UEFI just too problematic for me ever since it was introduced). No issues at all, not even a slight problem. TrueNAS reports that I have 2 threads. I'm not sure where the need for 4 cores came from.
This issue is not related to BIOS enabled VM it is specific to EFI enabled VM
 

-cj-

Cadet
Joined
Apr 6, 2014
Messages
8
If anyone needs help on working around the bug that TripleEmcoder referenced above, here's what I did:

First, I tried the TrueNAS nightlies which apparently doesn't have the fixed EFI image. So what I did was, install TrueNAS, select UEFI when prompted, and when done let it reboot. You'll notice that the VM powers off due to this issue.

I noticed that pFsense 2.5 installed just fine using EFI firmware in VMware, so what I did was, mount the pFsense ISO in VMWare and boot into it. When it loads, select Rescue / Shell.

From the command line, I ran the following commands. In my case, my drive is called da0, but yours might be ada0 or something similar.

#gpart show da0 <-- This will show you your partition table. If you don't know what your drive is called, you can omit 'da0', but if you have tons of drives be prepared...

You'll notice an 'EFI' partition. Usually it's the '1' spot. The following command will re-install the EFI loader, using the working copy from pFsense:

#gpart bootcode -p /boot/boot1.efifat -i1 da0

-i1 references the partition, in my case it was '1'. da0 is my drive. "-p /boot/boot1.efifat" is the new image from the pFsense rescue disk that will get written to your TrueNAS HD.

That's it! I usually issue a 'sync' followed by a 'shutdown -h now', This gives me time to unmount the CD image. Next time the VM reboots it should load straight into TrueNAS using EFI firmware under VMWare. You might want to do this if you're playing around with SR-IOV or have some other requirement, otherwise just stick to regular BIOS..

Hope this helps someone.
 
Top