FN11.3_U1 - install of Ubuntu 18.04 server in VM fails, v16 works

gsrcrxsi

Explorer
Joined
Apr 15, 2018
Messages
86
So i was having some issues with my install of Ubuntu 18.04 server that I had running in a VM, it was running Pihole and nothing else (it was occasionally locking up). So I wiped it out, deleted the zvol, recreated it, reinstalled everything. the install goes fine, but after booting the first time, when it gets to the log in, it starts throwing out all kinds of Input/Output errors. At first I thought this was a failing SSD, but I put the SSD into another system and cannot find a fault with it, and I've also replicated the issue on my other Freenas server also running 11.3-U1. upgrading to 11.3-U2 did not fix or change the issue in any way.

this is what I'm seeing:
Stbj8Md.png

V7Xt4FZ.png


Hardware Server #1 (where I first saw the problem)
MB: Supermicro X9DAi (latest BIOS)
CPUs: 2x E5-2630L v2
RAM: 64GB Reg ECC DDR3-1066
Disk where VM installed: 500GB Samsung 970 EVO m.2 on PCIe adapter (i tried a separate PCIe adapter also, same results)

Hardware Server #2 (where I replicated the issue)
MB: Supermicro X9DRi-LN4F+ v1.20 (latest BIOS)
CPUs: 2x E5-2680 v2
RAM: 128GB Reg ECC DDR3-1600
Disk where VM installed: 1.6TB Intel DC P3600 Series SSDPEDME016T401
(I also tried to install a VM to a zvol located on the main storage pool with SATA drives but it made no difference, still failed in exactly the same way)

I also tried installing Ubuntu 16.04 server and that installed fine, but as soon as I installed the upgrade to 18.04 it broke again.

What's the deal here? I have several Ubuntu 18.04 VMs running fine on Server #2, that were installed when it was still on 11.2, then migrated to 11.3, but now new installs of 18.04 server are totally broken on both systems. only the old ones are still working.

can anyone help or even replicate this?
 

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
Why does everybody use AHCI hard disks in bhyve?
Use virtio with Linux or FreeBSD as the guest os.
 

gsrcrxsi

Explorer
Joined
Apr 15, 2018
Messages
86
That doesn’t explain why VMs created in 11.2 work perfectly fine. Or why Ubuntu 16.04 works fine in the exact same config.
 

tophee

Explorer
Joined
Oct 27, 2011
Messages
71
I seem to be having similar problems. I'm only just starting out with VMs on my Freenas build. I'm using Ubuntu Server 18.04 as my base for two VMS. One VM to be a PiHole and the other some folding@home.
After I found that I needed to use virt0 things have been a bit smoother - I can install ubuntu (if I don't update their installer) and I can get it sort of to work. However, since adding Virtual Machines, I have run into some other networking issues - VMs losing internet connection after initially working and being unresponsive to ssh requests ( appearing to work through inbuilt noVNC).
I've also had to reboot my FreeNAS box and this has caused intermittent instances of IRQ storms knocking out my home network.
I did wonder if it was my old hardware... but posts like the OP seems to suggest otherwise.
Running Freenas 11.3-U2.
 

0x4161726f6e

Dabbler
Joined
Jul 3, 2016
Messages
19
I have had similar networking issues that I fixed by adding 'promisc' to my network interfaces; lagg and member interfaces.

I'm getting the same I/O error, can anyone confirm that switching from AHCI to virtO fixes it?
 

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
That doesn’t explain why VMs created in 11.2 work perfectly fine. Or why Ubuntu 16.04 works fine in the exact same config.
Of course not. Sorry for having been a bit snarky. To elaborate:

1. I never saw any of these problems, neither disk errors nor cursor "hanging", but
2. I used virtio from the start - so that might fix the first problem
3. I cannot say anything about the second one because I don't use graphical environments on my servers - which is what my Linux VMs are. They do not even have a VNC device, I remove them after installation. Serial console rules! ;)

HTH somewhat,
Patrick
 

gsrcrxsi

Explorer
Joined
Apr 15, 2018
Messages
86
changing to VIRTIO did help a little.

but then you get other anomalies and idiosyncrasies, surrounding the mouse function on a GUI enabled Linux VM when using VIRTIO that weren't present when using AHCI. this somehow is tied to how many cores are enabled in your VM (see the other thread about this).

AHCI, Ubuntu 18 and 16, mouse worked fine with any number of cores assigned to the VM
VIRTIO, Ubuntu 18, mouse only worked with 1 or 3 cores on the VM, anything else and the mouse freezes and is unresponsive
VIRTIO, Ubuntu 16, mouse only worked with 1 core assigned to the VM

it seems like a recent kernel update when using AHCI is what broke everything, my old VMs were working fine, but broke when they rebooted to a kernel update that had been installed. 4.15.0-96 seems to be the culprit. Ubuntu 16 works on the old kernel with AHCI, breaks if you update it. same with Ubuntu 18, works if you stay on the earlier kernel.

i wont even pretend to understand how the number of cores on a VM can affect the mouse functionality. but the bhyve implementation in FreeNAS is quite buggy as a whole it seems.

I use some VMs without a GUI, specifically the ones running Pi-Hole, but I have a few other VMs that need a GUI. luckily they don't need much horsepower so I just moved them back to 1 or 3 cores and they are fine for now.
 

McD

Cadet
Joined
Apr 17, 2020
Messages
4
My VMs had issues after updating to a recent 5.5 kernel (I run fedora 31), and were fine when booting the previous 5.4 kernel.

I was able to bisect the problem down to this kernel change:
https://lore.kernel.org/linux-pci/87imkr4s7n.fsf@nanos.tec.linutronix.de/

For 5.5 kernels this commit was added in stable release 5.5.3, not sure to which older kernels it was also backported. Unfortunately the commit author is not familiar with bhyve.
 
Top