Virtual Machine and Disk location

Status
Not open for further replies.

steven6282

Dabbler
Joined
Jul 22, 2014
Messages
39
Hey everyone,
I'm building a new FreeNas based server. I was messing around earlier with FreeNas 11.2 Beta and the VM support in it seems to be a lot more fleshed out than in 11.1, so I guess that is one of the things being worked on in 11.2. However I ran into a problem when trying to create a VM with a python index out of range error. I figured this must be due to the Beta state so I installed 11.1 STABLE. Now in 11.1 stable, I apparently have to manually create all the devices for the VM. However, when I go to add a disk, it's only giving me the option to select a ZVol. Why would this be limited to ZVols only?

I configured my new system with a 240 GB M.2 nvme drive as the boot drive with the intention of running some VMs on it as well. The plan was to run the root file systems of the guest VMs on this drive, and then any critical data stored in them would be stored in a mount from a share in my storage array. This way I can benefit from the speed of the SSD while keeping actual data secure. Some VMs I simply don't need data redundancy on. Like a small VPN tunnel that has no data to store. There is no reason to use space on a ZVol for that VM. I suppose I could probably run FreeNAS on a usb drive in order to have it look at this SSD as a valid usable disk for other things, but that seems silly to put FreeNAS on a cheap slow USB stick. I've also had 3 USB sticks fail on my current FreeNAS system before I moved to a small SSD for it, USB sticks just aren't built to handle 24 x 7 data reads and writes in my expenrience.

So, is there any way to force the system to let me put a VM disk on the boot disk?
 

kdragon75

Wizard
Joined
Aug 7, 2016
Messages
2,457
You CAN NOT use the boot drive for anything other than a boot drive. There is also no point in booting from from fast storage. It will have NO effect on the daily performance of your system. Use mirrored USB drives or a small *cheap* SSD to boot and the NVMe for VMs/SLOG. As for the small VMs like your VPN software, why not play it on your main pool? Its tiny.
 

steven6282

Dabbler
Joined
Jul 22, 2014
Messages
39
You CAN NOT use the boot drive for anything other than a boot drive. There is also no point in booting from from fast storage. It will have NO effect on the daily performance of your system. Use mirrored USB drives or a small *cheap* SSD to boot and the NVMe for VMs/SLOG. As for the small VMs like your VPN software, why not play it on your main pool? Its tiny.

I see now that I can not, but that doesn't explain why I can not. I don't know the inner workings of freenas itself, but I'm pretty sure FreeBSD that FreeNAS is built on doesn't have any such restrictions to using the boot drive for other things. It's not about any performance advantage for running FreeNAS on the SSD, it's about the reliability. I don't want to use mirrored USB for the same reason, sure it might keep the system from failing out right when the USB drive fails, but I'll still have to be replacing a USB drive occasionally. Ideally I won't have to replace anything for years. In order to put FreeNAS on a separate small SSD, I'd have to spend even more getting an expansion card because all 8 SATA ports on the MB are intended for storage drives. That is why having M.2 on a PCIE bus was so nice for a boot drive.

As for storing small VMs on the pool, it's not only about the space consumption but also just unnecessary IOPS on the pool. Even if it's a small amount, it's still wasteful.

Just seems like at the very least I should be able to partition the drive so that I can still use the non boot partition for other things. Maybe there is a good reason for it being like that, that is what I would like to know.
 

kdragon75

Wizard
Joined
Aug 7, 2016
Messages
2,457
In order to put FreeNAS on a separate small SSD, I'd have to spend even more getting an expansion card because all 8 SATA ports on the MB are intended for storage drives. That is why having M.2 on a PCIE bus was so nice for a boot drive.
This is why its so important to research and plan everything out before just buying parts.
As for storing small VMs on the pool, it's not only about the space consumption but also just unnecessary IOPS on the pool. Even if it's a small amount, it's still wasteful.
What is your pool doing that it can't spare 10 IOPS? I can't warp me head around how people think jails need to be on SSD and then its just syncthing or Plex and all the data resides on the main pool anyway. More to this point would the NVMe not be better used to augment the performance of your pool if your that short on IOPs?
Ideally I won't have to replace anything for years.
Maybe... Did you burn in your drives? If not, do that now. It could save you some hassle down the road.
I'd have to spend even more getting an expansion card because all 8 SATA ports on the MB are intended for storage drives.
$50 USD tops.
That is why having M.2 on a PCIE bus was so nice for a boot drive.
The only time it makes sense to have an ultra fast boot drive is on a desktop PC. Every server I have ever touched uses slow cheap (but redundant) boot media. All other data/applications go on performance optimised storage for its use case. Booting from NVMe on a server is a waste of a fast drive. I guess if you have $100 to waste on something that can be done for $20, i guess go for it.
Just seems like at the very least I should be able to partition the drive so that I can still use the non boot partition for other things. Maybe there is a good reason for it being like that, that is what I would like to know.
Yes. FreeNAS boot and updates like an appliance. that means updates can wipe the partition table for the boot drive. If you want the exact details, look into the update source code. Every appliance I have worked with does this. if you want the unlimited flexibility of FreeBSD, that could be a great option for you.
 

steven6282

Dabbler
Joined
Jul 22, 2014
Messages
39
This is why its so important to research and plan everything out before just buying parts.
Yeah, I don't think anything in any research I did would have had me find this limitation. It just never even occurred to me that this would be a limitation considering it's built on FreeBSD.

What is your pool doing that it can't spare 10 IOPS? I can't warp me head around how people think jails need to be on SSD and then its just syncthing or Plex and all the data resides on the main pool anyway. More to this point would the NVMe not be better used to augment the performance of your pool if your that short on IOPs?
I didn't say I couldn't spare it, I said it would be wasteful. And it is. It's like saying since I have a car with a full tank of gas so I should use it to drive 50 feet to the end of my driveway then back to the house because that short distance shouldn't matter.

Maybe... Did you burn in your drives? If not, do that now. It could save you some hassle down the road.
They are currently in the process of doing a zero fill and then have some other read write tests that will be done on them prior to another zero refill before putting them in the server. If one of them is going to fail early, it will hopefully happen during this period. It'll be a few days before all this finishes as they are 8 TB drives. I was trying to get some of the VMs and such that didn't need a lot of space or redundancy set up while waiting, which is one reason why I ran into this problem to begin with.

$50 USD tops.

The only time it makes sense to have an ultra fast boot drive is on a desktop PC. Every server I have ever touched uses slow cheap (but redundant) boot media. All other data/applications go on performance optimised storage for its use case. Booting from NVMe on a server is a waste of a fast drive. I guess if you have $100 to waste on something that can be done for $20, i guess go for it.
Again, it's not about the performance. I was stating what type of drive / connector it is because it's nice that it doesn't need a SATA connection that can instead be used for more storage disks. Also, this particular NVME drive was only $50, otherwise I wouldn't have gotten an NVME for it.

Yes. FreeNAS boot and updates like an appliance. that means updates can wipe the partition table for the boot drive. If you want the exact details, look into the update source code. Every appliance I have worked with does this. if you want the unlimited flexibility of FreeBSD, that could be a great option for you.

Yeah, if I wanted to do everything manually I would use FreeBSD directly. But, I don't, so I was just trying to understand why the limitations where in place. I'm not sure I understand why an update would need to wipe the partition table, but at least that gives me a better understanding of why it's problematic to use the boot drive for other things.
 

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
I don't think anything in any research I did would have had me find this limitation.
Well, anything other than reading the manual:
upload_2018-8-26_22-10-13.png

...or looking at the installer screens themselves:
cdrom3a.png
 

steven6282

Dabbler
Joined
Jul 22, 2014
Messages
39
Well, anything other than reading the manual:
View attachment 25367
...or looking at the installer screens themselves:
cdrom3a.png

Nah, that doesn't really work either. Cause all that says is that flash media is preferred to installing on a hard drive. It doesn't say you can't use that media for other things. And I did in fact search Why is flash media preferred when I saw that in install and found this:

https://www.reddit.com/r/homelab/comments/3i709u/why_is_installing_freenas_on_a_flash_drive_usb/

Where the top voted response says your best option for any appliance style application is a SATA DOM or SSD. I opted for the SSD option.

Again, neither this nor anything in the manual would have led me to discovering that I cant use the boot media for anything else prior to when I picked the components for my install. This is honestly the first encounter I've ever had with this limitation out of the numerous servers of various kinds I've built in my lifetime. Granted most of those were some type of web or application hosting server, so take it for what it is.
 

kdragon75

Wizard
Joined
Aug 7, 2016
Messages
2,457
Cause all that says is that flash media is preferred to installing on a hard drive.
The User Guide literally says "device that is separate from storage disks."o_O
There is nothing technically wrong with installing to NVMe assuming the board supports booting to it and yours clearly does. It's still an odd setup.
They are currently in the process of doing a zero fill and then have some other read write tests that will be done on them prior to another zero refill before putting them in the server. If one of them is going to fail early, it will hopefully happen during this period. It'll be a few days before all this finishes as they are 8 TB drives. I was trying to get some of the VMs and such that didn't need a lot of space or redundancy set up while waiting, which is one reason why I ran into this problem to begin with.
On a more important subject, have you run bad blocks? this si much more than a zero fill as it writes patterns and reads them back to verify the data was actually written and can be read from the entire surface of the disk. Its more like memtest+ for a hard drive and not just a mechanical stress test.;)
 
Status
Not open for further replies.
Top