locating the bottleneck

Status
Not open for further replies.

kabonacha

Cadet
Joined
Feb 11, 2013
Messages
2
Hi all, I've been experimenting with FreeNAS for a few days now and I'm pleasantly surprised about it's functionalities!

My setup:
RAM: CORSAIR DDR3/1866 2x4GB DIMM Vengeance CL9 DDR3 - 240 pin - 1,5V - CL9 - 2x 4GB Kit
MOBO: GIGABYTE FM2 GA-F2A75M-D3H AMD A75 - 4 x DDR3 - SATA600/USB3.0
CPU: AMD A10 5800K Black Edition FM2 3.8GHz Quad-Core
HDD: 2 x western digital RED 3TB 7200rpm (I still have 3 spare drives of 1,5tb; 320gb and 160gb but I'll add those later on)
SSD: sandisk 128gb
PSU: antec continuous power 520watt

I know the CPU is overpowered but I initially didn't plan on creating a fileserver with FreeNAS.
The setup described is actually my HTPC which is sitting under my TV and is pretty silent actually!

I just installed windows 8 on the 128GB SSD and started my experiments with FreeNAS using virtual box.
With some minor tweaks here and there I could enable raw disk access for the VM but windows is not letting me at the moment.
So I started of with simple virtual hard disks, virtual box and also vmware can't handle virtual drives larger than 3tb.
So I split every drive up in parts of 1tb and made those available to FreeNAS.

So I created a raidz2 array of those 6 disks and ended up with a volume of 3tb in size.
I then created my windows CIFS share and started some experimenting/benchmarking.

What I'm seeing now is that I'm only getting 35MB/s when I'm writing to the fileserver.
Reads are a little better ranging from 40 to 45MB/s.

I know that a VM can't beat a dedicated fileserver but I'm just wondering what kinds of speeds other people are recording using a virtual fileserver.

Playing around with the ZFS configuration of the drives doesn't help either.
Tried striping all 6 virtual hard drives and mirroring them, all setups have the same result.
Enabled host I/O cache
added a ZIL (virtual drive on the SDD) for logging, marked the virtual drive as SDD in virtual box.
Followed the 1GB of RAM per TB of storage rule

So I'm thinking virtual box is really being the bottleneck here...
Next thing I'm gonna try is enabling raw disk access but I have to solve a few errors first, I'm getting the VERR_ACCESS_DENIED error.

Looking forward to your thoughts/statistics or benchmarks!
 

ben

FreeNAS GUI Developer
Joined
May 24, 2011
Messages
373
I'd guess the crazy disk setup. Try to get the hard disk pass-through working.
 

cyberjock

Inactive Account
Joined
Mar 25, 2012
Messages
19,526
I don't recommend pass-through on a Windows host. From personal experience with pass-through with virtualization it fails horribly and has always caused data loss. I'm not sure what is actually accessing the disk, but something is writing to the disk and corrupting stuff.

Not to mention the fact that ZFS needs direct disk access, which virtualization only provides with ESXi and disk passthrough.
 

kabonacha

Cadet
Joined
Feb 11, 2013
Messages
2
I don't recommend pass-through on a Windows host. From personal experience with pass-through with virtualization it fails horribly and has always caused data loss. I'm not sure what is actually accessing the disk, but something is writing to the disk and corrupting stuff.

Not to mention the fact that ZFS needs direct disk access, which virtualization only provides with ESXi and disk passthrough.

I tried vmware ESXi server, but there I had the problem that I couldn't dedicate my GPU to my VM running windows 8 with XBMC.
But then again, vmware ESXi doesn't support drives over 2TB...

So I scratched ESXi server and tried Citrix xenclient which is also a hypervisor type 1 but does a few little tricks.
The problem I then faced with citrix xenclient was that my GPU which is embedded in the CPU wasn't supported.(AMD APU)
Installation went smooth though but after booting I got a lot of artifacts.

So that's how I ended up with virtual box on a windows host, I'm gonna give the raw disk access another try though.
If it really is that unstable and not performing like I want it then I'll just buy some more components to create a dedicated server.
 
Status
Not open for further replies.
Top