jgreco
Resident Grinch
- Joined
- May 29, 2011
- Messages
- 18,680
Okay so this isn't exactly FreeNAS networking, but I thought I'd post a few results here since it is interesting and related.
On a nice new E5-2697 machine, 2.7GHz, I set up two FreeNAS instances for the purpose of experimenting with network performance. Being on the same machine, the network isn't limited to 1Gbps, and I was curious to see how Intel EM stacked up to VMXNET 2 and VMXNET 3, plus other variables like MTU. The tests were all unidirectional, 20 seconds, run with iperf, most run several times and eyeball-picked the average. The same settings were applied on each side. The goal was to give something that resembled the use model of a fileserver serving a single file now and then. I have pruned a lot of superfluous noise and am including just the "statistically interesting" results.
Other observations:
VMXNET 2 has some problem with 5.5. I don't know what, but it failed with both the built-in FreeNAS driver (which I believe is a derivative of Open VM Tools VMXNET2) and the VMware supplied version. It would work for just a bit then crap out.
With an MTU of 1500, basically it worked out that EM capped out around 2Gbps, and VMX3 at around 3Gbps, regardless of window size. For larger MTU, a 256K window seems to provide better throughput, but going further seems to degrade throughput.
The unexpected surprise was that while VMXNET3 is more efficient than Intel EM for 1500 MTU, that the performance of VMXNET3 plummets substantially for large MTU use (3Gbps down to 2Gbps for default window), whereas Intel EM drops a little bit (2Gbps down to 1.7Gbps) but then explodes with the 256K window.
So, hey, if you're doing jumbo frames, try the Intel EM driver and a 256K window size.
On a nice new E5-2697 machine, 2.7GHz, I set up two FreeNAS instances for the purpose of experimenting with network performance. Being on the same machine, the network isn't limited to 1Gbps, and I was curious to see how Intel EM stacked up to VMXNET 2 and VMXNET 3, plus other variables like MTU. The tests were all unidirectional, 20 seconds, run with iperf, most run several times and eyeball-picked the average. The same settings were applied on each side. The goal was to give something that resembled the use model of a fileserver serving a single file now and then. I have pruned a lot of superfluous noise and am including just the "statistically interesting" results.
Other observations:
VMXNET 2 has some problem with 5.5. I don't know what, but it failed with both the built-in FreeNAS driver (which I believe is a derivative of Open VM Tools VMXNET2) and the VMware supplied version. It would work for just a bit then crap out.
With an MTU of 1500, basically it worked out that EM capped out around 2Gbps, and VMX3 at around 3Gbps, regardless of window size. For larger MTU, a 256K window seems to provide better throughput, but going further seems to degrade throughput.
Code:
em0, mtu 1500, default window ~2Gbps em0, mtu 9000, default window ~1.7Gbps em0, mtu 9000, 192K window ~5.3Gbps em0, mtu 1500, 256K window ~2Gbps em0, mtu 9000, 256K window ~5.6Gbps em0, mtu 9000, 384K window ~4.4Gbps em0, mtu 9000, 512K window ~3.9Gbps vmx3, mtu 1500, default window ~3Gbps vmx3, mtu 9000, default window ~2Gbps vmx3, mtu 9000, 192K window ~2.3Gbps vmx3, mtu 9000, 256K window ~2.5Gbps vmx3, mtu 1500, 256K window ~3Gbps vmx3, mtu 9000, 384K window ~2.4Gbps
The unexpected surprise was that while VMXNET3 is more efficient than Intel EM for 1500 MTU, that the performance of VMXNET3 plummets substantially for large MTU use (3Gbps down to 2Gbps for default window), whereas Intel EM drops a little bit (2Gbps down to 1.7Gbps) but then explodes with the 256K window.
So, hey, if you're doing jumbo frames, try the Intel EM driver and a 256K window size.