I am not using FreeNAS but "just" FreeBSD (I am actually considering switching to make my life easier). There are knowledgeable zfs people in this forum so I hope that I can get help.
Before 10.2 (9.2, 9.3 and then 10.0) my pools size was as follow:
After 10.2 upgrade, the same df gives:
This is consistent with the zfs list command (below for 10.2):
The total size (and as a result the available space) is 100GB to 200GB smaller after the upgrade
Both pools are raidz2 (seven 2TB disks for data and seven 3TB disks for main). I have mix of internal SATA + M1015 (driver v20 and firmware P20). the M1015 were using firmware P15 and I updated them to P20 to see if this was causing the issue.
any help would be appreciated.
Before 10.2 (9.2, 9.3 and then 10.0) my pools size was as follow:
Code:
Filesystem 1K-blocks Used Avail Capacity Mounted on data 8887411603 8226012836 661398767 93% /data main 13375108618 12275755567 1099353051 92% /main
After 10.2 upgrade, the same df gives:
Code:
Filesystem 1K-blocks Used Avail Capacity Mounted on data 8746341243 8226012836 520328407 94% /data main 13162804746 12275755583 887049163 93% /main
This is consistent with the zfs list command (below for 10.2):
Code:
NAME USED AVAIL REFER MOUNTPOINT data 7.66T 496G 7.66T /data main 11.4T 854G 11.4T /main
The total size (and as a result the available space) is 100GB to 200GB smaller after the upgrade
Both pools are raidz2 (seven 2TB disks for data and seven 3TB disks for main). I have mix of internal SATA + M1015 (driver v20 and firmware P20). the M1015 were using firmware P15 and I updated them to P20 to see if this was causing the issue.
any help would be appreciated.