I am sure their are going to be a lot of questions but I am going to start out with the general setup.
This has happened on 2x different systems that I have, hardware configuration is a little different between the two but the symptoms are the same. Both systems run a 12x drive Raid 10 array, both systems have around 128GB of memory and both array's have 1x intel 100gb or 250 gb ssd drive I am using as a cache device. I have created the zpools though the gui and have also added the ssd as a cache device through the GUI. When the ZFS status check runs and emails me the report they both show that a device has been removed. The device that is removed in both cases is the SSD cache disk. On both systems I am running FreeNAS 9.3
It may be important to note that the SSD cache disk is connected to a different on board sata controller. It is separate from the HBA that runs the rest of the drives that make up the array. Please let me know what screen shots and diag outputs you need me to post.
This has happened on 2x different systems that I have, hardware configuration is a little different between the two but the symptoms are the same. Both systems run a 12x drive Raid 10 array, both systems have around 128GB of memory and both array's have 1x intel 100gb or 250 gb ssd drive I am using as a cache device. I have created the zpools though the gui and have also added the ssd as a cache device through the GUI. When the ZFS status check runs and emails me the report they both show that a device has been removed. The device that is removed in both cases is the SSD cache disk. On both systems I am running FreeNAS 9.3
It may be important to note that the SSD cache disk is connected to a different on board sata controller. It is separate from the HBA that runs the rest of the drives that make up the array. Please let me know what screen shots and diag outputs you need me to post.