Detaching harddrive doesnt work and wrong number of devices in pool..

Status
Not open for further replies.

PeterCGS

Cadet
Joined
Aug 4, 2017
Messages
6
Long time user of Freenas and loving it, but last week I stumbled on a problem I simply cannot get around.

I've got a zpool consisting of ten raidz-2 with six 3TB drives in each so its a quite large 126.8TB zpool I rather not loose :)

I've been through floodings and electric shocks during the years I've expanded the pool, but lately all old Seagate drives (+40.000 hours) started generating hell of a lot of S.M.A.R.T. errors, so I stared replacing those with WD RED drives which I love..

Now, after replacing more than 50% of the 60 drives in the pool, one raidz-2 gave me trouble. It was the one that I replaced more or less all my Seagate drives in. When selecting replace a Seagate drive, it just added the new WD Red but didnt release the Seagate. So now this pool look like it consists of eight devices instead of six. The thing is I have one spot where the unavailable drive is impossible to detach.. I've tried several times to detach it in the Freenas GUI and the process generates no errors, but when looking at the volume status the device is still there.. Then I of course have one drive thats reporting thats it being replaced, but nothing happens..

Any ideas how to detach the drives that should not be in the pool anymore?!

-----------------------------------------------------------------------------------------------

Build FreeNAS-9.10.2-U5 (561f0d7a1)

Platform Intel(R) Xeon(R) CPU E31270 @ 3.40GHz

Memory 32716MB

System Time Fri Aug 04 09:36:03 CEST 2017

Uptime 9:36AM up 3 days, 22:24, 1 user

Load Average 0.03, 0.17, 0.16

------------------------------------------------------------------------------------------------------


Raidz2-4
da58p2 ONLINE
15310076402626339188 UNAVAILABLE
da21p2 ONLINE
da51p2 ONLINE
da57p2 ONLINE
da27p2 ONLINE
da20p2 ONLINE
da32p2 ONLINE

raidz2-3 ONLINE 0 0 0
gptid/4f61e235-c525-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/46fe1c4b-5da9-11e7-ab56-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/bb534030-dc7a-11e6-b35d-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/3951d766-8dbc-11e4-addb-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/8402a40c-d5e9-11e4-addb-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/83ce4b2c-b681-11e3-80f2-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native

raidz2-4 DEGRADED 0 0 2
gptid/39b6747c-bd24-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/2c1a4ece-4afe-11e7-8fcd-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
replacing-2 ONLINE 0 0 0
gptid/2deee1c6-565a-11e2-a391-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/da80b257-50dd-11e7-8fcd-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/836c3a45-c15a-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/1f700822-4b9d-11e7-8fcd-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
replacing-5 DEGRADED 0 0 0
15310076402626339188 UNAVAIL 0 0 0 was /dev/gptid/375c4838-565a-11e2-a391-002590576f25
gptid/47396353-5aaf-11e7-be00-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native

raidz2-5 ONLINE 0 0 0
gptid/79e6ffbf-c358-11e6-8853-002590576f25 ONLINE 0 0 0
gptid/2af2d9d6-bbc0-11e6-8853-002590576f25 ONLINE 0 0 0
gptid/6e101e8f-3284-11e3-a461-002590576f25 ONLINE 0 0 0
gptid/6ef0985b-3284-11e3-a461-002590576f25 ONLINE 0 0 0
gptid/91e9785c-c068-11e6-8853-002590576f25 ONLINE 0 0 0
gptid/70ba9262-3284-11e3-a461-002590576f25 ONLINE 0 0 0

raidz2-6 ONLINE 0 0 0
gptid/d1d39e3a-d6a1-11e6-8137-002590576f25 ONLINE 0 0 0
gptid/e2e7b41b-82c1-11e3-901a-002590576f25 ONLINE 0 0 0
gptid/e3bd7211-82c1-11e3-901a-002590576f25 ONLINE 0 0 0
gptid/e29946ab-417c-11e4-b169-002590576f25 ONLINE 0 0 0
gptid/e8aeebc6-bdd2-11e6-8853-002590576f25 ONLINE 0 0 0
gptid/1c84d069-ce87-11e6-8853-002590576f25 ONLINE 0 0 0

----------------------------------------------------------------------------------------------------------------

/Regards Peter K
 

Attachments

  • Freenas_Strange.jpg
    Freenas_Strange.jpg
    189.6 KB · Views: 191

Stux

MVP
Joined
Jun 2, 2016
Messages
4,419
How long has it been replacing the drives for?

Have you tried rebooting?
 

PeterCGS

Cadet
Joined
Aug 4, 2017
Messages
6
The problem have not been resolved. I've rebooted a couple of times but same thing happens. Every time I reboot, the resilvering process starts (resilvered 15.8T in 60h23m).

So everything works, but I cannot get passed the detachement of the non existing device, and I dont dare to start a ZFS upgrade etc. So any tips would be welcome!
 

rs225

Guru
Joined
Jun 28, 2014
Messages
878
You're showing 2 checksum errors on one of the raidz2 vdevs, which shouldn't happen. What are the status for the top level of the pool?
 

PeterCGS

Cadet
Joined
Aug 4, 2017
Messages
6
This is my full info, let me know if there is any other info that might be of interest or any commands I should run:

[root@freenas] ~# zpool status -v
pool: Fileserver
state: DEGRADED
status: One or more devices has experienced an error resulting in data
corruption. Applications may be affected.
action: Restore the file in question if possible. Otherwise restore the
entire pool from backup.
see: http://illumos.org/msg/ZFS-8000-8A
scan: resilvered 15.8T in 60h23m with 1 errors on Wed Aug 2 23:37:15 2017
config:

NAME STATE READ WRITE C
KSUM
Fileserver DEGRADED 0 0 1
raidz2-0 ONLINE 0 0 0
gptid/3a0e642b-7076-11e1-9ae2-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/c24f3364-6492-11e7-ab56-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/416d1859-be62-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/e95771b5-dfcb-11e6-a372-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/5356804a-d4c6-11e4-addb-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
da2p2 ONLINE 0 0 0 block size: 512B configured, 4096B native
raidz2-1 ONLINE 0 0 0
gptid/949976c1-4ea1-11e7-8fcd-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/711bf816-6494-11e7-ab56-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/77a1a344-7076-11e1-9ae2-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/c30f6b90-d019-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/b832eb15-bf85-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/4deb4a53-e1b7-11e6-9c6c-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
raidz2-2 ONLINE 0 0 0
gptid/4af80e5c-711f-11e1-ad68-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/6b728a00-bc4b-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/4c39da0c-711f-11e1-ad68-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/59320201-8cf5-11e4-addb-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/4d6c74c1-711f-11e1-ad68-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/d9f06704-e31d-11e6-8637-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
raidz2-3 ONLINE 0 0 0
gptid/4f61e235-c525-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/46fe1c4b-5da9-11e7-ab56-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/bb534030-dc7a-11e6-b35d-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/3951d766-8dbc-11e4-addb-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/8402a40c-d5e9-11e4-addb-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/83ce4b2c-b681-11e3-80f2-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
raidz2-4 DEGRADED 0 0 2
gptid/39b6747c-bd24-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/2c1a4ece-4afe-11e7-8fcd-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
replacing-2 ONLINE 0 0 0
gptid/2deee1c6-565a-11e2-a391-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/da80b257-50dd-11e7-8fcd-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/836c3a45-c15a-11e6-8853-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
gptid/1f700822-4b9d-11e7-8fcd-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
replacing-5 DEGRADED 0 0 0
15310076402626339188 UNAVAIL 0 0 0 was /dev/gptid/375c4838-565a-11e2-a391-002590576f25
gptid/47396353-5aaf-11e7-be00-002590576f25 ONLINE 0 0 0 block size: 512B configured, 4096B native
raidz2-5 ONLINE 0 0 0
gptid/79e6ffbf-c358-11e6-8853-002590576f25 ONLINE 0 0 0
gptid/2af2d9d6-bbc0-11e6-8853-002590576f25 ONLINE 0 0 0
gptid/6e101e8f-3284-11e3-a461-002590576f25 ONLINE 0 0 0
gptid/6ef0985b-3284-11e3-a461-002590576f25 ONLINE 0 0 0
gptid/91e9785c-c068-11e6-8853-002590576f25 ONLINE 0 0 0
gptid/70ba9262-3284-11e3-a461-002590576f25 ONLINE 0 0 0
raidz2-6 ONLINE 0 0 0
gptid/d1d39e3a-d6a1-11e6-8137-002590576f25 ONLINE 0 0 0
gptid/e2e7b41b-82c1-11e3-901a-002590576f25 ONLINE 0 0 0
gptid/e3bd7211-82c1-11e3-901a-002590576f25 ONLINE 0 0 0
gptid/e29946ab-417c-11e4-b169-002590576f25 ONLINE 0 0 0
gptid/e8aeebc6-bdd2-11e6-8853-002590576f25 ONLINE 0 0 0
gptid/1c84d069-ce87-11e6-8853-002590576f25 ONLINE 0 0 0
raidz2-7 ONLINE 0 0 0
gptid/08d97821-d58c-11e6-96bf-002590576f25 ONLINE 0 0 0
gptid/cfdb12bb-b77b-11e3-abcb-002590576f25 ONLINE 0 0 0
gptid/9d083481-4c53-11e7-8fcd-002590576f25 ONLINE 0 0 0
gptid/bb6e1856-4a93-11e7-8fcd-002590576f25 ONLINE 0 0 0
gptid/81601bae-6f79-11e7-b468-002590576f25 ONLINE 0 0 0
gptid/2c85da18-d19a-11e6-9087-002590576f25 ONLINE 0 0 0
raidz2-8 ONLINE 0 0 0
gptid/16a60d56-82d2-11e5-a1dc-002590576f25 ONLINE 0 0 0
gptid/574ee1e4-8252-11e5-a1dc-002590576f25 ONLINE 0 0 0
gptid/baf272a7-8e82-11e4-addb-002590576f25 ONLINE 0 0 0
gptid/bbcca4db-8e82-11e4-addb-002590576f25 ONLINE 0 0 0
gptid/bcacec05-8e82-11e4-addb-002590576f25 ONLINE 0 0 0
gptid/bd870af5-8e82-11e4-addb-002590576f25 ONLINE 0 0 0
raidz2-9 ONLINE 0 0 0
gptid/74488c5b-8e83-11e4-addb-002590576f25 ONLINE 0 0 0
gptid/75205757-8e83-11e4-addb-002590576f25 ONLINE 0 0 0
gptid/76063c9d-8e83-11e4-addb-002590576f25 ONLINE 0 0 0
gptid/76e725fd-8e83-11e4-addb-002590576f25 ONLINE 0 0 0
gptid/35ffd13c-c206-11e6-8853-002590576f25 ONLINE 0 0 0
gptid/784b7831-8e83-11e4-addb-002590576f25 ONLINE 0 0 0

errors: Permanent errors have been detected in the following files:

Fileserver@auto-20161201.0334-2w:/_Sagafilm/Millicom/MillicomCulturalMov
ie/2TBdisk/Senegal/Konverterade Gopro filer/GOPR0471.mov

pool: freenas-boot
state: ONLINE
scan: scrub repaired 0 in 0h1m with 0 errors on Mon Jul 17 03:46:01 2017
config:

NAME STATE READ WRITE CKSUM
freenas-boot ONLINE 0 0 0
gptid/c477c3db-dc06-11e6-9b20-002590576f25 ONLINE 0 0 0

errors: No known data errors
[root@freenas] ~#

/Regards Peter K
 

rs225

Guru
Joined
Jun 28, 2014
Messages
878
You have corruption that should not happen. This may be preventing the resilver from completing.

You can try copying that file to see if it succeeds. If not, the file may need to be deleted. You can try deleting the snapshot Fileserver@auto-20161201.0334-2w however there is a chance the problem will just migrate until all snapshots containing the damaged file are deleted. You should consider whether deleting all snapshots is a good idea, since even that could possibly not fix the problem.

You should not trust this hardware until you identify a cause. A memory test is a good idea. Is the RAM type ECC?
 

PeterCGS

Cadet
Joined
Aug 4, 2017
Messages
6
Oh, I thought the resilvering was completed?

However, the snapshot has now been deleted. I tried to detach the unavailable device after that (even though Freenas tells me it did this succesfully, the device is still there). So now I will run a scrub and see what the results will be. Thank you for the help!
 

PeterCGS

Cadet
Joined
Aug 4, 2017
Messages
6
Btw, ECC RAM is being used.

Hostname freenas.local
Build FreeNAS-9.10.2-U5 (561f0d7a1)

Platform Intel(R) Xeon(R) CPU E31270 @ 3.40GHz

Memory 32716MB

System Time Wed Aug 09 08:30:10 CEST 2017

Uptime 8:30AM up 8 days, 21:18, 1 user

Load Average 0.07, 0.39, 0.47
 

PeterCGS

Cadet
Joined
Aug 4, 2017
Messages
6
Thank you! Deleting the snapshot did the trick :) All is well now after a resilvering!
 
Status
Not open for further replies.
Top