volume status DEGRADED

Status
Not open for further replies.

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
Hi all,
I am running a freenas install 8.0.4 -x64 on a home machine, having the boot system on a USB drive of 4 Gb (noname) and 2 HDD drives for datas. After a pause of 6-8 months when the machine was not started I tried today and received some noisy beeps even from the start of the OS. After some investigation, turned out I had a problem with the memory slot, figured out and solved. But, after restarting the freeNAS the shares where not available anymore. In the Storage area I saw both Volumes reported to be DEGRADED and Error on reporting Used space and Free space.
I had 2-3 problems before with broken freeNAS system but always fixed with fsck; now, I didn't faced a problem like this one and after some search on the Internet it looked like a damaged HDD - but I didn't wanted to believe that so I mounted the first HDD (NAS0) on a linux filesystem, opened some files and it seems to be OK. Back on the freeNAS system, run the df -h which shows details only about the USB of 4 Gb, nothing about the 2 installed drives.
After reading some more I found about:
zpool status (which shows "no pools available" )
used -D on zpool to show the destroyed pools but none available
gpart show (also showed info only about the 4 Gb USB drive)
anyway, long story short nothing worked so I decided to stop and ask for help; before that I found another post (https://forums.freenas.org/index.ph...e-after-upgrade-from-8-3-1-to-9-1-beta.13734/) and decided to run " sysctl vfs.zfs.vdev.larger_ashift_disable=1 "; it also didn't produced any effect, so I've decide to shut down and really stop.

Now, when asking for help and trying to exactly describe the problem I've restarted the NAS and the NAS0 volume is correctly reporting HDD size, used space, free space and status is HEALTY. the other volume NAS1 is still in DEGRADED status. I have restored the smb share of NAS0 but still not reachable from the network - it may be a setup issue.
Anyway, what is this DEGRADED issue and how can I fix it and restore the machine to it's original use ?
 

Mirfster

Doesn't know what he's talking about
Joined
Oct 2, 2015
Messages
3,215
2 HDD drives for datas
I've restarted the NAS and the NAS0 volume is correctly reporting HDD size, used space, free space and status is HEALTY. the other volume NAS1 is still in DEGRADED status.
Can you clarify on how you have two Volumes when you only have two hard drives? Are you saying that each Volume is composed of a 1 drive vDev?
 

joeschmuck

Old Man
Moderator
Joined
May 28, 2011
Messages
10,994
Are you using UFS formatted hard drives or ZFS? Also please post the exact error messages you are seeing (screen shots are okay), don't try to simplify it at all.
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
Please don't open new threads in this section. There's a big message asking you not to do that right at the top.
 

joeschmuck

Old Man
Moderator
Joined
May 28, 2011
Messages
10,994
Thread moved. Thanks, I didn't even notice it. Maybe we can get someone to lock this section down.
 

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
Please don't open new threads in this section. There's a big message asking you not to do that right at the top.
Sorry about that. First I understood that we should not post new in your thread; but now Isee is about complete section. Problem is I didn't found a section to suit my situation.
Again, my bad, sorry for that.
 

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
Can you clarify on how you have two Volumes when you only have two hard drives? Are you saying that each Volume is composed of a 1 drive vDev?
Yes, this is the case.
HDD1 - 1 Tb, 1 partition - Volume NAS0
HDD2 - 2 Tb, 1 partition - Volume NAS1
 

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
Are you using UFS formatted hard drives or ZFS?
First HDD is for sure ufs file system type - because I was mounting this under a linux OS; the second may be NTFS but I am not 100% sure.
As ZFS is concerned I am pretty sure I avoided that option at the installation time since I was not exactly understood what it stands for and how will I be able to mount the file system under other OS's in case something goes wrong; so for sure I was using a pretty standard system to be sure I can get back the data on the drives.
I will come back later on with info on the commands.
 

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
Are you using UFS formatted hard drives or ZFS? Also please post the exact error messages you are seeing (screen shots are okay), don't try to simplify it at all.

Code:
[admin@freenas] ~> gpart show
=>     63  8028090  da0  MBR  (3.8G)
       63  1930257    1  freebsd  [active]  (943M)
  1930320       63       - free -  (32K)
  1930383  1930257    2  freebsd  (943M)
  3860640     3024    3  freebsd  (1.5M)
  3863664    41328    4  freebsd  (20M)
  3904992  4123161       - free -  (2.0G)

=>        34  1953525101  ada0  GPT  (932G)
          34          94        - free -  (47K)
         128     4194304     1  freebsd-swap  (2.0G)
     4194432  1949330703     2  freebsd-ufs  (930G)

=>      0  1930257  da0s1  BSD  (943M)
        0       16         - free -  (8.0K)
       16  1930241      1  !0  (943M)

as you see, the HDD2 is not present, but it is there . . .
Code:
[admin@freenas] ~> zpool show
internal error: failed to initialize ZFS library
[admin@freenas] ~> zpool status
internal error: failed to initialize ZFS library
[admin@freenas] ~> zpool status -D
internal error: failed to initialize ZFS library
[admin@freenas] ~> zpool import
internal error: failed to initialize ZFS library
[admin@freenas] ~> df -h
Filesystem             Size    Used   Avail Capacity  Mounted on
/dev/ufs/FreeNASs1a    927M    379M    474M    44%    /
devfs                  1.0K    1.0K      0B   100%    /dev
/dev/md0               4.6M    1.9M    2.3M    44%    /etc
/dev/md1               824K    2.0K    756K     0%    /mnt
/dev/md2               149M    7.7M    130M     6%    /var
/dev/ufs/FreeNASs4      20M    390K     18M     2%    /data
/dev/ufs/NAS0          915G    835G    7.1G    99%    /mnt/NAS0

again, NAS 1 (HDD2) is not present
 
Last edited:

wblock

Documentation Engineer
Joined
Nov 14, 2014
Messages
1,506
gpart only shows drives with what it thinks is a valid partition table. What does diskinfo -v ada1 show?
 

joeschmuck

Old Man
Moderator
Joined
May 28, 2011
Messages
10,994
Buddy, you do not have a ZFS pool, you are using UFS formatted drives, all those zpool and zfs commands will not help you. I suspect you second hard drive is dead. You need to remove your first hard drive and place it on a shelf for safe keeping, now power on your system, can you hear the hard drive spin up? Does it make clicking noises? Is it recognized by the BIOS? If your drive is being reported in the BIOS then boot up FreeNAS, is the drive being reported as ada0? If you don't see anything working then try Ubuntu Live or some other OS. What you want to do is see if the drive is recognized by the OS. If you get this far, in my tagline below is a link to Hard Drive Troubleshooting, click it, read it, do it. Report back when done.

If your drive is running but still not accessible then try it in another physical machine for the heck of it.
 

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
gpart only shows drives with what it thinks is a valid partition table. What does diskinfo -v ada1 show?
Code:
[admin@freenas] ~> diskinfo -v ada1
diskinfo: ada1: No such file or directory
[admin@freenas] ~> diskinfo -v ada0
diskinfo: ada0: Permission denied
 

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
Buddy, you do not have a ZFS pool, you are using UFS formatted drives, all those zpool and zfs commands will not help you. I suspect you second hard drive is dead. You need to remove your first hard drive and place it on a shelf for safe keeping, now power on your system, can you hear the hard drive spin up? Does it make clicking noises? Is it recognized by the BIOS? If your drive is being reported in the BIOS then boot up FreeNAS, is the drive being reported as ada0? If you don't see anything working then try Ubuntu Live or some other OS. What you want to do is see if the drive is recognized by the OS. If you get this far, in my tagline below is a link to Hard Drive Troubleshooting, click it, read it, do it. Report back when done.

If your drive is running but still not accessible then try it in another physical machine for the heck of it.
Lot of question man, thanks for pointing everything out; I'll proceed accordingly and post a feedback. I do not think HDD is dead, is running is moving and is heating so it should be OK. I'll put it into a linux machine, also to be sure.
 

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
yes, for sure I am using UFS formatted drives. I do not have an ZFS pool.
Second HDD is not dead; mounted to linux as read only and was able to see the files on it, play some movies, etc . .; in summary there are no HDD related problems - I suspect the only problems are with the UFS filesystem which got corrupted somehow.

@joeschmuck - in the signature, the first and second link are linked together somehow and the url points to the Raid Capacity Calculator instead of Hard drive troubleshooting guide.

Please excuse me in advance for the next statement; I am not an IT specialist even more - maybe just a regular PC user with below average intelligence and computer knowledge. But I have a personal conclusion of freenas being poorly designed; my only argument: every time I had a power drop or I rebooted my system from power supply and not from OS the UFS system gets corrupted. This is beyond my understanding power ( but, I've already admitted I am stupid). For me, is just like cutting down the power on the machine using Windows or Ubuntu, and, on the next restart, the system will not boot and you are forced to troubleshoot; this is not good, this is not correct, this is not a good design.

Later report:
Both HDD's are reported into BIOS system;
SATA 0 - WDC - 1TB
SATA 1 - WDC - 2TB

Inside freenas, both volomes are shown now as:
Volume1: NAS0 / mnt/NAS0 / Status: Healthy ( 1Tb) / disk: ada0p2
Volume2: NAS1 / mnt/NAS1 / Status: Healthy (2Tb) / disk: ada1p1
I think now I only have to redo the network paths to make it work . . I'll give it a try and report back . . .
 
Last edited:

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
Later edit:
So, just to resume, I ended up putting both HDD in place, restarted the system, (didn't even had to press F1 like before), system started and after that:
a. login to the freenas web interface
b. went to Sharing / NFS shares
c. defined share nas0 as /mnt/NAS0
d. defined share nas1 as /mnt/NAS1
e. went to Ubuntu system and installed
Code:
apt-get install nfs-common portmap

f. went to ubuntu machne in /home/user/ and created the folders nas0 and nas1
g. mounted the shares as:
Code:
sudo mount 192.168.1.150:/mnt/NAS0 /home/user/nas0
sudo mount 192.168.1.150:/mnt/NAS1 /home/user/nas1

h. went back to home/user/nas0 - and every files are mounted to the linux system now . . .

So again, who was that beautiful engineer who designed such a wonderful system ? I actually didn't performed any complicated recovery commands and actions - the only thing done was to disconnect both drives, mount them to a linux system , put them back into the NAS machine and reworked the shares; I didn;t even had to restore/re-import the volumes . . . .
 

joeschmuck

Old Man
Moderator
Joined
May 28, 2011
Messages
10,994
I'll have to fix that link. So you are good to go I guess?
 
  • Like
Reactions: jjq

Stux

MVP
Joined
Jun 2, 2016
Messages
4,419
Later edit:
So, just to resume, I ended up putting both HDD in place, restarted the system, (didn't even had to press F1 like before), system started and after that:
a. login to the freenas web interface
b. went to Sharing / NFS shares
c. defined share nas0 as /mnt/NAS0
d. defined share nas1 as /mnt/NAS1
e. went to Ubuntu system and installed
Code:
apt-get install nfs-common portmap

f. went to ubuntu machne in /home/user/ and created the folders nas0 and nas1
g. mounted the shares as:
Code:
sudo mount 192.168.1.150:/mnt/NAS0 /home/user/nas0
sudo mount 192.168.1.150:/mnt/NAS1 /home/user/nas1

h. went back to home/user/nas0 - and every files are mounted to the linux system now . . .

So again, who was that beautiful engineer who designed such a wonderful system ? I actually didn't performed any complicated recovery commands and actions - the only thing done was to disconnect both drives, mount them to a linux system , put them back into the NAS machine and reworked the shares; I didn;t even had to restore/re-import the volumes . . . .

ZFS with Redundancy protects against corruption issues like this.

UFS and No redundancy. HDs fail, systems reboot, simple FSs corrupt.

You'll lose your data eventually.
 
  • Like
Reactions: jjq

jjq

Dabbler
Joined
Mar 10, 2012
Messages
24
ZFS with Redundancy protects against corruption issues like this.
UFS and No redundancy. HDs fail, systems reboot, simple FSs corrupt.
You'll lose your data eventually.

Ok, got that warning. So what should I do ? choose the ZFS pools on the next install ? of course I cannot do that now with 2 drives full of data - can I ?
 

Stux

MVP
Joined
Jun 2, 2016
Messages
4,419
Ok, got that warning. So what should I do ? choose the ZFS pools on the next install ? of course I cannot do that now with 2 drives full of data - can I ?

You'd need additional drives anyway.

So, yes, next time you re-install, you won't have the option to use anything but ZFS.

You have 3TB of data.

You could use 2 4TB drives in a mirror.

Or 3 3TB drives in Raidz1 (z1 is not really recommended)

Since you basically need at least two disks with a capacity greater than 3TB, why not not decide how much capacity you want, how much redundancy. And let us know what you have and we can work out how to get there.

And then you can use your existing disks for backup
 
Status
Not open for further replies.
Top