2TB of Data hangs in the balance - HELP! created dataset

Status
Not open for further replies.

Ruiner

Cadet
Joined
Nov 14, 2014
Messages
3
Hi there,
So I'm sure I'm not the first panic post, and I do apologize. A friend convinced me away from buying a synology and I put together a ASROCK in a Silverstone DS380 case with 6x3TB. I got it up and running today and he taught himself and helped me set it up.

Unfortunately, the first few directories we created were through the command line (versus creating a dataset) After watching some of the tutorial videos, and being up way later than I should be, I finished transferring all the data from my 2TB drive and got the bright idea to follow the video and without thinking typed "main" for the dataset. This was also the directory that I had originally created in the CLI which had multiple sub directories.

Now the data is still there because PLEX can play the videos, but I don't know how to get it back so windows can see the data. I'm deathly afraid that if I delete the "main" dataset I created, that the data will get black holed. Can anyone help or assist? If it's easier I can webex/gotomeeting/phone tomorrow.

Below you can see that 1.8TB is used on the nsa-bdf (but I can't see any folders on my computer when I log into the NAS (192.168.1.118)

nasbreakdown.JPG


Plex can see the folder because I had presented that storage to the plugin earlier.
plex.JPG


Mass Transfers & Sort is what I desperately need back. I can recover movies and tv shows, but the rest is years worth of work of a variety of stuff, and a lot of personal pictures/videos/etc.
 
Last edited:

Ruiner

Cadet
Joined
Nov 14, 2014
Messages
3
I found on the wiki, "NOTE: If you create a dataset in a location where a file or folder exists that file or folder will no longer be accessible. It has not been deleted however and simply deleting the dataset will restore access to the previously inaccessible file or folder."

Can anyone confirm, I'm deathly afraid to click delete on that dataset.


Also found this alternative:
https://forums.freenas.org/index.ph...reating-a-dataset-need-consoling-advice.8681/

Thank you for any input. I feel really terrible about such a stupid mistake.
 
Last edited:

andyclimb

Contributor
Joined
Aug 17, 2012
Messages
101
yes i can confirm that this happens..

I got a bit confused once as i created folders by the cmd line and then somehow ended up with a dataset with the same name and was writing to one or the other, not sure which and the files were just not showing up! I would create a new dataset, main2... if i remember correctly if you cd into /mnt/main/ you should get the dataset... copy everything out of that directory into main2... then delete the dataset in the GUI and you should be left with the underlying folder that was there. the zfs dataset is just mounted to the /mnt/main folder. I did this and then discovered all the test folders I have made, before generating the new dataset...

I believe the jail still has access... because... i learnt this the confusing way... mount points in jails DO NOT CROSS file system boundaries, and a ZFS dataset is effectively a separate file system that is then mounted... I had to mount each dataset separately, as mounting the top level volume with datasets in a jail does not work. it follows the filesystem, which in your case has this folder you are looking for!

Hope that helps..

if you are super paranoid and want to be uber safe... make another dataset, share it with jail, go into jail and copy your data from the dir you want to save to this new one... then you can't loose it.

A
 

pschatz100

Guru
Joined
Mar 30, 2014
Messages
1,184
I think you could rename the dataset so that it is not the same as the folder. No risk of losing anything, that way.
 

Ruiner

Cadet
Joined
Nov 14, 2014
Messages
3
We got it all back. Tried to delete the dataset and it said it was in use. After forcing an unmount everything came back. Thanks a bunch guys!
 
Status
Not open for further replies.
Top