I want to phyiscally move some data on a regular schedule.
These drives are NTFS formatted so I have been using the Import Disk option to import to a dedicated dataset.
I can't find information on what it does exactly, specificially when I am importing newer data over an existing dataset. Is it entirely replaced? new files over the top and now non existing files are kept? Now thinking about it I guess I could look at the source which I will get ot on to.
Problem I have with this though is it doesn't seem to copy everything, the only progress I can see is the pro mgress bar (without %) when going back to the Import Disk option (I selected background prior). I can't seem to see anything in the logs about this - if anyone knows what I should be looking for?
The data being "synced" is close to 1tb and takes a long time (lots of small files), so I never really have an idea when its ended to check the logs.
I'm not entirely sure if this is the best way to go for a sneakernet dump, I guess I could clear the dataset myself each time, though whilst I don't mind this being a manual process (ideas for automatic welcome), the time to clear a dataset might make it more of a pain.
Any ideas? I'm starting to think about just mounting it myself and setting up an rsync task to check that mount regularly.
These drives are NTFS formatted so I have been using the Import Disk option to import to a dedicated dataset.
I can't find information on what it does exactly, specificially when I am importing newer data over an existing dataset. Is it entirely replaced? new files over the top and now non existing files are kept? Now thinking about it I guess I could look at the source which I will get ot on to.
Problem I have with this though is it doesn't seem to copy everything, the only progress I can see is the pro mgress bar (without %) when going back to the Import Disk option (I selected background prior). I can't seem to see anything in the logs about this - if anyone knows what I should be looking for?
The data being "synced" is close to 1tb and takes a long time (lots of small files), so I never really have an idea when its ended to check the logs.
I'm not entirely sure if this is the best way to go for a sneakernet dump, I guess I could clear the dataset myself each time, though whilst I don't mind this being a manual process (ideas for automatic welcome), the time to clear a dataset might make it more of a pain.
Any ideas? I'm starting to think about just mounting it myself and setting up an rsync task to check that mount regularly.