Xcopy would have failed just like the other commandline tools Trevor had tried. In all likelyhood, that machine is still live, and hosting a miriad of roles and services for the give you 10/10 for optimism there. The raid controller is integral to keeping the data readableĢ) You don't just 'pull out hard drive' on a server. Pull out the hard drive? What, one hard drive containing 60m files? Or even a server, and it has ONE hard drive?!? This isn't the 70s, Rus.ġ) it's at minimum RAID 5 array, quite possibly utilising the raid controller built into the servers motherboard. I love some of the suggestions that appeared between composing and posting my last Howe Some tools can check when copying that the destination is the same as the source and not copy - so you can just return the copy command again for the uncopyed files. You can the review the log and copy the last files by hand. With most of the the other copy tools, they will keep going with the remaining files when an error occurs. If you figure that out, there is no way to restart the operation with out overwriting any existing files - there is no "Skip all" option on the overwrite confirm dialog box. Which report.doc file is it talking about? There are a few thousand files of the same name in assorted directories. The copy has stopped, and can not be continued. You wander away for lunch (its going to take several hours to do the copy), come back to see an error "Could not copy 'report.doc' ", helpfully NOT printing the source or destination directory.
If you try it, with 60 Million files (probably around 6TB of data), explorer will sit there for ages "Preparing to copy" as it calculates the total data size and how long it will take. I always have a copy installed on my workstation, and at least 1 of my serversĮxplorer? You've got to be kidding (or trolling) I learnt of it during my Amiga days, in what seems like a lifetime ago. Stangely, not a lot of sysadmins know about DOpus.
EMCOPY IGNORE SYTEM VOLUME ZIP
It also has tabbed browsing (you can save groups of tabs), duplicate file finding, built in Zip management, custom toolbar command creation, file/folder listing and printing. I can attest to it's incredibly reliable performance, error handling and insanely flexible advanced features.Īside from being able to copy vast quantities of data, handle errors, log all actions, migrate NTFS properties, automatically unprotect restricted files and re-copy files if the source is modified, it also has built-in FTP, an advanced synchronisation feature (useful for mopping up failed files after you've fixed the problem that stopped them being copied), and a truly unparralelled batch renaming system which among other things, can use Regular Expressions. Personally, I swear by Directory Opus, by GPsoftware. Large file transfers are always a challenge