Messages - questorfla [ switch to compact view ]

Pages: prev1 2 3 4 5 6 [7] 8 9 10 11 12 ... 114next
31
Here is the problem: On a Widows 10 system.  Folder B should be an exact duplicate of Folder A. But a check of each shows that B is about .3 TB larger than A.

A is 1.43 TB    B is 1.73 TB.   I need a way to quickly find out why.   Both A and B contain the same number of 2nd level folders so it is something deeper. Until I know what, it is possible that B has some files or folders that need to be added to A.  Or maybe it has duplicates that need to be deleted.  I don't want to use an automated process that would make that choice for me.

A program that could quickly list Only the folders/files that don't match showing file and folder names and sizes would be perfect. Does anyone have any suggestions? I know I have seen many over the years but none come to mind right away.  And this is a kind of rush job.
:(  even Beyond compare is reporting at least 45 minutes to complete the comparison run
guess ill just check back tomorrow

32
Merry Christmas and Happy New year to Everyone on the Board.  I was offline for the last part of the month (restrictions by the Wife in Charge) due to family all being together for first time in years. :)

To answer all the great replies though:
 
Shades.  You are always the one to [point out the logical fact-forward solutions.  To be honest, this is what the person who is doing this SHOULD be doing.  That was why I agreed to allow them to do it.
Instead, they created about 5 MORE top level folders with LONG names and used drag and drop to put all the other folders in those Five.  Things immediately went from "Worst to Worser"  :(  I had explained the problem and almost cried when I saw her solution.  Fortunately, I only allowed her to work with a copy of the original mess so..

Wraith:  Thanks for the comments on OneDrive.
I will take another look.  It has been a long time.  Every now and again, i peek at what has 'gathered itself" into my own one drive.  The other day, i tried to copy what was in it to another location and ran into a bunch of errors that basically said the files were not "available at  that time" .  Most of the errors were on larger files and none of these were anything i deliberately put into One Drive so i fully admit that I have not given it a fair test.  If it works for you, i should see what happens on another trial.   
The biggest problem we would have is constant flux.  We have  TON of storage with our Business OneDrive and the files would be constantly added, changed, moved etc.  That is going to need a ton of bandwidth. All that currently is all handled inhouse at gigabit speeds (or close) and I’m not sure how much of a hit we can take on our 30 MB (UP) Cable Pipe to the web.   Worth looking at if it works for you though.

And Lastly, 4WD!
😊 I Knew i could count on you having some great suggestions.  Seems like you must do a lot of the same things i have to do.  But you usually know all the best ways to do it.  I'm lookng forward to somebody coming out with an AI that can look at the problem and come up with best answer (one tha can be afforded by the little guys anyway, I'm sure Siri or Alexa could handle it with ease !)
At least you are patient enough to read the blasted HowTo's on Robo.  Some of your suggestions look Very Interesting though and I plan on trying them ALL out ASAP.

Wishing everyone a Great High-Tech New Year  for 2019. 

Again, thanks to all for the great suggestions


34
You mean you Trust OneDrive?   :tellme:
That would be asking a lot for me to do.   Though I am sure it has gotten better since it first came out. 

One Question about Syncovery:  Do you know how it will handle cases where the files are buried beyond the Max_Path point?  I run into this a lot on these exact folders.  I am sure that a number of them are going to be like that.  Many Backup programs either Choke or return error codes for those files.  Yet they  really are there and can be opened, read, modified etc. by the programs (and people) that put them there. 
 
Files exactly like that are one of the main reasons for this project.  If i don't shorten the paths soon, every file in those folders will be in paths that cannot be backed up.  I have heard that MicroSoft has promised to open up the path lengths allowed to some really extreme number in a soon to be released version of Win 10.   And i think it can even be enabled now with some registry edits.  But as far as I can tell, the original Max_Path of 260 total is still the rule at this time.
 
Besides, the whole point is that people should be more careful and if I dont enforce some limiting factor, they end up copying a whole folder and all its paths under another top-level of folders adding even more complexity.  Searching for specific files in this mess cant be easy on the File Indexer Service.  This whole effort is to try to remove as many of those rediculously long file and folder names as possible

35
Wraith, I finally had time to test syncovery and it does work as advertised.  :)  The robocopy script I came up with does about half the job.  I will probably buy Syncovery abecasue it is the most well thought-out process I have seen for some time.   I can see where it (especially the schreduling part!) would be GREAT to have.  I might use it for what I need now as well -  if i can't get the robocopy switches in the correct order to work properly.  So far it does create the empty directory but i cant get it to move specific files.    I was hoping to find a simple way to manage this one task and feel Syncovery is almost like using a Shotgun to kill a fly  :). 
Still, as you say, i think it will do the trick after i run a couple more tests so i am sure i know what is going on.  I can see many uses for it on much larger tasks i have to perform daily. 

Pages: prev1 2 3 4 5 6 [7] 8 9 10 11 12 ... 114next
Go to full version