avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Sunday July 14, 2024, 8:55 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Author Topic: Windows directory not up to date?  (Read 2582 times)


  • Supporting Member
  • Joined in 2007
  • **
  • Posts: 888
  • what am I doing in this handbasket?
    • View Profile
    • Read more about this member.
    • Donate to Member
Windows directory not up to date?
« on: January 18, 2014, 07:55 PM »
So I have a process (Sharepoint farm backup in several pieces) that is hanging up weekly some time in the wee hours.  Have not found any useful information on why this is happening yet, and what's worse is I can't even really determine the exact time that it halts (other than a 3- to 4-hour window more or less).  When I check in the morning the job is still running, it just isn't doing anything.

My thought was that maybe if I could pin down when this hangup happens, I'd have a chance figuring out what is doing it.  If I rdp into the server and watch the progress of the backups and refresh the target folder (with an F5), I can see the file growing as the backup gets written.  So I thought I'd install one of those file change monitors and that would answer my question (at least for the "when does it quit").  Unfortunately, that didn't work; all the monitors I found hook into the notification chain and see when the files are created and closed, but there's no info on the size change as the files grow.

So I thought to heck with it, I'll just write a script to pull a directory every so often, compare the results with the previous interval, and report on any changes.  I did so over the last few days, and the script works great, and all my testing shows that it reports creation, deletion, size change, time stamp change, and file rename properly.  Started it running last night before the site collection backups started.

Problem is, the script isn't reporting file size changes either.  I have the script window and the target folder window open, and the script is reporting the same three items for each backup - file creation, size and time stamp change when the backup finishes and the file is closed and starts copying off, and the delete when it's done.  No interim data on the file size changes as it grows.

Except -- if I hit F5 then I DO see the file size changes... in the Windows folder window and also my script sees those same changes.  Looking a little closer at what is actually happening, if I do successive "dir"s at a command prompt, I can see the free space dropping, but the reported size of the file doesn't change until I hit F5 to refresh the display again!

F5 updates the reported size for everything.  Closing the file does too.  Nothing else.

I did a little googling on what's happening, and I'm sure this is all the result of NTFS trying to be efficient.  But what I really want to know is, is there a simple way to make it optimize for accuracy instead of efficiency?  It seems such a waste of cycles to have to request a file size explicitly for each file...
vi vi vi - editor of the beast