Hi Folks,
If you look at the thread with DiskTune, you find the possible explanation .. most of these softwares are doing all their actual defragging (copying and moving of files) using the same Windows API. This is also discussed on the MyDefrag page.
Thus the softwares are more involved with issues like file placement (optimization) and timing of the activity (e.g. dedicated or background) and user interface and auxiliary functions and whether or not they do a few of the ultra-techie aspects -- they do not write any new code for the actual movement and replacement of the files.
Now there are times defragging messes up, I gather from reading forums . One category would be software that places files or notations below the Windows OS, we know there are categories of software that are defrag-sensitive, perhaps in the virtualization or sandbox world, perhaps with some programs that write directly to disk, perhaps .. conceivably .. with special markers like serial #'s placed hidden that were ultra-security sensitive (I am guessing a bit). Different softwares may handle those possibilities differently .. e.g they could actually check to see if such-and-such is on the system and/or they could be trained to play nice with certain files, not moving them. I have yet to see a cogent list of the various types of apps and usages that qualify for this. However your common house-garden-variety user probably won't get hit here.
Perhaps there are other issues like recovery (power turns off) or looking for bad sectors (remember how the Windows defragger would often simply not function due to wanting the perfect chkdsk) where there are diffierences in how defrags work. Again, I have yet to see a cogent list of the special situations. And it is possible that your new beta is weaker in some special situations than a shaken-out software. Possibly.
There is also the special situation with system files that might only be touched in a dedicated boot-time defrag. If this is not your house-garden concern, the issue does not even come up. Or you might be using a special program, like NTREGOPT, for such functions.
All-in-all the primary issue is that the file movement code (something like .. copy file .. copy new file pointer .. confirm .. change pointer in mini-nanosecond.. reconfirm) is not app-specific. That, the most sensitive part, is a Windows API, apparently done reasonably well for a Redmond operation.
Anyone who can explain real deficiencies better, maybe we can have a little thread (deficiencies and pitfalls of the garden-variety defragger). Or here. I would like to learn a bit of the techie pitfalls.
As for JKDefrag/MyDefrag, I look forward to giving it a whirl, it is spoken of very highly. Apparently it does the optimization in a superior fashion than most other softwares, possible exceptions being the ultra-high-end like PerfectDisk. You can have dueling algortithms ! Generally I just use the simple background products, Auslogics and Defraggler (Piriform/Ccleaner). On the theory that the real issue is taking care of the 30K fragments that can develop -- and the file placement struggle is more nuance than substance (ducks .. YMMV).
Granted, on the other hand there is nothing wrong with a very intelligent file placement, especially one that looks ahead to the forthcoming new temp files and extensions and all. (As discussed on the Mydefrag page.) One thing I liked about the DiskTune fella was that, to a certain extent, he seemed to be willing to speak about this bluntly. That there was only so much all the optimization in the world would ever accomplish on a Windows system, and that so much was not huge.
One way to look at it is that on our systems file I-O is so far below memory usage and CPU exhaustion and internet connections and other bottlenecks in causing any actual noticeable speed bumps .. that tweaking a bit faster file I-O, while nice, will make little practical difference. Of course the unnecessary and cumbersome 50K fragments people get would do some harm .. since every file fragment can lead to an extra read on volatile files and one file can have hundreds of fragments. Just as significantly, fragmented files can be a bear in a techie recovery operation, disrupting any hope. So you try to strike a balance. In my case, I have yet to be convinced that optimization is very relevant.
This thread has even triggered my defrag scheduler ! (Whenever I read a post about defragging I do a defrag.) ... Ok, before my reinstall of XP maybe I had a weekly defrag in the Splinterware system scheduler program that I use, and haven't yet reinstalled. A good minor chore for now.
Shalom,
Steven Avery