On the subject of measuring differences between defraggers:
First start with a bit of a (simplified) background. On Windows it is quite safe to say that the NTFS file system is used the most. Commonly in Linux that would be a heir of the EXT2 file system (EXT3 / EXT4). The base of both file systems exists already for a long, long time. By design the NTFS file system places files on a hard disk in close groups. The design behind EXT2/3/4 file systems leave a lot more "distance" between files on a hard disk.
Imagine a fresh installation of Windows and Linux, each on its own PC, which are the same in every other hardware aspect. Both run fine as long as there are not much files added or removed. And NTFS will have a minor speed advantage over EXT2/3/4 in this particular scenario.
However, after installing programs and the ongoing processing of data/temporary files you will see that all files are getting fragmented much quicker and more severely with the NTFS file system than with the EXT2/3/4 file systems. Leaving "expansion room" between files has beneficial effects on a computer that is actively doing the task(s) it is setup to do. And it gives the user the impression that there is no fragmentation on EXT2/3/4 file systems (which is not true at all). In this scenario the speed advantage goes to EXT2/3/4.
When the hard disk from each PC has less than 10% of free space left, both NTFS and EXT2/3/4 will have matching performance that is quit bad.
So far the simplified background.
The defragger that comes as default with Windows doesn't have too much settings (regarding placement of files) for you to adjust. So if you want to have the advantages of "expansion room" between files, you cannot really rely on the Windows defragger, as it will do its best to give you the best performance it can with a relatively close knitted group of files. And after it finishes the counter is set to 0% (ideally). And the user is under the impression that files are not fragmented.
Other defragmentation software is bound to use the API calls as the Windows defragger does, but most will give you more options regarding the placement of files on the hard disk and which other optimizations should be applied.
Honestly, since Windows 7 the Windows defragger does an adequate job. But for some not enough, hence there still are quite some 3rd party defraggers out. Some would even argue that using a "better" defragger is just an opinion.
Personally, I don't think that is true. Then again, my opinions regarding file placement and optimizations (in a nutshell: Windows itself, programs, (user) data files and temporary files, each stored in their separate partition) are more extreme. In that separation scenario a lot of file fragmentation is completely eliminated, no matter which defragmentation software and/or optimization schemes you plan to use.
I even dare to say that with a strict separation scheme, you'll extend the operational lifetime of your hard disk extensively as you keep the need for defragmentation to an absolute minimum. After all Windows by default does a weekly defragmentation to give the user the impression that Windows hardly fragments files.
Added bonuses of the strict separation scheme are: you'll have an easier time keeping your (user) data safe from almost any operating system mishap and it makes backups much easier too.
Of course, with SSD hard disks most of the above has become a moot point, accessing fragmented or defragmented files on a SSD hard disk hardly makes any difference in time and are much faster than on a standard spinning hard disk anyway. Still, the added bonuses from strictly separating files are still valid.