At first a big >THANK YOU!< for the first answers to my request!
@lanux128
I read all the reviews on Donationcoders, but the Information about the facilities of Disc Library sounds like nearly each other app. In no review I found timings, benchmarks or numbers of any kind. Sometimes they wrote: "This program is faster than light!" or "quickly finda all you search" - with these tests a normal user cannot imagine the real behaviour. I ask myself why the most people forget to mention numbers, because the differences of disk catalogers are very impressive (often hundreds or thousands of times). I tried only to read my C:-drive with Disc Libary (28309 files, 4219 dirs) and became big eyes
: It slows extremely down with each entry. After 60 Minutes I had 2/3 of its content (without looking in Zip-Folders - all additional infos deactivated!) and only 2-3 entries a second are added? I stopped after 1 hour - this way it´ll take ages to read my complete collection. This is slower than everything else I´ve seen till now. Neither the options/features nor the visuals are enough to change my opinion. The most other programs manage the same task in a few seconds.
More interesting things for me are seldom features like FTP-support, the read and search speed. Import- Export- Options. Special File-Operations (comparison, diffs), similarity-search of picture-contents, fuzzy-search options, encryption of database, batch searching for different words in saved result-lists and more. Standard-Features are not of interest. What special features of programs no others have are for you most fascinating or used. Do you miss something special no program offers at the moment?
I need infos that go a bit deeper.
Examples:
What search options are used most: Searching by filename / file-extensions / dirname / date / Text-Content or simply browsing in virtual drive-structures.
Are you often using very special search-terms like several words that have to match, some others that mustn´t match (like a filter) in combination with directory names and last modify date?
How often do you such programs? Once an hour, only 1x each day / week / month?
The sizes / countage of your collection. How many CDs & DVDs are archived? >10 >100 >1000 or perhaps >10000?
Whats most important for usage: Speed of reading discs, speed of searching, Details of Infos you can extract, small size of the collection-database, low memory usage?
I personally want to archive a very big amount of several thousands CDs/DvDs with millions of files. The search (this is the main used function of such a program) should be very fast - faster than every other program offers at the moment. Unnecessary infos mustn´t be read - no picture previews and text-content. Additional remarks for each entry should be possible. Possibility of a very low memory usage if I request for it. I´d like a minimal size of the database.
@KenR
At the first step of development I didn´t want to read contents like Google Desktop Search or others like that. The reason is the incredible amount of memory that the hash and lookup-tables create. A content-table needs to read and analyze all text files totally from the first to the last byte - this consumes very much time and memory. I tested different ones only with about the half text-contents of my laptop partitions. After nearly an hour a 640 MB database was the result. Don´t forget that harddiscs are MUCH faster than CD/DVD-drives (about 80-120MB/second). If I only take 8-10 DVDs full with ebooks, internet site-mirrors or sourcecodes, the database would be doubled in size and the time for this action will be 10-20 times longer. Perhaps something like that will be implemented in the future, but not in the first shot.
A lot of users wrote they especially want small databases, fast search speed and in most cases it´s not necessary to preview or know the text-content of external drives - for others you can get very good freeware Desktop-searches. I don´t want to integrate functions that will be used only by 0,0001 percent of the users.
The main problem of nearly all disc catalogers is the search speed. Nearly all of the programs on the market must load the whole database in the memory before you can search through the data and the results also are hold in a list of strings and further infos in memory. The others think that they increase the speed this way. Normally this would be right - but they´re wrong! Some bottlenecks are not within the search itself - only very few of them were able to see this and found a quick solution. There are other factors that influence the speed much more. I already started the main-features and experimented with some new homebrewn algorithms - no GUI till now. The result is, that the search speed without loading all datas to memory is now sometimes several hundreds or thousands of times faster than others. This speed increases very much if I cache the datas in memory or prepare some special speedup-hashes.
For the benchmarks I calculated the objects/second and the speed in comparison to the actual version of my program. I deactivated an algorithm that speeds up the search about 5-10 times, because the database cannot be compressed very good containing these infos - perhaps the final version will give you the choice to create these data or not. There are some new ideas I´ll test in the next weeks and I expect a multiplification of the search-speed up to 10-100 times.
I show you an extract of the benchmarks:
there were 3 searches through my complete harddisc:
1.) The direct search for a filename appearing only 1 times with perfect name-matching
2.) The search for all .exe files on the drives the result is about 4% of all entries
3.) The search for *.* that shows up about 165000 files and dirs as result
All additional informations (zip-files, etc. are deactivated) - only filename/dirs/dates are collected - no additional filters - search by name - disturbing programs are deactivated (Virus killer, firewall, pest patrol) - same computer - same cpu-settings - no other programs used in the background (like Firefox or anything other cpu-consuming hd-accessing things)
1 - 2 - 3 Program name
22860550 - 4228446 - 2286055 My program with precalculated speedup-information
8382226 - 5624437 - 425982 My program with normal (slow) search but other improvements
206827 - 51707 - 12927 Whereisit V3.11
23717 - 3953 - 233 CD Search V2.0
11732 - 7821 - 499 CDWise 1.0
10672 - 2668 - impossible Disccat 3.0 b4
70535 - 30229 - 2273 InsideCAT 2.22
68445 - 3803 - 95 MAKara CD Catalog 1.1
28402 - 644 - 33 MyFindex V3.4
23707 - 4255 - 138 CDTree 2.1.8 std
155330 - 77665 - 77665 Cathy 2.20.4
31066 - 3612 - 1726 JBCat 2.0 b1
77665 - 25888 - 1006 CD Bank 2.7.6
41341 - 20671 - 111 DiskFinder
1102 - 1074 - 60 CD Catalog Expert 9
82682 - 13780 - 322 Disk Explorer Professional 3
55121 - 27561 - 1050 CD Bank Cataloguer V2.7.8
82590 - 23597 - 319 DriveScan Plus 2006 3.5b
82625 - 13771 - 648 Advanced Disk Cataloger 151
414103 - 27607 - 998 Broken Cross Disk-Manager 3.94
331496 - 165748 - 82874 Advanced CATaloguer 2.6 pro
331390 - 27616 - 1624 CDWinder 2.6.1
82848 - 16570 - 118 JeeKey CeDor 5.3
41424 - 1821 - 50 Catalog Max 1.66
1175142 - 19140 - 1043 Deductus 1.61 b 266
297568 - 2976 - 111 Analinx Filookup 1.1.120
411343 - 164537 - 10284 Locate32 V3 RC2
3022 - 1662 - 141 Whatdisk 1.0
There were many others I didn´t include. Some of them had in some searches technical problems (exceptions) or were not able to use the right search-term or to read contents others than CD-drives. Others have restrictions in the number of result-output some only about 1000 others up to 10000 or 100000 as maximum.
I look forward
for further personal experiences and suggestions!