I'm splitting this from a previous topic...
You know I'd love to see a more organized process on this forum where a small group of people could agree to do an organized, exhaustive, frenetic testing of programs and try to come up with some consensus of best tools for a task.
In other words, it would be nice if we could take a thread like this and have a few people just go out and try every help file maker they can find and winnow down the list of top candidates. Not only would that be helpful but we could then go and try to get some discounts on these, as well as provide some guidance to the non-top authors on how they could improve their programs.
Just one consideration...
I'd like to throw in "not-testing"... For these kinds of things there are people that use the software in real-world production and can contribute that real-world experience.
One of the problems I find with a lot of reviews is that they are little more than "feature lists" with no real commentary on how the application truly performs in real life.
I simply don't have the time to go out and download, install, then test a bunch of software, but I can certainly contribute for those applications where I really know them very well from real world experience.
I say "real world experience" because there's a VERY big difference between running a simple test for 5 minutes, and using an application on a daily basis.
Of course feature lists are important, but adding in something about the performance would take a review from being a "review" to being "authoritative".
Ok - That's all. Let the ideas start flying on how to get this organized... Perhaps we should start with a list of goals. I'll go first:
1) Make the reviews authoritative and reliable for readers
2) Exhaustive reviews of individual pieces of software
3) Exhaustive comparisons between pieces of software
4) Relatively easy to contribute to the process in a structured way, perhaps "templated" - this would require a metadata type structure
5) Not sure... Please continue...