Thanks for the reviews. They are appreciated. But I do have a suggestion for publsihed findings:
Place the review criteria in the review, along with some appropriate scale, and how the software and/or company performed according to each criterium. Of the reviews you posted, I've not found anything to disagree with. But I don't know what you were measuring, your background usage with similar products, where you got your information, and what (if any) annoying little thing you've been waiting for a product to finally address. The pop-up window on the Outpost review seems to be a good example. I do not at all disagree with your findings, but the review of that one feature took almost a third of the entire review. Certainly there are more features to be evaluated and scored?
Again, the reviews are very good, and I'd have to say fair. But they aren't exactly informative so that we readers can look at them, nod to ourselves and say "yup, of the 17 things that are important to me, this one ranks tops in 15, and OK in the other 2."
In other words, please give us more objective material. I'm not one to look at comparison charts with other products; that's usually an advertsing gimmick because most such charts only include the desirable features of the product that is pre-determined to win against the others. I immediately mistrust them. But if you would just list the things you looked at or considered, and how the product ranks against those features (not against other products), it would be a great help. Perhaps do as John Dvorak used to do back in the 80's when he wrote reviews for PC Magazine: say a feature "Needs some work, it's great, or it's good enough."
Thanks!
Ric Naff