I agree with you about httracker - I hadnt used it in a while and looking at the latest version it just tries to do too much now. 10 tries to do something simple is not what you want.
If you are looking for freeware open source and don't mind command line... I suggest wget. It is command line but can download an entire site - and probably could be integrated with any launcher / hotkey program so you can type "getall www.thissite.com
" and have it call wget with all the right options.
package for windows: http://www.christoph...m/WGet/WGetFiles.htm
official site: http://www.gnu.org/software/wget/
article on mirroring with it: http://www.jim.rober...t/articles/wget.html
(there are better ones but I am lazy, this is the first one i found)
Back when i used firefox there was a pretty good web page "snapper" tool which could also do a whole site, but I cannot remember the name of it.The name was related to snapshot but i cant think of it atm.
There are some great commercial tools and decent freeware tools out there - I noticed even IDM has a site grabber now but haven't tried it- but if it is only for once in a while wget can do the job just fine.