Every page view on a forum for me tends to involve having to redownload headers and images over and over.
-kurtisnelson
Images shouldn't be redownloaded, since the browser would normally pull them from the cache. Or you could disable downloading images, though it may make some sites unnavigable.
A generic application like you describe is not really possible, I think, since each forum has a different layout and slightly different functionality. There is no common "protocol" to follow. A minor change in the layout of a forum would require updating the application. Also, if a forum depends on JavaScript, for example, that application would have to have JavaScript built in, etc. It would have to support authentication (cookies), encrypted connections... In effect, it would almost have to be a fully-featured web browser, with only a different interface and the added storage functionality.
What you may want to try, instead, is a program that will download a whole website (or parts of it) to disk. Years ago I used a program called Teleport Pro for this, but there are probably newer offerings available now. Reget Deluxe can also download websites recursively, and I know there's at least one freeware app, but can't recall its name. This is worth a try, although this solution works best with static websites, where pages don't change between views. With dynamically generated sites, such as discussion forums, recursive download may take a long time (or, if URLs change dynamically, may never stop, because the downloader keeps seeing "new" links every time it grabs a page - I've seen it happen often with Teleport Pro).
Also, without cookies, you'll be downloading content as seen by a user who is not logged-in, so no Reply links, for example.