Other Software > Developer's Corner
Desktop Forum Reader
kurtisnelson:
I am wondering if anyone knows of a program that will in some form crawl a forum and download the posts kind of like a newsgroup client. I frequent many forums, but get frustrated at how long each individual page load takes and the inefficiency of browsing them in a browser. Does such a program exist? If not, does any one have an idea on how to create one?
KenR:
I don't know if it's what you're looking for, but you might take a look at http://www.newzcrawler.com/.
Ken
kurtisnelson:
More what I mean is something that will crawl a forum and browse it similar to a newsgroup, text based and all local files. Something that would work well on a slow connection and could connect to the internet, update, then disconnect to read. Every page view on a forum for me tends to involve having to redownload headers and images over and over.
tranglos:
Every page view on a forum for me tends to involve having to redownload headers and images over and over.
-kurtisnelson (February 09, 2007, 12:47 PM)
--- End quote ---
Images shouldn't be redownloaded, since the browser would normally pull them from the cache. Or you could disable downloading images, though it may make some sites unnavigable.
A generic application like you describe is not really possible, I think, since each forum has a different layout and slightly different functionality. There is no common "protocol" to follow. A minor change in the layout of a forum would require updating the application. Also, if a forum depends on JavaScript, for example, that application would have to have JavaScript built in, etc. It would have to support authentication (cookies), encrypted connections... In effect, it would almost have to be a fully-featured web browser, with only a different interface and the added storage functionality.
What you may want to try, instead, is a program that will download a whole website (or parts of it) to disk. Years ago I used a program called Teleport Pro for this, but there are probably newer offerings available now. Reget Deluxe can also download websites recursively, and I know there's at least one freeware app, but can't recall its name. This is worth a try, although this solution works best with static websites, where pages don't change between views. With dynamically generated sites, such as discussion forums, recursive download may take a long time (or, if URLs change dynamically, may never stop, because the downloader keeps seeing "new" links every time it grabs a page - I've seen it happen often with Teleport Pro).
Also, without cookies, you'll be downloading content as seen by a user who is not logged-in, so no Reply links, for example.
urlwolf:
hmm maybe what you want is website watcher by aignes:
http://www.aignes.com/
I think it's a nice tool. I'd buy it, but I'm waiting for a DC discount :). A bit obsolete since most sites can be read with an RSS reader. Still, nice functionality to track changes of static (or slowly updated) pages.
Another solution: subscribe to the RSS feed, if the forum has one. Then use a reader. I like greatnews. This forum for example has plenty of RSS functionality. THat is probably the best solution.
Navigation
[0] Message Index
[#] Next page
Go to full version