Welcome Guest.   Make a donation to an author on the site November 24, 2014, 10:11:44 AM  *

Please login or register.
Or did you miss your validation email?


Login with username and password (forgot your password?)
Why not become a lifetime supporting member of the site with a one-time donation of any amount? Your donation entitles you to a ton of additional benefits, including access to exclusive discounts and downloads, the ability to enter monthly free software drawings, and a single non-expiring license key for all of our programs.


You must sign up here before you can post and access some areas of the site. Registration is totally free and confidential.
 
The N.A.N.Y. Challenge 2012! Download dozens of custom programs!
   
   Forum Home   Thread Marks Chat! Downloads Search Login Register  
Pages: Prev 1 [2]   Go Down
  Reply  |  New Topic  |  Print  
Author Topic: IDEA - Webpage archiving  (Read 22360 times)
kfitting
Charter Member
***
Posts: 574


View Profile Give some DonationCredits to this forum member
« Reply #25 on: January 02, 2011, 05:48:48 PM »

Anybody have information on the status of cyberarticle?  Last update was Feb 2010, blog hasnt been updated since aug 2009.

Or, any other similar software?  I'd really like to be able to archive/catalog webpages and pdfs.  I have Local Website Archive, but there are no tags/keywording and the author is not planning on doing anything for multi-page saving (per email a year or so ago).  I'm impressed by what I see in Cyberarticle, just not sure about longevity.
« Last Edit: January 02, 2011, 05:52:58 PM by kfitting » Logged
rjbull
Charter Member
***
Posts: 2,788

View Profile Give some DonationCredits to this forum member
« Reply #26 on: January 04, 2011, 02:38:48 PM »

Or, any other similar software?  I'd really like to be able to archive/catalog webpages and pdfs.

For Web pages (but not PDF), would the "site grabber" features found in some download managers suit you?  E.g. from the Grabber Help of Internet Download Manager (IDM):
Quote

The site grabber feature of Internet Download Manager not only lets you download required files that are specified with filters, for example all pictures from a web site, or all audio files from a web site, but it also lets you download subsets of web sites, or complete web sites for mirroring or offline browsing.
Logged
panzer
Participant
*
Posts: 206

View Profile Give some DonationCredits to this forum member
« Reply #27 on: November 18, 2011, 04:40:32 AM »

http://www.httrack.com/
http://www.freshwebmaster.com/freshwebsuction.html
Logged
IainB
Supporting Member
**
Posts: 4,871


Slartibartfarst

see users location on a map View Profile Give some DonationCredits to this forum member
« Reply #28 on: November 18, 2011, 05:30:44 AM »

This might be a bit off-topic, but it might help.
I can recommend the FF add-on Scrapbook for personal browse-archiving of those pages you want to keep - and their files and nested sub-sections (if you want them).
For example, I have just copied this discussion thread with a single copy command.

The discussion thread runs over two "pages", but I enabled the FF add-on AP (AutoPage) before I made the copy. AP makes the pages appear as a single continuous flow of pages, with separation gaps between each "page".

The nice thing about the Scrapbook saved pages is that the folders where they reside on your disk can be indexed by WS (Windows Search), so that they then become integrated into your client-based information repository. If you have an image indexing tool scanning the same folders, then the images from the saved web pages can all be included in your image database as well. It's surprising what Scrapbook sometimes captures that you never saw on reading the original page.
Logged
Pages: Prev 1 [2]   Go Up
  Reply  |  New Topic  |  Print  
 
Jump to:  
   Forum Home   Thread Marks Chat! Downloads Search Login Register  

DonationCoder.com | About Us
DonationCoder.com Forum | Powered by SMF
[ Page time: 0.026s | Server load: 0.05 ]