topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Thursday October 31, 2024, 6:48 pm
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Author Topic: download webpages from a text file of urls  (Read 2420 times)

kalos

  • Member
  • Joined in 2006
  • **
  • default avatar
  • Posts: 1,824
    • View Profile
    • Donate to Member
download webpages from a text file of urls
« on: April 09, 2011, 04:21 PM »
hello

I need a program that I will load a text file with urls and it will download the full webpages (including files and links etc), preferably as mht

any idea?

thanks

PS: wget for winxp often fails to do this

lanux128

  • Global Moderator
  • Joined in 2005
  • *****
  • Posts: 6,277
    • View Profile
    • Donate to Member
Re: download webpages from a text file of urls
« Reply #1 on: April 09, 2011, 09:16 PM »
have you tried HTTrack? has MHT support, iirc.

It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.
-website

http://www.httrack.com/