Home | Blog | Software | Reviews and Features | Forum | Help | Donate | About us

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • October 21, 2016, 11:56:10 AM
  • Proudly celebrating 10 years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Author Topic: download webpages from a text file of urls  (Read 1070 times)


  • Member
  • Joined in 2006
  • **
  • default avatar
  • Posts: 1,463
    • View Profile
    • Donate to Member
download webpages from a text file of urls
« on: April 09, 2011, 04:21:26 PM »

I need a program that I will load a text file with urls and it will download the full webpages (including files and links etc), preferably as mht

any idea?


PS: wget for winxp often fails to do this


  • Global Moderator
  • Joined in 2005
  • *****
  • Posts: 6,258
    • View Profile
    • Coding Snacks by Lanux128
    • Read more about this member.
    • Donate to Member
Re: download webpages from a text file of urls
« Reply #1 on: April 09, 2011, 09:16:20 PM »
have you tried HTTrack? has MHT support, iirc.

Quote from: website
It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.