ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > General Software Discussion

download webpages from a text file of urls

(1/1)

kalos:
hello

I need a program that I will load a text file with urls and it will download the full webpages (including files and links etc), preferably as mht

any idea?

thanks

PS: wget for winxp often fails to do this

lanux128:
have you tried HTTrack? has MHT support, iirc.

It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.-website
--- End quote ---

• http://www.httrack.com/

Navigation

[0] Message Index

Go to full version