ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > General Software Discussion

download url

(1/2) > >>

kalos:
hello

I need a program that will download urls from a list in a text file

it should just save the html (saving the complete webpage, ie. the html AND a folder with the files, should be optional)

also, it should use any cookies necessary from the web browser (IE) in order to authenticate

last, it should download every few seconds between each url, or with random time interval

do you know any program?
I already know wget, but I have problems with cookie authentication

thanks

GeekK:
HTTrack Website Copier

- cope/paste text file in window:

- cookies: http://www.httrack.com/html/faq.html#Q1c (copy the cookies located into the Temporary Internet Files folder for IE into your project folder (or even the HTTrack folder))

- it can pause after chosen number of Bytes: http://www.httrack.com/html/step9_opt2.html

- faq's and manuals: http://www.httrack.com/html/index.html; has lot more options like flow-control



kalos:
thanks, but I really need an alternative of Httrack, I have hard time with this program and it is not as much customizable

I wuold use wget, but I cannot make it to automaticaly choose and use the appropriate cookie

blackcat:
what about cURL? Have you tried that one?

kalos:
I am trying to make curl work for this, with no success atm

Navigation

[0] Message Index

[#] Next page

Go to full version