ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE. Software > Post New Requests Here

Transform list of url links to copies of the actual web pages

(1/2) > >>

Folder A contains one simple text file with 10 website urls: Sites.txt

Have web browser (any) visit the 10 sites, and save the corresponding 'page' of each visited site to a separate file.

At the end of the process..., Folder B will contain the 10 files, 'xyz_01.html' ... 'xyz_10.html', from the list in Folder A.

Any thoughts and help greatly appreciated.

Nicholas Kormanik

This can easily be done.  However, keep in mind that simply downloading the "page" of said site, unless they're simply static sites, doesn't always work that well when it comes to viewing them afterward.  If you're okay with that, this can be done with just a few lines of AHK code.  Let me know.

In Powershell:

--- Code: PowerShell ---$urls = Get-Content K:\sites.txt for($i=0; $i -lt $urls.Count; $i++) {  $client = new-object System.Net.WebClient  $client.DownloadFile( $urls[$i], "k:\Site_" + $i + ".html")}
Change paths to suit, (ie. the K:\).

NOTE: Media content, (images, etc), if fetched from other sites will not be downloaded, you'll just have the URI link - it'll still work when viewed in a browser, the content will be fetched from the external site as needed, just like when viewing the original page.  Actually, all that's downloaded is the page source since that's all a HTML file is - plain text.

In AutoHotkey:

--- Code: Autohotkey ---mySitesFile   := "c:\tmp\15\Sites.txt"mySitesFolder := "c:\tmp\16" FileRead, myData, % mySitesFileLoop, Parse, myData, `n, `r{    UrlDownloadToFile, % A_LoopField, % mySitesFolder . "\xyz_" . A_Index . ".html"}
Change path variables to suit.

Maybe there is a way to make Chrome do a download of the site (that will get images etc so the web page is perfectly viewable)


[0] Message Index

[#] Next page

Go to full version