Main Area and Open Discussion > General Software Discussion
Is there software for this?
SomebodySmart:
I go to http://www.pedersonfuneralhome.com/obituaries/ and there's a list of twelve obituaries.
Each has a URL that is in the HTML code and is easy to capture, and the target file is easy to curl or wget,
but I want ALL the hundred of URLs to individual obituaries and I don't want to do work. The NEXT key actually produces a list of the next twelve but the VIEW SOURCE function still lists the first twelve in the source code. Now, is there a product that will download and capture everything one at a time so I can leave the machine on auto-pilot?
ayryq:
In a couple minutes of looking, I couldn't find a way. But I'm posting because I used to live a couple blocks from Pederson, in Rockford MI. That's all :)
mouser:
well there are a few programs designed to "spider" a page and download all linked pages, images, etc.
one well known one is "Teleport Pro", but there are others.
SomebodySmart:
I looked at Teleport Pro but it doesn't look like it will be able to scan
and download the output of scripts, just static pages.
well there are a few programs designed to "spider" a page and download all linked pages, images, etc.
one well known one is "Teleport Pro", but there are others.
-mouser (June 13, 2015, 05:52 PM)
--- End quote ---
ayryq:
I figured it out:
Go to http://www.pedersonfuneralhome.com/obituaries/ObitSearchList/1 (and increment the final number)
Eric
Navigation
[0] Message Index
[#] Next page
Go to full version