ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > General Software Discussion

web dir in excel

<< < (3/4) > >>

Target:
I use EXCEL 2003

an example would be a list of all these PDFs: http://www.efloras.org/web_page.aspx?flora_id=12&page_id=1155-kalos (April 10, 2012, 04:24 PM)
--- End quote ---

just trying to understand the workflow here - looking at your example, I'm wondering why you're using Excel (I can only assume the records have been supplied in that format) instead of a downloader (like the DownThemAll xtn for Firefox - though of course this doesn't do the indexing and launching part).

kalos:
I have the excel with some data
I will have to append at the beginning of the text of each cell in all rows of a specific column, and the http://domain.com/

Then, the created links must be automatically replaced with the paths of the linked files, after the files are downloaded locally

By this way, I we have a dynamically updated excel with the updated versions of the files from the server

Target:
I have the excel with some data
I will have to append at the beginning of the text of each cell in all rows of a specific column, and the http://domain.com/

Then, the created links must be automatically replaced with the paths of the linked files, after the files are downloaded locally

By this way, I we have a dynamically updated excel with the updated versions of the files from the server -kalos (April 12, 2012, 12:23 AM)
--- End quote ---

I go that much, I was wondering where you sourced the data in your spreadsheet, and why Excel instead of something else

kalos:
The excel is provided from the website that has the files / database, as well

It also contains much info in each row about that specific pdf

Ie. the excel is also a database and a sitemap

Target:
Kalos

I've only just had the opportunity to sit down and spend some time on this and I need some input from you

I'm still a bit sketchy on your workflow, ie you said the source data (ie the excel sheet) came from the website.  I've been trolling around the site and I can't see anywhere where I might download such a file - this leads me to the conclusion that the data has been sourced manually and then entered into a spreadsheet (yes?, no?)

Also I don't understand why you're trying to use Excel for this when there might be better tools for the job.  The only reason i can think of is that there is some other sort of analysis going on here (though even if there is it doesn't mean Excel is a good tool for this) - can you expand on this a bit please

FWIW, I just ran HTTrack to create a local copy of the web page.  There could be better tools, but this is one I've used before - it took just over 4 hours (:o) and I ended up with a copy of all the linked documents, and a copy of the page with all the links pointing to the local files (about 300M in total)  

Conversely DownThemAll pulled all the linked PDF files in about 15 minutes, but of course there's none of the associated info, or any means of identifying what each document is.  You could probably save a local copy of the page and update the links as appropriate, but the source appears to be pretty inconsistent syntactically (though this approach has promise)

EDIT: added some additional info

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version