ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

DonationCoder.com Software > Post New Requests Here

IDEA: Copy text of web page while following links

<< < (2/2)

kab122:
jity2

Ya, I know exactly what you are describing. Wish it were that easy.
I'm thinking they must get $ for the ads on every page I visit  :P

Besides, I just needed the text; don't want to print all the images/ads.

I did this to print off recaps of Lost and House from:
http://www.televisionwithoutpity.com
The writers are absolutely great, lots of yuks for me during my daily commute.

jity2:
Hi Kab,

Now with the website, I see! ;-)
I have no direct abd easy anwser! Maybe someone can do a script with Perl (just a guess!)?
Maybe also gather all links in html at once and keep only text?

Example:
page 1
http://www.televisionwithoutpity.com/articles/content/a12215/
page 2
http://www.televisionwithoutpity.com/articles/content/a12215/index-1.html
Page 3
http://www.televisionwithoutpity.com/articles/content/a12215/index-2.html
...
page 13
http://www.televisionwithoutpity.com/articles/content/a12215/index-13.html



Here is an idea that would work: I would use the shareware "Macro Magic".
(select manually the number of times the macro will repeat)
record the macro (mouse gesture plus click) : click on the page, CTRL+A ; copy text; go to all_text.txt (created before and already openned before the macro work), paste text, go back to html , click on the page, scroll down , and click next with mouse

This should work. ;-)
See ya ;-)
Jity


Navigation

[0] Message Index

[*] Previous page

Go to full version