ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > General Software Discussion

links collector

<< < (3/4) > >>

someone else was posting about this error before too, what is causing this mystery npptools.dll error!

saw this post recently:

Can you try this:

* uninstall winpcap (not url snooper) from add/remove programs control panel
* reboot
* install winpcap 4.1 beta:
* reboot
* see if it works now.


URL Snooper is a great program, but not exactly the way to go for this situation, for two reasons:

1) it doesn't integrate within the browser
2) it doesn't autosave files, text, links
3) it sniffs the network, trying to get "hidden" files, while I only need to just see what is seen whithin the browser, nothing more

I was looking for a firefox extension or an opera javascript that will see the text and links of each webpage I visit and it will save specific links/text/files

there is no need to sniff the network in order to catch "hidden" urls etc, it's overkill

the problem is that most of programs that attempt to do what I need (offline browsers, etc) require you to enter a starting web address, then specify retrieval options and then let the program do the job

but what I want is an integrated to the browser solution to do this within my browser "as I browse"

an auto-bookmarker, an auto-file-saver, an auto-text-saver that will save info as I browse the net automaticaly


first, I need something that will "monitor" every webpage I visit, as I browse the net
this monitoring has to be very accurate ofcourse, which means it must not miss any webpage even the webpages that are partially loaded etc

by "monitoring webpages" I mean to grab the text, links and files of every webpage I visit
by "the text of the webpage", I mean the text that is highlighted/selected when we click ctrl+A in a webpage (included any other "hidden" text, etc)
by "the links of the webpage", I mean the links that are grabbed when we hit ctrl+alt+L in Opera or any other method that shows all the links of the webpage (included any "hidden" links, javascript links, etc)
by "the files of the webpages", I mean the files that are included in the folder that is created when we save a webpage which created an html file and a folder (and any other hidden files, embedded files, etc)

as far as I know (and if you know something else, please inform me) the available methods that can monitor web browser traffic are these:
javascript can monitor webpages as I browse the net (opera, for example, has this javascript function: document.addEventListener('DOMContentLoaded',function() { where it does things when the webpages are loaded)
internet connection sniffer can monitor webpages as I browse the net, that can sniff urls
web proxy can monitor webpages as I browse the net, as it works as a cache proxy

then I need to apply filters to specify which of the text, links and files are useful and then we need to save the filtered information

any help would be much appreciated

thank you

Ralf Maximus:
Does it need to be real-time?

If you're using IE6 or 7, the browser's cache is just a collection of files that can be accessed via the file system.  I imagine any file-search utility that does regular expressions (FileLocator Pro?) could suss out the patterns you've described.  For a fact I know UltraEdit's file-search feature will do this.

This is not real-time scanning, but you could kick off such a search after your browsing session is complete.


[0] Message Index

[#] Next page

[*] Previous page

Go to full version