ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE. Software > Web Link Captor

Added delay to to make the script more human


My list contains 600 items, ms will block after about 30 results.

I added a time.sleep(60) command in (and had to import time) to fool the (b i n g) search engine.

Each input line takes 60 seconds, it makes the script more human.....

So I don't have to enter lost of  subselections (600=20x30 input items)  :D

See both changes in the code below between the line with ======== on it

--- Code: Python ---#-------------------------------------------------------------------------------# imports we need# include top level directory (needed to import from helper below)import sysimport os#201302 wjamoe added import time ===================================import time#=============================================================

--- Code: Python ---def DoSearch(self,id,label,search):        # this function does the main work -- take a search term        # return a list of results        results = []         #initialize result with engine name and id        baseresult = {"engine":self.get_searchenginename()}        if id!="":            baseresult["id"]=id        if label!="":            baseresult["label"]=label         # ok grab a web page using our search terms        content = self.SearchWebPageGrabContents(search)         # and now parse results        addresults = self.ParseWebPageContentIntoResults(search,content,baseresult)        #201302 wjamoe added delay of 60 seconds =============================        time.sleep(60)        #=============================================================        # add        results.extend(addresults)        # return        return results
The testrun for 2 items takes 120 seconds  :(


Disclaimer: I don't know how to program in python (at all, yet)
Couldn't you keep start-time, fire 5 or 10 searches, then wait until 60 seconds have passed since start-time, and fire another set of searches? Would take much less waiting. But it would also look/feel less human-like searching...

That would require to reshuffle code, and will take more time to program and a lot more testing to find out what the heuristics are within the search engines. The engines probably all differ in that respect to identify robots and alike.
Current solution suffices for me, sorry...:-[

I'm just thrilled that someone other than me is using this tool.  :P


[0] Message Index

Go to full version