I don't know if I'm out of line here, but.......
I would like to commission someone to create an internet search "delegator" - a search spider that can interact and command the other search engines of the WWW (Google, Yahoo, etc, etc) to search all databases in the entire WWW using keywords input to the delegator, and come up with an "answer".
To better explain, though it's difficult:-
You write a program that orders other people's search engines to do my search work (whatever that work is). Most databases have a search function built in - libraries, museums, universities, governments. So you write a spider that finds their search engines, inputs my keywords, and asks them to search. This way, we harness the power of thousands of engines, working in unison. Kind of like parallel processing, or a kind of metasystem.
I appreciate that coding a search spider may (or may not) be a menial task, depending on one's expertise and experience. But, like everything else, it's only easy if you know how to do it !
Otherwise, it's nigh on impossible; mind boggling, in fact.
Not being aware of where you stand in this realm, could you please tell me if anyone is prepared to consider it - writing the search spider, or "delegator", I mean.
If anyone would you like to have a think about it, let me know "yes" or "no", how long it would take to write it and how much it would cost me ?