...this might be the shortest thread ever on DC
-rgdot
Don't speak too soon. A lot of websites nowadays seem to have coded-in detection of crawlers and downloaders that behave badly - e.g., start firing off hundred of queries a second, which overloads the server. To protect the server, if a norty crawler is detected, then that crawler/downloader is blocked.
This has the potential effect of making some site-scraping software obsolete.