Recently I have noticed that Google has become much more sensitive towards automated queries sent to it. It bans the IP very fast – much faster than it used to before. What’s funny is that the tool that got caught when I discovered it wasn’t even a site geenrator of any kind scraping the SERPs – it was a simple position checker 🙂 Previously, it was possible to solve the problem by simply increasing the wait time between the queries sent to Google – not any more it seems!
Just another example of Google creating a problem for itself – if there was API for search still publicly available who would have queried the human interface SERPs directly?
As to the solution – I am now simply sending queries from multiple IPs instead of the same one 🙂
I’ve noticed the same thing. I get blocked just doing research (not a script of any kind, just me phisicly clicking through pages to fast). It’s not hard to get around, just do your searches on AOL, they use google results and don’t block IPs.
With a little scripting you can use the google ajax search interface to pull SERPs. They currently don’t have a limit on the # of requests. It returns your results as a JSON object that you can manipulate any which way.