Website slowdown by bots
Suggestions from Iakov about managing bots making too many requests:
- crawl-delay in robots.txt for yahoo & bing (if they are the problem)
- google webmaster tools allows you to set limit for google
- http code 429 + retry-after for nasty bots (if they are useless, just disallow /*)
- questionable: robots.txt + disallow for certain pages
- advanced: perhaps we discussed that already, if I were you I would use a) cache all standard elements b) save all gene pages to files (e.g. excluding header/footer). it is also possible to cache sql requests if you have enough hdd/ssd.