|
Re: Block The Baidu Spider From My Site?
I can't see how 1 spider can eat all your bandwidth, I used to host my site on half the bandwidth you have and I had google spider and loads of other spiders crawling my site all the time, and it being a forum it had tens of thousands of pages to crawl. It used barely anything, its just like a normal visitor.
The only reason it might use lots is if it has no request-delay like the other crawlers do. Google's default is 5 seconds/page so if its doing more than 1/second it might use more.
As said above the easiest way to block it if it is ignoring the robots file is to block the IP range.
|