If your website server becomes very slow without a noticeable increase in traffic, you need to consider where the problem might be. Usually, if the website hasn't undergone upgrades or technical modifications, the issue lies externally, such as the site being scraped or excessively crawled by search engines. Naiba recently checked the server monitoring information and found the server CPU usage was very high, as shown in the figure below:

After analyzing the website logs, it was found that Bing's bot was crawling too frequently, so we need to address this.
The solution is to limit Bing's crawl frequency.。
Method 1: Set in the admin dashboard

If you have registered for Bing Webmaster Tools, you can find the Crawl Control menu in the dashboard and set the crawl frequency on the right side.
Method 2: Set via the robots.txt file
If you don't know what robots.txt is, please read this first:
What is robots.txt_Correct WordPress robots.txt Writing Methods and Generation ToolsWe can add the crawl-delay parameter in the robots.txt file.
User-agent: *
Crawl-delay: 1
The above code means limiting the crawl frequency for all search engines to slow. If Crawl-delay is not set, it means the search engine decides the crawl frequency itself. This value can be set to 1, 5, or 10, corresponding to slow, very slow, and extremely slow, respectively.
For other search engines, such as Google and Baidu, you can set the crawl frequency in their respective webmaster tools or via the robots.txt file. Relatively speaking, the robots.txt method takes effect more slowly.
Comments are closed
The comment function for this article is closed. If you have any questions, please feel free to contact us through other channels.