The Googlebot Crawl Rate Limiter Tool will be removed today, January 8th, 2024, according to Google from search console because it’s no longer necessary. This tool has been available for over a decade, but with the improvements Google has made to its crawling logic and other tools available to publishers, its usefulness has dissipated.
The purpose of the tool was to provide publishers with a way to control Googlebot crawling so that it didn’t overwhelm the server. There was a time when some publishers experienced too much crawling, which could result in the server being unable to server webpages to users.
According to Google, you can reduce the crawl rate by:
“If you need to urgently reduce the crawl rate for short period of time (for example, a couple of hours, or 1-2 days), then return 500, 503, or 429 HTTP response status code instead of 200 to the crawl requests. Googlebot reduces your site’s crawling rate when it encounters a significant number of URLs with 500, 503, or 429 HTTP response status codes (for example, if you disabled your website). The reduced crawl rate affects the whole hostname of your site (for example, subdomain.example.com), both the crawling of the URLs that return errors, as well as the URLs that return content. Once the number of these errors is reduced, the crawl rate will automatically start increasing again.”
Crawling algorithms have reached a state where Googlebot can automatically sense when a server is reaching capacity and take immediate action to slow down the crawl rate. Moving forward, the minimum crawl rate will by default be set to a lower rate similar to what publishers tend to request.
This move will make Search console less complex to navigate, especially with a tool that is rarely used making it more user-friendly.
Publishers you encounter an issue with Googlebot’s crawl rate, you can still use the Googlebot report form to send feedback to Google.