In a John Mueller Google Hangout, one SEO professional was concerned that their crawl requests had rapidly dropped by over 90 percent.
They checked all aspects according to Google’s developer docs – things like robots.txt, and other possible errors.
They asked if there was anything they might be missing as a result?
John explained that perhaps Google’s systems were having trouble accessing the content quickly enough.
In addition to the number of requests, he said that they also look at crawl demand – a measure of how much Google may actually want to crawl that site.
For reasonable websites, crawl demand does tend to be pretty stable.
If they see a lot of new content, then it’s possible for the crawl demand to go up significantly. However, these types of changes are usually quite slow over time.
It’s directly proportional to server technical issues. If Google sees significant server errors, then it could potentially stop crawling the site until those are resolved.
This happens at approximately the 7:29 mark in the video.