In a John Mueller Google Hangout, one SEO professional was concerned that their crawl requests had rapidly dropped by over 90 percent.
They checked all aspects according to Google’s developer docs – things like robots.txt, and other possible errors.
They asked if there was anything they might be missing as a result?
John explained that perhaps Google’s systems were having trouble accessing the content quickly enough.
In addition to the number of requests, he said that they also look at crawl demand – a measure of how much Google may actually want to crawl that site.
For reasonable websites, crawl demand does tend to be pretty stable.
If they see a lot of new content, then it’s possible for the crawl demand to go up significantly. However, these types of changes are usually quite slow over time.
It’s directly proportional to server technical issues. If Google sees significant server errors, then it could potentially stop crawling the site until those are resolved.
This happens at approximately the 7:29 mark in the video.
John Mueller Hangout Transcript
SEO Professional 4 7:29
Hi, John. Nice to have you here. So my first question is, recently, the crawl requests on my company site has dropped, like nearly over 90 percent. And we have checked all aspects, according to the Google official doc, like, robots.txt. And we also want to know, we want to know any other possible technical factors that can cause a sudden drop of crawl requests in the date? What main aspects do you recommend that we check as well?
John 8:09
So what it sounds to me like, our systems have trouble accessing your content quickly enough. So when it comes to the number of requests that we make on a website, we have two kind of things that we balance, on the one hand, the crawl demand, which is how much we want to crawl from a website.
And assuming this is a reasonable website, then the crawl demand usually stays pretty stable. It can go up, if we see a lot of new content, it can go down if we see very little content. But usually, these changes are very kind of slow over time. And the other side is the crawl capacity. This is how much we think the server can support from crawling, without causing any problems. And this is something that we evaluate on a daily basis. And it can react quite quickly if we think that there’s a critical problem on the website.
So for critical problems, we think of things like server errors, if we see a lot of server errors, if we can’t access the website properly, if the server speed goes down significantly, so not not the time to render a page, but the time to access the HTML files directly. And those are kind of the three aspects that play into that.
And if for example, the speed goes down significantly, you would see that in the crawl stats report in Search Console. And that’s something where, like if we think we cause problems, from crawling too much we will scale that back fairly quickly.
SEO Professional 4 10:03
Oh, so the response time is highly relevant with – highly related with – the crawl request, right?
John 10:16
Yes.
SEO Professional 10:18
All right. So do you think the response codes of huge 5xx and 4xx also may reduce the crawl rate?
John 10:28
5xx errors definitely, though those are server errors, which we would see as being potentially problematic. 400 errors are less problematic, because it’s basically content doesn’t exist, so we can crawl normally. So if a page disappears, that’s no problem. If it has a server error, that is a problem.
SEO Professional 4 10:49
So how many levels of 5xx code? Would you see it as the main cause of server errors? Like if we have 100 pages, and maybe among 15 pages returned to return to 5xx due to some technical errors? Will Googlebot think our server may overload? Like, how, how many percent would that make?
John 11:20
I don’t think we have a fixed number. So what I will check if this recently dropped the speed, I would go into search console and the crawl stats section. And see if one of those numbers there changed significantly. Like it will have the graph. And then you can see like, on this date, we saw the change. And did the 5xx errors go up? Or go down? Or is the speed kind of too changing there? Anything like that?
SEO Professional 4 11:52
Oh, I see. So if, assuming that we find the critical problem, and we may maybe it’s worse, because of a response time, or maybe the server errors? If we fix this problem, like maybe one or two weeks? When will we see the correlates back to the normal normal requests, like..?
John 12:17
We update the crawl rate on a daily basis. So probably within a couple of days, you will see like step by step increasing again.
SEO Professional 4 12:28
Oh, so maybe, maybe if we fix it in the right direction, we can see the we can see it increase.
Thanks!