During a hangout, one SEO professional was concerned about their crawl rate. It had dipped in Google Search Console dramatically by over 90 percent.
They believed it was because of their response time.
So, they took many actions to improve that response time which was around 400 milliseconds to 500 milliseconds.
The crawl rates, however, have not yet been restored and it’s happening much more slowly than initially anticipated.
The SEO pro mentioned that John had previously told them if they fix their response time quickly, then the crawl rate should be restored within several days.
John asked them when the server response time was improved.
The SEO pro said that it was improved several days ago, but they are still not experiencing the improvements they would expect in Google Search Console.
John answered that he believes it just takes a bit longer for it to improve. Their systems are very responsive in slowing down so they can ensure that they don’t break anything, but are much slower in terms of ramping up again.
He expects that, since it’s only been a few days, that they should give things a week or longer to improve.
John also offered an alternative: to submit a request to the Googlebot team in Google Search Console to look at the site. They will examine the site for improvement and could manually restore it.
However, the team does not respond to messages, so this could take awhile to see on the front end of things.
Sometimes it’s faster than their automatic systems, however. It’s a nice feature where you can report malfunctions and other issues where Googlebot may not be catching up with the crawling quite yet.
This happens at approximately the 20:16 mark in the video.
John Mueller Hangout Transcript
SEO Professional 5 20:16
So the first question on one of my first questions is that I don’t know if you remember in our former meeting, I mentioned that our crawl rate has dropped drastically over 90 percent. And we assume that it’s because of our response time. And now. And now, we have done many actions to improve our response time. And now the response time in the backend of the GSC has been restored to, to the former level: around 400 milliseconds to 500 milliseconds.
But the crawl requests, also the crawl rate is restoring much more slowly than the response time. I remember that you said if we fix our response time quickly, the crawl rate will be restored itself in several days. Right. So yeah, so right now the crawl rate isn’t restored like what we expected. So we’re wondering why is that? And maybe is that because our response time isn’t restored to maybe lower level or, or something like that. So I kind of hope that you can give us some advice on that.
John 22:01
Yeah. Do you know when the response rate has improved, or how long that has been?
SEO Professional 5 22:09
The response time, what, like the response time, several days, it’s, it’s in restored to, like 500 milliseconds several days, and now we couldn’t see any improving, or the improving is much slower, slower in the GSC.
John 22:32
Yeah, my guess is probably it just takes longer. So usually, the system that we have is very responsive in slowing down to make sure that we don’t cause problems, but is a little bit slower in ramping back up again. So, I suspect, if it’s been a few days that you improve that, then probably I would give it maybe a week or longer to catch up again.
What you can also do in the Help Center in Search Console, we have a link to a form, where you can request that someone from the Googlebot team takes a look at the crawling of your website.
And you can give them information about this, especially if it’s a larger website where we have a lot of URLs to crawl, you can tell them like “we, we improved our crawl rate significantly, you can crawl again” with I don’t know, whatever crawl rate you think is appropriate. And the Googlebot team sometimes has time to take action on those and kind of adjust the crawl rate up manually. If they see that there’s actually the demand on Google side, and they see that the site has changed in that regard.
So that’s sometimes a little bit faster than the automatic systems. But it’s not not guaranteed. I see. I can look for the link later on. Or it should be in the Help Center. I think it’s called Report a problem with Googlebot or report a problem with crawling something like that.
And it’s a form where you specify the IP addresses of the Googlebot that you’re seeing. And it’s usually meant to report issues where Googlebot is crawling too much. But you can also report issues where Googlebot is not catching up with crawling yet.
SEO Professional 5 24:35
Oh, that one passed. We have already submitted the submitted IP address and also the problems through that Googlebot system, but we did, we still didn’t get any response from that.
John 24:50
Yeah, usually they don’t respond. Usually they review these every now and then and then they take action on them. So my guess is probably you just have to wait a little bit longer.
SEO Professional 5 25:04
Okay. Okay. Thank you. So you think our response time has gone back to the former level around 500 milliseconds, do you think? Do you think maybe lower is better? Or we should also
John 25:19
Right. I think 500 is pretty good. That’s usually in the range where I suggest for crawling. And it, I think if you have a very large website with, I don’t know, millions of millions of URLs and having an even faster response time is useful. But for normally sized or even medium large websites, I think 500 milliseconds should be okay.