In a hangout, one SEO professional asked John Mueller about server connectivity issues and how they would impact crawling by Google.
They had a problem recently where they saw more than 55 percent of robots.txt errors and server connectivity failures in Google Search Console.
They had seven days of failures in November. And it happened once in December.
So their server team blocked some IP addresses from Googlebot. Since November, their traffic from Google dropped by about 20 percent. If nothing changes during the day/date, is this reasonable to blame the server team for the decrease of their traffic?
They were just wondering. They wanted to know what effect these problems would have on Google search.
John asked if the problem was fixed now.
The SEO professional responded that they opened their IP address, but they had another problem in December and they don’t know why.
John explained that there are two things to watch out for here.
The first thing is that there are server connectivity issues. Google would not see this as a site quality problem.
It wouldn’t be that the ranking for your pages would drop as the result of server connectivity issues.
This is the first step.
What does happen, however, with these types of problems is that if they cannot reach the robots.txt file for a while, then they will assume that they can’t crawl anything on the website.
This can result in some of the pages from the site being dropped from their index.
That’s a simple way to figure this kind of issue out – is it from a technical problem or not?
Are the pages gone from the index? If so, that’s probably from a technical problem. Or are the pages just ranking lower? Then that’s not necessarily from a technical problem.
This happens at approximately the 10:20 mark in the video.
John Mueller Hangout Transcript
SEO Professional 3 10:20
I’m sorry. Okay. Um, yeah, that’s Korean. I have a question about the relationship between failure from robots.txt and server connectivity failure and search traffic from Google, because I saw more than 55 percent of robots.txt and several connectivity failures recently in Search Console.
So we had seven days of failures in November. And it happened once in December. So our server team blocked some IP addresses from Googlebot. So since November, our website traffic from Google has dropped about 20 percent.
So if nothing changes during the day, during the date, is it reasonable if I blame the server team for the decrease of their traffic? I’m just wondering. So how much effect on the problems will be, you know, from the robot and server connectivity to Google searching?
Is the problem fixed now? Or…?
SEO Professional 3 11:30
They kind of opened their IP address, but we had one more problem in December, and they don’t know what is the reason? Because they opened their IP address, and they don’t know, what is the reason from there? Google Search Console?
So I think there are two things, maybe, to watch out for. On the one hand, if you have server connectivity issues, we would not see that as a quality problem. So it wouldn’t be that the ranking for your pages would drop. So I think maybe that’s the first step.
So if you see that the ranking of your pages is dropping, then that would not be from the technical issue. On the other hand, what does happen with these kinds of server connectivity problems is that if we can’t reach your robots.txt file for a while, then we will assume that we can’t crawl anything on the website. And that can result in some of the pages from your website being dropped from our index.
So that’s, that’s kind of a simple way to figure out, is it from a technical problem or not? Are the pages gone from the index? And if so, that’s probably from a technical problem. Or are the pages just ranking lower, then that’s not necessarily from a technical problem.
And if it is from a technical problem, like if these pages are gone, then usually we will retry those missing pages after a couple of days, maybe, and we will try to index them again.
So if the issue was in November, and you fixed it in November, then I assume you should not see any effects from that anymore. Just because we should have reindexed those pages in the meantime.
If you do still see that we’re not indexing the pages properly from your website, I would double-check the crawl errors section in Search Console to see if there’s still perhaps a technical issue, where sometimes maybe Googlebot is blocked.