In a hangout, one SEO professional asked John Mueller about server connectivity issues and how they would impact crawling by Google.
They had a problem recently where they saw more than 55 percent of robots.txt errors and server connectivity failures in Google Search Console.
They had seven days of failures in November. And it happened once in December.
So their server team blocked some IP addresses from Googlebot. Since November, their traffic from Google dropped by about 20 percent. If nothing changes during the day/date, is this reasonable to blame the server team for the decrease of their traffic?
They were just wondering. They wanted to know what effect these problems would have on Google search.
John asked if the problem was fixed now.
The SEO professional responded that they opened their IP address, but they had another problem in December and they don’t know why.
John explained that there are two things to watch out for here.
The first thing is that there are server connectivity issues. Google would not see this as a site quality problem.
It wouldn’t be that the ranking for your pages would drop as the result of server connectivity issues.
This is the first step.
What does happen, however, with these types of problems is that if they cannot reach the robots.txt file for a while, then they will assume that they can’t crawl anything on the website.
This can result in some of the pages from the site being dropped from their index.
That’s a simple way to figure this kind of issue out – is it from a technical problem or not?
Are the pages gone from the index? If so, that’s probably from a technical problem. Or are the pages just ranking lower? Then that’s not necessarily from a technical problem.
This happens at approximately the 10:20 mark in the video.