During a hangout, one SEO professional asked John Mueller about a ranking drop they suffered for one of their domains.
Their question was: At the end of January, they saw a massive ranking drop for their domain. Later, there was a drop on further international domains.
Other domains like at, ch, fr, es, lots of top-level domains. Everything either points to a technical issue or an issue of a loss of Google Trust.
They explained that the IT team has identified an issue with the SSL certificate. There was no manual action or security issue in Google Search Console.
But there are lots of minor technical aspects that have additionally been fixed. Then they double-checked the common SEO mistakes such as robots.txt, meta tags, canonicals, hreflang, etc.
Nothing is fixed. Could it be that they are on the right track, or should they be doing something else?
John answered that the bigger picture where you’re seeing at first – one top level domain and then other domains over time, to John that suggests there’s not a simple technical tweak that needs to happen on the website.
But rather, it’s more a matter of Google’s understanding of the website overall.
And in particular, the SSL certificate, if SSL works well in a browser, then it’s going to work fine for indexing. This would not be a cause of a drop in visibility in the search results.
And many of the things the SEO professional already mentioned are just minor technical issues.
These are things that tend not to cause bigger issues across a website as well.
Especially if these are not things where the problem has just recently been added to a website, then that wouldn’t be affecting a sudden change in how a website is shown in search.
For example, if you had your robots.txt file completely wrong, it would not be a case that after one or two years Google would suddenly react and say “Oh, we can’t deal with the website anymore.”
It’s really like “That’s a clear technical issue.” And that would have been a technical issue from when you uploaded that robots.txt file.
Whereas if you later on, over time, you see that actually – the visibility of the website overall is changing, then that’s more of a kind of broader SEO type of thing where you would want to look into the various SEO topics as well and not just purely focus on small technical tweaks.
And it’s really hard to say – what particular thing you should be focusing on here, because SEO takes into consideration so many different aspects across a website.
This happens at approximately the 23:51 mark in the video.
John Mueller Hangout Transcript
John (Submitted Question) 23:51
Let’s see, since the end of January, we see a massive ranking drop for our .de domain. And then later also on further international domains, like at, ch, it, fr, es — lots of top level domains. Everything either points to a technical issue or an issue of lost Google Trust. The IT team has identified an issue with the SSL certificate. There is no manual action or security issue in Search Console, lots of minor technical aspects have additionally been fixed. And we double checked the Common SEO Mistakes like robots.txt, robots meta tag, canonical, hreflang, etc. But nothing is fixed. Could it be–like are we on the right track? Or is there something else?
John (Answer) 24:40
So I think kind of from taking a step back without looking into the specific site, I know you’ve specified the site there. And I’ll double check some things there. But essentially, kind of the bigger picture where you’re seeing at first in one top level domain and then some–like overtime, you’ll see it in the other top level domains as well. To me, that kind of suggests that there’s not a kind of a simple technical tweak that you need to do on your website.
But it’s rather more a matter of Google’s understanding of your website overall. And in particular, the SSL certificate. If SSL works well in a browser, it will work well for indexing, that would not cause a drop of visibility in the search results. And a lot of the kind of, you mentioned them already, as minor technical things. Those are things that tend not to cause bigger issues across a website, as well.
And especially if these are not things where the problem has just recently been added to a website, then probably that wouldn’t be affecting, like a sudden change in how a website is shown in search. So for example, if you had your robots.txt file completely wrong, and it would not be a case that after one or two years, then suddenly Google would react and say, Oh, we can’t deal with this website anymore.
It’s really like, that’s a clear technical issue. And that would have been a technical issue from when you uploaded that robots.txt file. Whereas if you later on over time, see that actually the visibility of the website overall is changing, then that’s more of a kind of a broader SEO type of thing where you would want to look into all of the various SEO topics as well, and not just purely focus on like small technical tweaks. And it’s really hard to say what particular thing you should be focusing on there, because SEO takes into account so many different aspects across a website.