An SEO professional was curious about their indexed pages.
They had found that from approximately January 13 forward, their indexed pages dropped by over 90 percent. These are indexed pages and actual traffic in the Google Search Console back end. It was also showing up in their log report.
So they checked the coverage report in the GSC back end. And it turned out that almost 14,000 URLs and 14,000 indexed pages were dropped.
And these 14,000 URLs also increased on the “crawled, but currently not indexed” portion of the back end.
The number is exactly the same, and they also checked the samples. The indexed as well as the crawled but not indexed.
On their site, they didn’t find anything unusual, such as the canonical and also the meta tags marked as noindex. There is nothing that indicates that this is an issue.
So they couldn’t figure out how to identify the problem.
They were wondering if John could give some pointers and information about how to solve it.
John explained that he’s worried that Google needs to dig into this issue a bit more.
The one aspect he recommends that the SEO professional checks is whether or not they can actually get the crawl done properly.
John assumes they have already looked into that, but it’s a good idea to double-check there.
The SEO professional explained that yes they have double-checked that their URLs on Google are live and fully crawlable and indexable.
They have checked GSC, their sitemap, their robots.txt. All checked out and there is nothing unusual.
John explained then that what always happens is that they discover a lot of URLs for websites.
If they don’t think that these URLs are important, then Google will keep them in their list.
And at some point, they will try to recrawl them.
John suspects that these are just random URLs that Google discovered over time. And they try to crawl them again from time to time to see if there is anything that they are missing.
This happens at approximately the 23:50 mark in the video.