An SEO professional presented John with a rather interesting problem. They have an e-commerce site of around 2-3,000 pages.
When a few pages on the site were developed, they were set to noindex when they should not have been.
This issue has been fixed for several months now.
However, in Search Console, approximately 80 percent of these pages were still set to noindex. Googlebot even crawled the page back in June.
They attempted to repair this in development, but this has not changed the situation.
They kept requesting indexing via Search Console and they tried submitting new sitemaps. However, they could never get these pages indexed again.
The SEO pro was wondering if there were any problems with Search Console indexing pages right now, or if any glitches existed in this regard.
John answered that they don’t think there are currently any indexing problems.
He explained that there are two possibilities. The first one is that they are more conservative when it comes to indexing requests.
For example, if a page is left noindexed for a long period of time, then they will automatically slow down the crawling process for that page.
It’s also possible that the Google Search Console reports are out of date, making the report look worse than it actually is.
By using the filtering in Search Console, you can examine the performance report for these pages for URL patterns to see if they match the number of high noindex pages in Search Console.
It could also be that Search Console is reporting on pages that aren’t really all that important.
John then said that improving internal linking to explain to search engines that these pages are really important is something else that you can do as well.
This happens at approximately the 8:14 mark in the video.