An SEO professional asked John Mueller during a hangout about their blog posts being discovered but not indexed.
So they have run a 2-sided marketplace since 2013. This is fairly well-established, and they have about 70,000 pages, and about 70% of these pages are generally indexed.
Then there is this budget that crawls the new pages that get created. They see movement on those, so the old pages go out and the new pages come in.
At the same time, they are also writing blog entries from their editorial team.
And these get pushed to the top of the queue. They are always using the request indexing on those, so they will be indexed faster. They also add them to the sitemap.
But, they find that they write them, and then they want them to get onto Google as fast as possible.
They have been growing over the last year and they have more content on their site. They have seen that sometimes this doesn’t work as well for new blog entries, and they also see it in this “discovered but not indexed queue for a longer period of time.
Is there anything they can do to combat this? Like creating more internal links or something? Is this decision content-based? Or do they just have to live with the fact that some of their blog posts may not make it into the index?
John explained that it is normal for Google to not index everything on a website. This can happen to the entries they have on the site, and also the blog posts. It’s not tied to any specific kind of content.
John believes that using the URL inspection tool to submit them to indexing is fine. This is not going to cause any problems.
John would also try to find ways to make these pages as clear as possible that you care about things that should be indexed.
Internal linking is a good way to do that. To really ensure this happens, you can show on your homepage, “here are the five new blog posts,” and you link to them directly so that it’s easy for Googlebot. When they crawl and index your homepage they will see “Oh, there’s something new and it’s linked from the homepage,” it must be important.
This happens at approximately the 18:52 mark in the video.

John Mueller Hangout Transcript
SEO Professional 5 18:52
Hi John, thanks for the time. My question pertains to a crawling question pertaining to discovered not indexed. We run a two sided marketplace since 2013 that’s fairly well established, we have about 70,000 pages, and about 70% of those are generally in the index. And then there’s kind of this budget that crawls the new pages that get created. And those we see movement on that so that old pages go out and new pages come on.
At the same time, we’re also writing blog entries, kind of from our editorial team. And to kind of get those to the top of the queue we always use this request indexing on those, so they’ll go quicker. We add them to the sitemap as well, but we find that we write them and then we want them to get on to Google as quick as possible. As we’ve kind of been growing over the last year and we have more content on our site, we’ve seen that that sometimes doesn’t work as well for the new blog entries and they also sit in this “discovered not indexed” queue for a longer time. Is there anything we can do? Like internal links or something? Is it content-based? Or do we just have to live with the fact that some of our blogs might not make it into the index?
John 20:13
Yeah. I think overall, it’s it’s kind of normal that we don’t index everything on a website. So that can happen to kind of the entries you have on the site, and also the blog posts on the site. It’s not tied to a specific kind of content. I think using the Inspect URL tool to submit them to indexing is fine. It definitely doesn’t cause any problems. But I would also try to find ways to make those pages kind of as clear as possible that you care about that. So essentially, with internal linking is a good way to do that. To really make sure that from your homepage, you’re saying, like, here are the five new blog posts, and you link to them directly so that it’s easy for for Googlebot, when we crawl and index your homepage to see oh, there’s something new and it’s linked from the homepage. So maybe it’s important, maybe we should go off and look at.
SEO Professional 5 21:12
Okay, so if it’s linked from the homepage, it’s more likely that Google sees it as important, than if we just add it and it kind of gets added on a sub-blog page.
John 21:21
Yeah. Definitely. That helps. Yeah, just making it as obvious as possible for us to figure out, okay. There’s also usually, if you have a blog section on your site, you also have RSS feeds. And if you have that set up, I would also submit those to Google in Search Console, just because RSS feeds tend to focus more on the newer content. And that kind of helps us to pick those up a little bit faster. So we use them similar to sitemap files, but sometimes RSS feeds are a bit easier for us to pick up.
SEO Professional 5 21:59
Okay, that’s a good hint. I can implement that. Cool, cool. Thanks for your time.