We wanted to share yet another round of SEO insights from John Mueller’s latest Google Search Central Office Hours Hangout on May 21, 2021.
John’s Hangouts are usually chock-full of SEO insights, which we are going to cover in detail (and much easier-to-read technical wording) here.
This Hangout was no exception, featuring new SEO insights on Chrome User Experience Data (CrUX data), Schema.org, JSON-LD, HREFLANG, new Google SERP features and much more.
Let’s dive right in!
Our SEO Insights
At Google I/O, they had a brand-new adventure game that allowed attendees to network with each other virtually.
This turned out to be quite a hit, judging by social media comments from Barry Schwartz and others.
It turns out that replicating a conference virtually really does feel like attending a conference in-person.
Google Has No Preference for Schema.org Markup Language
One question from a webmaster dealt with whether they can use another format for Schema.org markup to render it on the site.
John recommends using JSON-LD because doing so makes it easier for sites to implement. Other types of markup do not have less value if you choose to use them, though. So if you use the in-HTML Schema.org markup, you don’t have to worry about losing any SEO value. Google does not give any special treatment to JSON-LD over other formats.
Lost Rankings Immediately After Image Changes; When Will They Come Back?
One webmaster had a problem where they had made changes to images. They changed the source URLs of the main image on their product page to the new optimized image. The previous image was used for high pixel density devices, but they forgot to implement that.
After two days their traffic dropped by 30 percent and hasn’t returned after two months.
John thought that two months is too long a time for traffic to return. He does think, however, that larger e-commerce sites will take a much longer time to achieve the uptick in traffic, because Google takes a longer period of time to understand such sizable sites.
A couple of things we don’t know include the exact size of the site in question. For smaller sites it would probably take a few days for traffic to return, if it were affected by this particular change (and assuming that no other changes were not afoot that the webmaster did not know about).
For larger sites, though, you could expect such a change to take two to three months for Google to really understand and implement in the ranking algorithms.
Context of Categories and Tags are Important, Not Backlinks
One webmaster had a question regarding categories and tags, and they could not get a straight answer: Are backlinks to categories and tags pages important?
John answered that Google doesn’t work on backlinks alone when it comes to finding category and tags pages. He states that how they are grouped and their overall context is much more important than backlinks. Backlinks to category and tag pages are secondary, and are likely not something that will supersede the overall context of the category and tag structure.
Use the Canonical Link Element When You Have Multiple Pages With the Same Content
The same webmaster had questions regarding canonicals. Specifically, whether or not he should create canonicals based on general recommendations from certain tools.
John explained that Google recommends using the canonical tag when you have multiple URLs for the same content.
If you have multiple URLs for the same content everywhere, then you definitely want to do so.
Don’t Use Word Count to Judge How Much Content Should Be On a Tag or Category Page
The same webmaster also asked questions about category and tag pages in terms of Google’s expectation about writing 500 to 800 words.
John explained that they don’t count the words. He recommends not looking at word count and instead (again, reinforcing his previous statement) looking at the category pages and how they are grouped together.
His example included pet food: If you’re making a category page about pet food, and you have a bunch of pages full of content about pet food, then creating this category page makes a lot of sense. He also addressed the other side of the coin: If you have a single page on a topic, it doesn’t make sense to make category pages just for that one page.
If nothing unique exists for that particular topic, then you may not need that category page for the topic.
Multiple category pages for single pages within a site means that you’re likely diluting the overall value of your site across many unnecessary pages. Instead, you could concentrate on your most important pages and make them stronger in terms of SEO value.
Will Redirects Pass Core Web Vitals Values to the New Page?
This was an interesting question posed by another webmaster. He asked if you have core web vitals on a particular page and you redirect it, would the old page’s core web vitals be passed to the new page?
John answered that yes, just like any other signal in search, if Google sees a redirect then they will likely forward the signals they have and apply them to the new URL. It doesn’t matter if the URL itself is actually different, just that you’re moving from one URL to another.
How Does Google Collect CrUX Data?
The same webmaster also asked about how Google collects CrUX data. Do they capture the CrUX data from indexed pages only?
John explained that Chrome collects CrUX data on a per-URL basis, and the usual anonymization techniques are used. If 100 pages are shown in the search results, then Chrome will use those pages rather than every single page on the site.
How Do I Optimize Pages With the Language Meta Tag?
The webmaster also asked regarding the language meta tag, i.e. how to best optimize pages with the same content for English, Spanish, French, et cetera. Instead, they use hreflang.
John mentioned that internal links are what matter most. In fact, Google does not use the language meta tag. Rather, they will attempt to understand the language the page is in based on. It will be used for ranking based on the actual text rather than what’s in the lang meta tag.
As an aside, Bing uses the content-language meta tag, the complete reverse of Google. So if you’re optimizing for both search engines, you’ll want to use both the hreflang and language meta tag.
What if My Website Doesn’t Have any Field Data for Core Web Vitals and Page Experience?
This is a question John called out as being very interesting. Basically, the webmaster was wondering what would happen if their website doesn’t have any field data from the PageSpeed Insights or CrUX data. Will it be counted in ranking for page experience when the update is rolled out?
John reinforced the point that Google does use field data as the benchmark metric for search. When they begin rolling out page experience, this field data is what will be used for ranking purposes. If a site doesn’t have the field data available, then Google will be flying blind when it comes to ranking your site.
After Using the Yoast Video Plugin, Click-Throughs Dramatically Decreased
Basically, the webmaster used a video sitemap from Yoast to submit his recipe site. However, the site has experienced a dramatic decrease in click-throughs after using it.
John explained that the quality of your video snippet has a lot to do with it. It could drive click-throughs or not. He also recommends using the robots meta tag for video snippets, whether or not you want to show a video preview for those pages. If you want to remove the preview entirely, simply removing it from the sitemap file won’t work.
Use the robots meta tag to suppress the video, but also be sure to remove the video from the page if you don’t want to show it in the search results. If you’re seeing a decrease in traffic as a result of this, you may want to consider improving the quality of your video overall. In short, impress your audience and they will reward you with page views.
Is There Any Way to Limit the Use of Google Search Console?
This was an interesting one. The webmaster asked if there is any way to limit the use of Google Search Console by certain parties, primarily using a credit system.
John says that while Google Search Console does not use credits in this fashion, they do have a DoS (denial of service) and abuse system that kicks in if you do things like use a script to automate Google Search Console. Of course when it comes to normal interactions there is no way to enforce a limit.
What Can We Do if We Use Schema on Our Site, Follow All the Documentation, But Aren’t Getting Anywhere With Results?
The webmaster asked this question regarding their online reading platform for books. They followed all documentation for Schema, etc. but are not getting anywhere and were wondering how they could get around this issue.
In John’s answer, he explained that Google has a system where a site is submitted and someone at Google manually examines the site to see what’s going on. If everything is aligned, they may end up activating things for the site.
On the other hand, for certain setups—especially if work is currently being completed before launch—then there may be a waiting period. This period varies depending on the feature that’s being launched and if the site is optimizing for a feature that has yet to be fully released.
Google Search Central Office Hours Hangout Transcript
The below includes the entire transcript of John’s awesome hangout session on May 21, 2021. Enjoy.
All right, welcome, everyone to today’s Google Search Central Office Hours Hangout. My name is John Mueller. I’m a search Advocate at Google here in Switzerland. And part of what we do are these office hour hangouts where people can join in and ask their questions around their website and web search. And we can try to find some answers to a bunch of stuff that was submitted on YouTube. So we can go through some of that. But it also looks like a handful of you are already raising your hands. So maybe we’ll just start off with all of that. I hope you’re doing well. I hope you caught up and had a chance to take a look at all of the Google I/O stuff. It seems like there are some some pretty neat stuff happening there. So I don’t know, I don’t have a ton of really super new insights to the super new announcements, but maybe I can help if anyone has questions. All right, Barry, looks like you’re lined up first.
Cool, thanks. Um, I guess two questions on I/O. One is can you tell us what? What do you like most about the show? Like, what session, or what did you find most interesting about it? And two, is there anything that you may have not seen the community discussed that maybe should have been discussed from the I/O event?
I don’t know. I think the the parts I found really cool are on the one hand, the adventure game, which which I thought was done really well. And it was like, almost like, like seeing people at a conference, like you run across names that you’ve seen before? Not quite the same, but it was pretty good. And the other one is everything around RSS, where there’s like, suddenly, all of that is kind of back. And you can follow sites in Chrome and kind of keep up with them, which I thought was pretty cool.
Excellent. Thank you.
He had a second question. Right. Like,
Yeah, like anything you saw that you didn’t see the community talking about? That from that came up in I/O that wasn’t discussed by the community? Like, I can’t believe this, this announcement or…?
I don’t know. I don’t know. It’s like such a whirlwind. I don’t know, I’ll probably notice in the coming weeks. But it seems like a lot of stuff got picked up. I mean, some of the aspects are a bit tricky. With regards to having to record them ahead of time, you never really know exactly what will happen until then. But I think, some of the coolest stuff got announced anyway. So that was pretty nice. But I don’t know, the big announcements. It’s hard to judge what people didn’t pick up because it feels like people picked up a ton of details. So sounds good. Thank you. Let’s see Show.
Sure. I mean, you can use anything that runs on your server, essentially, to generate the HTML and that can include JSON-LD. I mean, that’s usually how most ecommerce sites do that. They look up the database information, the pricing, the availability. And they put that into markup in JSON-LD or microdata, or whatever format. And we recommend using JSON-LD, because it usually makes it easier for sites to implement. But that doesn’t mean that there’s any less value passed with the other types of markup. So if whatever feature you’re looking for supports that type of markup, and that’s what you want to do, then go for it. I wouldn’t say you need to do Google’s preference, right? Because usually, people just come to us and say, it’s like, oh, you have two or three options, which one is the best one? And if we just say, Oh, I don’t know, just figure it out yourself, then everyone’s like, Oh, I won’t do anything. But if we say, Oh, we really like this option, then it’s a lot easier for them to say, Okay, I’ll just start and do that.
So Google prefers JSON-LD, just this free format, practically equally, would you say that?
Yeah. Yeah. I mean, if the feature that you’re looking for is available in those different formats, then they’re completely equivalent. It’s not that we would give any special treatment to JSON- LD.
John, we had a quick chat a few weeks ago about Images and an Image Search related issue we got on our website, and I wanted to know your thoughts about it. Personally, I also discussed with Glenn Gabe about this. And I know that he’s very interested in your thoughts, too. So on March 23, we changed the source URLs of our main image on our product pages, to our new optimized image. And in the previous image was supposed to be used for a high density devices to be set inside this source set attribute. The fact is that we forgot to implement that. And after two days, we had a drop of 30% traffic for image search. And we didn’t realize that until after four weeks. So after four weeks, we added the source set type attribute. And so we are still hoping to have our traffic back. But after two months, that hasn’t happened yet. So I wanted to know from you is it just a matter of time, or is there anything else we can do about it?
I don’t know. It feels like two months is a pretty long time. But especially since you mentioned these are product pages. It sounds like an e-commerce site. In that case, and you’re looking specifically at traffic from Image Search.
Yeah, because we’ve got a 30 percent drop in image search. But now consequently, we had also sent like 5, 10 percent drop in traffic from regular search. Now, because of that…
I don’t know. So if it’s possible the two months time there is just the time that it’s still taking to catch up with a larger site. One thing you could do is maybe send me some details, and I can look at that with the image search team directly. I would also double check the details on your side with regards to which image URLs are actually shown in image search. Has it picked up the change in image?
No, I still see that the old images, but it looks like they dropped in rankings for some reason. And I think it is related to that, because that happened two days after the change.
Two days is actually a really short time because yeah, if you’re looking at an e-commerce site, then if we recrawl the the different products, my guess is, depending on the site, it’d be in the range of two or three weeks for a bigger, bigger e-commerce site or even medium sized e-commerce site. So if you’re seeing changes, like two days after a change that you made, that feels like it wouldn’t be related to that.
That’s amazing because we had that kind of traffic. We had that kind of traffic for years. And then we made this change and after two days, we dropped 30%. And we didn’t didn’t do anything else other than that, around that time frame. Yeah. And so I don’t know, they also suggested that it was directly related to that. But, you know, I wanted to hear from the source. Yeah.
I mean, you’re welcome to send me some examples. Sure. If you’re seeing a strong change across a larger part of a website within a short period of time, then that doesn’t feel like it would be related to something technical on the pages. Unless it’s like your homepage, and suddenly, your homepage is not showing anymore due to the big change, like immediately. But for like a larger set of product pages, each one individually is a small change, you would see a gradual decline.
Right. But in case that was the case,should google pick up the updated version with the source site defined and fix things later on that. Yeah, that’s something that should happen.
That’s if it’s really just related to a change of image URLs. I would expect that to happen over the course of a couple months. Okay. Like, so two months is kind of like I would expect to see some effects by now. But actually, it just passed a month since I put the source site tag. Okay. So maybe I still need to wait. I mean, to me, it sounds more like maybe a quality change where our algorithms are overall, kind of re-evaluating the quality of your website, because that’s something that could happen from one day to the next. Okay. And I wouldn’t see this as something that’s tied to like a technical change across a lot of product pages.
Yeah. So actually, I have some basic questions that I have asked many people, but I am unable to get a straight answer. Okay. So I have a few questions. Like, one is, suppose I have a WordPress blog, in which I have created some Tags and Categories. Now I want to delete those Tags and Categories. These tags were created for some posts. But these tags don’t have any backlinks. And these categories also don’t have any backlinks. And if I delete, what will happen, like Google will decrease my rankings, or will it not?
So just because there are no external links to those pages, doesn’t mean that Google will not be able to find them. So that’s kind of the first thing because especially if you have posts that have Tags and Categories on them, then you have links from the posts to the category tags and category pages. So what I would look at there, instead of external links is how those pages are currently being shown. So in Search Console, and the performance report, look at the individual tag and category pages or kind of like a group of those pages. And then based on that, you can make a decision, can I just delete these pages? Do I need to redirect them or not? If there’s almost no traffic or no visibility of those pages, then cleaning them up is perfectly fine.
Okay, on the same question, I have another question like, I have been using some SEO tools, which suggests that I should create canonical links. Okay, for suppose there are two tags, and there is a post. So it is better to direct all the tags and the posts to the same canonical link. So is it correct or I should not do this?
We usually recommend using the canonical link element when you have multiple URLs for the same content. And it sounds like these are different pieces of content like the the category page tag page
Suppose my post is dog food. So I have a tag called dog, I have a tag called food. Okay. And there is a category called animal. Or pets, there is a category. So now I want to delete pets plus food and dog, because I want to create some different types of categories. So you already gave me like, I have to group tags after that. So if, like suppose let’s take an example WordPress using which we can define the canonical links. Okay, so while testing the SEO, SEO for readability and SEO, we find that each source like it within the tag, also, you have to create some content, or within the category also, you have to write some content. Because if I check on any SEO tool, it will suggest that your tag is empty, you need to create content. Again, the problem is like you don’t, I am unable to reproduce the same content. Suppose I have a content called food. And I have tagged for dog food. And food is a generic term. And how can I write the content for dog, it canonically will also not work, I have to again, if I create a tag that Google will expect that I would write around 500 to 800 words. So I saw I guess a couple of things like add a one, just to add, do you think tags have an importance, like if the tag should have proper content, like 500 keywords for a tag in WordPress?
We don’t count the words. So I would not look at the word count and say it needs to have this much content. Usually, what I recommend doing is looking at the tag and category pages, more as groupings of categories of things. So if you have a lot of posts about different kinds of pet food, then making a category for pet food makes total sense. However, if you only have one post about dog food and making a category for dog food is the same thing. It’s not a group of things or category of things. So that’s something where I’d say Well, probably you don’t need that. But from an SEO point of view, I would try to minimize the number of pages that you create across your website. And try to make sure that for all of the pages that you do have that you have significant content there. So if you’re saying this category, dog food, doesn’t have anything unique on it, then maybe you don’t need that category. And that’s kind of the direction I will take there. It’s not that we would penalize a page. If it’s a category page, and there’s just one item there. It’s more that you’re diluting the value of your website across a lot of unnecessary pages that you could concentrate on and make your existing important pages stronger.
Good evening, john. It’s the first time I’m talking to you and thank you. So I have a couple of questions. I’ll go one by one. So the first question is like, my website is 100% passed core web vitals and all the URLs are core web vitals valid. Now I want to restructure my site. I mean to say that I want to change the URL to the better SEO friendly URL. So earlier, like a lot of sub folders were present and a lot of sub directories were present for the primary content, which we want to rank organically. So my first queries now I have already changed the URL. So will the core web vital metrics, whatever exists for my pages will they be passed to the redirected URL? This is the first question. Because those pages are entirely different. Like page x is in folder xx x. Now there is a new page, which is not under the folder, but now we have a redirect. So the score is directly captured from the chrome user experience report. So users have not opened the page. So do you think that core web vitals benefits, whatever captured in the Google Search Console will be passed?
My understanding is that they would be like, like any other signal from search, if we see a redirect, then we would forward the signals that we have and apply them to the new URL. And it doesn’t matter so much if the URL looks different. It’s more that you’re moving from one URL to another URL. But I would expect this kind of a change. Whenever you’re doing a restructuring of a website internally, especially with changing URLs, I would expect that to be something where you would see fluctuations in search, maybe for a month or two.
Okay, fine. I got the answer. So I’ll go with the next question. So my next question is like, suppose I have a website, and I have only 100 pages indexed. So Google will capture the cRuX report, Chrome user experience report only for those index pages by Google?
So in general, Chrome would be collecting this data on a per URL basis and be using the the usual anonymization techniques, kind of what is used with the chrome user experience report data to collect that. And then how that would be applied to your website and search would essentially try to be done on a per URL basis. So you might have 1000 pages in total. And if 100 of those are shown in search, then for those 100 that we would show in search, we would use those 100 pages and their signals, we wouldn’t really use the other 1000 pages that you also have.
I got your answer. Thank you. So like, my third question is I have a page where I’m putting lang equal to lang English in the meta meta tag, marking the language that I’m using is English. But I have other content also, like in Spanish or French or something, which is in non English, but I’m not marking this by adding the tag meta tag Lang equal to French or something like that. So when will my non English language content be used by Google for ranking factors, or it will be considered like, Okay, I have other content also, which is a non English language, and I still want to rank it?
I mean, if we find internal links to those pages, we can crawl and index those pages. And it’s something where, like, if we have links there, we will crawl and index those pages. We don’t use the lang meta tag on pages, but rather, we try to understand which language your page is in based on the text. So if you have content in different languages, we’ll try to find that and show that in search appropriately. And then it will be used for ranking because it could be shown in search.
Okay, so my last question I have is a little bit more related to the schema.org – not last question – second last. So like, we are an affiliate coupon site. So we have seen the post by Google that Google will not rank the site, if it has thin content, thin content means most of the affiliate site, they had the same similar content, not adding any value to the content system or the web system or not even adding much value to the content or the user experience. So what we have done we have done a lot of hard work from almost more than two years to things on that, and we have added a lot of a schema.org, JSON-LD, whatever required, like, FAQpage. And we have captured almost most of the things which can add value to the users. When forgetting about whether we are going to get traffic or not, we have considered that okay, the user experience. And it should add the episode benefit like you if the user is browsing our site or browsing our page, let him benefit from the site first. So considering that, like, I have a page, like I am listing a Walmart sale event, so the sale event like have they can have n number of sale events like Black Friday sale, or New Year sale or Christmas sale? So can I have a page on my site where I list all the sale events? And can I also mark them with the sale event schema? Is that allowed? Because I have seen on Google like, we should not add a markup and it can bring down our ranking and can penalize if we put a wrong schema or something which is not misleading the user or the system? Now the question is, like, I’ll summarize. I want to list down the sale event for a particular merchant. And I want to mark those sale events, like what is the sale event name? Like say Black Friday? What is the time period that I just want to mark them in the form of schema.org or JSON LD. So is that fine? Should we go ahead and make this change?
I don’t know offhand the requirements for that type of markup, my understanding is if you’re using the event markup, that would not be suitable for this. And in general, the kind of the structure data on the pages should be applicable to the primary type of content on the page. So I suspect listing a bunch of events from one vendor on a single page and marking that up would not be in line with the guidelines for structured data on our site. But I don’t know all of the guidelines by heart. So I would recommend checking out the documentation. And just double checking before you implement it. Also, keep in mind, there’s a lot of structured data in schema.org, which we don’t necessarily show in search. So it might be the sale event structured data, but I’m not aware of it. It might be something that we don’t even use at Google.
Oh, okay, fine. I’ll double check the standard. Okay, so the last question is, what is the importance and significance of marking a page as a corner store content? In the plugin like SEO Yoast plugin, we can mark a post as a cornerstone content.
I don’t know. That sounds like something specific to the plugin.
Okay. The thing is, we have followed the best practices and considering the user experience and added the best content. So we are still struggling with some traffic. So if we can get some pointers, like what is causing the issue with our site or something? Like what is exactly? Making our site struggle to rank organically instead of like, after we put a lot of words?
Now, I, I don’t know, it feels like something that’s kind of out of scope for the office hours here. My recommendation there would be to go to the search central help forums, and post some of the details of what you’ve seen, some examples from your website with URLs, and try to get advice from other people there. You probably won’t find this one thing that you need to do to improve your ranking. But they might be able to give you advice and say, Oh, it looks like this is just a collection of affiliate links, or it looks like there is not enough content on these pages. And that I think that’s the kind of input that can be quite useful. Yep.
Let me run through some of the submitted questions. And then I’ll get back to all of the hands that are raised as well.
Just so that we don’t lose track of the submitted things.
We have different types of pages on our site contact pages, vendor pages. We recently released product pages. And category pages after our release of product pages, we saw meaningful site traffic decrease as product pages were being frequently ranked above supplier pages. Any tips on how to indicate to Google what product pages are less relevant than other types of pages other than de-indexing our product pages?
Yeah, it also mentions the priority in the sitemap. We don’t use the priority in the sitemap file. Originally, I think we thought that would help us to figure out which pages to crawl more frequently. But it turns out not to be that super useful. With regards to a general website, where you have different types of pages, and you want to have one type of page ranking above another one, there’s no HTML markup where you can kind of specify this is my most important type of page.
And this is a less important type of page. But what you can do is create web structures such that Google will recognize which pages you care about most. In particular, for most websites we tend to look at the homepage as something that’s the most important part of the website. And from there, we see the links that lead out to the rest of your website.
And then kind of all of those individual steps have links, like going to the first step, and then going from there to the next step. All of that, essentially gives us a little bit of a guidance on what you consider to be the most important part. So we’ll start with your homepage and want to think well, the most important pages you probably mentioned on your homepage, so we’ll go off and crawl those and try to give those also a little bit more weight in the search results. And from there, we’ll go to the next pages. So if you have a structure where you have like category pages, subcategory pages, and then product pages, and they’re kind of linked like a tree structure, from there, usually we’ll see kind of the category pages as being kind of more important and kind of less importance as things kind of fan out a little bit.
And you can also control that a little bit yourself in that you could kind of individually take individual products, where you say, Oh, this is my most important product, either. Maybe it’s a new product, or maybe it’s a product where you sell a lot of things or where you have a kind of a high ROI on the specific products. And you could specifically mention those on their homepage. So when we crawl your site, we’ll see Oh, category pages, but also, these five products are mentioned on the homepage.
Therefore, those five products must be really important. And you can apply the same thing with your specific kind of site structure there. So take the URLs that you care about the most and make sure that they’re linked from the route of your website. And then from there, think about what the structure for the rest should be. And especially if you have a medium to large size website, that’s something you can probably pull together apparently, nicely.
If you have a smaller website where you have like 10 pages, and they’re all equally important, then it’s a lot harder for us to figure out which one of these pages is actually the most important for you. Because they’re all usually all linked together anyway. But it sounds like you have kind of a structure where you could tweak things a little bit. So that’s the direction I would go there.
My website has limited traffic within a threshold. And I didn’t get a notification in Search Console about page experience. And I couldn’t get any field data from any of the tools like Page Speed Insights or Chrome user experience report, just wondering how to find the field data that have the website benchmark and work on that.
So yeah, I think this is a super important question. Because we do use the field data as essentially the metric that we would use for search, when we start rolling out the page experience update for rankings, we would base that on the field data. And if your website doesn’t have any field data available, then, of course, we don’t have anything to work on. But that doesn’t mean that you don’t have anything to work on. On the one hand, you have the different lab tests that you can use to try to figure out which pages you need to work on.
There is, I think, an article on web dot Dev. where essentially, it uses I think, analytics to collect the field data on a website. So that might be an approach you could take.
A couple of months ago, I used the Yoast video plugin to submit a video sitemap for my recipe site. Since then, I’ve seen a dramatic decrease in click throughs on my recipes, my rank itself has stayed the same, and impressions are about the same. But now in mobile, I’m seeing recipe videos start auto playing a few seconds in the search results. I’m wondering if that is contributing to the drastic decrease in click through rate on my recipes. So as a little bit more in the question. But essentially, I think there are few parts here.
On the one hand, it is possible that if your site is being shown in a different way in the search results that people are interacting with that slightly differently, it could be that like, especially if you have a video snippet shown in the search results, maybe they look at the video and they say, Oh, this isn’t actually what I was looking for. And then they tend not to go to your site, they could also be looking at the video snippet and saying, Oh, this is exactly what I was looking for, and then actually go to your website.
So that’s something where based on the content you have available, you can kind of control that a little bit. The other thing with regards to video sitemap and the video snippet in general, there’s a robots meta tag that you can use to control whether or not you want to have a video preview shown for your pages. And if you do want to have a video preview, showing how long that preview should be. So that’s something you can control there. The other thing is, with regards to removing that video snippet completely, just removing it from the video sitemap file won’t necessarily remove the video thumbnail. So essentially, if we spot a video on a page, and we think this is a relevant part of the page, we can still show that video thumbnail in the search results.
I think you can suppress that with the robots meta tag. I’m not 100%. Sure, offhand. But essentially, if you want to make sure that we never show a video thumbnail for your pages, then removing the video is kind of the right step there. So those are kind of a different option there. My general recommendation here is if you’re seeing a decrease in traffic to your site based on that video thumbnail, then I would consider that maybe this is also a sign that you need to improve that video so that it actually is a better representation of what you’re providing on your pages so that users look at the video and say, Oh, this is actually fantastic, and then go off and look at your site.
So that’s, I think, kind of the long term recommendation that I would take there, I wouldn’t just blindly say, oh, people see that my pages aren’t that great. And then they don’t come and visit. But rather I kind of use that as kind of a nudge that you shouldn’t increase the quality of those videos and the content too.
Why does the crawling request for my homepage fail? This has started happening in the last two weeks.
I don’t know, I probably need to take a look at the individual URLs there. And this is probably also something that could be done in the Help Forum. If you don’t have a way of giving me those URLs directly, essentially, in the Help Forum, the folks there have a lot of experience with different kinds of website issues. And they can probably help you to figure out whether this is a technical problem, and if so where could it be coming from? And they can also recognize when it’s an issue that’s on Google side or probably on Google side. And escalate that to the Google team that is kind of monitoring the forums.
Are Google currently testing a new search result layout for local businesses on mobile? We’re seeing instances of the Google My Business listing showing for brand search in top position, then there’s no organic listing underneath.
I don’t know what is happening with local search results. I am not involved on that side at all. That’s usually kind of more something on the map side. So I honestly don’t know. I have seen some tweets about changes in the kind of Google My Business search results but I don’t know what is actually happening there. And with regards to testing new layouts, I would go with the assumption that Google is always testing and always trying to figure out better ways of showing the content in the search results.
We’ve been previously told that the impressive organic visibility growth of domains at Google has nothing to do with its owner and the fact that the folks who work on SEO at Google almost have a disadvantage compared to other people. And then it goes on into a fairly long discussion of like, why is this website ranking like it is?
I’d love to kind of dig into this and give some more input here. But in general, we don’t talk about why other people’s websites are ranking the way they are. But rather, we try to focus on things like what you could be doing to improve the visibility and the quality of your own website. So just kind of like why is this website ranking like this? Why are they showing like this? That’s from my point of view, not particularly useful, right? I’d rather focus on individual websites themselves and try to give some input on what you could be doing to be appearing differently there.
I want to add a page on my site where I list all sales events. I think we talked about this one briefly. My crawling has been slowed since the new search console was launched. I see sites take longer now to index a new site could take two to three days versus the minutes with the older Search Console. Also, is there any way to limit the amount of interactions you can have with Search Console limits seem to be unseen also? Where can you see your daily credits?
So I think there is some confusion with the question itself, in terms of search console is essentially mostly a reporting tool for what is happening in search. It doesn’t control what is happening in search. So it’s not the case that like when a new search console is launched, then all of Google’s search systems are changed. But rather, the new search console is kind of a new UI that uses new systems to display information about your site. And if you’re seeing changes that we’re crawling your site’s lower than before, then that would not be related to Search Console. So that’s, I think, the most important aspect here. With regards to limiting the interactions that you can have in Search Console, I’m not aware of any limits like that.
I could imagine that if you use a script to automate Search Console that at some point, our kind of denial of service and abuse systems will kick in and just say, hey, well, this is not the way that search console was meant to be used. But essentially, for normal interactions, there’s no like limit or credits that you can get with regards to how to use Search Console. Um, let’s see, we’re still experiencing a canonical issue with our international sites, currently our page, so and so is index and canonicalize to the .de, .ch version. I have no errors in Search Console for my hreflang sitemaps. But I do find the URLs and duplicate submitted URLs not selected as canonical session, is this a bug? Is there something I can do to fix this?
So I probably need to take a look at the examples. What is it? What exactly is happening here, but it looks like one URL is the German for Germany version, and the other one is a German for Switzerland version of the same page. And what often is done across websites that target different versions of the same language is that they use the same content. And I don’t know if that’s the case here. But for example, what we would often see is that a page in German for Germany, Austria and Switzerland might have exactly the same content. And in a situation like that our systems would try to recognize that even if there are the pages are mostly the same, we will try to recognize that and pick one of these URLs as the canonical for indexing.
The advantage there is that it’s a lot easier for us to focus on one URL rather than to focus on three separate URLs, we can recrawl and reprocess that one URL a lot faster than we can three individual ones. In practice with the hreflang annotations on those pages, what would happen is that we would still process those hreflang annotations, and we would still be able to kind of replay those URL changes in the search results.
So practically speaking, we would index one of these versions as a canonical, then if a user from the other location is searching, then we will swap out the URLs that are shown based on the hreflang annotations. So, practically speaking, the hreflang would be working, technically speaking, we will be indexing just one version. And the most confusing part of all of this is that Search Console tries to be smart about this and focuses on the canonical URL.
So it doesn’t report to you that we’re showing kind of like these different languages or these different country versions, to users. So in the performance report, you would only see performance data for the canonical in the indexing report, you would only see indexing data for the canonical. And for the kind of other country versions, you would see that what you’re seeing there is like it’s a duplicate URL, and we picked a different one as a canonical.
So that kind of makes it a bit tricky. What I recommend doing in a case like this is just searching with those country settings. So I use the HL equals for the language parameter and GL equals for the country parameter and the URL parameters. When searching and searching for something like I don’t know, some title from the page, and then just double check to see that it’s actually swapping the URLs. And if it’s swapping the URLs, and everything is fine, it’s like the reporting side is completely messy and hard to trust. But at least from a search point of view, it’s working well.
And if you need to change that, so that we absolutely index these versions individually, then they absolutely need to be different. And it needs to be something where when our systems look at these pages, they can say, Oh, this is really a completely different page. And then we’ll index them individually, we’ll report on them individually, all of that. From an SEO point of view, you’re not going to get any advantage by having completely separate URLs indexed. It’s really just like the reporting side is a lot easier if we index them individually.
Okay, let me see if I can find something short here. I’m good at that.
We operate an online reading platform for books, when you search for a book Schema for Google Search can pull up the book in a section below the knowledge graph on the right-hand side. To submit the books, you need to implement the schema and submit your books through a form to a team at Google. We followed all the documentation submitted a number of times, but we haven’t had any progress. What can we do there?
Yeah, so especially for some of the newer or less used features, we tend to have this setup where you submit your site and then someone at Google will take a look at that and see if it’s aligned. And if everything is okay. And we’re still adding new sites to that part of search, then we’ll kind of activate that for that site. But I guess there are like two situations where it’s sometimes that doesn’t work that directly. On the one hand, if there’s not a big team that is working on this feature at the moment, then maybe they won’t be processing those submissions.
On the other hand, if there’s work being done to activate this across the board, so that anyone can use this markup, then sometimes they just wait until they can actually launch the fully open version of that setup. But if you want, if you can give me some sample URLs, maybe some information about what you tried to submit there, I can try to find a team internally and pass that on. I can’t guarantee that they’ll be able to activate this for your site, but I’m happy to let them know like, Oh, this site has been kind of trying to get into this feature for a while and hasn’t had any response so far. Okay, kind of running low on time, but maybe I’ll take a couple of live questions.
Hello, everyone. We are one of the big pharmacies in Switzerland. And recently we gained about 300 to 400 spammy backlinks, most of them are hosted by blogspot. So my question is, do you recommend me to disavow them in Search Console or no, just let them go?
I think if you’re a big site, and you’re seeing a couple 100 spammy links, I will just ignore them. So it’s, it’s something where if you know that in the past you’ve kind of bought links then like that’s worth disavowing. Or if you look at it and you say, anyone from Google, when they look at this, they will think, Oh, I bought these links, then that’s worth disavowing. But if these are like random blogger links, and they’re just like spammy, weird blogs that are linking to your site, I would just ignore them.
They are weird, spammy backlinks. So thank you so much.
Quick question, actually. So I’m out here with bucket pills. We are one of the biggest online pharmacies here in Canada, we serve the whole country. And we are debating this question which I’ve tackled as well before. I know you probably have covered it, but I wanted to bring it up once again. So it is very interesting for us to be able to obviously target the term, for instance, online pharmacy, plus a particular geo target, or province or city, such as maybe Ontario, British Columbia or whatnot.
So the question here is really, should we aim to have separate landing pages for each province? And potentially each city? Or should we just stick to our main homepage, right. The issue here is, of course, from the data that we see coming in from our current electric system users, people might benefit from having some dedicated content specific to in tailor to the particular specific location, then some medications are more popular than others in different areas of Canada. And so we really want to serve the user in that way. And that’s what we thought about, you know, separate landing pages. But at the same time, of course, we are concerned about, you know, keyword cannibalization. So could you give us any input on that?
Now, I think the aspect I would worry about more is doorway pages, kind of everything around, like, I don’t know how many provinces there are in Canada. So I feel like I’m missing some crucial piece of information. But if this is something along the lines of, I don’t know, a handful, maybe I don’t know up to 20 things, then that feels like something where maybe that makes sense to have individual pages for that.
Especially if you know that there are individual guidelines for these individual provinces, for example, then that’s something where I say, well, maybe it makes sense to have something for those provinces, for kind of like a global company, or you’re really saying this is actually unique information for people in that province, and information that they need. And not just like, popular products from your region kind of thing. But rather like something useful from that, on a city level, I think that’s going almost like way too far, because then you suddenly have like hundreds or 1000s of these pages. And you can’t realistically provide unique guidance on a city level, if you’re a global or country wide company. So that’s something where I tell you, like if you end up with, I don’t know, 10, 20 different versions of these pages, that seems like something reasonable. If you’re going way past that, then you’re essentially going into the area of doorway pages, where you create a lot of pages, maybe they map to individual keywords, but you’re diluting the value of your site on the one hand, which means kind of like the bigger mass that you already have built up is kind of like diluted, which means if there’s a lot more competition, I imagine there is, then that makes it harder for you. It’s easier to rank for some random city name plus pharmacy, but then kind of like the head terms, you kind of really lose out there. And by also not diluting them too far, you avoid the risk of being seen as having a bunch of doorway pages as well.
Perfect. JOHN, thank you so much. I have another quick question. If you don’t mind. This is regarding I think this topic has also been covered as well before and I’m bringing it up once again. And it is the matter of how to handle on page citations. And this is just to give you a specific example very quickly. We sometimes have to reference certain details from the manufacturer or pharmaceutical that we get our medication from. So we obviously would like to provide a link and a source to your visitors, so they can always go back and check the manufacturer’s details.
So the question here is, how should we handle this? This page is on our page, is it best to happen, for instance? Obviously, you know, using something like the Chicago style I’m writing. So I mean, at the bottom of the page, perhaps footnotes right. But should they be follow link attributes? Should they open on a separate tab? Should they have an existing tab? Should they be in an accordion? Should they not be? How do you suggest we handle those?
Thank you. So from an SEO point of view, I would make these normal links, so not with a nofollow, because it’s essentially information that you’re providing. So as long as those are not affiliate links, then it’s not that they’re paid links, it’s more information that is organically provided within your website. And how you structure that within your website is more something I would look at, from a user experience point of view. And maybe do some tests with users to see which way works best for them and how they can best recognize that, like, these are useful links, these are authoritative links, because they’re from the manufacturer, and all of these subtle details. From an SEO point of view, I don’t think it matters if they open in the same tab or a different tab, or if you have an accordion or whatever. It’s really more the user side that I would worry about there.
Perfect. Okay. Just I wanted to add to this because I think it actually might be helpful for everyone here. You had mentioned the notion of, for instance, sometimes adding, of course category pages to your homepage, and perhaps the most important products that we would like to feature right on our homepage. So this is something that we experience quite frequently, because we may add new medications onto our product catalog. The issue or the question we have here is what something swapping out, for instance, some of the feature products on our homepage, in favor of having some of the newer products that we had to go catalog in there for us to get an index first, of course, and then ranked by Google as well. Would it somehow dilute then perhaps even the ranking or the credibility of the previously submitted, you know, important products as well that we actually have in our pharmacy? Can you give us something?
Yeah, I mean, if you change the structure of your website, then we’ll try to reflect that in search. So if you previously said this product is one of the most important ones, it’s linked from the homepage, and you remove that link, then suddenly we say, Oh, it’s it’s no longer one of the most important ones, we will still be able to crawl it, because probably it’s still indexed, we will still be able to find it and update it. But we’ll kind of say, well, it’s not the one of the most important ones anymore. So with that in mind it is like usually what I see on e-commerce sites is that they have different sections kind of in that homepage area, where they say things like most popular products, newest products, something like that. And that kind of works for users, because they go to our homepage and I see Oh, new products, I will take a look. And it also works for search engines because we can pick up Oh, most popular and it’s like we can also see kind of the newest products as well.
I see. Perfect. Thank you so much, John, I like to give more time for you guys as well. Thank you.
Cool. Okay, let me take a break here with a recording and I’ll be around a little bit longer so we can continue to chat, but kind of make sure that the recording is a reasonable length. Thanks to all of you who are watching this recording afterwards. I hope you find it useful. And maybe I’ll see some of you and one of the future hangouts as well. All right. Bye for now.
Where to Find John Mueller’s Google Search Central Office Hours Hangouts
John Mueller holds Google Search Central Office Hours Hangouts every other Friday.
You can find previous versions of the recordings from these sessions on the Google Search Central YouTube channel.
Click on play on the video below to view the video from this session: