This time, we bring to you John Mueller’s Google Search Central Office Hours hangout from July 23, 2021.
The topics John covered during this hangout included the following:
- Adding a rel=canonical tag on many different sites,
- Creating links for different languages,
- Accidentally de-indexing and blocking search engines from crawling the site,
- Press releases and how Google sees them,
- Negative SEO attacks,
- Embedded software on web pages,
And much, much more.
Be Sure to Watch His Hangout!
We recommend watching his hangout from July 23, 2021 here:
You will also be able to find the transcript from the hangout as well.
John Mueller SEO Insight #1: Can We Add a rel=canonical Tag for Many Different Websites?
A webmaster was concerned about adding rel=canonical tags for 100 different websites. They have more than 100 different websites set in the English language. Their specific problem was that they would have a French version for approximately ten days only. Is it possible for them to add a canonical just for the English version?
John explained that you can apply the canonical tag to any page you want to have indexed. The main criterion is to make sure that you have one URL that’s the preferred version of that URL. Then you set the rel=canonical tag to that version. Google understands that for international sites for different countries, or different languages, each of those versions is unique. John recommends setting its own canonical. So the English version would have a canonical, and the French version would have a canonical and so on.
John Mueller SEO Insight #2: Do We Need to Create Links for Different Languages?
The same webmaster also asked about links. Their question was about creating quality links in bulk to each separate language page.
John explained that link building is something that’s going to happen naturally over time and that it’s not something you would need to manually do. The important part, he said, was that you have strong internal cross-linking. Over time, French users will appreciate the site and perhaps begin linking to it. Backlinking is considered a natural thing and not something that you have to do manually.
John Mueller SEO Insight #3: Our Client Did Not Block Search Engines and Google Indexed the Wrong Pages. We Fixed it and Resubmitted the Sitemap and Nothing. What Can We Do?
A webmaster was working with a client who had not blocked search engines from crawling the new design of a website. When they went through the reindexing process by resubmitting a sitemap, Google didn’t index a single URL of the site. The client’s question was if there was anything that could be done differently to repair this and get Google to crawl and index the new site.
John explained that it sounded like there were no technical issues involved—since the individual URLs get indexed, then it sounds like there isn’t an overall block of the website. An overall block could occur if they submitted removal requests and they hadn’t realized that, but it sounded like both of those issues were okay.
He said it’s mostly a matter of Google recognizing that it’s a useful website and then crawling it. Submitting URLs manually is one approach. Google does, however, need to understand how the website fits in with the context of the overall web. This is something that just takes time.
During this time when Google is learning about the site, one of the best approaches is to find ways to promote the site directly. Instead of publishing and praying that Google finds it, as a webmaster, you’re required to do a lot more work to find that traffic. You have to find the right channels where you can promote the website in order to drive users and traffic. It takes time and practice to find the channels where people who would appreciate these topics hang out.
Run ads, do public shows, do social campaigns, work together with other sites (maybe get on their newsletter regularly), and so on. All of these things can help drive awareness.
Then Google will finally understand over time that this is a site they should be indexing much more.
John Mueller SEO Insight #4: How Does Google See Press Releases?
Another webmaster had a question regarding press release submissions. The content they use for the submission is the same every time, and the domain name at the bottom is linked. The event is the same, and this content is published in a variety of newspapers, while also being the same as it is on-site. Will these types of press releases be a problem for Google?
John explained that if you want to rank for that content, then putting that content on other people’s websites actually makes doing so quite a bit harder. Using press releases to drive awareness of your business is a reasonable approach. Regarding links in press releases, John also explained that the important thing about press releases is that it’s clear they’re press releases and that they were written by the website owners themselves. The way to do that is to make sure there are no follows on all the links in the press release. Just having a domain name link is fine as well. Google will say “it’s just a domain name. It’s not like it’s keyword-rich anchor text.”
If you want to just drive awareness of the site, then Google doesn’t really care if the content is ranking or the press release has the same content on this site or on any other site. It IS different, however, if you wrote an exceptional piece, and it’s copied and pasted into all sorts of different press releases, and these appeared on all these other websites. Then the content on these press releases would rank above the original content on your site.
John Mueller SEO Insight #5: Google Ignores Negative SEO Attacks
A webmaster had two questions about this complex topic. How easy is it to get away from someone who is constantly attacking their site, and should they disavow the domain or their backlinks?
John explained that in general, Google will recognize and ignore such negative SEO attacks. He wouldn’t necessarily worry about it. If they’re really worried about it, and it will make them feel better, adding these domains to the disavow file is just fine.
He also reiterated to just list the domain as opposed to the specific URL. Even if there are 500,000 URLs in Google Search Console, the webmaster should do it at the domain level. It’s just easier. Listing the domain in the disavow file is exactly the same as individually disavowing 500,000 URLs.
John Mueller SEO Insight #6: Google Needs Clear Text on Pages That Have Embedded Software (Such as Calculators)
Another webmaster was having challenges with reindexing a client’s URL. The page was in the sitemap, but Google Search Console was reporting a soft 404. GSC said to try it again in a few hours. But the webmaster had been trying this for two weeks. They tried the mobile-friendly test, it passed, and there were no issues from the webmaster side either. The page is a car service pricing calculator tool, so the webmaster was wondering exactly why they can’t get the URL reindexed.
John explained that Google tries to recognize soft 404 errors automatically, especially when there’s something like clear text on the page that tells them there’s no content that’s actually there. John recommends making sure that you add some physical text to the page explaining to the user that this is a calculator, we have no values calculated for today, or something like that. He also said that Google has been seeing a rise in reports of software for pages, so they’re going to look into those glitches as well.
John Mueller SEO Insight #7: If You Have a High Percentage of Redundant Meta Descriptions, Google Will Attempt to Generate Them Automatically
The fifth Webmaster was worried about REACT and JavaScript. They have a situation where their meta description picks up snippets from the website, and it’s not a full sentence. Apparently, there are issues with it picking up a title from one of the products or services. The meta description is actually fabricated by Google itself, even though they have the meta description entered into the page.
John explained having a real meta description is a first step that’s really important. He suggests making sure to spend the time crafting the correct meta descriptions and making sure that they are showing up on the physical page. He also explained that you don’t want to have things be repetitive across the website. If the meta descriptions are basically the same description across the website, then Google will automatically generate meta descriptions. Double-check to make sure that your meta descriptions are not overly repetitive.
In addition, John said Google also attempts to recognize when things are accidentally spammy in the description if you have a lot of keywords or you’re just using the queries you want to rank for without adding additional value. They will say this is not really all that useful for the user, in which case they will re-write the meta description to a better one.
The next reason Google changes the meta description is when they want to match the queries that people are actually searching for. Because sometimes these meta descriptions don’t match those queries. They will try to use that query if it’s mentioned in the meta description itself. If the query is not added to that meta description, then they will automatically generate a new meta description based on the content on the page instead.
John Mueller SEO Insight #8: Google Tries to Pick the Right Content Based on Region
The sixth webmaster was concerned with some international SEO problems. The first one was that a client they are working with has more than one foreign language version of that same website. Their competitors are ranking for regional search queries that they’re not even though they’re posting regional content.
John explained the HREFLANG tag would not change anything in this case. The HREFLANG tags help Google pick the right URLs to show in the SERPs, but they would keep the ranking the same. They would end up swapping out the URLs for the local versions. He used the analogy of a shop that sold basketballs in India and another shop location was in Singapore. With regionally specific content, Google would simply pick the right ones to show with content like that. It’s a matter of normal ranking and not anything you can influence easily using technical SEO. In this case, having all the traditional SEO specifications in place is recommended.
In a confusing case like this, the only thing that you can really do is work on the internal linking in order to ensure that Google identifies exactly which pages are important.
John Mueller SEO Insight #9: Multiple Location Landing Pages Are Classified as Doorway Pages
One webmaster asked the question of whether or not having multiple location landing pages is okay to have when you’re a business that legitimately services these locations, even though there is no location physically present there. The webmaster heard these are considered doorway pages, but they wanted clarification.
John explained that yes, they are doorway pages. It’s a matter of having a lot of them. If, for example, a webmaster said their business is global and they deliver to every city in the world, and you create landing pages at the city level for all products, then that’s a doorway page. These are situations where you create more than one page that are extremely similar, and they’re trying to rank for different queries, but they basically lead to the same funnel.
Either Google’s algorithms or webspam team would definitely pick something like that up, and say there’s not exactly all that much in terms of useful content on these pages.
John Mueller SEO Insight #10: Google Doesn’t Guarantee Any Rich Results
Another webmaster was concerned about Google removing all FAQ snippets from a site without even showing a single warning in Google Search Console.
John began by explaining that Google never guarantees any rich results in the search results, so they wouldn’t provide any warning. This is a case where you’re using the correct markup that you’re eligible to actually be shown. It’s likely that sometimes Google may show a lot of rich results for the site, and perhaps they’ll scale these results back over time. And perhaps they will scale them up again over time. They could even turn off these types of results entirely.
The things Google looks at when it comes to rich results are:
Level 1: Are they technically implemented properly?
Level 2: Are they compliant with Google’s policies?
Level 3: Is the website’s quality at least okay overall?
If you have seen a significant change in ranking around a core update, for example, it’s likely that Google’s algorithms refined what they think about your website overall in terms of its quality.
There’s also the case where something is extraordinarily off-the-charts bad about the structured data that’s on the page that their webspam team had to take a manual action.
John Mueller 07/23/2021 Hangout Transcript
John
All right, welcome, everyone to today’s Google Search Central Office Hours Hangout. My name is John Mueller. I’m a search advocate on the search relations team here at Google. And part of what we do are these office hour Hangouts, where people can join in and ask their questions around search and their website, we can try to find some answers. A bunch of stuff was submitted on YouTube already, so we can go through some of that. But, as always, maybe we can start off with some live questions to get us started. It looks like some of you already have your hands raised, which is great.
Webmaster 1
I have a question related to international market targeting. Okay. So, like, related to, especially related to canonicalization? My question is, if you have 100 websites in an English language, okay. And now, you just wanted to translate, like create another directory of French French language. But in that you only created 10 pages. Okay, only 10 pages, you have translated a total out of 100. Now, my question is, can we set canonical inside the French version? Or not? Or inside the English version? Or not?
John
How do you mean canonicals?
Webmaster 1
Like, look, I’m saying, if suppose you have an English version. So for a particular English version, you also have a French version for 10 days only. Okay, now I’m asking, Can we add a canonical for English version or not? Because I’m already applying hreflang.
John
Now. So usually you can apply the link rel=canonical to any page that you want to have indexed. So that’s kind of the basis criteria that you have one URL and you have a preferred version of that URL, and then you set the rel=canonical to that. And when it comes to international websites for different countries, or different languages, each of those versions is unique. So it would be its own canonical. So in a case like that, you would have like the English version of one page would set the rel=canonical to itself. And the French version of a page would set the rel=canonical to itself. And you will have the cross-linking with hreflang, but the canonical would be per language.
Webmaster 1
Okay, so, but the content on the English version, and the French version is 100%. Similar, like 100% copied content. Now, still, can I use clinical for French language as in a French version canonical?
John
Even if the content is translated, then it’s unique content, then you have the rel=canonical per language again. So from that point of view, like whichever URLs you want to have index, that’s, that’s the one that you would use for the rel=canonical.
Webmaster 1
Okay. Okay. Understood, understood. And another question is related to, like, linking backlinking purpose, if suppose I have those pieces in front, I have 10 pieces in French language and 10 other pieces in Portuguese language? So can we start creating links, quality links, especially not? Like in bulk? Can we create links for those different languages? Or do we only have to work on English content?
John
I mean, it’s normal that these pages would collect links over time. So it’s not something that you’d need to kind of like manually do. So it’s, yeah, I think the important part with multilingual websites or international websites is that you have really strong internal cross-linking. But over time users in French will appreciate your website and they will link to it perhaps. So that’s something that essentially happens naturally. That’s not something that you have to kind of go off and do manually.
Webmaster 1
Now – suppose I’m creating backlinks for English content. So similarly, if I want to create backlinks for French content for Portuguese content, okay, so can we create all those with the help of those French versions of link…
John
It’s completely fine to have the different international links in international versions collect links, but I’m kind of worried about the way that you frame it. We will start building backlinks to these pages because that sounds a lot like you will start dropping links in different places to get those links. So that’s kind of the part where I’m like, you need to be careful of the aspect that an international webpage collects links over time. That’s completely normal.
Webmaster 1
Okay, okay. Unless, unless you. So we can also go ahead with change, I can also change the slug part of a link.
John
How do you mean, slug?
Webmaster 1
The slug part? If I, if we have a page like abc.com/xyz, okay. And I want to convert /xyz into the French version. So can we change our slideshow to the French version?
John
Sure, does it? Does it impact? Does it impact SEO? Not really. But you can do it. So some sites have localized URLs and other sites have kind of global URLs that’s up to you.
Webmaster 2
I have two questions. One is about one of our clients, they launched a website. In June, by mistake, they did not block the search engine from crawling and indexing the site. And they realized it at the beginning of July and they told us to fix it. So what we did was we changed the tags, we submitted the sitemap on Google Search Console again. And there was an issue like Google, Google Search Console sent a message that there are a lot of no indexed pages. So we clicked on the site for verification in progress, it became a check. But unfortunately, Google did not index a single URL of the site. It’s almost one month, so the client is asking whether we can do something different, to make the process faster. So I’m asking what we can do, because it’s almost one month, and it looks like no URL has been indexed so far.
John
Like so you’re seeing nothing in the reports, then for index coverage, or…?
Webmaster 2
Is almost the same, like what we tried, when we saw that nothing’s happening. We submitted a few URLs manually, like inspect the wall and then decrease Google to index, those few pages only get indexed by Google. But the one we submitted through sitemap submission, did not get indexed at all.
John
Okay, so it sounds like there’s not a technical issue involved. And, like, if you’re saying like individual URLs, when you submit them manually, they do get indexed, then that sounds like on the one hand, there is not a technical issue involved. And there’s also no kind of overall block of the website involved. So those are two aspects that sometimes play a role. The overall block might be if site removal requests were submitted, and you didn’t realize that, for example, but it sounds like those two sides are okay.
And then it’s essentially really mostly a matter of us recognizing that this is a useful website and going off and crawling it. And on the one hand, submitting URLs individually, that’s one approach. But we also kind of need to understand how this website fits in with the context of the overall web. So that’s something where sometimes it just takes time. And during that time, a really good approach is to find ways to promote your website directly.
So instead of like putting the website up and hoping that Google finds it and sends you lots of traffic, I kind of take the initiative and say, well, in the beginning, we have to do a lot more work to drive users to our website, because they don’t know about it, and finding ways to promote your website. So sometimes it just takes a bit of time and a bit of practice finding the right channels where you can reach people who are interested in these topics.
And then I don’t know, maybe running ads, maybe doing some shows, social campaigns, maybe working together with other sites. Any of these things, they can kind of help to drive awareness of the website. And with that, over time we will understand that this is actually a website that we should be indexing a lot more. But kind of like just dropping a website onto the web and submitting a sitemap file and hoping Google shows it visibly in search. That’s usually not a great practice.
Webmaster 2
And one more question. This is about press release submissions. So one of our clients is planning to do basically a submission for one of the recent events. Now the question is when they do the press release submission. At the bottom of the press release submissions, they provide information about the business and add the domain name and domain name is hyperlinked now that the content they use for the submission every time.
Each time the content is similar, like when the event is the same content published in various newspapers. So will that be a problem from Google’s point of view, because this is the same content published on the various news websites? So we are a bit concerned about how Google will consider this.
John
So that it’s the same content, I wouldn’t worry about that. If you want to rank for that content, then obviously, putting the content on other people’s websites makes that a bit harder. But if you just want to drive awareness of your business, then that’s kind of a reasonable approach. And with regards to links in press releases, the important part for us is kind of that it’s clear that these are press releases, and that you wrote them yourself.
And you, you can do that by making sure that there’s a rel=nofollow on the links that are there. Or if you just have a domain name link, then usually that’s fine as well, where if we can kind of recognize Oh, it’s just a domain name, it’s not kind of like this keyword-rich anchor tag that is there. So from that point of view, that sounds like a reasonable approach. And again, if you’re mostly wanting to drive awareness of the website, then you don’t really care if your content is ranking or the press release on this site or on a different site. It would be different.
If this is content that you want to rank for, like you did something really fantastic. And you put the whole report on your website and into the press releases, then it’s possible that these press releases will rank instead of your content.
Webmaster 3
I have two questions about negative SEO. The first question is, how to easily get away from someone who is doing negative SEO to you by negative SEO? I mean, someone points a lot of domains with a lot of bad backlinks on it. And he’s doing that constantly, for example, for a couple of months.
John
So with something like that, in general, we would recognize that and just ignore it. So I wouldn’t necessarily worry about it. If it’s something where you’re really losing sleep, because you think maybe Google thinks that I’m buying these backlinks, then putting the domain in the disavow file is fine. And I don’t know, I would just do that from time to time. It’s not something that you need to do every day. And just by including the whole domain, in the disavow file, you kind of are a little bit faster than what they’re doing. So it’s almost more efficient on your side.
Webmaster 3
So should I put the domain file or their backlinks? Because previously, I asked questions, and you said that Google is looking at the redirect between two canonical pages, and these domains are redirecting their backlinks. So they have tons of backlinks on it.
John
Not quite sure how you mean, but I think with the disavow file, I will just list the domain. Like if you look into Search Console, you can see the links report there. And I would just take the domain name from there and use that.
Webmaster 3
Yes. But for example, in Google Search Console, it’s showing me their backlinks. For example, if someone points a domain to my website, it’s showing their backlinks on it. And there might be more than 50,000 URLs. So I…
John
Now, I will just do it on a domain level. So I mean, it’s something where most websites don’t need to do this. And I think even if there’s someone trying to do negative SEO for you, our systems are so used to this kind of behavior and they ignore it. So I would say 99 percent you don’t need to do anything about this. What I would do in the situation where you’re just really worried you want to make sure that you have everything covered.
I would download the links from Search Console, I would extract the domain name from there maybe in a spreadsheet or something like that, and then just submit those delete. And that way, you kind of have that covered because that the links that we show there are kind of the ones that are similar to what we would pick up. So from that point of view, I would just submit those, but again, 99%, you don’t need to do anything.
Webmaster 3
The second question is, if there are no issues reported in Google Search Console, and no manual actions are taken, can this be a reason for a recent drop?
John
I would suspect not. I, it’s, it’s something where, especially with negative SEO, we have a lot of practice with that. And we’re really good at recognizing these kinds of things. And the main thing that we do when we recognize it is we ignore those links. So it’s not that we would say, we will drop your website in search, it’s just well, like we noticed, there’s some weird stuff happening, we will just ignore it. And we will focus on the rest of the good things for your website. So if you’re seeing a drop in search, then probably that’s related to something else.
Webmaster 4
So a page of my clients got in the index from Google a couple of weeks ago, this page was in the sitemap, but Google Search Console now reports it as a soft 404. And when I use a URL inspection tool, it says the Search Console says, you know, something went wrong. If the issue persists, try again in a few hours. But I’ve been trying this for two weeks. I tried this URL in the mobile-friendly test, and it says that it’s mobile-friendly. From the webmaster’s point of view, there are no issues as well. There are no manual actions as well. So I’m just wondering if I could get closer to what exactly is wrong with this page. So this page is a car service pricing calculator tool. So I was just wondering what could be wrong with this?
John
Yeah. So we try to recognize soft 404 errors automatically, especially when there’s something like, like a clear text on the page that tells us there’s no content actually here. So if this is a kind of a calculator page, I will just make sure that you don’t accidentally have kind of by default, some message saying, Oh, we have no values calculated for you today, kind of before you enter the numbers or something like that. The other thing is with software for pages, we saw a bunch of these reports recently, in the past couple of weeks, I’d say. And the team has been looking into that. And I think they turned one of the classifiers off. Now, based on some of the feedback that we got. So I would suspect that maybe this will just catch up again, and work out and I don’t know, in the next couple of days or so. So that might be related to that.
Webmaster 4
So just to clarify, the first thing you’re suggesting is Googlebot will probably need to see some clear text on the page indicating the the nature of this page, what what it’s actually about,
John
Not so much the nature of the page, but that there’s actually content there. So for example, a common situation on an e-commerce site is you go to a category, and then it says like, there are no products in this category. If we see a text that says, kind of like, there are no products here, then we might say, Oh, it’s actually a soft, 404 page. And with a calculator page like that, maybe it’s set up in a way that when you load it the first time it says, Oh, I don’t have any results for you. Whereas if you load it up the first time and say, Oh, this is a calculator and kind of like some general text, then that makes it a lot easier for us to recognize there’s actually something here and we should keep this page indexed.
Webmaster 4
Cool. And then the second suggestion you have is that, you know, it’s something on Google’s side that this thing may resolve on its own in a couple of days or weeks. Yeah. Okay. So then one additional question on this is, is there any chance that failing core web vitals, specifically the cumulative layout shift, could that have anything to do with it being seen as a soft 404?
John
No, no, that’s completely separate now.
Webmaster 5
My question regards react and JavaScript. One of my clients have their react JavaScript website. And for some reason, in the meta description, it picks up snippets from the website. And it’s not even a full sentence. It’s saying it’s like a result, website, result page category page where they have, let’s say, 126 results found. And then it’s saying map view. And then it’s saying it’s picking up a title from one of the one of the products or one of the services. So it’s, the meta description is actually fabricated by Google itself. While we have the meta description entered? How can we tackle this and to make sure that Google actually picks up the meta description instead of
John
Chaos?
Webmaster 5
*laughs* Yeah.
John
I think having a meta description is kind of as a first step is really important. So that I would just double check that. The other thing is our systems try to create their own descriptions and titles when we recognize that things are very repetitive across the website. So if you have like the same meta description across hundreds of pages, then we might say, Oh, it’s, it’s not so descriptive, actually, we will figure it out. And when it comes to, we will figure it out, sometimes we do a pretty good job. And sometimes we don’t do so well. Yeah, so kind of the repetitive nature, I would double-check that. The other thing is we try to recognize when things are accidentally spammy in the description, in that if you have a lot of keywords in the description, or it looks like you’re just including the kind of queries that you want to rank for in the meta description. That’s something where we might say, well, this is not actually useful for the user, we will, we will figure it out. And the third type of reason why we sometimes pick a different meta description is we try to match the description based on what people are actually searching for. So you’ll, you’ll see different descriptions, depending on the query that goes to the page. And if we can recognize that a part of the query is clearly written in the meta description itself, then we’ll try to use that. Whereas if we can recognize that actually, the query is not in the description at all, then we might pull out some content from the page and say, like, here is some kind of context for why we showed this page in the search results?
Webmaster 5
Well, well, I can assure you the query is there. And also when it comes to repetitiveness. When you’re talking about SEO on scale, you do not want to go to every category page and make sure that they have a unique meta description. So you find a way to make it somewhat unique. But also like, like templatized, in a way that you, hey, these are the keywords that we are trying to use. And we have, let’s say, three versions of some meta description that sentences that you use, and you juggle them around to different pages. Okay, um, is that valid? Or would you say, well, we still will recognize the repetitiveness in that even in that sense?
John
I think that’s usually okay. So especially on e-commerce sites, you kind of necessarily end up like that. The important part is that the description is not like a description of the website, essentially, where you have the same description across all of the pages. If you’re automatically generating a description based on the current page, that’s a perfectly fine approach. The other thing that you might be able to do is, when we try to create the description to show in the search results, we often try to focus on the kind of the first text that we find on the page. And that’s something that you can control a little bit and that if you can give us kind of like a clear heading on a page way on top in the HTML so that when we look at the page itself, we see Oh, it’s like we, we need to create our own description for whatever reason, which might be wrong now. But if we do create our own description, we can take kind of the first part of the page here and that’s, that’s like a clear heading. It’s not like A collection of menu items, for example. Yeah. And that sometimes makes it so that even if we have to revert to creating it ourselves, we can pick something that’s kind of like a reasonable piece of text.
Webmaster 5
Okay, well, it makes sense, because when I started that decline, we had some content above the, the, the listing view of all of all the products. And that is where, where it actually started to pick up one of the first sentences in the description of the site itself. But we move the content below the products, because the goal of the pages, of course, is to get somebody to buy a product, instead of going through a bit of content. So that’s why we moved it down. But I want to suggest that we put it back up.
John
I mean, I mean, you can use things like CSS to just display it somewhere else. So that’s, that’s something where like, technically, it could be on top of the page and the HTML, essentially, but with CSS, it’s located or displayed somewhere else. So that could be kind of a workaround. But also, if you can’t get this to work at all, and the descriptions look really terrible, then send me some examples. Ideally, like screenshots of queries, and then we can pass that on to the team and say, like, hey, like, we’re trying to do the right thing. But look, look at this chaos that we’re creating here.
Webmaster 5
Okay, I got a second question. Regarding the core update, we actually had a bump in the beginning of May, where we actually saw more traffic coming into one of our main pages, actually, all of the main pages, and that was very good. And we’re actually doing good in the core web vitals. And then all of a sudden, they went from, from that page experience to poor – needs improvement. And that was good. That’s when we saw the uplift. And then the core, the core update came, and it got reversed. Our clients, I see them recover in the last couple of weeks, or one of the clients with a client that we just talked about actually remains at the same level. We believe that it still has something to do with the rendering of the page that is not fast enough with the first contentful paint is not not low enough, and LCP is still too high. But it seems that the values have been changed. Or have been testing our website saying, Hey, now you’re got some pages or needing improvement, and all of a sudden, we’re back at square one. How would you approach this in order to get this off? Now, or just to know where the pain is? Because right now you think you have solved it, and then the update comes in?
John
Now? So I think first of all, the core update that we did is not related to the core web vitals. So that’s kind of an unfortunate naming, I guess. Now I think about it, but they’re really unrelated. The core update is really about kind of updating the way that we see relevance across a website. And the core web vitals is essentially just kind of the speed user experience side of things. Yes, that is a coincidence. Yeah, and the core web vitals, we started rolling them out, I think, in June, early June, and they’re rolling out until the end of August, something like that. So that would be something where at most you would see kind of like a subtotal, kind of slow drop over time until end of August, it wouldn’t be the case that you would see kind of like a drop from one day to the next. If you see something like a drop from one or two days to the next, then probably that’s more related to the core update.
Webmaster 5
Yeah, okay. Okay, then it’s that it’s totally the core update, meaning that we need to look at the quality of the content.
Webmaster 6
So I had two questions, actually. First would be that the client that I’m currently working on has multiple foreign counterparts of the same website. And sadly, we don’t have correct, not up to mark hreflang implemented on our content. Correct. So what is happening currently is, let’s say for example, some of our foreign counterparts are ranking on most of our Google search queries, along with our regional search queries are Well, whereas the content serving these regional queries on our website is also regional. Right? So is there any way around where and why we are posting regional content to cater to these regional search queries? But is there any way around wherein except implementing hreflang tags, we can rank on these regional query specifically. And this just started happening post the algorithm update that was released.
John
Okay. So essentially, you’re seeing regional pages not ranking the way that you would like them in those regions? So in a case like that, the hreflang would not change anything. So the hreflang annotations help us to pick the correct URLs to show but we would kind of keep the ranking the same. And we would swap out the URLs for whatever local versions. So if you have a shop that sells I don’t know basketballs, and you have one shop in India and one shop in Singapore, if a user in Singapore is searching, we will try to show the Singapore version there. And we would essentially try to keep the ranking the same. So that’s something where kind of regional content and it wouldn’t rank differently, we would just try to pick the right ones to show with regards to ranking content like that. I suspect it’s just a matter of normal ranking, it’s not something that you can kind of easily influence with technical means. So definitely doing things like having clean internal linking is really useful. Kind of all of the traditional SEO things is definitely…
Webmaster 6
But the problem, John here would be that we have the key, like basically, the content themes are very similar, like the counterpart also has the same regional content, I mean, content themes, specifically. And the j. g. Jan, genre is the same, right. But what is happening is for these regional counterparts, regional queries, we have similar content catering to that. Correct. But our foreign counterpart is ranking on regional search queries, which is what is happening currently. So is there any other way? I mean, like you mentioned, other than normal SEO techniques, which we are doing up to mark any other solution to this issue that we have.
John
Okay, so it’s basically all your website, and the content from a different country from your website is ranking instead of the content that you have uploaded?
Webmaster 6
Yeah.
John
Okay. And then by regional, you mean countries or countries?
Webmaster 6
So let’s say for example, we have our website from India, and the website’s ranking is based out of Singapore. So the Singapore website is ranking on Indian sub. So that’s what we are looking at currently, right now.
John
Okay, then, then you really need to use hreflang. Yeah.
Webmaster 6
That’s even if the content is regional?
John
Yeah. I mean, if you have an alternate version for one country, then that’s a perfect place to use hreflang. If it’s different countries, different content, then essentially, you could theoretically still use hreflang, if you think it’s similar enough. But if it’s like completely different content, then it’s essentially just different content. And within a website, you can’t easily control which of your pages are ranking. The only thing you could do is make sure that the internal linking really clarifies which of these pages is more important. But you can’t. You can’t say, well, for this query, I want this page for a different query I want another page.
Webmaster 6
Got it. Got it. Great. Okay, so my second part of the question was, so basically, the client that I’m working on currently also has links, which redirect to an e-commerce website, correct. Where in the buying journey, isn’t from the website, but goes on to another ecommerce website? Is there any way wherein we can optimize the whole conversion rate of these pages specifically, so that we get more transactions to that?
John
I don’t know of a kind of SEO trick to improve the conversion rate or things like usability and kind of like, an on page.
John
Let me go through some of the submitted questions, and then we’ll get back to more of the live questions as well. Let’s see, starting off is having multiple location landing pages, okay to have when you’re a business that serves these locations without physically locating them. I heard these are doorway pages, can you please clarify? So doorway pages are essentially a situation where you create multiple pages that are very similar, that are just trying to rank for different queries, and which essentially lead to the same funnel. And so kind of the situation that you’re describing here, the thing I would kind of keep in mind when it comes to doorway pages is usually it’s a matter of having a lot of them. So if you say my business is global, but I serve a kind of delivery to every city in the world and you create city-level landing pages for all of your products, then that’s definitely a doorway page. And on the one hand, the webspam team might pick that up, our website algorithms might pick that up, but almost certainly our quality algorithms will pick that up as well and say, well, there’s actually not much useful content on these pages. On the other hand, if you have maybe a clearly defined very small set of locations that you want to target, then that seems less of an issue.
So for example, if you’re a local business, and you have kind of like two big cities nearby, and you’re located right in the middle, then that seems okay to say, Well, I’m targeting the city. And I’m also targeting that city and the type of services that you provide are very similar. So on the level of like, you have two cities, and you’re just making something similar for both of these cities, that seems okay.
And so that’s kind of the thing here and finding the level in between, I think is super tricky. My recommendation is to try to have fewer pages, and then expand from there. So instead of saying, I will create 20 pages and see if that’s good or not, create one page and see how that works out, look at the search queries that go to those pages. And then over time, you think, oh, it makes sense to have a page separately on a slightly different topic area for that area, then maybe expand to that.
And so that kind of makes it easier for you to grow organically. And to make sure that you’re not running into this situation where you’re just automatically generating millions of pages. Is it okay to translate your whole website with Google Translate, and to show them as a subfolder or subdomain, this way I can increase the cost of translation? Is that useful for SEO?
So theoretically, you can use Google translate to translate a website. However, on the one hand, we would see that as automatically generated content and the webspam team would be unhappy about that. The other thing is most likely this would be low-quality content. Because while the automatic translations are getting better and better, they’re still not the same level as a normal translation.
So my kind of the approach I would take in a situation like this is to start out with an automatically translated kind of piece of content, and work to actually make sure that that’s a manually translated page in the end. And when you’re ready with the manually translated page, then make that live and kind of work through your website like this. And quite often, when you work with localization agencies, they do something similar. They start out with an automatic translation, they kind of refine it.
And when it’s well refined, they will send it back to you and say, like, here’s our translation, I would not just blindly translate a whole website because it makes a website low quality, it goes against our webmaster guidelines. It’s kind of like a bad approach, I think.
What reasons might cause Google to remove all FAQ snippets from a site without providing a warning in Search Console?
So we don’t guarantee any rich results in the search results. So it’s not that we would provide a warning per se, for any kind of rich result type in search. But rather, when you use the correct markup, you’re eligible to be shown. And it can happen that we show a lot of these rich results for your website, and maybe we’ll scale them back over time. Maybe we’ll scale that up again over time. It can also happen that we turn them off completely.
When it comes to rich results. We have different levels that we look at. On the one hand it has to be technically implemented properly. which probably is okay, if you used to have it there. On the other hand, it has to be compliant with our policies. I don’t know, it’s hard to judge here, but I’m guessing maybe that’s okay. And in general, we have algorithms that try to understand our policies and try to kind of in Act, I don’t know, engage on that.
So that’s something where, again, you wouldn’t see your message around that. And the third thing that we think about when it comes to rich results is we need to make sure that the website’s quality overall is okay. And understanding a website’s quality that changes over time. And in particular, when it comes to things like core updates that can change. So if you saw this change kind of around when we launched the core update, then it might be that our algorithms have kind of refined what, what we think of your website, how we kind of see that fit in how relevant we think that is for certain queries.
And based on that, maybe these rich results are not shown anymore. So those are kind of the directions I would go there. There’s also the option that there was something so bad with regard to the structured data on the page that the webspam team took manual action. But that’s something that’s really rare. And that would be something that you get a notification on in the Search Console.
And then we have the meta description, I think we looked at that, is it okay to choose a domain name with two hyphens? Or is one hyphen better? Or should hyphens be avoided completely?
Up to you, is whatever you think makes sense. As some websites have hyphens, some don’t, I don’t think anything in our algorithms look specifically for hyphens in domain names. The aspect of just putting keywords into your domain name, I think that’s kind of overrated, in the sense that I don’t know our search algorithms try to understand the quality and relevance of a website overall, and the domain name isn’t really like the strongest factor there.
So that’s something I don’t know. Like, if you’re trying to move to a domain, just and just add keywords in there, my guess is that the whole move to a new domain part will be much more complicated and cause more issues than any value you would get out of just having a keyword extra in the domain name. So I will try to avoid doing that.
But again, it’s not related to hyphens or anything like that. It’s really just like, Should I add a keyword into my domain name or not? What’s the best way to know if Googlebot is rendering a page? Well, the best way is kind of try the inspect URL tool in Search Console, it gives you a screenshot, it gives you the rendered HTML, you can double-check the rendered HTML to see if it’s okay. And that’s essentially what happened. So the test in Search Console runs almost completely the same way that we do it for search. It’s not completely the same way. Because in Search Console, on the one hand, we want to give you an answer back quickly.
So the timeouts are slightly different. On the other hand, in Search Console, we try to pick up all of the content fresh. So in particular, when it comes to CSS and JavaScript files, and any kind of other files that you have embedded in us and your code. That’s something in Search Console, we try to pick it up fresh, and we have those timeouts. So sometimes that’s kind of tricky. But when it comes to search, we tend to have a lot more flexible timeouts because we can render a page whenever we need it.
And we can cache the content, the JavaScript, the CSS files a little bit longer, so that we don’t need to fetch everything individually. But if you’re seeing that it works well and inspect the URL tool, then it definitely works well in search.
Is having or having incorrect HTML language attributes affect search rankings?
No. So the Lang attribute in HTML is not used by Google. So whatever you put it in there is kind of up to you. I would, however, kind of do it properly, because it does affect things like screen readers. So that’s something where, like, I would still try to get it right. But I wouldn’t assume that you’ll see any SEO effect on Google with regards to that. I’m not 100% sure about other search engines and might be that I don’t know if one or the other ones uses the Lang attribute in HTML.
But at least at Google, we don’t use that. What we try to do is understand the primary language of the page based on the actual content. Because one of the things that we noticed is that people often use the same template. As other people, they’ll copy and paste the template, or they’ll install a template automatically in their CMS. And a lot of these templates come with Lang attributes already predefined. And then it is because you don’t see it, when you look at the page, it’s very easy to end up in a situation where your language attributes as English, but the content is actually in French.
So because of that, we try to focus on the actual content and recognize the language from there. How different is SEO for a product or service page than a normal article, for example, for keywords like piano classes near me as a user intention would be defined as a service rather than a solution? It’s essentially the same. So I don’t think we do anything different with regards to these different kinds of pages. But rather, like we tried to find the content on the page, understand how it’s relevant and treat that appropriately.
The one thing I think, to kind of mention here as well is, it sounds like what you’re looking at is a local service, or local business, essentially. And for that, I wouldn’t make sure that you really have a really strong Google My Business entry setup, because that’s something that can be shown a little bit easier in the search results for queries like this. And in particular queries, that includes something like near me, it’s not that you need to rank for near me, because near me is essentially, like global, it’s not not something specific on your website. But rather, what you need to do there is just make sure that you have your location very clearly defined on your pages so that we can recognize this location is associated with your website or with this page. And the user is in that location. Therefore, any query that includes near me, we can apply this geo geographic distance algorithm to figure out like, these are actually results that are near them, and they match what they were looking for.
So that’s something to keep in mind there. With Google My Business setup, you automatically have a location specified anyway, so it’s a little bit easier there. But having all of that combined makes it a lot easier for us to actually understand this is a local result, and the user is local, and they’re looking for something local. Therefore, we should highlight this better in search. Okay, wow, still a bunch of questions left, and a bunch of hands, maybe like, like, usually, I’ll go back to more live questions from you all. And I also have a bit more time afterwards as well, where we can stick around a bit after the recording pauses.
Webmaster 7
Finally, I get to ask you a question.
John
Okay.
Webmaster 7
So I’m doing SEO for an e-commerce website, pretty new, around eight months old. And it’s not a huge website, it’s got like 30, 40 products. In total, we sell green tea and herbal tea online. And so I’ve been seeing some movement with the efforts that we put in a regular backlinking process, on page SEO, structured data. We have all that in place. And we are seeing some movement there. We are seeing an increase in impressions. We are working a lot on our blog section, because that’s what we see a lot of other competitors are doing. And we try to create skyscraper content master blogs, 2000, word 3000 word blogs, and they end up ranking.
Right. So some of our blogs are also ranking ahead of healthline or medical news today. So I think the efforts that we are making there are falling into place. It’s just that on the Products page, they don’t rank, man. They don’t rank at all. I mean, I don’t know what else can we do? They’re like they have a valid product structure data review data. And yeah, I told you, it’s not a big website. Like if I go to any competitor’s website, they have a lot of main content and a lot of supplementary content as well.
That’s one thing that I see where there are differences and sometimes they have backlinks to their product to collection pages, but their product and collection pages rank on the first pages. We are just having a hard time getting our product pages to rank a couple of times our product pages are ranking, but then they are ranking on the seventh or eighth pages. So we really wanted to understand better on how my approach should be from here on.
John
Yeah, I think that’s always challenging. And I don’t think there’s a simple trick to make that happen. The one thing I would watch out for is if these are products of this ecommerce site, then make sure that you have all of the Merchant Center set up correctly. Because I don’t know if this is in every country yet. But we do have the Google Shopping feed, where you can kind of like submit your products for free. And then we can show those in the shopping search results as well. And sometimes, we mix that in with the normal search results. So that’s kind of a way to additionally get your products a little bit more visibility.
With regards to the kind of the products themselves, if you’re saying that some of your content like the blog posts are ranking well, but the products aren’t, then sometimes you can help that by making sure that there’s clear internal linking between those things. So the pages where you are seeing that they’re working out, well make sure that there’s really a clear linking from those pages to your actual products. My guess is probably you have some of that setup.
If you have really long kinds of blog pages that are ranking, the challenge might be that it’s very tempting to link to a lot of different products from there. And by linking to a lot of different products, you’re kind of saying, Well, everything is a little relevant, but nothing is really. Whereas if you link to a few products from those pages, then you can kind of really say, well, these are really important products. And that’s something where you wouldn’t necessarily need to focus on things that are ranking badly at the moment, but things that you care about instead.
So if you say these are products where you earn the most money from or that are the most popular, or maybe they’re the easiest to sell, something like that, then those are the kinds of things where with internal linking, you can kind of promote those a little bit. And with really strong internal linking, then over time, the rankings of those product pages might change as well.
Webmaster 7
Just to add to that, we have a shopping setup, we are spending heavily on shopping campaigns. So of course, the feed that we set up there is also getting exposed to the organic listings. And we are seeing some traffic here and there. Very, very small, not significant. As far back as internal linking, what we do is, let’s say I create a blog for say, green tea or herbal tea. And that’s a 2000 word blog. And in the end, we just kind of concluded our blog, right. And then we link to that particular page, not to anything else. So we try to create blogs focused on one product, one kind of tea that we have, and we only link to that tea.
John
That sounds good. Yeah. It sounds like you’re doing a lot of things right. And if you’re ranking kind of competitively with some of these really big sites with your blog posts, then yes, that’s, that’s really hard work. And that’s kind of a sign that at least there’s different aspects they’re involved in. And that one angle you have covered really well. So that’s something where I’d say I would tend to continue in that direction. And I would still think about internal linking a lot with regard to these pages, especially like maybe from your blog post, but also from the rest of the website itself. If you say that you have few products at the moment, then it’s sometimes tempting to link to all of these products from your kind of shop homepage. And if you can find a way to funnel that a little bit better than that can make it a little bit easier for individual pages to rank better.
Webmaster 7
Okay, so like, Can I sum it up in the way that if there is a blog, and there is a collection page, there is a product page? So either of them ranks? That is good enough for us? Right? So we go a little aggressive on internal linking, find out the most important products, the products that sell the most or are our most searched for, and focus on internal linking of those products and continue blogging the same way?
John
No, I think that’s generally a good approach. I think you’re on the right track, so it’s not completely wrong what you’re doing there. It’s just sometimes the competition is quite strong. And it takes a while to…
Webmaster 7
We always end up competing with Amazon for no reason. I mean, it’s always there. So that makes it all the way harder. Cool. I think I get the approach.
John
Okay. Cool. Thanks. Okay. Let me pause the recording here. It’s been great. great having you all here. And like I mentioned, you’re welcome to stick around and we can go through more questions as well. If you’re watching this on YouTube, feel free to join in one of the next office hour sessions if you’d like to, or at least submit questions if you have anything pending. Thank you for watching and hopefully see you one of the next times.
Watch John’s Hangouts on Most Fridays
Be sure to catch John live over at the Google Search Central YouTube Channel when you get the chance.
Don’t forget, they happen most Fridays at 7:00 a.m. PT and you can get the schedule either by catching John Mueller on Twitter and asking him about it at his handle @johnmu, or going to the Google Search Central YouTube channel link above.
After they occur, you may want to stay tuned to iloveseo.com as we continue to cover his upcoming hangouts in-depth!