This time, we bring to you John Mueller’s Google Search Central Office Hours hangout from July 23, 2021.
The topics John covered during this hangout included the following:
- Adding a rel=canonical tag on many different sites,
- Creating links for different languages,
- Accidentally de-indexing and blocking search engines from crawling the site,
- Press releases and how Google sees them,
- Negative SEO attacks,
- Embedded software on web pages,
And much, much more.
Be Sure to Watch His Hangout!
We recommend watching his hangout from July 23, 2021 here:
You will also be able to find the transcript from the hangout as well.
John Mueller SEO Insight #1: Can We Add a rel=canonical Tag for Many Different Websites?
A webmaster was concerned about adding rel=canonical tags for 100 different websites. They have more than 100 different websites set in the English language. Their specific problem was that they would have a French version for approximately ten days only. Is it possible for them to add a canonical just for the English version?
John explained that you can apply the canonical tag to any page you want to have indexed. The main criterion is to make sure that you have one URL that’s the preferred version of that URL. Then you set the rel=canonical tag to that version. Google understands that for international sites for different countries, or different languages, each of those versions is unique. John recommends setting its own canonical. So the English version would have a canonical, and the French version would have a canonical and so on.
John Mueller SEO Insight #2: Do We Need to Create Links for Different Languages?
The same webmaster also asked about links. Their question was about creating quality links in bulk to each separate language page.
John explained that link building is something that’s going to happen naturally over time and that it’s not something you would need to manually do. The important part, he said, was that you have strong internal cross-linking. Over time, French users will appreciate the site and perhaps begin linking to it. Backlinking is considered a natural thing and not something that you have to do manually.
John Mueller SEO Insight #3: Our Client Did Not Block Search Engines and Google Indexed the Wrong Pages. We Fixed it and Resubmitted the Sitemap and Nothing. What Can We Do?
A webmaster was working with a client who had not blocked search engines from crawling the new design of a website. When they went through the reindexing process by resubmitting a sitemap, Google didn’t index a single URL of the site. The client’s question was if there was anything that could be done differently to repair this and get Google to crawl and index the new site.
John explained that it sounded like there were no technical issues involved—since the individual URLs get indexed, then it sounds like there isn’t an overall block of the website. An overall block could occur if they submitted removal requests and they hadn’t realized that, but it sounded like both of those issues were okay.
He said it’s mostly a matter of Google recognizing that it’s a useful website and then crawling it. Submitting URLs manually is one approach. Google does, however, need to understand how the website fits in with the context of the overall web. This is something that just takes time.
During this time when Google is learning about the site, one of the best approaches is to find ways to promote the site directly. Instead of publishing and praying that Google finds it, as a webmaster, you’re required to do a lot more work to find that traffic. You have to find the right channels where you can promote the website in order to drive users and traffic. It takes time and practice to find the channels where people who would appreciate these topics hang out.
Run ads, do public shows, do social campaigns, work together with other sites (maybe get on their newsletter regularly), and so on. All of these things can help drive awareness.
Then Google will finally understand over time that this is a site they should be indexing much more.
John Mueller SEO Insight #4: How Does Google See Press Releases?
Another webmaster had a question regarding press release submissions. The content they use for the submission is the same every time, and the domain name at the bottom is linked. The event is the same, and this content is published in a variety of newspapers, while also being the same as it is on-site. Will these types of press releases be a problem for Google?
John explained that if you want to rank for that content, then putting that content on other people’s websites actually makes doing so quite a bit harder. Using press releases to drive awareness of your business is a reasonable approach. Regarding links in press releases, John also explained that the important thing about press releases is that it’s clear they’re press releases and that they were written by the website owners themselves. The way to do that is to make sure there are no follows on all the links in the press release. Just having a domain name link is fine as well. Google will say “it’s just a domain name. It’s not like it’s keyword-rich anchor text.”
If you want to just drive awareness of the site, then Google doesn’t really care if the content is ranking or the press release has the same content on this site or on any other site. It IS different, however, if you wrote an exceptional piece, and it’s copied and pasted into all sorts of different press releases, and these appeared on all these other websites. Then the content on these press releases would rank above the original content on your site.
John Mueller SEO Insight #5: Google Ignores Negative SEO Attacks
A webmaster had two questions about this complex topic. How easy is it to get away from someone who is constantly attacking their site, and should they disavow the domain or their backlinks?
John explained that in general, Google will recognize and ignore such negative SEO attacks. He wouldn’t necessarily worry about it. If they’re really worried about it, and it will make them feel better, adding these domains to the disavow file is just fine.
He also reiterated to just list the domain as opposed to the specific URL. Even if there are 500,000 URLs in Google Search Console, the webmaster should do it at the domain level. It’s just easier. Listing the domain in the disavow file is exactly the same as individually disavowing 500,000 URLs.
John Mueller SEO Insight #6: Google Needs Clear Text on Pages That Have Embedded Software (Such as Calculators)
Another webmaster was having challenges with reindexing a client’s URL. The page was in the sitemap, but Google Search Console was reporting a soft 404. GSC said to try it again in a few hours. But the webmaster had been trying this for two weeks. They tried the mobile-friendly test, it passed, and there were no issues from the webmaster side either. The page is a car service pricing calculator tool, so the webmaster was wondering exactly why they can’t get the URL reindexed.
John explained that Google tries to recognize soft 404 errors automatically, especially when there’s something like clear text on the page that tells them there’s no content that’s actually there. John recommends making sure that you add some physical text to the page explaining to the user that this is a calculator, we have no values calculated for today, or something like that. He also said that Google has been seeing a rise in reports of software for pages, so they’re going to look into those glitches as well.
John Mueller SEO Insight #7: If You Have a High Percentage of Redundant Meta Descriptions, Google Will Attempt to Generate Them Automatically
John explained having a real meta description is a first step that’s really important. He suggests making sure to spend the time crafting the correct meta descriptions and making sure that they are showing up on the physical page. He also explained that you don’t want to have things be repetitive across the website. If the meta descriptions are basically the same description across the website, then Google will automatically generate meta descriptions. Double-check to make sure that your meta descriptions are not overly repetitive.
In addition, John said Google also attempts to recognize when things are accidentally spammy in the description if you have a lot of keywords or you’re just using the queries you want to rank for without adding additional value. They will say this is not really all that useful for the user, in which case they will re-write the meta description to a better one.
The next reason Google changes the meta description is when they want to match the queries that people are actually searching for. Because sometimes these meta descriptions don’t match those queries. They will try to use that query if it’s mentioned in the meta description itself. If the query is not added to that meta description, then they will automatically generate a new meta description based on the content on the page instead.
John Mueller SEO Insight #8: Google Tries to Pick the Right Content Based on Region
The sixth webmaster was concerned with some international SEO problems. The first one was that a client they are working with has more than one foreign language version of that same website. Their competitors are ranking for regional search queries that they’re not even though they’re posting regional content.
John explained the HREFLANG tag would not change anything in this case. The HREFLANG tags help Google pick the right URLs to show in the SERPs, but they would keep the ranking the same. They would end up swapping out the URLs for the local versions. He used the analogy of a shop that sold basketballs in India and another shop location was in Singapore. With regionally specific content, Google would simply pick the right ones to show with content like that. It’s a matter of normal ranking and not anything you can influence easily using technical SEO. In this case, having all the traditional SEO specifications in place is recommended.
In a confusing case like this, the only thing that you can really do is work on the internal linking in order to ensure that Google identifies exactly which pages are important.
John Mueller SEO Insight #9: Multiple Location Landing Pages Are Classified as Doorway Pages
One webmaster asked the question of whether or not having multiple location landing pages is okay to have when you’re a business that legitimately services these locations, even though there is no location physically present there. The webmaster heard these are considered doorway pages, but they wanted clarification.
John explained that yes, they are doorway pages. It’s a matter of having a lot of them. If, for example, a webmaster said their business is global and they deliver to every city in the world, and you create landing pages at the city level for all products, then that’s a doorway page. These are situations where you create more than one page that are extremely similar, and they’re trying to rank for different queries, but they basically lead to the same funnel.
John Mueller SEO Insight #10: Google Doesn’t Guarantee Any Rich Results
Another webmaster was concerned about Google removing all FAQ snippets from a site without even showing a single warning in Google Search Console.
John began by explaining that Google never guarantees any rich results in the search results, so they wouldn’t provide any warning. This is a case where you’re using the correct markup that you’re eligible to actually be shown. It’s likely that sometimes Google may show a lot of rich results for the site, and perhaps they’ll scale these results back over time. And perhaps they will scale them up again over time. They could even turn off these types of results entirely.
The things Google looks at when it comes to rich results are:
Level 1: Are they technically implemented properly?
Level 2: Are they compliant with Google’s policies?
Level 3: Is the website’s quality at least okay overall?
If you have seen a significant change in ranking around a core update, for example, it’s likely that Google’s algorithms refined what they think about your website overall in terms of its quality.
There’s also the case where something is extraordinarily off-the-charts bad about the structured data that’s on the page that their webspam team had to take a manual action.
John Mueller 07/23/2021 Hangout Transcript
Watch John’s Hangouts on Most Fridays
Be sure to catch John live over at the Google Search Central YouTube Channel when you get the chance.
Don’t forget, they happen most Fridays at 7:00 a.m. PT and you can get the schedule either by catching John Mueller on Twitter and asking him about it at his handle @johnmu, or going to the Google Search Central YouTube channel link above.
After they occur, you may want to stay tuned to iloveseo.com as we continue to cover his upcoming hangouts in-depth!