Google’s John Mueller, a Search Advocate on the Google webmaster relations team based out of Switzerland, regularly holds Google Search Central Office Hours Hangouts. During these hangouts, he delivers exceptional insights into how SEO works at Google. They last around an hour, and you can catch them approximately every other Friday.
For this one hour long hangout, we cover everything from minute one to minute 60.
So sit back, grab a cup of coffee, and enjoy this post. First, we talk about SEO insights from the hangout, then you will find the actual hangout transcript.
We hope you enjoy it and leave with some insights of your own.
Is it Important for Googlebot to See Cookie Consent Messages?
One webmaster asked if user-interaction-based cookie consent messages are okay. These types of cookie consent messages pop up and the user would need to click on them to accept them or not.
John’s answer: They do not matter. Googlebot does not need to see a cookie banner. Also, Googlebot doesn’t keep cookies. The most important thing, though, is that Googlebot is not blocked from being crawled or blocked from crawling the website.
How Many Times Do You Need to Use a Keyword On the Page?
Another webmaster explained an issue they are having with keyword targeting. They have a directive to use keywords a certain number of times in the body, and a certain number of times in meta tags. Their question is whether or not this makes sense.
John’s answer: He does not think it matters, nor does it make sense. His belief is that writing naturally will help resolve that automatically. However, he would not disregard using individual keywords completely. Just don’t over-focus on exact keywords—using singular or plural does not matter. Instead, mention the topic that your site is about.
He also provides examples like news articles, where it may read more like literature than a website. Also, where most of the words they would use don’t exactly map to the site’s topic. He still explains that mentioning the actual topic on the page is important, not less important is the number of mentions—don’t go overboard with all the different synonyms and ways of writing it.
Does the URL Removal Tool Only Affect the Canonical Version of Said URL?
Another webmaster asked whether or not using the URL removal tool only affects the canonical version of the URL, or if it works on the entire duplicate content cluster the canonical is a part of.
John’s answer: The URL removal tool will actually match the URL exactly as it was submitted. They will also consider the http://, https://, www., and non-www. versions of the URL. The indexing side and crawling sides both stay the same. The solution to the URL indexation problems? Just submit the one you want to see in the SERPs, and then Google will simply hide that URL from the search results.
Does Mobile-First Content Above the Fold Matter?
Webmaster #3 got pretty detailed in this question. But, the part he’s concerned with is the fact that the initial designs of their landing pages include the same hero banner, header, and approximately two sentences or so. They want to know exactly what priority above the fold content has in terms of search.
John’s answer: The main priority is that there is at least some amount of unique content in the above the fold area. Banners and generic hero images are fine. At the minimum, a unique heading should be visible above the fold. And some of the text content should be above the fold as well.
Does Domain Age/Expiration Matter?
A relatively brief question. But webmaster 4 asks whether or not domain age and expiration date matter?
John’s answer was a relatively short one, saying that no, they don’t use that. Also, that they don’t check the whois information.
Does the Skyscraper Technique Really Work: Does Google Consider Word Count? Does Wrong Information Count?
Webmaster #5 got really detailed with their question. But they’re essentially asking 1. Whether or not you can rank using much longer content than your competitor, and 2. What does Google do about incorrect information being spread by others in their posts?
John’s answer: There may be a correlation between the fact that you wrote improved quality content above the competitor. But, there isn’t a correlation between the higher word count and the new rankings. It’s absolutely not something they consider.
Will the DOM Size Matter in the Next Page Experience Release?
Webmaster #5 also wondered whether or not DOM size (due to including much more HTML as a result of increasing word count) will be a consideration in the next page experience algorithm release when it drops in mid-June. (editor note: DOM refers to the document object model).
John’s answer: He believes that the loading time of your page is partially related to the size of the HTML file that you send. But, many other factors exist. Such as images on the page, as well as the use of CSS and JS on a page. It’s not just about the HTML that’s on the page.
How do Discrepancies in GSC Affect Data Interpretation?
Webmaster #6 asked about the sampling of data and how it could be interpreted. He asked why the main property in GSC is different when compared to sub properties, and how to ensure that the data they pull from GSC remains consistent every time, because of the differences between main property and sub property data. They are also concerned about the sub properties telling a different story than the main property numbers.
John’s answer: The size of your site matters in this regard, as well as how the traffic arrives on site. It’s likely due to the limit of data points that they collect per site, per day. It’s likely that this is what they are seeing here. Most larger sites shouldn’t need to pull numbers from the sub properties, though. In this case, it’s likely that pulling these numbers makes the most sense for the site owner.
John also explained that there are different databases involved when processing main properties and sub properties. However for more detailed information, it may make more sense to look at it from that perspective.
He also said that for most sites, they chose numbers that would make data sufficient and useful, and those for which the totals almost line up, although that may not happen every time. In some cases, digging into deeper detail is recommended.
Will We See a Premium Paid Version of GSC with More Data?
Webmaster #6’s last question of the day was about whether or not Google will release a premium paid version of GSC with more data.
John’s answer: He has no idea. They don’t tend to talk about the future all that much publicly. He did say it feels weird to have a premium version of GSC, though. He does explain that seeing what people are doing with Search Console and seeing how they’re running into limitations is useful to gauge.
GSC Has Discrepancies Compared to Lighthouse
Webmaster #7 explained that they’re in a situation where there is different page speed data for URLs in Google Search Console when compared to Lighthouse. They have noticed that if something is entirely green in a Lighthouse report, Google Search Console doesn’t say the same thing.
John explains: There is different data, referred to as lab data and field data. Lab data is the data that’s presented in Lighthouse or Google Search Console. The field data is the data that users see. Google Search Console presents the field data, not the lab data. But there are differences in terms of how each type of data is seen in practice. This is likely the type of data that this webmaster is seeing.
In search, Google uses the field data from real users, and it takes 28 days to update in Search Console. Depending on what you’re doing, using Search Console is not likely to have much benefit. Ideally, you should be optimizing for the field data over the lab data.
Can You Use Google Products When Doing A/B Testing?
This is the question Webmaster #8 posed during this hangout—whether or not it’s a good idea to use Google products like Google Analytics when optimizing something for SEO.
John’s answer: When it comes to A/B testing, so long as the page’s purpose doesn’t change, and it is essentially the same page with a different design, it’s fine. You shouldn’t change the page from one product to another, for example. Also, A/B testing is temporary, so it’s not like Google will take into account the A/B test version as a permanent thing.
Does Google Treat Pages More Favorably Than Posts in WordPress?
Webmaster #10 wondered if, in the case of a WordPress site, does Googlebot recognize the difference between pages and posts?
John’s answer: Google does not recognize there’s a difference. This is something that is strictly internal and specific to WordPress. Google would see the final, rendered HTML page.
John Mueller Hangout Transcript
All right. Welcome, everyone to today’s Google SEO Office Hours Hangout. My name is john Mueller. I’m a search advocate on the search relation side. And part of what we do are these office hour hangouts where people can join in and ask their questions around their website and web search. Wow, lots of people. So joining. Cool, it looks like we have a bunch of people already raising their hands, which is great. So we can go through some of you all, there are also a bunch of questions submitted on YouTube. If you want, you can add more questions to YouTube. Or if you’re in here live, feel free to raise your hand as well. And we’ll see how far we can make it today.
Is it really important for Googlebot to be able to see a cookie consent message that we serve to users? Because we kind of decided to serve it upon user interaction. So Googlebot won’t be able to see it? I wonder if that can cause us any problems in terms of Google friendliness?
In general, that should be fine, because Googlebot doesn’t really need to see a cookie banner. Googlebot also doesn’t keep cookies. So it wouldn’t accept cookies anyway, if you gave it to Googlebot. So if you have it set up like that, I think that’s fine. Lots of sites serve the cookie consent banner to Googlebot as well, just because they serve it to everyone. That’s usually also fine. The important part is essentially that Googlebot is not blocked from being crawled or blocked from crawling the website so that you don’t have a kind of an interstitial that blocks the access to the rest of the content. Yeah. Thank you. Sure. All. Right. Let me see Michael,
Hi John, I have kind of a softball question, but it’s something that I’m running into right now. Going back to basics, so I have a, I have a directive to use keywords, specifically target keywords in you know, in meta tags, use it here in the h1 use it this many times and in a piece of content. And that really just seems, seems outdated to me. Especially with all the advances in semantic search, and all the cool MUM and all that other stuff that’s coming down the pipe. I just wanted to ask a basic question. Do you think that that’s still a legitimate SEO tactic? Or should we not be focused on using this particular keyword this many times on a page?
In general, the number of times that you use a keyword on a page, I don’t think that really matters or makes sense. When you’re writing naturally, usually, that resolves itself automatically. And also, with regards to the individual keywords. I think that’s something where I wouldn’t disregard it completely. But at the same time, I wouldn’t over focus on exact keywords. So in particular, things like singular and plural or kind of like the different ways of writing individual words, that’s something that you probably don’t need to worry about. But mentioning what your site is about, and kind of like what you want to be found for, that’s something I would still do. So in particular, what we sometimes see when we look at things like news articles, if a news site doesn’t really understand SEO, they might write in a way that is more, I don’t know, almost like literature and that you read it and you kind of understand what it means. But the exact words that are used on the page, don’t really map to exactly that topic. So that’s something where from, from an SEO point of view, if there’s something you want to rank for, I would still mention that on the page. I wouldn’t go overboard with the number of mentions, I wouldn’t go overboard with all of the synonyms and different ways of writing it. But mentioning at least once definitely makes sense.
So I have a question regarding the URL removal tool. So my question is, if you use that tool? Does it only affect the canonical version of the URL? Since I guess this is affecting the index you’re using for publishing search results. So does it only affect the canonical version? Or does it affect the entire sort of duplicate content cluster, which this canonical is a part of? So for instance, if I write in a URL, which, in my opinion, was the one to be excluded, but it’s basically the canonical variant? Or sorry, non canonical barrier? What happens in that situation?
Okay, good question. So we don’t consider it canonical at all when it comes to URL removals. But rather, we match it one to one exactly as he submitted it. And we include the HTTP HTTPS and dub dub dub non dub dub dub versions of that URL. And essentially, what happens is we don’t remove it from our index. So the whole indexing side stays the same, the crawling side stays the same. We just don’t show it in the search results. So essentially, you would submit the URL that you see in the search results, and then we will just hide that from the search results. Right, thanks. Cool.
My question relates to template content above the fold from a mobile first perspective. And this relates to a blog restructuring, we’re working on where we’re moving from all of our blog articles living in like a blog sub directory, to building out like topic clusters with specific topic landing pages. And based on initial designs of the topic, landing pages, they’re all going to include the same hero banner with the same header, and then the same two sentences. And then below that will be like a sub navigation to let the user navigate to a specific topic landing page. And my concern is from that mobile, viewport, each of these topic landing pages, their unique URLs, unique titles, but they have that same header and like, just above the fold content, but like right below that is a unique sub header, then like a button, like which would, which would be like the h1 tag of the page. So I just want to get your thoughts there. I know Google, in the past have said that they way above the fold content. And I think it’s something important for us to look at, but is it detrimental to like non brand rankings for those topics, landing pages.
Now, the important part for us is really that there is some amount of unique content in the above the fold area. So if you have a banner on top and you have a generic hero image on top, that’s, that’s totally fine. But some of the above the fold content should be unique for that page. And that could be something like a heading that’s visible in a minimum case, but at least some of the above the fold content should be there. So that’s kind of the guidance that we have in that regard.
Okay, cool. Thanks. Maybe I’ll just leave the head of their statement and like change a sentence or something based on the topic?
Yeah. I mean, it’s probably also something where you want to look at how users interact with those pages afterwards. But that’s kind of more from a non SEO perspective. But it’s, it’s always, I think, important for cases like that, that you take a look and see what actually happens with the users afterwards. Awesome. Thank you so much.
John, I have a question regarding the domain age and does the domain expiry matter? Suppose I have a domain that is a couple of years old, and it will expire in the next six months? So does the SEO check the age of the domain?
No, no, I don’t think I don’t think we use that at all.
Okay, okay. So I have one more question on the domain only. So do you check the whois whether it’s private or public?
I don’t think so. At least, at least not that I know of, because it’s also something where some of the top level domains and the registrar’s they just have every Private by default, so it’s like you wouldn’t be able to compare that anyway or use that for anything useful.
So, John, I’m a little worried right now. So, what happens is like now what I was talking in front so we can clear out things. So, there is a lot of crap, there’s wrong information and data spread widely across the internet, it is copied, and the content length keeps on increasing, let’s suppose one SEO expert in 300 words on on page SEO says 10 points that you should do. The next SEO expert comes he writes 20 points and the content increases 200-1,000 words then the next expert comes there are only five six people only if you search on Google, they will come only okay they are the main culprits who are spreading all the wrong information.
Then the next one comes, he writes 2,000 words of content and he lists 20 points, and another person comes here with 200 Google algo points and he has 4500 words. And he claims also like I have I have been increasing my word count. And Google is – has increased my rank from number three to number one.
So it is not directly my question, but how the people are using the skyscraper technique to building a bigger building. Then suppose another tower, next person is writing again, crap, there is no useful information. It is building – is more bigger – then another is more bigger. And they are like, if I want to see real information, then the actual gurus are also spreading misinformation. But still they’re competing around the five, six gurus are competing, and that many people are taking their courses.
So in SEO, suppose I want to write something truth. So we are discussing, I am writing and in whatever we discuss in 200 or 300 or 500 words, I write the real answers. So I think maybe we knew Google algorithms, this is my suggestion to Google try to read also the content, like the meaning of the content, if the core meaning of the content is something like domain expiry does not matter. But they explicitly write, maybe they are an affiliate, that use your domain should expire after three years. Keep on renewing, so that the expired is longer. Okay, so Google checks if a genuine website will have longer expiry. So they will have an affiliate link below. So they want to sell, like the psychology I’m telling but that this is a wrong information. Like, that’s just my concern only to like, share with Google like you can advocate on it like,
Yeah. I think it’s I think it’s always tricky. And there’s also the aspect of sometimes there’s just old and outdated information out there on specific topics. But yeah, I get yeah, I don’t know. people like to write a lot of things and to promote the things that they’ve been working on. So it’s, it’s hard for me to say, Yeah…
why still the skyscrapers techniques seem to work. They, they showcase the example like three months ago, I was using 500 piece of content with 500 words. And suppose on on page SEO, they will give an example. And I am taking one example, on page SEO, I have 500 words, and I used to rank at number six, then I increased to 1000 words, I moved to number five, then I as it slowly increases, he naturally climbs the ladder up to number one.
I yeah, I don’t think the there is really a correlation there. I think like sometimes these things happen that you rank higher when you make changes when you improve things, but just purely from increasing the number of words on a page. That’s absolutely not something that we take into account.
Quality matters. So in the next algo, will the page experience see the number of Dom? Suppose I suppose you write an article with 2000 words very good, relevant. I write again with 3000 words. So my Dom will increase Dom size.
My question is related to Google Search Console. And it is more to do with the sampling of data. So what I’m stuck with, and I’m confused with. So in my the main property of search console, if I create a sub property, so what I see that the sub property, properties numbers, the clicks, if I take from the main property, it is only 4,000 clicks. But when I create a sub property, I see 40,000 clicks for that particular subfolder. So when I’m seeing analyzing the numbers at the main property level, I might skip out that portion of that sub folder because I see only 4,000 clicks, and maybe this other side sections are doing great and all, but when I create a sub property, it gives me huge number, which is a 10 times difference. So what then what I did, eventually, that I created many sub properties inside search on Google Search Console, to get more numbers, which gives me a better picture that okay, this sub folder is doing great. It’s actually not only 4,000 or 2,000, it’s actually 20,000 30,000. Like that. So what should I do? So should I continue pulling numbers from sub properties? Or should I go to the main property and consider the numbers from there? Because if I add up all the sub properties, the total goes beyond the numbers which I see at the main property?
Yeah, I think it depends on the size of your site, and the way that the traffic comes to your site. Because there is a limit of the number of data points that we collect per site per day. And perhaps that’s what you’re seeing there. So from my point of view, if you’re seeing more useful numbers or more actionable numbers that way, I think that’s totally fine. I don’t know if that’s something that the average site needs to do. My guess is most sites don’t need to do it. But maybe it makes sense for your site.
But the only thing is the concerning part is that if I continue pulling the numbers from the sub property, and that’s fine, but when I look at the main property, so if I have to look at the overall numbers, then it gets a little confusing, because the sub properties are speaking a different story. And the main property, the total overall website numbers are seeing a different thing.
Yeah, I mean, you’re looking at different databases there. So from that point of view, comparing those different properties, that probably doesn’t make sense. But if you need to look at more detailed information within one of those subsections of your website, then maybe that does make sense to look at it and on that level.
Okay, so you would recommend that it’s like case to case.
I think for most sites, like we chose those numbers to make sure that for most sites, the data was sufficient and useful, and that the totals kind of line up. But if they’re, there are definitely individual cases where it makes sense to dig more into detail and kind of verify subsections of a site.
And very short question, would we see anything in future, like, premium version of Google Search Console, where we can pay and get more data?
I have no idea. Yeah. I mean, it’s something where we tend not to talk about what is lined up for the future. It feels weird to have a premium version of search console or something like that. But like, who knows, I think seeing what people are doing with search console and seeing where they’re running into limitations. That’s always useful for us.
So my question is, for core web vitals, me and my team have been preparing and making optimization, whether it’s theme related CSS. So what we have witnessed is the data in the Google Search Console, it’s completely different to when we go to the incognito mode of the website and go to inspect and go to Lighthouse and generate a report from there. So over there, I can see 91 performance and everything is good – Largest Contentful Paint, Time to Interactive, everything is green. However, if I go to Google Search Console, I can see like 100 core URLs. So we are in the middle of a situation where we don’t know where to go from here. So if you could help with that…
Okay. So the important part is, these are different ways of looking at the data. And so we differentiate between the lab data, which is kind of like if you test it directly, which is what you would see in PageSpeed Insights or in Lighthouse, when you look at it incognito in your browser. And the other variation of the data is the field data, which is what users actually see when they go to your website. And that’s the data that’s shown in Search Console. And from there, the differences are essentially that the lab tests that you do – they make a lot of assumptions with regards to whether users will have this kind of device or this kind of connectivity, or have this kind of configuration. And they will try to use those assumptions to figure out what it might be. But what they actually see in practice could be very different. And that’s probably what you’re seeing there.
Okay, so having said that, Google would consider the practical data over the lab data to run to the future.
So in search, we use the field data from real users. There’s one tricky thing with the field data in that it takes 28 days to update in Search Console for various reasons. What you can do is use something like Google Analytics together with an extra script on your site to track the field data yourself as well. So then you would have the lab data, your own field data and the search field data. And usually, you would see kind of the differences between the lab and the other field data.
So our aim should be optimizing for the field data.
I had a question about. Yeah, I know, what doesn’t like cloaking on a site. But that’s essentially what happens when you use Google Analytics, AB testing, for example, because it’s rating content underneath your work, when it does AB testing. So any, anything that optimize it’s optimized for SEO cannot use Google Analytics, AB testing, is that right?
The important part for us with AB testing is that the AB testing is not a permanent situation. And that the AB testing is something where Googlebot essentially also falls into that AB test. And essentially, we can kind of see what users are seeing. And when it comes to AB tests, the important part for us is also that the kind of the purpose of the page remains equivalent. So if you’re AB testing a landing page, and you’re selling one product, then it shouldn’t be that instead of selling a car, suddenly you’re selling a vacation or a flight or something like that, it should be something where the the purpose of the page is the same. So that if we understand this is a page that has a car for sale, for example, then we can send users there. And if the page looks slightly different, when users go there, like that happens, sometimes you personalize, sometimes you do AB testing. All of that is essentially fine. So the AB testing that you would do with analytics usually is in that area where you say, Oh, I am changing my call to action. I’m changing the colors, the layout slightly, but you’re not changing the purpose of the page.
Yeah, totally. Well, we are not changing the purpose of the page. But we do plan to run some UX enhancements through the experiment. So it’s kind of risky.
It shouldn’t be. So especially if the purpose of the page remains the same, then that’s something where even if Googlebot were to see both of the ad versions, then we will be able to index the page. Normally, it wouldn’t change anything. Okay, so that’s perfectly fine. The cloaking site that is more problematic is if I don’t know you’re selling a car, and then when Googlebot looks at it, it’s showing your car when a user looks at it, it’s going to pharmacy, then it’s, that’s usually kind of the more spammy cloaking, that we worry about where the webspam team would get involved. And all of these subtle changes. Also, if you have device specific changes on a page, that’s perfectly fine for us.
I’ve made maybe four more – “Why will we use AdSense, for example?” And maybe for mobile, we both stopped showing so much.
John, regarding this the same thing, if ranking goes down because of AB testing, is it? Is it a good sign that Google is not finding the different version? And if we roll out an older page, then the ranking should come back?
I don’t know. I, my assumption is the ranking would not change with normal AB testing. Because you have the same content, essentially, the purpose of the page remains the same. If you were to significantly change the page, like remove all of the textual content in the B version, and Googlebot sees that version, then that’s something where the ranking might change. But for the normal AB testing, I don’t see that as being problematic. And if the ranking did drop, then I would, as a first assumption, assume that it’s not related to the AB testing, but rather maybe related to something else.
So ranking will not have any fluctuation.
I mean, they’re always fluctuations and ranking, but it shouldn’t be just because of the AB testing.
My question is, we are a WordPress site. So we have a blog section. We also have a services section, our services sections are labeled as pages on the WordPress backend, whereas our blog sections are labeled as posts, our services section gets a lot of traffic, but our blog section does not get a comparable number of traffic. So is it because Google treats pages more favorably than posts, or maybe we are missing out on other fronts like blog marketing?
I don’t think Googlebot would recognize that there’s a difference. So usually, that difference between kind of posts and pages is something that is more within your back end within the kind of the CMS that you’re using and within WordPress in that case, and it wouldn’t be something that would be visible to us. So we would look at these as – it’s an HTML page. And there’s lots of content here. And it’s linked within your website in this way. And based on that we would rank this HTML page, we would not say, Oh, it’s a blog post, or it’s a page or it’s an informational article, we would essentially say, it’s an HTML page. And there’s this content here. And it’s interlinked within your website in this specific way.
So my other question is, what is happening is that the blog section has a longer URL to the root URL/blog/category/articlename. So is it because the URL is longer and it is getting harder to rank the blog? It shouldn’t be…
Now, I don’t know your website. So it’s, it’s hard to say. But what might be happening is that the internal linking of your website is different for the blog section as for the Services section, or the other part of your website. And if the internal linking is very different then it’s possible that we would not be able to understand that this is an important part of the website. But that’s not tied to the URLs. It’s not tied to the type of page it’s really kind of like, we don’t understand how important this part of the website is.
How important is the length of the blog, we are following the guideline of 300 plus words? And recently, I’ve read in many places that Google favors long form content. So maybe are we missing on that front?
We don’t use the word count at all. So the number of words in your articles that’s totally up to you. I think some people like to have a guideline with regards to the number of words but that’s, that’s really an internal guideline for you for your authors. It’s not something that we will use for SEO purposes.
I have a question about Google Search Console. My website has around 120 external links, and about 40% of them are non-working Japanese domains. I have no idea where they came from, and what do I have to do with them?
I probably don’t need to do anything with them. If these are just random links from the internet, I would just ignore them. Sometimes, like, it’s not specific to Japanese links. But sometimes spammers include normal URLs within the kind of URLs that they use to promote spam. And that means like on random forums, and blogs, they will drop these URLs as well. And sometimes that ends up with a lot of links that are posted in non-English or in foreign language content. And I’ve seen that a lot with Japanese, Chinese, Arabic, all kinds of languages.
Okay, and the, like, almost 50% count, doesn’t matter at all, I shouldn’t be worried that I get some kind of penalty for the or…
Yeah. Now, I mean, if these are not links that you placed, things that you bought, where you put these links there, then I will just say…
Yeah, what if my competition was the one buying those links for me? I? You wouldn’t worry?
I mean, the those are the kind of links that I, I mean, on the one hand, if they don’t exist anymore, then we will ignore them over time anyway. So that doesn’t matter if you disavow them or not, if they don’t exist, they don’t have any value. And on the other hand, these are the kind of links that we see so often, since, I don’t know, 10-20 years, and we just ignore them.
But why do you show them in Google Search Console?
Just because we want to show you everything that we know about? Yeah, it’s, I agree, like in a case like this, it’s like, oh, if Google knows, these are stupid links, then maybe they shouldn’t show them. But in in Search Console, we try not to make a judgement with regards to the links. And we also include things that are nofollow, or, and all of that. So it’s like we show them to you. But if you look at them and say, Oh, these don’t matter for me, then you can just ignore them.
Okay, thank you very much.
Let me run through some of the submitted questions. And then I’ll get back to all of the hands that are so raised. Wow, so many people. Let’s see, maybe I’ll see how I can go through these a little bit faster. And I have a bit more time afterwards as well.
Let’s say I have 500 physical shops that are selling my products, and I want to create for each of them a specific landing page, would this be considered doorway pages?
No, that would be essentially fine. It’d be kind of like having different products. Because these are unique locations. These are physical locations and having individual pages for them is perfectly fine. Sometimes it might make sense to combine these and put them on a shared page, such as if you have a lot of shops in specific countries, maybe just list the shops there instead of individual pages per shop. But that’s totally up to you.
Then second, from a ranking point of view, does Google treat nofollow, UGC and sponsored rel=attributes any differently?
We do try to understand these and try to treat them appropriately. So I could imagine our systems that we might learn over time to treat them slightly differently. But in general, they’re all within the same theme that you’re telling us. These are links that I am placing because of this reason, and Google doesn’t need to take them into account.
Where Can You Watch the Google Search Central SEO Office Hours Hangouts with John Mueller?
You can find them on YouTube over at the Google Search Central YouTube Channel. While it’s usually every other week, recently John has been holding them more frequently.
Keep updated on their appointment times by following this page over on YouTube.
For this hangout from June 4, 2021, you can watch the video here: