Several times a month, John Mueller hosts his own Google Search Central Office Hours hangouts over on YouTube.
Here, he accepts questions from the webmaster community on just about everything related to SEO.
If you have a pressing issue that you can’t find an answer to anywhere else, chances are John can help you, or at least point you in the right direction.
As per usual, this past office hours hangout included many questions on a variety of SEO topics.
First, we’ll tackle our SEO insights and then go into all of the questions in detail so that if you have seen the hangout before, you don’t have to watch it again.
Our SEO Insights
In this SEO office hours hangout, John addressed a number of situations applicable to SEO professionals, including discrepancies in different reports, how long it takes to see changes, grouping URLs by type, and more.
Question: Even After Implementing Changes, Nothing Happens to My Ranking. What Gives?
It takes time to see changes reflected in the search results. Even if you delete all 150 pages on a 150 page site and start anew with new redirects and everything, Google still takes time to learn the new site structure. You can’t expect to rank again immediately.
An entire website architecture overhaul could take at least a couple of months to settle down, if not longer.
Question: I Have NAP, Address, Phone Number, etc. Do I Need to Stuff My Pages With Content for Higher Rankings?
It’s more about making the pages unique for the user rather than stuffing them with content. If content is what will make it unique, go for it. But it shouldn’t be the core of everything you do. NAP information is found on many different websites, and doesn’t necessarily make a website unique. High-quality content is more likely to make that site unique, but don’t just stuff content on the page for the sake of having more.
Question: Why Are There Discrepancies Between My Core Web Vitals Report, Page Experience Report, and Google Search Console?
The likely issue here is that there may be a discrepancy in regards to timing. For example, in terms of field data for a set of URLs, those URLs may have been changed, but then the new URLs do not have any new field data yet. In this case, there could still be a timing issue.
Question: Does Google Group URLs by Type?
John did confirm that yes, they do group URLs by type. Especially in the Chrome user experience report data, the field data. They try to recognize where pages may be similar enough, and then they may group them together as a result.
They also use correlative past data. If they don’t have current data about certain sets of URLs, Google may use similar data to see exactly what’s going on there.
Question: Is There a Long-Term Value in Terms of SEO Value Being Shared Across Pages That Are Ranked?
There is nothing that is specifically measured by a domain authority score (which is what this question alludes to). It’s actually a bit more simple. The score essentially measures if it is a strong site with a good reputation. Not that there’s a specific “domain authority” threshold or score. Also, John made the comment that any pages of content that are added would be considered as similar to the other pages already added to the site.
Question: We’re Moving From Adaptive to Responsive Design. What’s the Best Way to Handle Category Pages?
If you’re moving from thinner category type pages to a more informational user experience, that’s fine. There shouldn’t be any issues from a ranking standpoint. This particular kind of transition is going from thin content to more valuable content, so it should help you in search.
Question: Site Colon Query Shows 3,475 pages vs. 170 pages in GSC. Should I Worry?
John mentioned that we should not worry too much about the difference between what site colon shows and what Google Search Console shows. A site colon query is not meant to be used for diagnostic purposes. For some sites, you’ll see lower numbers, and for other sites you’ll see numbers 100 times larger. The numbers in the site query results are optimized for speed.
However, he does mention that in cases where a site is 500+ pages, and you have less than 100 pages indexed, it could be a major problem that you need to diagnose and make sure all the technical stuff is in order.
After sorting out the technical issues, make sure that you’re promoting your site more. Get more users, get more ads, or maybe get someone else to work on the site for a while to get the ball rolling.
Question: After We Stopped Running Ads, Keyword Ranking Returned to Number 1. Do Ads (Or Lack Thereof) Affect Rankings?
John had to reiterate this important fact: whether or not you use ads does not have any impact whatsoever on rankings. Let us repeat ourselves: ads do NOT have an affect on your performance in search. Despite many reinforcements from Google that this does not happen, SEO professionals tend to think that it does.
This is one of those “correlation=causation” fallacies that don’t always mesh with the reality of what’s going on. Often, as site owners, developers, etc. we don’t always have a birds’ eye view of what’s going on with any particular website at any given time. It doesn’t always happen. It could be that someone did something to the backend of the site that caused it to drop exactly at the same time that you bought ads. It could be that major changes to redirects and content occurred at exactly the same time that your ad was purchased and cancelled.
Either way, correlation is not necessarily related to causation. Just because you stopped running ads and have seen an improvement in ranking does not mean that this was the exact cause.
John Mueller Office Hours Hangout Transcript
All right, welcome, everyone to today’s Google Search central SEO Office Hours Hangout. My name is John Mueller. I’m a Search Advocate at Google here in Switzerland.
And part of what we do are these office hours where people can jump in and ask their questions around their website and web search.
As always, a bunch of things were submitted on YouTube, we can go through some of those, but it looks like people are already raising their hands to get started.
So I guess we can get started with you all.
We can’t hear you. I know. Something doesn’t seem to be working. Okay. Okay. Otherwise, like, see if you can get it working again.
I think I heard you before. So let’s be close. But we can go back to you. As soon as you’re ready. No problem. All right. Mohawk.
So I come to your office hours, about a month and a half back and you give really good advice. Where you asked me to cut down on the number of pages and focus on value and unique content. And not on scale.
So ever since then, we cut down drastically a number of pages, well, we went from like 500,000 to five pages. And now those pages are buffered with a lot of useful tools. And they have a 100% conversion rate from those pages to app downloads, which is phenomenal.
But what we’ve been struggling with is that even with all that engagement, and I just posted a link on the chat as well. Those pages always rank at number seven and number eight pages.
And what I’m concerned about is that because no one ever goes to Google page seven or eight, how will Google ever get the signal that those pages are more engaging and better at answering user intent, then pages that always rank first.
And so I don’t actually see a clear path to ranking higher on Google, even with creating really high engaging pages.
Okay, so I have to take a look at your pages and see if there’s something specific that I can help with there.
But in general, this is a kind of process, it takes quite a lot of time to see things kind of change in that regard.
And that’s less a matter of Google seeing signals that people on page six find this a great result.
But more a matter of us recognizing that across the web, this is actually something that is relevant, and that we should be showing more visibly in the search results.
And I think concentrating your pages on fewer, or your site on fewer pages makes a lot of sense. And helps to make it so that you don’t have to collect signals for each of these individual pages.
But rather, it’s like everything is nicely concentrated there. But that’s still something that I would assume would still take quite a bit longer. So I, again, I don’t know your website, I don’t know exactly what you’re doing there.
But my guess is if you’re making a change going from several 1000 to a handful of pages, then that’s something that would take a couple of months, at least for things to settle down, maybe I don’t know, up to a half a year, something along that line.
And that’s just with kind of Google waiting to see where things are settling down. But in the meantime, you can still do things to help promote your website to work on things, to make things that are interesting for people to link at, make it easy for people to link at them to bring people to your website so that they understand that your website exists, so that they in turn can also link to your website.
Alright, thank you. And then one more question separately.
We also noticed that, you know, we’re a visa tech company. So we noticed that a lot of people search for embassies and foreign embassies near them. But a lot of the pages right now directories are stuffed with keywords and actually hold valuable information right at the end.
So we created better pages where we answered the user intent directly and our objective is that users get to those pages and leave those pages immediately.
So, you know, they find the phone number or the address. But on the search master forum, people perceive those pages as 10.
Because, you know, they have very little content, there’s just like phone number, NAP addresses, and we haven’t stuffed them with a lot of content.
And is it necessary to stuff pages with content so that they rank higher, or all that matters during user objective and intent?
You don’t need to stuff the pages with content. But I do think, like, again, I don’t know those particular pages. But I do think, with this kind of content, you have to watch out for you just like not just providing commodity information. So if you’re talking about things like addresses and phone numbers, like that information is available on so many different sites, that’s maybe even directly available directly in Google My Business, I don’t know. But that’s something where essentially, the information that you have there is the same that everyone else has, because it’s, it’s like a fact, right. And because of that, you need to find a way to provide significantly more value to users than just the facts that everyone else knows. So that’s something where, like, maybe you can build out those pages, maybe there are other parts of your website that you need to build out to make those pages less critical for your website. But you kind of need to watch out for that situation where the information that you’re providing is essentially exactly the same as everyone else has. And then from Google’s point of view, it’s like, well, we can point at any of these sites and users will be happy. So you kind of have to make something that goes beyond just that.
Hi John, I see a discrepancy between the core web vitals report, and the page experience report a Google search console. So for example, if I look at the core web vitals report, we’ve got 4000 URLs that have been listed as good. However, if I look in the page experience report, we have zero URLs, listed as good. Now, if I look. If I look at the buckets. So a URL is a good URL, it says the URL has a good status and the cool web vitals report, URL has no mobile issues. According to the mobile usability report, the site has no security issues. The URL’s already served over HTTPS, and there were no ad experience issues. Now all of these points are ticked, it passes all these points. So how can there be 4000 good URLs in core web vitals? Why is the score zero in page experience? Surely should that number not be the same? If it will pass?
I don’t know offhand why there might be a difference there. Like that. There might be something with regards to timing, there might be something with regards to the grouping of these pages. What one thing I would do there is check in PageSpeed experience directly and see if you also have field data there. So in the PageSpeed Insights tool, you you can test individual pages, and it will show you the lab scores. But it also shows you if there’s field data for those pages available. And that can give you a little bit of a sense of is there actually information that Google would have available to show in the page experience report?
Yeah, so if I click into an individual URL, and the core web vitals were important for ones that are listed as good, and you get the list of examples, and if I click into that and give me more information. It says, I think that there’s not enough real data for this page. Okay. I think there’s an icon. It says there is not enough real world data. But doesn’t there have to be real world data for these pages to appear as passed in core web vitals?
I don’t know. I don’t know why.
I have made a forum post in the Google Help Forum and has a bit more explanation but it hasn’t really had any official replies. Perhaps I could share that with you and you could look in a bit more detail. If you wanted to have a bit more of an explanation there.
Okay. If you can send me the forum link. I’d be happy to pass that on to someone from the team. We can take a look.
I had one situation where I could imagine that happening if we used to have, like field data for a set of URLs on your site, and for some reason, we don’t have that much field data anymore for the site, then we might know the URLs. But we don’t actually have data for the URLs. But I, I don’t know how you would check for that offhand. So I am happy to pass that on to someone from the core web vitals team.
John, you don’t group URLs by type, do you? Because we’ve noticed something similar that category pages, or they also don’t have enough Chrome views in order to give perfect data. But we get messages saying these are similar pages that…
Yeah, we do that. We do that with the chrome user experience report data, the field real world data, essentially, where we try to recognize when there pages that are similar enough, that we could group them together. And then that could be some, I don’t know how that would look in practice. It could be something where all of your category pages are in one group, and we say, well, these pages perform similarly. So if we find a new URL that is also part of this group, we don’t have to have data for that new URL, we can rely on the data for the group overall.
Okay, so let’s throw that in there.
Yeah. And I think that sometimes throws things off a little bit in the sense of, we might have one group, essentially for a site, but that could contain 1000s of URLs. So in the report in Search Console, I think we would report that as like 1000s of URLs have this problem? And then we just show that one, one part of the group essentially, but like not seeing any data at all in one report and seeing a lot of data and the other report that feels kind of weird.
Hi, John. So I attended your last office hour session, and I raised a question regarding the above the fold content.
And content. Sorry, I didn’t catch it.
Above the fold, okay. Yes.
So recently, one of our main keywords have been deranked. And the changes that we did on our page was like, we had some banner image. And we have just removed that content, and no changes apart from that. So that might be the reason we have been deranked?
I think if you make that kind of design change on your website, where suddenly the content moves up, or suddenly the content moves down, you would generally see that as a fairly soft change, like a very small change. So if you’re saying that your whole website is essentially no longer visible in search anymore…
It is visible. Initially, we were at first position, now we have moved to second after that change.
Yeah, I don’t think you would be able to kind of tie it back to that change that feels that feels like a fairly, I don’t know, almost like a subtle normal change in search that that can always happen that a site moves from position one to position two or position three, and then to position two, and then position one like these, these kinds of changes are fairly common.
Thank you, sir. One more question. In local SEO, does NAP details play a very vital role?
I have no idea about local SEO. Like, do you mean Google My Business essentially?
Yes, yes. Like if I have a keyword like support digital marketing course, Melbourne, and our location with that. In that case, I have an institute in Melbourne. So the NAP details play a very vital role as far as directory submission and other submissions are concerned.
I don’t know how Google My Business would handle that. It feels like it would matter but I honestly don’t know.
Ok, let’s say you’re ranking for, you know, hundreds and 1000s of keywords, right? And if we like, create some more few pages, so is there something like, you know, that value that you have generated from the 1000s, and hundreds of pages that we have already built on getting rank for them, isn’t being passed on to those pages. And that can also help, you know, in order to rank again, right from scratch, there could be some SEO value, which is like, connected, like up to time have like six months, or one year or 10 years, and then it will, again, then pass on to those pages where we can rank those pages. Much easier, rather than working even harder and harder and harder. Is something like an SEO value being shared across your website, or the pages that are getting ranked.
So if, if I understand your question correctly, it’s essentially, is there value in having a long term presence in search, like in being visible for the long term in search?
Yes, and then those things are likely to pass on for some new pages that we have created, and then getting the land because few things that are moving, we have been ranking for hundreds and 1000s of keywords, we have done the best content directly below it and others in the ranking. So it feels like there are a few pages, which are just, I think, there’s something like no SEO value or perception, you know, that Google has about our domain or that? I don’t know, they produce some good content or better content. And that can help you know, for other pages to be better.
Okay. So, I mean, if you build out a website, and it becomes successful, and it does well in search, then obviously new pages that you create, which are linked from within your existing website, we understand a little bit more about those already. So that’s something that, in general is always a good thing, which is also one of the reasons why I recommend focusing on making one really strong website rather than a lot of individuals, small ones. So essentially, like building out a big and strong website. I think that always kind of makes sense. But like whether there’s an inherent value or kind of like domain authority or something like that associated with a domain. I don’t think I say that. But it’s, essentially you’re building something strong, it has a good reputation. People like it, search engines like it. So when you add new content, they assume that it’s similar to the old content, which kind of makes sense.
So I think that kind of my website, or my little thing has some great value on the Internet and on the SERP and something like that. When presenting right?
And honestly, not quite sure what exactly you’re asking? Because it sounds like you already have a big website, and it’s working well. So I’m not quite sure what you would do differently or what, what kind of help you’re looking for.
Hi, john. My question is around or is focused on an e-commerce website. So we’re making several changes to layout, UX, etc. So essentially moving with the times from adaptive to responsive design. And, you know, focused on mobile parity, etc. One of the pages in particular that will be changing quite a bit is a key category page. So on the desktop experience, currently, it’s very much informational. Whereas on the mobile experience, currently, it’s more focused on just the sub categories, etc, etc, the mobile accordions and menu items. So given the fact we’re already being called predominantly with the mobile crawler, and it’s a relatively small change that’s happening on the mobile experience is actually going to be better because it’s essentially going to be turning into a product listing page. Should we be concerned about ranking changes and impact given that scenario, or given the fact that it’s on mobile, we’re already being called with the mobile side. It is the thinner content of the two experiences so that at the end of the day, it’s actually going to be better and we shouldn’t really worry about the fact that yes, we’ve got a ton of content on the desktop. sights. And that’s going to be moved over to another page.
Now, I think that sounds like you’re on the right path in terms of the migration from thinner category pages to more informational category pages. I think that kind of makes sense. One way you can double check which page Google is indexing is to look at the cached version of the URL. So if you take one of the category pages, and you look at the cached version in Google, you should be able to see is is it the mobile version or the desktop version? Especially if you say that there’s content differences at the moment? And based on that you can kind of judge a little bit better? Is it going to be transitioned from the desktop version to a better mobile version? Or will it be transitioned from a thinner mobile version to a better mobile version? And that should give you a little bit of a sense of is this going to be more of a positive change or more of a negative change?
So this is a website, you know, I probably asked in February, that is taking actually longer to index. And the current situation is, I have 170 pages, valid pages in Search Console. When I checked it through site colon, it just says 3475 pages. So that has been happening from three to four months. Now, I’m not sure what the correct way to go from No, it’s it’s actually an e-commerce website. And it has about less than 500 product pages. So yeah.
down. So in general, I would not worry about the difference between what a site colon query shows and what Search Console shows. In practice, a site colon query is not meant to be used for diagnostics purposes. So sometimes you do see quite different numbers. And for some sites, you see much lower numbers for other sites, you see, like 100 times larger numbers. And essentially, the numbers we show in the site query result are optimized for speed and to give a kind of a sense of the website. But they’re not optimized to be comprehensive. So that’s where the Search Console numbers would come in.
I think in general, if you’re talking about a site that is about like, 500 pages or so and, after a period of several months, only, like 100 are indexed. That seems like something where probably you could do a lot better with minimal extra. So that could be something where what I would first of all check is just the technical side of things to make sure that technically everything is okay, that the website can be crawled. There are some website crawling tools available, I think they’re even free for very small websites, where you can check your site to see if it’s crawlable or not.
And if it’s well, crawlable, then the next thing I would consider trying to figure out is what you can do to promote your website a little bit better. And that could be something like encouraging users to come to visit, maybe by buying ads, maybe by working together with someone else for a while, just to kind of get the ball rolling. It could also be if you’re an e commerce site, or especially if you’re a small business, local business site, maybe they’re like local Chamber of Commerce, that would be interested in linking to your website to give you like a little bit of extra information so that when our systems look at your website, they say, “Oh, this is actually a legitimate small business, we should try to index everything.”
Because especially if you’re talking about a smaller website with like a couple 100 pages, that feels like something where we, like if we have a little bit of a hint, then we’ll go off and get all of that. If you’re talking about an e-commerce site that has 500,000 pages, then obviously, like if we get all of those pages or not. It’s like that’s, that’s a totally different story. But with 500 pages, it feels like something that our systems with a little bit of extra incentive to say like we should at least check it out. We should be able to get a significant part of those indexed.
Alright, so another thing Which was happening to this website, it has a few hundreds of errors in Search Console. That was due to setting no index meta tags or any already, then what I did, I just, you know, remove the no index from the product pages. And there are still a few unwanted pages that we want to noindex and we kept it noindex to those pages. And now when I start validation, you know, it keeps on failing, because we have the pages that is at the top of the report. Those are noindexed. And the bottom pages, the product pages all are when I checked it through live test, it just, you know, validate and available for Google crawl, but still, it’s you know, shows under the error section for a long…
Now, I think the transition from no index to getting content index, that could be a reason why things take a little bit longer. But the validation tool in Search Console is not something that I would say would be holding you back, it’s more a matter of we use that tool as a way of giving you extra information. So just because the validation fails for that, and like the other URLs are okay, I don’t think that would be holding your website’s indexing back, it’s more search console is a little bit confused, because you said you fix the issue, but you didn’t really fix the issue. And then we don’t know like what we should show you as an error. Because we want to help you to fix these issues, if there are issues. But if you’re saying like, some of these are noindexed by design, then like Search Console will be a bit confused for a while, but that’s fine.
Alright, so I’ll check the traffic. Basically, this site has a social media presence earlier, it has a good presence and good followings out there, then they launch the website. So we are getting good traffic already on the website. But yeah, the website is like, you know, as playing trade since last January. So yeah, I’ll double check the things. But any advice would be helpful.
Now, I do think switching from no index to index that can make things take a little bit longer and get a little bit confusing. But like, it’s been a couple of months now. So it feels like that should be slowly settling down, I would still try to see what you can do to kind of actively promote your website a little bit better. Especially if you have a strong social media presence, maybe there are things you can do with your users there to kind of encourage them to promote the website for you as well.
Because I feel like especially a smaller website with a couple 100 pages, it doesn’t take a lot for us to say Oh, like we can spend a little bit more time here to index more of this content. With really large websites with very competitive websites. That’s kind of like a different story. But with small websites, small ecommerce shops, I think we should be able to pick that up a little bit better. Just sometimes it just takes a little bit more.
Hey, john, we recently, you know, had this situation where we were ranking in first position for a particular keyword. But when we started running ads for the same keyword, now, our organic page just didn’t appear in not even in 100 results. So right after we stop running ads, keyword is keywords back to number one position. So, you know, I know this, Google Ads shouldn’t affect your organic results. But I’m curious to know whether if there is any other particular reason for this to happen.
I don’t know why that would happen like that. But it would not be related to the ads. So that’s something where the systems are completely separate. On our side. The ranking within the ads, the ranking within search are completely separate systems. And there’s essentially no no real connection there. And I get this question sometimes, like in this way, if I run ads, then my website disappears from search. But I also get the same question in another way. It’s like if people run ads and suddenly their website ranks higher in search because they’re paying for something. And from our point of view, it’s like we work really hard to differentiate ads and search.
So much so that even when large advertisers go to their ad manager or account manager and they have like the smallest search question, like we we push back on that we don’t give any answers at all, when it comes to questions from clients or partners. So that’s something where I would, I would definitely not expect any kind of ranking change, especially something as visible as that, because that, I think that would also be something that lots of other sites would see and would be kind of a bit. I don’t know, super obvious. So the ranking change that you see, you saw, there seems like something that would be totally unrelated to the ads.
Sure. Let me go through some of the submitted questions. And then I’ll get back to more of the live questions as well. Let me just see, refresh the page here.
Will noindexing a page that already had external links remove the ranking benefit that the website was getting from those links, assuming that the linking pages are indexed?
Pretty much. Yes. So from our point of view, links are tracked between two canonical URLs. In other words, one URL links to another URL. And we kind of have that connection there, we have some understanding of what is happening between those two URLs. And if one of those URLs becomes non canonical, if it becomes non-indexed, if it drops from our systems, then essentially, that link disappears.
And that could be if one of these pages turns into a 404. If it gets a noindex. All of these things are reasons for us to kind of say, well, this page no longer exists. Therefore, the links that used to go to this page are no longer relevant. And from a kind of like taking a step back, it kind of makes sense. Because if someone is recommending something very specific on a website, and that’s something very specific that doesn’t exist anymore, then they’re not recommending anything anymore, because it doesn’t exist anymore.
On the other hand, if they’re recommending the whole website, overall, I’m pointing at the homepage, then, obviously, that’s something that remains valid as long as your homepage continues to exist.
Does the FAQ still work?
I assume this means the FAQ structured data, and from what I know that continues to exist and continues to work. What usually tends to happen with some of these structured data types or rich results types, is that over time, we try to fine tune how often we show them, just to make sure that we’re not overloading the search results with all of these, I don’t know blaming an extra functionality that just confuses people in the end.
So what often happens is, when we start a new type of rich result, people will kind of reluctantly try it out. And then if it works, well, then everyone tries it out. And then suddenly, the search results page is totally overloaded with this type of structured data. And then our systems and our engineers work to kind of fine tune that a little bit so that we continue to use that structured data, we just don’t show it for all sides all the time, which kind of makes sense, similar to how we tune the snippets that we show for websites and tune the rankings and tune the search results overall. So that’s something where, from at least as far as I know, I don’t think we’ve turned off any of the FAQ rich results types.
Then the second question here is the h1 is already on my post title, can I still use it inside the post?
So I assume it’s like a text in the h1 and in the title, not, not the h1 tag itself, because I don’t think you can put an h1 tag and your title. From my point of view, sure, you can, you can definitely do that. The title of the page and the headings on a page, they give us a little bit more context about what this page is about. And you can use that functionality to give us context. And users expect a bit of context as well. And sometimes it’s the same text on both of those places.
In particular, the title of a page is not extremely visible on mobile. So it’s something where if you want to give some context for the page overall, then don’t just show them the title of the page but also show visibly on the page itself. And using a heading like an h1 or h2 or whatever. That’s something that you can definitely use for that.
So just because like a bunch of keywords are in the title doesn’t mean you can’t use those keywords elsewhere on the page.
Can signed exchanges significantly improve core web vitals in particular LCP, even if LCP is more than six seconds. So I know a little bit about signed exchanges, but I don’t know all of the details. From what I know, signed exchanges are a way that we can kind of prefetch content from your website and serve that to users ahead of time so that they essentially have a part of your pages already loaded in the browser’s memory when they go and click on your search result.
If you have things like bigger images on the page, and they can be pre loaded, that also helps us a little bit there. And this is something that I think is super hard to measure because you tend to measure a single URL and not the transition of going to a search results page searching kind of in the background, it prefetches everything and then clicking on a result and then seeing the prefetch result, that’s super hard to measure with tools like PageSpeed Insights.
But with signed exchanges, that’s something that I think we’re headed more and more in that direction. It is also something that I think we had in similar ways, without kind of the privacy preserving nature of signed exchanges, in that we would prefetch content through the search results page from a server, and the server itself would see that as a normal request. And it would skew analytics and confuse people. And then I don’t know with cookies, what else was happening there.
On the other hand, maybe just pre-fetching the first result, maybe that’s a little bit too limited in the long run, maybe we have to find some balance in between there. So I assume that’s something that will still kind of be worked out. And over time, as we see more and more sites implement this, then we will be able to figure out what the right balance is there and maybe even have some tools that give you a little bit of insight into what is actually happening there.
With regards to core web vitals, because we use the real world data for core web vitals, when we use that in search. This is something that would be taken into account there. So if you’re using signed exchanges, and it significantly improves the speed for a significant number of users of your website, then that could be reflected in the Chrome user experience report data.
Can you please confirm that PageSpeed Insights score isn’t a ranking factor, but it’s a vital tool for diagnosing possible speed issues that when fixed might lead to better user experience on a website.
Seems kind of like a trick question. Because I mean, there are multiple things that can come together here. On the one hand, we don’t use core wave vitals as a ranking factor yet, which I think is coming in June or later this year, at least.
So that’s kind of the one aspect there then the core web vitals are only a part of what we show in PageSpeed Insights. And in particular, the PageSpeed Insights score kind of that zero to 100. That is not something that maps specifically to particular core web vitals. So that’s something slightly different. And also the score that you see in PageSpeed Insights that’s based on lab tests. So essentially a live test that you do on every page. Which gives you some insight into how this page is performing. But it doesn’t give you insight into what users are actually seeing. And what users are actually seeing part is the field data. And that is what we would use for ranking when core web vitals becomes kind of a live ranking factor.
I don’t know if there’s a short answer for this. But one version essentially is within PageSpeed Insights, you do see some data that we would in the future use for ranking with regard to speed and usability. But you also see a lot of other data in PageSpeed Insights. And that other data is more for you, as a site owner to figure out like, Where are the problems and to give you some mechanisms that you can iterate on your website to speed things up.
How do you do SEO for question and answers for educational websites? For every question, separate a separate page or blog, or club or topic wise question and answers. Because there are many websites which are prior to providing solutions for questions, then there will be a greater chance of duplicate content. So how do you deal with that? And also, which HTML tag is recommended for every question?
So lots of questions here. It feels like questions about questions and answer sites are almost very, a bit meta. But anyway, I think the balance between should you have one question and answer on a page? Or should you have a lot of questions on a page? Or should you kind of group everything by topic? That is something that you need to figure out for your own site. And for some sites, if you have a lot of content, maybe having one page for each question and answer makes sense. For other sites, if there are kind of like shorter questions and answers and maybe grouping them together. Makes sense. And that’s something that from Google’s point of view, we don’t have a strong preference that we say you should do it like this, or you should do it like that. You really need to figure out what works well for your site. What works well, for your users.
With regards to structured data, I think you can do it both ways. So not not 100% sure, like how this maps but I think you have the FAQ markup. And what is the Q&A markup that you can use, if you want to use structured data, I double check the documentation because I might be confusing it or something else, just now. But in the structured data documentation, you see the different types of structure that we have there. The thing to keep in mind is that just because there is a type of structured data for something does not mean that that is the optimal setup of a website.
So if, if, for example, there were only structured data for multiple questions on a page, that doesn’t mean that you should always put multiple questions on a page. So that’s something to kind of keep in mind. The other part that you have there is, what if other people have the same answer or the same question on their website, kind of with regard to duplicate content? And I think that similar to kind of the, I don’t know, the person who asked earlier today about the the embassy locations where you have an address and your phone number, that is something where if you have a question and answer pair, which is essentially commodity information, where essentially it’s just a fact, it’s like, I don’t know, did Barack Barack Obama exist? And the answer is yes.
Like, that’s something where, like, obviously, like, if people are asking for this, and you’re giving them the answer, yes, then that’s, that’s an answer. And other people might have that too. But you need to find a way to kind of differentiate yourself from everyone else who just has this fact as an answer. And that’s something that you can do with additional content for individual questions. Maybe. That’s something that you can do with the way that you structure information on your website. That’s something that you could do with the functionality, the usability of your website overall.
But that is something that you should keep in mind, where if you’re focusing on information that is essentially, on lots of places on the web, then you need to make sure that you’re not just doing the same thing as everyone else. Because if you aren’t just doing the same thing as everyone else, then our systems will say, oh, like we already have this answer once. Why should we show the same answer again, in the search results page, it doesn’t really make much sense.
So that’s something always to keep in mind there. Depending on the type of site that might also play a role, for example, if you have math problems, where it’s like, what is two plus two, then you could just give the answer there. But I don’t think that would be very useful for people. So I would not see that as something where, like, I will take all the numbers in the world and make kind of an addition table and make individual pages for that. I don’t see that as being super useful for search. And then the last part about HTML tags. I don’t think we have a kind of a preferred HTML tags for questions and answers.
We see two versions of our site appearing in Search Console. One is HTTP, and the other one is HTTPS. What is causing this And should we fix it?
So this one is easy. It’s essentially happening because someone submitted both of those versions in Search Console. And if you didn’t do it, then maybe someone else did for you. And in practice, it doesn’t cause any problems, probably all of your data will be concentrated on one of those two versions. And essentially, the value that you get out of having both of them listed in Search Console is if for one of those versions, something goes wrong, you’ll get information in Search Console. One way to kind of get the same amount of information without having them listed twice is to use a domain verification, which would automatically include HTTP and HTTPS. The slight downside of a domain verification is that I think not all of the older Search Console tools are available for domain verification. So having them verified individually still kind of makes sense too.
And let’s see, I have a question about button text and hyperlink. I have a blog where I have a list of some artists and buttons for each of them as view profile. And I also have read more anchors for every post. Is it possible that this would be considered duplicate content, or spammy and turn linking? What’s the best practice?
From that point of view, if the only navigation within your site were with buttons, that would be a problem. However, since you also have these read more links, which are essentially, it sounds like normal HTML links, going to the same page, essentially, we would see that and say, Oh, this is a normal link, and then we can crawl that and follow that normally. So that would be completely normal internal navigation.
If you had both of these elements as links, so if the kind of the button to view profile and the Read More link, were both normal HTML links, then that would also not be a problem. It’s not not an issue if you have multiple links to the same page from one of your pages. So in practice, what would happen here is we will try to combine those two links and just essentially treat them as one link and follow those normally within your website.
The one thing I would recommend here, though, is that you use a good anchor text instead of read more. So it’s, it’s very tempting to kind of have these read more links on a page and pointing to a more detailed page. But our systems do take into account the anchor text of a link. And they use that to get some context of the page that you’re linking to. So if the anchor text is just read more then that doesn’t tell us anything. It’s like, we find the link, we can follow the link, we can crawl that, that’s perfectly fine. But for example, if the link were read more about XYZ, then we would automatically know when we find that link, oh, this other page is going to be about XYZ. So we can kind of categorize that a little bit ahead of time already.
So that’s something like if you’re looking for optimizations, and it’s really currently a read more link, that would be something that I would recommend doing. I also noticed this on my site when I kind of like tried to clean things up a little bit. In Lighthouse within Chrome, if you – I think it’s the accessibility report or the SEO report, not 100%. Sure. If you run that report on your site, and you have links that just contain read more than that will be flagged as something that you should improve. So that might be something to try out there as well.
Will the new core web vital signal be live ongoing signals? For example, if my pages are rated as poor when the update rolls out, but a few months later, they managed to improve them? Kind of will that be taken into account? Or will we have to wait until Gary pushes a button.
So my understanding is this would be live based on the current data, because that’s kind of the the way that we can, I don’t know, run this kind of a pipeline in the sense that when we get new data for core web vitals from kind of the field data for individual URLs, that we can automatically take that into account in our systems. I don’t think it would make sense for us to kind of like keep a cached version of all of those signals, and only use the old data for that. So the thing to keep in mind, though, is that the core web vitals data that we use from the Chrome user experience report, is I think 28 days old.
It’s not that if you improve your site today, tomorrow, you will see a ranking change. But rather, if you improve your site today, then in 28 days, we would have information about how your users have been seeing those pages since then. And we’d be able to take that into account at that point. With regards to like kind of like moving between the different values of the core web vitals, I think we’ll have to see how strong that effect is. But in general, we tend to differentiate between pages that are in the bad bucket. And kind of then from there, there’s a needs improvement bucket, which is kind of like that yellow zone, and then the good bucket. And essentially, from every time as soon as you pass from bad into needs improvement, then things start to improve in search. And I think at least at the moment, it’s not planned for there to be further iterations within the good buckets.
So kind of like between needs improvement and good, like things will slowly get a little bit better. And then once your pages are completely in good order, it’s not going to be the case that tweaking milliseconds will change anything with regards to ranking. Obviously, for users, that might be a different story. Maybe you can see things like conversion rate changes, if you improve things by a couple milliseconds, but that’s more something for you to care about, then something for search to specifically care about.
Find John Mueller’s Google Hangouts on YouTube
Again, you can find John’s Google Hangouts on YouTube.
If you’d like to watch the hangout, you can watch it below.
You can also follow him on Twitter.
The hangouts are usually quite comprehensive, just like this one, so you’re sure to learn something new even if you’ve been doing SEO for a while.