During this Google Search Central office-hours hangout with John Mueller, there were plenty of great questions.
Among them include questions about page experience updates, other core updates, quality issues, international SEO issues, and much more.
Will the Page Experience Algorithm be Real Time?
Just a very quick question, this speed experience algorithm, which is coming in May just want you to know if it will be a real time algorithm or something like core updates where it will be updated from time to time?
I don’t know. I don’t know that. I decided completely yet. Part of that is also there’s just a general lag for the data anyway. So we kind of have to wait that period of time until we have collected enough data. So I suspect it’s not something that will be optimized for speed, or speedy updates, but more kind of, to have a clear understanding of the overall picture. So my guess is it’ll be more more something of a slow thing rather than a real time change.
Our SEO Insights: Page Experience Algorithm Will Not be in Real-time
We already know that there are two components to page speed calculations: the overall data collected over the past 28 days, and the data currently being processed for your site. With confirmation that the algorithm won’t be updated in real time, we can expect page speed changes to look roughly the same, and follow a 28 day data review schedule.
The algorithm won’t be in real-time like Panda and Penguin. At least, not at first. Google could eventually roll Page Experience into the normal algorithm. They have done so with just about everything else.
Hi, John, I got a couple of questions with regards to Core Web Vitals. So the first one is we recently updated our site and now we’re getting a 100% score in lighthouse. But, we have to deploy the changes, the amount of poor URLs have increased dramatically in Google Search Console being reported. So poor URLs have gone up, and our good URLs have actually gone down. Now understand the difference between lab and field data. But it seems a bit weird that we’re now getting a much better score in lighthouse, the lab data 100 100% or 100. But now our real field data seems to be getting worse. And it just seems a bit strange that we’ve been, we’ve improved so much. But yeah, the real world data seems to be the opposite of that.
I think one, one aspect that you need to take into account is the delay with regards to court, kind of the chrome user experience, report data, which is that I think, like 28 days period, essentially, during which the data is collected. Essentially, if you make a change now and you test it in lighthouse, or PageSpeed Insights, that’s what you would approximately see in about a month with the data from Search Console. From a timing point of view that’s a little bit decoupled. So if you made a change now, and immediately afterwards, in Search Console, you see that it changed, then the change in Search Console would not be connected to the change that you just did now.
Our SEO Insights: Even in Page Experience Correlation is not Causation
This is an interesting question. The webmaster is asking if they make a change, and see an immediate change in GSC, whether or not that change in GSC is the result of the change they made just then.
John Mueller says that this is seldom the case, where GSC mirrors what happens in real-time. Instead, the changes in GSC are entirely separate from the change that’s happening at the time.
Why Are We Not Seeing an Increase After 28 Days?
Yeah, so we’re aware of the 28 days delay, but when we deployed it more than 20 days ago, we still haven’t seen an increase in performance.
I don’t know. It’s hard to say, without looking at the details. It’s like one thing that I will try to do there is to try to figure out which part of Core Web Vitals is affected by that if it’s like Largest Contentful Paint or if it’s CLS, and based on that, try to figure out where it might be coming from.One of the things that generally happens with the lab versus field data is that with the lab data, it’s basically an assumption. It’s an approximation of what our systems think might happen in the field. Because there are just so many unknowns out there that depend a little bit on your users, where they’re coming from, what kind of devices they have all of that, which means that you can use a lab data to incrementally improve but you don’t necessarily see a clear connection between the lab results and the field results. I don’t know if that’s something that might be playing a role there.
Our SEO Insights: Lab Data vs. Field Data and Their Differences
John Mueller mentions that Core Web Vitals, in terms of lab vs. field data, field data is an assumption. The assumption being that there is an approximation of what Google’s system thinks might happen in the field. Because of this assumption, it can be a little off.
The lab data is real-time data from real users who are using these sites. The lab data can be used to improve results incrementally but you won’t necessarily be able to use them to see clear connections between the types of results for useful changes on your sites.
Do Core Web Vitals Scores Change Depending on Internet Speeds in Different Countries?
Well it actually leads quite nicely to my second question, which is with the scores and Core Web Vitals, is there any consideration regarding different internet speeds in different countries? So we’re based in Southeast Asia, in nine countries in Southeast Asia where the speed is less, especially for mobile. So for example, in one of our countries, it’s the 90th slowest country for mobile speed. So how does that affect the score? Are there any different scores from different countries? And then if there isn’t, whether in some of our countries, our English speaking countries, we’re potentially competing with websites from other countries, for example, in the Philippines, we have a lot of Western websites in there, okay. So a lot of their users will be from America, Europe, etc, where the speed may be faster. And they were kind of competing against them. Where I used to come from the Philippines where the speed is slower.
Now, I don’t know what the final setup there will be. It is something where we have country information in Chrome user experience report data. It’s where we’d be able to figure out like where users are primarily coming from. But the general idea is still kind of that users should be able to have a good experience. And if the bulk of your users see a slow experience, regardless of why, then essentially, that’s what will apply there. So that’s, at least from what I know that’s kind of the general standpoint there. It’s like if 90% of your users are coming from locations that are slow. And essentially 90% of your users have this kind of sub optimal experience with your site, then that’s kind of what will be taken into account.
Our SEO Insights: Page Experience is Still Critical, Regardless of Internet Speeds
This is a perfect time to mention what John Mueller has consistently said: that page experience will still be critical, regardless of internet speeds throughout the rest of the world.
This is easy to achieve by ensuring fast Core Web Vitals numbers across a wide range of users, platforms, and devices, while not concentrating on only your own audience.
You want your website design to be versatile and load for every single potential user, not just prioritized based on your audience in Google Analytics.
The idea is comprehensive cross platform compatibility. Rather than prioritized. The easiest way to do this is focusing on lean, clean, mean code, and ensuring a responsive design. Also, W3C-compatible and valid code is preferable. Google has said time and time again that there is no specific type of code needed, however, W3C valid code creates wider opportunities for cross platform compatibility, which is always a nice tick of the box for the user experience.
John Mueller: Referral Traffic Does Not Mean Better Ranking
Hi, John. So my question is about referral traffic. How does referral traffic improve search rankings for a particular page if that page is ranking for a certain keyword?
No, no. It’s sometimes useful to get traffic to your pages regardless, but we don’t take that into account when it comes to ranking. Primarily because you can have lots of different traffic sources that don’t necessarily mean that your page is good. For example, if you go out and spend a lot of money on social media advertising, you might get a lot of people coming to your site. But just because there’s a lot of people coming doesn’t mean that we should be ranking higher in search.
Our SEO Insights: Referral Traffic Still Not Taken Into Account When Ranking
This one is generally straightforward: there is no SEO benefit to having referral traffic constantly driven to your website. While it’s nice for the numbers, especially when sharing reporting with your client, there’s really no SEO reason to have it and it won’t increase your rankings.
How Can I Use User-Generated Content Effectively?
Can I use my users review below my health articles, will this user generated content affect my articles authority?
You can use user-generated content like that. The important part to keep in mind is that we essentially look at your page overall. And if your page overall looks like it is lower quality because of the user generated content, then we would see your page overall as being lower quality. So it’s less a matter of is it user generated content or not? And more, is the page overall good or is it not so good?
Our SEO Insights:
Google’s algorithms look at the page overall. They don’t look at how each piece of content is used on an individual basis.
If your page looks lower quality, then the algorithms will assess the page as being a lower-quality page.
Do You Need a Country-Specific Sitemap File?
Thanks, John. I have an international sitemap question. So the advice for webmasters is that having a separate international sitemap seems to help. But it really talks about in conjunction with hreflang language tags, etc. So our site—the new one—is global. What we do is break it down by country. But it’s all English. But is it still helpful for the crawls to have a country level XML sitemap or does it not matter until you get to 50-100,000 URLs?
Yeah, you don’t necessarily need a country’s specific sitemap file. I think for country specific content or localized content, if you use hreflang, then you can put that in a sitemap file, or you can put that into the head of the page. But we don’t really differentiate between the source of the hreflang annotations. So essentially, the hreflang side is the part which is relevant for us and how you split things up into Sitemaps. It is totally up to you. Some people split it up in logical sections, some people by country, some people just, I don’t know, split it up alphabetically. It’s totally up to you.
Our SEO Insights: A Country-Specific Sitemap File is Not Needed
You don’t need a country-specific sitemap file. It won’t improve rankings one way or the other. Use it in the way you feel it’s best.
Do Desktop-only Sites Fall Under Mobile-First Indexing?
If our site is only meaningful to desktop users, such as driver downloads, will Google still use mobile first indexing?
Our SEO Insights: Desktop Only Solutions Don’t Matter
Even if your site is a desktop-only solution where mobile does not have an existing counterpart to that desktop functionality, yes, Google will still use mobile-first indexing as the priority indexing consideration.
What is the Difference Between Core Updates and the Product Review Update?
What’s the difference between core updates and this product review update? Does it work in a similar way? Is there a difference like the product review update looks at queries overall that deal with reviews and best of list articles.
I don’t know how to frame the difference. I mean, from our point of view, there are different algorithms that try to look at different things. Usually the core updates tend to look at things overall across the board for websites. And this particular update is really more focused on kind of product review websites or product review type content.So it’s kind of like this—they look at the sites overall in a broad way, both of them, but they look at different aspects. I could imagine at some point, we would see this product review algorithm, just as a part of our normal web search ranking and not tied to specific kinds of pages, for example.
But for whatever reason, they decided to roll this out a little bit separately.
Will the core web vitals and Google passages roll out at the same time? Or did the Google passages already start rolling out? Already? I don’t know. I bet Barry would know, if the passage indexing stuff has rolled out. passes, indexing has rolled out as of January, I believe.
Our SEO Insights: Core Updates are Broad, Product Review Update is Specific
The core updates are broader, and these updates cover quality issues on the website overall. They do not cover specific types of website attributes.
The product reviews update, however, targets specific types of website elements such as product reviews and services reviews. Lower-quality reviews overall are the target, and you probably can’t get away with “this product is amazing!” one-sentence reviews anymore.
Will Exploding Search Pages Have a Negative Impact on the Site?
On our website, we have 100, white labels that are copies of the main site, and they’re closed off by the X-robots tag, noindexed, due to technical limitations. All pages have meta robots HTML tags set as index,follow. Over the course of last week, we have over 84 million pages added in Google Search Console as excluded but blocked by robots, and the number keeps increasing by millions. Will this bring any negative effects for the site?It seems that the X-robots tag is not being respected, even though documentation clearly states that it should take precedence over the robots meta tag. What should we do to stop Googlebot from endlessly crawling auto-generated search pages?
Yeah, so I think there are two aspects there to keep in mind. One is the noindex, robots meta tag, or X-robots tag is essentially specific to indexing, it’s not something that would prevent crawling. So in particular, we have to first crawl the page in order to see the noindex.
And because of that, we end up crawling a lot of these pages just to see “Oh, actually, there’s noindex so we can drop it.” And especially in the beginning, as we start to see things like this across a website, we will probably go off and crawl a ton of content and see that it’s all no index. And usually what happens there is over time, our systems realize that crawling these specific URL patterns leads to nothing like we can’t index the content, therefore, we will reduce the crawling of those specific URL patterns.
That’s something that kind of settled down on its own. But we still need to crawl all of these patterns first. And if the crawling is a problem for you, then on the one hand, you can use robots.txt to block crawling of those pages completely. The tricky aspect with robots.txt is that it doesn’t prevent indexing necessarily. So with robots.txt, we wouldn’t be able to look at the page.
And we wouldn’t know there’s an X-robots tag with a noindex associated with the page. So it’s possible that we might still index these pages without the content. In practice. That means if you do a site query for those URLs, you might see those URLs show without any snippet.
Our SEO Insights: Google Crawls Patterns of URLs to Determine Site Crawlability
In order to determine a specific site crawlability, Google will crawl URL patterns in order to figure out exactly what’s permitted.
If a site has category URLs blocked and noindexed, then once Google crawls all those and determines they are not crawlable, they won’t crawl them again.
John Mueller’s Search Central Google Office Hours Hangouts
As a reminder, you can find John Mueller’s Google Search Central Office Hours Hangouts at the official Google Search Central Youtube Channel.
If you’re in any way interested in SEO, yesterday is the time to subscribe and keep up with them.