John Mueller explained in his hangout on 08/13/2021, that the page experience update is unrelated to the core updates. While Core Web Vitals and the core updates have similar wording, they are not the same. Instead, the Page Experience Update focuses on speed, and the core updates focus on your overall website quality.
One webmaster’s problem stemmed from being unable to validate page speed issues and corroborate information within Google Search Console.
John explained that in Google Search Console, what they show is field data or data from real-world users. Tools like Google Page Speed Insights are lab tests or lab data, which are different from data that you would find in Google Search Console.
John’s recommendation included using specific JavaScript libraries that are available on web.dev. These libraries help create more accurate tracking of page speed metrics and data.
If you make a change now, it will take close to a month to show up in Google Search Console.
This is why it’s important to utilize some of these JavaScript libraries, so you can corroborate some page speed issues showing up in Google Search Console, and make sure that you have accurate insight into exactly what’s happening to your Core Web Vitals.
This comes up at approximately the 00:34 mark in the video.
John Mueller 08/13/2021 Hangout Transcript
My question is around Core Web Vitals and in Search Console. What concerns me is I’ve seen an increase in my URLs that have been bucketed into the poor category. But unfortunately, with Page Speed Insights, it’s impossible for me to validate that or corroborate that information. Do you have any information on, I guess, which tools Google is actually using for the algorithm? And, you know, again, what concerns me is the trend that this has happened after the release of the latest core updates? So I’m curious if you could give me any insight and guidance into what tools to trust?
John 1:16
Okay. So I think the whole topic of speed is surprisingly complicated in that you look at it, and it has like three measurements. You’re like, “Oh, I can figure three things out.” But there just aren’t many things involved.
Webmaster 1 1:31
I’m sorry to interrupt, John, this is actually a CLS issue. So this one is not speed. But similar. Similar problem.
John 1:40
Yeah, exactly. And I think maybe, first of all, just to be clear, the whole page experience update is completely unrelated to the core update. So the core updates that you might see happening, or that we announced, they’re more about understanding the relevance of the site. So more about the quality and the content of the site. And the page experience update is purely one kind of the core web vitals and the other page experience factors. So that’s, that’s maybe one thing to keep in mind that these are really, really separate.
Then in Search Console, what we show is essentially field data, which we also call “real user metrics,” or “RUM data,” which is essentially data that’s collected from users when they go to your website. So that’s essentially kind of the scale on which we operate, they’re the, I don’t know, the most pure measure that we can get, essentially, because this is what users actually see. And Page Speed Insights and some of the other tools, they are so-called lab tests, which are essentially kind of emulations of the field data where basically they run similar calculations.
And they make assumptions and say that, on average, users are probably like this, have this kind of connection, this kind of the screen does this kind of thing on their phones. And we’ll try to estimate what those metrics could be. And because of that, you might see differences across the data in Search Console, and the data that you see when you run the tests yourself. Because the tests when you run them yourself are essentially estimations. And in Search Console, you see what users actually saw. So that’s kind of the main difference there. And that applies to speed, CLS (Cumulative Layout Shift), FID (First Input Delay), all of that.
And I think, especially with regards to CLS, that’s something where sometimes you have weird things that play in there, which are kind of hard to figure out. Where, for example, if users in your country see the page completely normally, and users in the primary location of your site where most users are, they see maybe a pop-up or something that comes up in between, which shifts the CLS around. And then when you test it yourself, you might think, well, this metric looks okay. But what users actually see is slightly different.
With regards to CLS, I think that’s, that’s sometimes super hard to pinpoint. What I would recommend doing there is trying to make some simplified pages and trying to see if you can see what users might be seeing so that you can narrow down a little bit which elements on the page are actually causing this shift. What you can also do is kind of instrument the pages yourself. So in web.dev, we have some JavaScript libraries that you can use, where essentially you add the code to your pages, and then it reports through Google Analytics or some other analytics tool that you use—the metrics that users are actually seeing, that lets you get that a little bit faster. Because in Search Console, because of the way that we aggregate the data that’s always about 28-days delayed.
So if you make a change now, it takes almost a month for you to see that in Search Console. And if you instrument the pages yourself, you can do things like a B testing and track that you can see, essentially, if you release something today, you can check tomorrow, what users actually saw there. And you can narrow things down a little bit faster. And once you figure out which elements on the page are causing the CLS shift, then you can kind of work to improve that on the site.
So that’s kind of the direction that I would go, they’re kind of using the field data that you see in search cons console as a way of recognizing oh, there is an issue, and then trying to narrow that down iteratively, and using the lab test to kind of confirm that you’re on the right track there, and to try to reproduce what users are actually seeing.