One SEO professional was very concerned about some page speed issues he was experiencing.
In short, he saw different numbers everywhere, from GTMetrix, to WebPageTest.org, to Lighthouse.
He was not entirely sure who to trust when it came to these numbers.
John talked about the differences between lab data and field data. He explained that with some lab testing, you will be able to gain some incremental improvements where page speed is concerned.
The main difference between lab data and field data is this: think of lab data as data based on speculative situations. Field data is everything that users actually saw.
He said that the data you see in Search Console is known as field data. He also explained that a question and answer process to yourself could help you narrow down what’s causing the page speed issues.
Are these page speed issues something we can actually fix? Or are they something that is more like a red herring that we may never be able to fix?
Could they be a quirk that occurred in the network at some point that keeps happening? Or is it really something concrete like a third-party plugin you can remove?
These are the kinds of questions that John recommends you ask yourself when it comes to diagnosing and repairing technical page speed issues.
The SEO professional then asked about GTMetrix and WebPageTest.org. It’s not easy to perform these tests again, so who do they trust?
John further explained that in the lab testing tools, things may be okay from a theoretical point of view.
However, in practice with the field data, people see something entirely different.
Even if you have perfect testing in the lab testing tools, they do make assumptions on what users might see. This is why they are so speculative in nature.
These tools can only go so far and make assumptions about what your users see. They don’t actually measure the field data, which is what you get in Google Search Console.
This happens at approximately the 7:04 mark in the video.
John Mueller Hangout Transcript
SEO Professional 3 7:04
Yes. I just wanted to know – the new design for PageSpeed Insights. I wish it could be reverted back, because I have to keep on scrolling down. See, you’re laughing because you knew this was going to happen. But I mean, you know, my main concern is a lot of sites. I remember when HTTPS came, when that was introduced. And there’s a bunch of other algorithms inside everything that makes the algorithm what it is today.
So this is not something again, we can’t just continue to focus on making sure that, you know, this site is going to be, you know, 98, out of 100, and so on. But in February, when the ranking I guess, what, when PageSpeed Insight will count? What do we do? Because at this point in time, I mean, there’s certain sites I’m having issues with.
They include LCP issues and CLS and a lot of sites right now are some sites that are on a very popular CDN platform. I’m not going to name them, but they told me like, look, you know, it’s servers within your area that are creating a specific score, and you’re seeing, you know, sometimes I would see the LCP and CLS, at a perfect score, which would be almost 99 out of 100.
And then it goes back into another score. And which has nothing to do with the site. So I’m kind of, I mean, I’m getting worried because Google sometimes crawls the site, and then I get the error in Search Console saying, hey, you know, you have CLS, CLS issues and LCP. And then I go and scan it, and it’s fine. And it’s like, 99 out of 100.
And then some pages on the blog are 100 out of 100. So then I’m like, What is going on here? And then the new design that just got introduced? And then I’m seeing some areas where it has no data found. And I’m like, Okay, I guess you guys need to crawl it. And then data will come. John, this is just I don’t know.
I think I think there. There’s one one big aspect that plays a role there, which is important in that. On the one hand, we have these lab testing tools like PageSpeed Insights and Lighthouse and things like that. Where essentially you test that either from your browser with your connection or with some server, Web Page Test, I think similar things. Some of the various other tools do that as well, that’s kind of the lab testing that you can do.
And that’s more something that is useful for incremental improvement. So if you need to fix something you need to see if it worked or not, you can just see it again. We don’t use the lab testing for search. Instead, what we use is the field data, which is basically what users actually saw. And that’s the data that you see in Search Console.
So essentially, what you’re probably seeing there is users saw that something was slow, kind of on an aggregate level, we’d look at that, that it’s not, it’s not on a per URL level. And we give you that information, like some users saw this kind of an issue in Search Console. And then you can essentially use the individual lab testing tools to see where could this be coming from? Is it something that I can fix?
Is it something that was like, I don’t know, some quirk that happened in the network overall, that you can’t really kind of help with? That kind of helps you to narrow things down.
SEO Professional 3 11:10
The thing is in GTmetrix, and I know WebPageTest.org it’s a favorite of yours. Because you in a lot of hangouts, you do have that shown. It’s perfect there. So when I rescan it, of course, you got to wait 25-there’s 25 people ahead of you-all that stuff. And it’s not easy to rescan as PageSpeed Insights. But when I do that, it’s fine. It’s perfectly fine. So what should I do then? Who do I trust?
Well, I mean, essentially, what is happening in a case like that is with the lab testing tools are saying things are okay, from a theoretical point of view. But in practice, people see something different.
So all of the lab testing tools, they make assumptions on what users might see. So if you’re running in a data center somewhere, you can’t just act like oh, my, I don’t know gigabit connection through the internet is what every user has, but rather they slow things down artificially, and they try to act like a normal user.
But of course, they can only go so far and make assumptions and what your actual users see might be different. It might be that your actual users are like using something faster than the assumption or using something slower.
And essentially, the Search Console, part of that the search ranking side is based on what users actually see. And not based on some artificial number that some data centers…
SEO Professional 3 12:39
So as 5G develops, and gets better and better, this, of course, will change and then it’s gonna be easier, easier for everyone. Right? So this will be a thing of the past.
Well, I mean, speed is more than just the network connection…
SEO Professional 3 12:52
I don’t know. No, it’s just that I’ve gone and I’ve done my research up there. And if you just go and just stand on a mall, I don’t see anyone that has a 3G network anymore. And…
And you collect the actual data that users saw in analytics. And that gives you a little bit of a faster feedback loop where you can try things out and say, “Okay, what happens if I make this image even smaller?”
Or “What happens if I remove I don’t know, AdSense, or analytics,” or whatever, then you can see fairly quickly, okay, like, the number is still 99 for me in the lab tests, but users actually saw it faster. So maybe I should go in that direction.
SEO Professional 3 13:46
Okay, I see. Alright, so yeah, it’s, it’s hard. And it’s fun at the same time, as you’re getting the hang of it. So it’s not something we should worry about.
I mean, I would dig into it to try to figure out what it is. But the thing to also keep in mind is the Core Web Vitals, the page experience is, is a ranking factor, but it’s not the biggest ranking factor. So if you have critical things that you need to do, if it’s a matter of restructuring your website first then I would kind of do the critical things first, and then as you find time, think about what you can do to improve speed.