It has long been a policy of Google that engaging in cloaking behavior can get your site penalized, banned, or worse.
One webmaster, during John Mueller’s 09/17/2921 hangout, asked about cloaking techniques they were using for their site.
John explained, unequivocally, that the webmaster should not be engaging in cloaking or cloaking behavior.
The webmaster was also concerned about their paywall. John said that they just need to mark up the paywall using paywall structured data so Google can see that’s actually what’s being served.
Basically, the overall rule is that you must show users, even other users from the same location, the same content that you would show Googlebot.
This discussion occurs at approximately 28:48 in the video:
John Mueller 09/17/2021 Hangout Transcript
Webmaster 5 28:48
Can I add something in this? I have a similar question. Actually, we have live TV for our new channel, which we are not providing in the U.S. But as you told the site crawl in the US only, so we provide our information, “this live TV is not available in U.S.”
So a U.S. crawler crawled and the Google results show if I type in Google then this “hey this live TV is not available in your location.” Even I searched from India as well. That’s why because in that case, can we do cloaking techniques for our live page?
So we show the India version of live TV for Google crawler, but the user we show the live TV not available text page so that it’s issue fixed automatically so to avoid this issue?
Based on our guidelines you can’t, so or you shouldn’t. So our guidelines are pretty clear that you should be showing Googlebot the same content as other users from that location. So if everyone else in the US is blocked, you shouldn’t be showing Googlebot the content. So that’s kind of, I think, the baseline situation.
There. There is. I think, now that you mentioned that with, especially with regards to news sites, one way that might work, I don’t know, is the use of the paywall structured data. In that case, it’s something where you could show Googlebot the the normal content, and then if you have a paywall that’s required to access the content, you would just need to mark up that content with the paywall structured data, and then users if, if they have access, they can log in on their side and actually get that content.
Webmaster 5 31:02
We can’t provide even after the paywall.
Yeah, if you can’t provide it, even after the paywall, then you wouldn’t be able to provide it to Google. Yeah, it’s, I think it’s, it’s kind of an awkward situation. And it’s, I think, a little bit unfair towards people who are not in the US or not in the location where sites are being crawled. Because the other way around is, of course, possible.
Like, if you have a website that is based in the US, and you block India, then that wouldn’t be affecting the search results because we crawl from the US. I think it’s a little bit unfair, but from a technical point of view, it’s just how we have to deal with crawling, we can’t crawl from all countries.
And the policies, at least at the moment, are such that you should treat Googlebot the same as other users from that country. It might be that at some point, we change those policies. But they’ve been like that for a very long time.
Webmaster 5 32:07
Even we, we can’t redirect to different pages in the U.S. if based on the user agent or based on the country from the server end?
Now, hey, I mean, like, what do you do essentially for users in the US is, essentially, what we would use for indexing. So another approach that I’ve seen for other kinds of content is to provide some level of information that is something that you can provide in the US.
For example, I think I’ve seen this with maybe casino websites, I’m not sure where sometimes the content is illegal in the US, where they have kind of a simplified version of the content which they can provide to the US, which is more like descriptive information about the content.
So if you have movies that you can’t provide in the US, you can still serve the description of the movie in the US, you just can’t serve the movie itself. And that way, at least you have a basis of content, that would be indexable. And that could be findable then in search.
Webmaster 5 33:29
So earlier, uh, uh, I think it was maybe Gary or someone else from Google, in some condition, you can use cloaking, so you can in some condition, I might, I don’t know which condition is that. But I think my condition is the same. We can use that…
Our policies are pretty clear in that regard. Like it should really be what users see and country. And I mean, there’s an aspect of you, you might be willing to take a risk and say, “Well, I assume the webspam team will not take action on my website, because I’m a legitimate business. And if they check, they will see what I’m doing.” But that’s something where you’re essentially breaking the guidelines and hoping that it comes out well. And for normal businesses, I don’t think that’s really a good idea.
And the other tricky part, there are a lot of the teams at Google that work on search, they are also based in the US. So if someone in the US were to double-check, like what is happening on this website, they also see it’s blocked. So it’s kind of tricky, I think.
I think this question also goes into the Core Web Vitals aspect there where maybe you want to block countries where connectivity is slow and from that point of view, I think, I mean, in this case, blocking other countries would be a problem with regards to search, first of all, and with regards to Core Web Vitals, it’s more something, well, if you want to block other countries, you can do that. I still don’t think it’s a good idea. So I would try to avoid that if at all possible.