One SEO professional asked John Mueller about hiding links to other sites in order to pass link juice.
Their question was: They want to mask links to external sites to prevent the passing of link juice. They think the PRG approach is a potential solution. What do you think? Is this overkill? Is there a simpler solution?
John explains that the PRG pattern is a complicated way of doing a post request to the server, which then redirects somewhere else to the external content.
Googlebot will, essentially, never find that link.
From John’s perspective, this is super overkill. There’s no reason to do this unless there’s really a technical reason that you absolutely need to block the crawling of these URLs. John recommends linking to these pages normally, or using the rel=nofollow to link to those pages.
There’s absolutely no reason to go through this weird post redirect pattern there. It just causes a lot of server overhead and makes it really hard to cache this kind of request and take the user to the right place.
John would just use a nofollow on these links, if you don’t have them followed.
The other thing is, of course, just like blocking all of your external links, that rarely makes sense. Instead, John recommends ensuring that you’re taking part in the web as it is, which means that you link to other sites naturally.
They link to you naturally, and so on. Essentially just taking part in the normal part of the web and not trying to keep Googlebot locked onto your specific website.
This happens at approximately the 20:00 mark in the video.
John Mueller Hangout Transcript
John (Question)
Let’s see. See, I don’t know the timing. We’ll have to figure out how long we make these. We want to mask links to external websites to prevent passing of our link juice. We think the PRG approach is a possible solution. What do you think? Is the solution overkill? Or is there a simpler solution out here?
John (Answer)
So the PRG pattern is a complicated way of essentially doing a post request to the server, which then redirects somewhere else to the external content. So Googlebot will never find that link. And from my point of view, this is super overkill. There’s absolutely no reason to do this. Unless there’s really like a technical reason that you absolutely need to block crawling of those URLs, I would either just link to those pages normally, or use the rel=nofollow to link to those pages.
There’s absolutely no reason to kind of go through this weird post redirect kind of pattern there. It just causes a lot of server overhead and makes it really hard to cache kind of like that request and take users to the right place. So I would just use a nofollow on those links, if you don’t want to have them followed. The other thing is, of course, is like just blocking all of your external links, that rarely makes any sense.
Instead, I would make sure that you’re kind of like taking part in the web as it is, which means that you link to other sites, naturally, they link to you naturally. Kind of like taking part of the normal part of the web and not trying to like, keep Googlebot locked in to your specific website, because I don’t think that really makes any sense.