One SEO professional asked John Mueller about hiding links to other sites in order to pass link juice.
Their question was: They want to mask links to external sites to prevent the passing of link juice. They think the PRG approach is a potential solution. What do you think? Is this overkill? Is there a simpler solution?
John explains that the PRG pattern is a complicated way of doing a post request to the server, which then redirects somewhere else to the external content.
Googlebot will, essentially, never find that link.
From John’s perspective, this is super overkill. There’s no reason to do this unless there’s really a technical reason that you absolutely need to block the crawling of these URLs. John recommends linking to these pages normally, or using the rel=nofollow to link to those pages.
There’s absolutely no reason to go through this weird post redirect pattern there. It just causes a lot of server overhead and makes it really hard to cache this kind of request and take the user to the right place.
John would just use a nofollow on these links, if you don’t have them followed.
The other thing is, of course, just like blocking all of your external links, that rarely makes sense. Instead, John recommends ensuring that you’re taking part in the web as it is, which means that you link to other sites naturally.
They link to you naturally, and so on. Essentially just taking part in the normal part of the web and not trying to keep Googlebot locked onto your specific website.
This happens at approximately the 20:00 mark in the video.