An SEO professional wondered about problems they were having with Google AdsBots.
They want to block the ads URL from Googlebot in the robots.txt file. They were curious whether this would impact their Google ads, and if this would cause ads not to work normally.
John explained that the AdsBot does not follow the usual robots.txt directives. You must use the user-agent directly for Google AdsBot.
However, he does not know what the exact impact would be on the ads side if you block ads. His understanding is that Google uses this primarily as a quality check for the landing page.
The SEO professional then asked about the fact that they found that Googlebot crawled their ads URLs more so than their normal URLs. The thing they want Googlebot to do is promote their product pages more, instead of the ads.
In order to do this, they want to block ads URLs from Googlebot using robots.txt.
John explained that that is perfectly fine. If you block the ads landing pages from Googlebot, that is just fine. However, he does not know the impact if they were to block the ads pages from Google AdsBot.
The SEO professional explained that this is not something they do. Their next question was whether Google AdsBot shared the same crawl budget with Googlebot.
John replied that yes, Google AdsBot does share the same crawl budget with Googlebot.
It’s possible that crawl requests from Google AdsBot will increase and Googlebot can crawl less. John also said that it settles down fast.
This caused the SEO professional to express their concern that even if they’re blocking the ads URLs in Googlebot, Googlebot may still crawl too many ads URLs.
John then said that if they’re blocking ads landing pages for Googlebot, then Google should never ever be crawling the ads landing pages with Googlebot. It would only be the AdsBot that is actually crawling it.
This happens at approximately the 7:08 mark in the video.
Looking for a new way to improve your SEO audits? Our Ultimate SEO Audit Template could be right up your alley!
John Mueller Hangout Transcript
SEO Professional 4 7:08
Hey, John. Good morning. It’s me again. So we want to ask some questions about Google AdBots. So do you think disallowing ads URL to the Google bots in the robots.txt will affect our Google ads, and maybe that will cause ads not to work normally?
John 7:33
So as far as I know, AdsBot doesn’t follow the normal robots. txt directives, you have to use the user agent directly for it. But I don’t know what the effect would be on the ad side if you block AdsBot. My understanding is we use this as a way to do kind of, like quality checks for the landing page. And I, I don’t know what that would mean, from a quality point of view for ads, if you don’t let AdsBot check these pages.
SEO Professional 4 8:06
The reason we talked about this question is that we’ve recently found that Googlebot crawled our ads URL more than normal page URLs. But we want Google bots to promote normal product pages instead of these ads. So just think, if it’s ok to block the ads URL for Googlebot?
John 8:30
Sure. That’s, that’s perfectly fine. Blocking, blocking the ads landing pages for Googlebot is fine. If you, again, if you block the ads pages for AdsBot then that’s something you’d have to check with the ads folks.
SEO Professional 4 8:46
Yeah, that’s not what we do. So also, about Googlebot: does AdsBot share the same crawl budget with Googlebot?
John 8:58
Yes. Yes.
SEO Professional 4 8:59
So that means maybe the crawl requests from AdsBot increase and maybe Googlebots crawl less?
John 9:08
Yes, it can happen. Usually, it settles down fairly quickly. I don’t know the details from the ad side. But my understanding is, when you submit new ads campaigns, then the AdsBotchecks all of those pages. And once that’s checked, then they don’t need to be checked as often as usual.
SEO Professional 4 9:31
Okay, I see. So if, like, if we’re blocking the ads URLs in the Googlebot, Googlebot may still crawl too many ads URLs. What can we do? Maybe some measures to handle this situation to let Googlebot crawl more product pages?
John 9:58
I think if you’re blocking the ads landing pages for Googlebot specifically, then we shouldn’t be crawling the ads landing pages with Googlebot. It would really only be the AdsBot that’s crawling it.
SEO Professional 4 10:15
Because the previous time we blocked some certain pages in the robots.txt, but we still see that sometimes Googlebots still crawl these pages, even though we block them. So we just want to make sure that it’s working.
John 10:33
Now, it should never be the case that if we recognize the robots.txt file that we still crawled the URL anyway, from Googlebot. So that feels like either something we would have to know as soon as possible. Or maybe some mistake with things like upper and lowercase in the URLs are not exactly the right path, the being included in the robots. txt file.
SEO Professional 4 11:03
Okay, so normally, if we blocked, if we blocked that in robots. txt, it shouldn’t be crawled by Googlebot?
John 11:11
Yeah. And if you see situations where the AdsBot crawling, like severely drowns out everything else, I would use that contact form that we have in the Help Center to report problems with Googlebot. I think you’ve used it before in the past as well. Because that also goes through the Googlebot team, and they can kind of help to distribute the requests a little bit better, and contact the ads team and tell them to slow down with this website, those kinds of things.