J!Extensions Store™
Faq Why AMP cache pages have error "Blocked by robots.txt" in Google Search Console?

Why AMP cache pages have error "Blocked by robots.txt" in Google Search Console?

Created on:Wednesday, 06 June 2018 00:00

This error in the Google Search Console is normal and depending on the Google management of cached URLs. Indeed Google needed to disallow direct crawling of cached URLs using a robots.txt file at the CDN domain: https://cdn.ampproject.org/robots.txt

More informations and policy about crawling Google AMP Cache URLs by bots can be found here: https://docs.google.com/document/d/1V_uLHoa48IlbFl7_3KWT_1JmCf6BnFtt3S_oR4UsasQ/edit


Google wants to serve cached URLs on its own without the need of direct indexing of those pages by crawlers that must index only actual URLs.

This misleading error message appeared recently in the Beta version of the Google Search Console and it will be probably fixed at a later stage by Google.


To avoid false positive error messages in the Google Search Console and to ensure a correct and quick indexing of new AMP pages, consider to switch off the AMP CDN cache parameter under plugin settings:

AMP cache

Notice that activating the AMP CDN cache may prevent links to be indexed, thus if you want to activate this feature it's strongly recommended to wait a few weeks until AMP links are well indexed by Google. It's not uncommon that using AMP CDN cache new AMP pages were slowly indexed so based on our past experience it's recommended to not use this feature or at least use it only if you have several static pages already well indexed.