Forum
Welcome, Guest
Please Login to access forum.
Re:Error in robots.txt file after upgrading to v. 3.5 (1 viewing) 
Go to bottom
TOPIC: Re:Error in robots.txt file after upgrading to v. 3.5
#2991
John Dagelmore
Admin
Posts: 3722
User Online Now
Re:Error in robots.txt file after upgrading to v. 3.5 Karma: 79  
I agree with you, indeed we modified the robots.txt edit.
The problem is mainly due to the rule 'Disallow: Googlebot', even if the access is given only to '.js' and '.css' resources the generic block is skipped by Google.
It's better to avoid it and simply remove the "Disallow: /media/". This is our conclusion after several tests.

Everywhere seems that the correct and easy way is adding the lines:

User-Agent: Googlebot
Allow: /*.js*
Allow: /*.css*
http://upcity.com/blog/how-to-fix-googlebot-cannot-access-css-and-js-files-error-in-google-search-console/

But as you pointed out this solution seems not fully correct.

Thanks for your help.
 
Logged Logged  
  The administrator has disabled public write access.
Go to top