J!Extensions Store™
Forum
Welcome, Guest
Please Login to access forum.
Error in robots.txt file after upgrading to v. 3.5 (1 viewing) 
Go to bottom
TOPIC: Error in robots.txt file after upgrading to v. 3.5
#2988
Better Web
Fresh Boarder
Posts: 5
User Offline
Error in robots.txt file after upgrading to v. 3.5 Karma: 0  
Hi,

I notices that when you install the new version of JSitemap pro 3.5, the robots.txt gets edited.

Specifically these lines are added to the top:

User-Agent: Googlebot
Allow: /*.js*
Allow: /*.css*
Allow: /*.png*
Allow: /*.jpg*
Allow: /*.gif*

I understand the intention, but unfortunately this gives access to EVERYTHING on the server.
If you test, for instance "administrator/index.php" in the robots.txt test tool on the Google Search Console, you will see it's not blocked.

I guess it comes from the way robots.txt files work: you have to be specific about what you allow AND disallow.

So this works better:

User-Agent: Googlebot
Allow: /*.js*
Allow: /*.css*
Allow: /*.png*
Allow: /*.jpg*
Allow: /*.gif*
Disallow: /administrator/
Disallow: /cli/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /log/
Disallow: /logs/
Disallow: /tmp/

Googlebot will have access to ALL js, css, ... files on the server, EXCEPT those in the listed directories with the "Disallow" statement.
 
Logged Logged  
  The administrator has disabled public write access.
Go to top