Faq Why Google Search Console reports 'Couldn't fetch' or 'Could not be read' for the XML sitemap?
Why Google Search Console reports 'Couldn't fetch' or 'Could not be read' for the XML sitemap?

This is a known bug of the Google Search Console.
If you receive the red status 'Couldn't fetch' or 'Cound not be read' after attempting to submit an XML sitemap to the new Google Search Console generated by JSitemap Pro or any other tool, try to wait some time and refresh the page. The sitemap should be marked normally as green 'Success', indeed the new Google Search Console replaced the status 'Pending' of the previous Webmasters Tools with 'Couldn't fetch' that is misleading.
Additionally starting since February 2019 there is a bug in the new Google Search Console, if this is your case after having submitted a sitemap link wait some time then refresh the page or delete and resubmit the sitemap link multiple times. As an alternative, submit the sitemap link using the JSitemap integration in the Joomla! backend.
Keep in mind that if you are dealing with a brand new website, the Google Search Console could report the message 'Processing data, please check again in a few days' in the 'Coverage' section and you may have to wait several days in order to have Googlebot crawling and processing your website and sitemaps.

Sitemap not fetch

If instead the red status 'Couldn't fetch' persists after some weeks, double check if you have a problem on your server or Joomla! website to ensure that the Googlebot is not blocked in some way and can't reach any URL submitted.

The first thing to do is to open the XML sitemap link in the browser to check if the XML document is properly generated and accessible. In this case, to double check that the sitemap is validating correctly, you can perform a test on https://www.xml-sitemaps.com/validate-xml-sitemap.html submitting the XML sitemap link.
Once you are sure that there is no problem on the sitemap side, most probably there could be an issue at the server level or an issue related to some installed plugins, such as Akeeba Admin Tools, RSFirewall, etc. Indeed there are certain extensions that could block Googlebot and prevent your website to be accessed by Google; if you have similar extensions installed, try to disable them.

You can perform an additional test to simulate how Googlebot sees your website, open the following URL and submit the sitemap link to test if it's correctly accessible: https://technicalseo.com/seo-tools/fetch-render/.
If there are no issues detected even with this test, the problem could be solely related to your server, so you have to contact the hosting provider to further analyze it. In particular, it's important to check the validity of the HTTPS certificate, any usage of server cache and CDNs such as Cloudflare that could cause similar problems to Google.

It's also important to check if the domain registered in the Google Search Console matches exactly with the sitemap link that has been submitted and with all URLs included in the sitemap. Be sure to have registered the property using the 'URL prefix' method and that all domains match exactly including www/non-www: property domain, sitemap domain, sitemap URLs domain must all be the same.

If also the server has no issues and all configurations are correct, then it's definitely a bug of the Google Search Console.

You find some additional resources about this argument at the following links:

If you are using Akeeba AdminTools pay attention to the feature named ".htaccess Maker" if you enable "Frontend Protection" and create the htaccess file, indeed because of this a sitemap file could not be fetched. In order to allow access to XML files of the sitemap you can add the sitemap files names in "Exceptions : Allow direct access to these files", and file type "xml" in "Frontend file types allowed in selected directories".