Google’s John Mueller answered a question about why a Search Console was providing a sitemap fetch error even though server logs show that GoogleBot successfully fetched it.
The question was asked on Reddit. The person who started the discussion listed a comprehensive list of technical checks that they did to confirm that the sitemap returns a 200 response code, uses a valid XML structure, indexing is allowed and so on.
The sitemap is technically valid in every way but Google Search Console keeps displaying an error message about it.
The Redditor explained:
“I’m encountering very tricky issue with sitemap submission immediately resulted `Couldn’t fetch` status and `Sitemap could not be read` error in the detail view. But i have tried everything I can to ensure the sitemap is accessible and also in server logs, can confirm that GoogleBot traffic successfully retrieved sitemap with 200 success code and it is a validated sitemap with URL – loc and lastmod tags.
…The configuration was initially setup and sitemap submitted in Dec 2025 and for many months, there’s no updates to sitemap crawl status – multiple submissions throughout the time all result the same immediate failure. Small # of pages were submitted manually and all were successfully crawled, but none of the rest URLs listed in sitemap.xml were crawled.”
Google’s John Mueller answered the question, implying that the error message is triggered by an issue related to the content.
Mueller responded:
“One part of sitemaps is that Google has to be keen on indexing more content from the site. If Google’s not convinced that there’s new & important content to index, it won’t use the sitemap.”
While Mueller did not use the phrase “site quality,” site quality is implied because he says that Google has to be “keen on indexing more content from the site” that is “new and important.”
That implies two things, that maybe the site doesn’t produce much new content and that the content might not be important. The part about content being important is a very broad description that can mean a lot of things and not all of those reasons necessarily mean that the content is low quality.
Sometimes the ranked sites are missing an important form of content or a structure that makes it easier for users to understand a topic or come to a decision. It could be an image, it could be a step by step, it could be a video, it could be a lot of things but not necessarily all of them. When in doubt, think like a site visitor and try to imagine what would be the most helpful for them. Or it could be that the content is trivial because it’s thin or not unique. Mueller was broad but I think circling back to what makes a site visitor happy is the way to identify ways to improve content.
Featured Image by Shutterstock/Asier Romero