Skip to main content

I am using a custom sitemap for our community website in robots.txt file. How can we remove the existing default sitemap after replacing its url in the robots.txt file?

At the moment it is not possible to disable the default sitemap.

That said, if you link your own in the robots.txt, there should be no issue with the fact that there is another one lingering somewhere.


Now, since we have the custom sitemap in the robot.txt file, the external crawlers will re-route to the custom sitemap. But there are again few external sites which relies on the sitemap of the site and will point to /sitemap.xml which in our case is having a different URL. So normally, how are these cases handled?


We don’t have a workaround in place.

The sitemap standard does not list any preferred filename as far as I know, the fact that 3rd party websites hardcoded a URL instead of reading it from robots.txt is up to them.


Hi,

Posting an update here - there is currently no way to remove the default site map unfortunately and no current work arounds as Bas mentioned. I will update here again if something changes this.


Reply