Here is a example of a robots.txt file with this set up:
User-agent: *
Crawl-delay: 5
Sitemap: http://community.insided.com/feed/topics
Disallow: /search/activity/recent
Disallow: /search/activity/sincelastvisit
Disallow: /search
Disallow: /topic/new
Disallow: /inbox/overview
Disallow: /inbox/conversation?
Disallow: /badges
Disallow: /search/activity/unanswered
Disallow: /example-category-to-not-crawl-44
Disallow: /moderation-board-45
Disallow: /members/
Allow: /media/terms-and-conditions.pdf
We have also included a lot of disallow rules for the most requested items - how to stop various unimportant platform pages from coming up in Google search results. This is especially useful for categories and content you would like to discourage from Google results.
Also note that the url for the XML feed is the same for all communities and can be accessed just by adding /feed/topics to the end of your community url: http://community.insided.com/feed/topics
The RSS feed contains the most recent items to be added to the community, it cannot however be customised.
This tutorial follows on from the article we have regarding the construction of a robots.txt file: https://community.insided.com/how-to-s-37/how-to-setup-a-robots-txt-file-1358