Skip to main content
Question

Gainsight's approach to GEO

  • January 21, 2026
  • 3 replies
  • 65 views

Suvi Lehtovaara
Forum|alt.badge.img

I would like to hear what is Gainsight's approach to GEO, meaning optimizing the community sites for AI bots and crawlers? We are going towards, or perhaps already living the era of no-clicks, instead of clicking the search tools (Google, ChatGPT etc.) are sourcing the answers from sites. People are less and less likely to visit our communities.

I would like to hear

  • how are you making sure that it’s is easy for the (key) AI tools have adopted the contents of the sites?
  • how key AI tools can fetch about the contents of the site through Google Index / Google Search API?
  • how AI tools are can fetch new data from the site?

3 replies

mitchell.gordon

AI tools and bots are actively scraping your site if its public. You can see it in metrics for page views and unique viewers. It has been confirmed from Gainsight that LLMs on VPNs have scraped my community multiple times. The only recommendations were to block all IPs from GEO regions.

 

Just happened again on the 12th for my community. Double the page views and unique viewers than any other day in the history of our site. We have to explain these data spikes to management. We have started reporting on additional numbers due to the artificial metrics provided.

 

To my knowledge, there is no solution for this. Gainsight would have to filter out the data and thats a gray area. Some will want to see it others wont. Some will say some of the data is bad and other will say its good. Id recommend getting together with a BI team to look at your raw API data and filter out what youd like.


Graeme Rycyk
  • Gainsight Product Manager
  • January 22, 2026

Hey ​@Suvi Lehtovaara,

@mitchell.gordon has a great answer here but let me post my thoughts and some steps that can be taken today to help with this. 

I run Product for our AI and Search teams on Customer Community, and this has been on my mind for a while now.

So llms.txt has been discussed as the new standard to drive AI crawlers to site but as this is not yet standard and might never be, still the best place to handle this is what is include in robots.txt. This is not a guarantee but helps or at least does not restrict the AI crawlers from accessing the data found on your community.

Some recommendations for AI Crawler rules inside robots.txt:

# ===========================================
# AI CRAWLERS
# ===========================================

# --- OPENAI ---
User-agent: GPTBot
Allow: /
User-agent: OAI-SearchBot
Allow: /
User-agent: ChatGPT-User
Allow: /

# --- ANTHROPIC ---
User-agent: ClaudeBot
Allow: /
User-agent: Claude-User
Allow: /
User-agent: Claude-SearchBot
Allow: /

# --- PERPLEXITY ---
User-agent: PerplexityBot
Allow: /

# --- GOOGLE ---
User-agent: Google-Extended
Allow: /

# --- APPLE ---
User-agent: Applebot-Extended
Allow: /

# --- COMMON CRAWL ---
User-agent: CCBot
Allow: /


Your robots.txt can be edited in Control and is found here: {your-community-name}.insided.com/seo, however for a use case of maximum AI visibility, I would suggest to use the wildcard. A long explicit list per bot is just noise unless you need specific per-bot rules.

The bottom line is currently, GEO is 80% traditional SEO, so with good technical SEO, fast page speed, applied server-side rendering, use of FAQ / Org schema, with content properly structured for extraction, and fresh content, everything that worked before still should apply to LLMs.


Cheers,

Graeme


Meredith.E
  • Gainsight Employee ⭐️
  • January 22, 2026

@Suvi Lehtovaara this is top of mind for me today so I was happy to see your post.  I’ll leave all the technical directions to ​@Graeme Rycyk because he’s awesome at the technical side.  

From a non-technical lens we’ve been discussing this and how community leaders can ensure their community content is surfaced across LLMs.

LLMs strongly favor:
• Short, declarative answers
• Clear problem → solution structure
• Explicit “best answer” signals
• Repetition of the same answer pattern across pages
• Brand authority markers (schema, navigation, About context)
They struggle with:
• Long comment threads
• Opinion battles
• Buried answers
• Inconsistent terminology
• No canonical “final” answer

As a community manager, I’d first ask if there are any tools in place (typically in marketing) that are measuring the LLM metrics today across the .com or others and see if Community can be added as a domain they are tracking. There are several 3rd parties specializing in these analytics today.

Getting a baseline prior to enhancing the community for LLMs forms the narrative with data.

Second, I’d use Gainsight’s Moderation AI Agent to ensure AI pre moderation is happening in my community.  There may be a good amount of time savings by leveraging this tool available in CC today. 

Third, I’d leverage the AI summaries in community moderation to see what’s new and conversations that need a best answer. Ensuring best answers on posts and discussions is a quick win to ensure content is surfaced on LLMs. 

The good thing about Gainsight is it’s already structured for SEO and LLM GEO content surfacing (assuming your community is not gated).  Having the best answer may help the LLMs pick up your community content in its surfaced results. 

Fourth, I’d go back to the tool tracking LLM and see how the data results have changed.

I'm interested to learn more about what you are all implementing. how you are measuring and any other tips and tricks you find along the way.  I’d love to hear about any external tools you are using in your org. to get the data across LLM/GPT search also as I’ve been looking at a few.  ​@mitchell.gordon  are you all using any 3rd party tools to measure the content surfacing on the LLMs?