As Gainsight continues to expand its suite of native AI tools, such as Write with AI, Cheat Sheet, CoPilot, and Staircase, it would be incredibly valuable to include OOTB analytics and insights into the internal adoption of these AI features.
Today, admins are often left relying on anecdotal or qualitative feedback from users to understand how (or if) these AI features are being leveraged. However, this user reported data can be inconsistent and limited. Having clear, quantitative insights directly in Gainsight CS would help admins:
- Evaluate adoption trends of specific AI tools
- This, and the following point, become increasingly relevant as organizations seek to measure the value of AI tools
- Identify which AI features are driving engagement or saving time
- Understand where further training, enablement, or awareness might be needed
- Track usage over time and by user/team for deeper operational visibility
While any reliable OOTB reporting on AI feature-specific usage would be a significant improvement, it would be logical to expand the Gainsight 360 > Feature Usage dataset to include detailed tracking for native AI tool usage. Ideally, each AI capability (Write with AI, Cheat Sheet, CoPilot, etc.) would be broken out as its own enum so admins can view individual adoption rates and trends.
It would also be helpful to have the ability to build reports and dashboards using this AI utilization data, and potentially tie it to outcomes like CTA closure rates, timeline activity, or overall productivity metrics.
As AI becomes a bigger part of the Gainsight experience, it’s crucial for admins (and their organization) to be able to measure its actual impact and adoption internally. These insights would help teams better understand how their CS teams are engaging with AI, where there may be resistance or untapped value, and how Gainsight’s AI investments are translating into daily usage and efficiency (ROI).