Skip to main content

These are the core tenets that I’ve picked up about how to measure the effectiveness of a CS Ops team. How do you measure your CS Ops team’s success?

Team Throughput

It starts with tracking simple CS Ops activity. You can use some metric like features deployed. Since a large portion of our CS Ops team uses sprints, we can show "points delivered". @kendra_mcclanahan (Gainsight’s Director of CS Ops) has even built mechanisms in Monday.com so that those points are compared against expected vs. actual, and are reported broken down by high-level strategic initiatives and major areas of run-the-business responsibilities.

Leading Indicators of Impact

You’ll then want to start to collect signs that your CS Ops team is actually starting to make a difference. For example, @jennifer_bruno (Gainsight’s leader of CS Enablement & Transformation) runs a survey of our CSMs about how enabled they feel, how useful our health scores are, and so on. She coordinates with the rest of CS Ops about what areas of impact we want to track. You might even have leading indicators that are related to specific projects. For example, if a short-term initiative is to improve new CSM onboarding, you might, for a period of time, track how soon those CSMs have a first strategy-alignment call with customer decisionmakers.

Lagging Outcomes

CS Ops is essentially targeted at improving 4 things:

  • CS efficiency (account ratios)
  • Retention rate (GRR)
  • Expansion rate (NRR),
  • Advocacy (new logo $$$ influenced)
  • Every CS Ops team has different focuses, and a given team's focus will change over time, depending on the company's strategic priorities. For example, if you acquire a new company, “CS efficiency” could jump from being last priority to first, in a heartbeat.

The best way I've heard to measure lagging outcomes is by comparing cohorts against each other. Keeping with the example of before, if you embarked on new CSM onboarding improvements because you felt like that was the biggest reason that your retention was struggling, then you can track the retention rate of customers owned by the CSMs who went through that program, versus the CSMs who did not. Or, in the case of my own work on community-building (since I sit within our broader CS Ops team), I’m working to pull together data on retention, expansion, and so on of customers who engage in our community, versus those who don’t.

Totally agree with this! Here are some indicators I’ve been playing with as well. Looking forward to hearing from others!

-CSM Reach. We centralize our CSM questions to one CS Ops Slack channel. Looking at the number of questions and issues  we’ve answered and addressed there was insightful and showed how much time we saved while also broadcasting knowledge out to the entire team and identifying gaps in our internal knowledge base,

-Documentation Reach. Last quarter I reported on how many knowledge articles we created and updated. It’s a more quiet indicator, but more documentation and updated documentation means less time for CSMs spent searching for information. 

-Collaboration. While our number 1 stakeholder will always be CS, we are key collaborators for multiple different teams. We reported on how much we delivered not only for CS but for other team within the company. It can sometimes be less visible and tangible to CS leadership but it’s an important part of measuring our effectiveness.


@owbou Such great ideas! It reminds me that we’re also covering a couple of those:

  • CSM Reach: We have Halp integrated with a Slack channel for all questions about our own implementation of Gainsight, which both helps manage the flow and gives us reporting and categorization of what comes up. I just haven’t seen what actual reports @kendra_mcclanahan  pulls from it.
  • Documentation Reach: We’ve thrown ourselves as a company into using Coda for our internal guidebook, and @jennifer_bruno has been leading an effort to build out section after section for CSMs. There’s a fuzzy line, though, between this metric and “sprint points delivered”. Your point makes me wonder, though, if Coda could give us usage metrics. We do track documentation usage (and “Knowledge Check” quiz completion) as part of any major new workflow that we’re rolling out to CSMs.

Thank you both @seth and @owbou for sharing!! 
Our CS Ops team is very new and I’m newer, haha, so we are eager to learn how experienced folks manage and measure the outcomes of their CS Ops teams 😊


@AnnieMo @jenniferpa @kerri.lindstrom Thought you might be interested in this post.


Reply