Skip to main content

Introduction:

Hey everyone! I'm excited to share our journey in automating our end-to-end process for self-adoption assessments and scorecard updates. This initiative has significantly improved our efficiency and accuracy, and I believe our learnings can help you too! We've streamlined everything from survey delivery to scorecard updates, and I'm going to walk you through how we did it.

Our Automated Process: An Overview

Our automated process has several key steps:

  • Survey Creation and Delivery: We kick things off by creating and delivering surveys to our customers.
  • Survey Completion: Once the customer completes the survey, the process moves forward.
  • Backend Data Management: This is where we handle the data behind the scenes.
  • Manual Intervention (as needed): For our high-touch customers, we've built in a step for manual intervention to ensure accuracy.
  • Automation Engine: The rules and logic that drive the automation.
  • Scorecard Updates: Finally, updating the scorecards within Relationship 360.

Delivery Methods: High Touch vs. Low Touch

Like many of you, we have different service delivery methods:

  • Scaled Method (Low Touch/No Touch): For these customers, we use Gainsight Journey Orchestrator to send surveys with an initial invite and a reminder.
  • High Touch Method: Here, CSMs have more control. A CTA is triggered, allowing them to decide when and how to deliver the survey through email assist tasks within the playbook. This gives them the ability to target specific contacts and personalize their communication to better align to their current customer engagements.

Key Tip: We built the survey at the Relationship level which is how we have our Products mapped in Gainsight. Thus the contacts that will be used in either delivery method need to be in the Relationship Person object. 

Internal Notifications and Survey Responses

  • Once a survey is completed, we have a JO program to send internal notifications to our team with a link to the survey response. This allows the CSM to view the entire survey response, including who completed the survey, for which company and product, and provides a comprehensive view of all responses.
  • These notifications also include links to a Wiki site with process details, next steps, and best practices, as well as a link to the survey response itself. 

Data Management: The Backbone

Data management is crucial! Here are some of the ways we handled it:

  • Low Volume Object: We used a low volume object to mirror the survey design and allow for editable fields. This provides flexibility for managing updates. These objects are also used to create reports.
  • Custom R360 Layout: We created a custom R360 layout so CSMs can easily see adoption assessment results and related reports, specific to the product.

Key Tip: This step is only needed because we wanted to add the next step for manual intervention in our high-touch accounts. 

Manual Intervention: Ensuring Accuracy

  • We recognized that high-touch customers often have complex organizations, and survey respondents may not always have a complete understanding of every product feature.
  • To address this, we built in a manual intervention step that allows CSMs to modify survey results without altering the original survey data.
  • By default, all survey responses are included, but CSMs can exclude entire responses or modify answers to specific questions.

Real-World Example: We had a customer who answered "yes" to using a feature, but the CSM knew they weren't. The manual intervention step allowed the CSM to correct this, ensuring accurate scorecard calculations.

Automation Engine: Keeping It Running

The automation engine is what keeps the whole process running smoothly.

  • Data Designer: We use a data designer to create weighted averages for each product feature, based on the survey results.
  • Rules Engine: We created a multi-step rule chain to automate the process; the first rule focuses on populating our low volume object from our initial set of survey data and the second rule takes the calculations defined in our data designer and applies them to the data within our low volume object, loading the results to the Relationship scorecard measures. 

Scorecard Updates: The Final Step

  • The rule that loads the scorecard measure runs weekly so there may be a delay if a manual override is updated. 

  • To maintain transparency and provide valuable context, each update triggers a detailed timeline entry, explaining the reason for the change and the contributing average response. This clear audit trail empowers our CSMs with a comprehensive understanding of the data driving the scorecard.

Key Takeaways:

  • Automation can significantly streamline self-adoption assessments and scorecard updates.
  • Providing options for both low-touch and high-touch customers is essential.
  • Manual intervention capabilities are crucial for ensuring data accuracy, especially with complex customers.
  • Gainsight’s low volume objects, data designer, and rules engine are powerful tools for building this type of automation.

I hope this overview is helpful! I'm happy to answer any questions you have.

 ​@Carol_Keyes  - This is gold!

Thank you for sharing! 


@Carol_Keyes This is extremely helpful. Thanks for sharing this! 


I am so proud of our technical team @BMC who were able to develop and execute on this vision!!!  Such a great partnership between the Business and Operations!


@Carol_Keyes We have a very similar setup for our Healthfest program.

Technically speaking, the problem I find is the Gainsight’s survey’s lack of connectivity with data management. Pushing survey data with rules is… 

Ideally, we should be able to sync questions with fields in data management to skip the need for a rule, which becomes a nightmare whenever a multi-picklist is involved (I exaggerate: whenever a picklist (single or multi, no discrimination, as maintaining picklists in MDA and in Surveys is 😰 - country picklist 🙅...), because ultimately: the data we collect is always going to end up either in standard object or a custom object. It won’t just sit in survey flattened objects… NPS doesn’t. Why would others do? 

We should be able to link questions to MDA fields so responses would sync on their own in whatever object we determine. At the very list, picklist questions should be connected to data management picklists. 

@lloonker what we discussed the other day


This was a great read, and an interesting approach to allow “override” to correct those inaccuracies. Thanks for sharing!
And 100% agree with you ​@alizee! Went looking for the idea to link here and of course, I already voted and was yours 🤣

 

 


Reply