In evaluating customer engagement, benchmarks tailored to the nuances of different user bases offer a vital perspective.
Completion rates from some of PX's top performers offer a benchmark, but success is relative—measured not just by broad numbers, but by the appropriateness of content to its audience and the context within which it is delivered.
Recognizing that the size of the engaged audience may differ across user bases, it becomes clear that completion rates are more indicative of success when paired with content type and user intent!
This approach allows for a clearer understanding of effective engagement and provides a more relevant yardstick against which to measure and drive user interaction.
With this in mind, let’s explore specific engagement metrics from Gainsight PX’s most successful customers, with a focus on how completion rates for various content types can inform best practices and strategic planning.
These benchmarks are taken from customers for 4-6k MAU (aside from the Dashboard Guide, which is used to illustrate the variables which arise as you scale your user-base).
Guide Engagement Metrics:
- Unused Feature Highlight: A strategy combining a video with tooltips activated for an unused feature garnered 230 views from a possible 4-6k MAU. It resulted in a completion rate of 5.1% for the three-step guide, suggesting a targeted approach can effectively draw attention to underutilized features.
- Tool Renaming Alert: For informing users about renamed features, hotspots proved effective with 1k views and a robust 14% completion rate from 600 unique users, underscoring the value of direct and clear notifications within the tool.
Survey Engagement Metrics:
- NPS Banner: This survey type had 1k views with a healthy completion rate of 13-15%. We have seen this completion rate increase when launched as a banner without disrupting user workflows. For more information on starting with in-app NPS check this post.
- Workflow-Specific CES: Focused surveys, such as the workflow-specific CES, demonstrated a higher engagement level with a 20-23% completion rate but from 99 views, indicating that when surveys are closely aligned with user workflows, they are more likely to elicit a response.
- CSAT Survey: Similarly, a general CSAT survey attracted 1k views and achieved a 17% completion rate, affirming its role in measuring customer satisfaction levels.
Larger Customers - Guides with Video:
- Dashboard Guide: A video guide designed to explain dashboard functionalities captured 117k views within a larger customer’s environment, accompanied by a completion rate of only 0.03%.
- This completion rate reveals a tendency for user engagement to diminish as the user base scales. This data point is crucial for larger customers, indicating a need for more engaging or succinct content to maintain user attention throughout the engagement.
- It also speaks to the important question of how success is defined, as even a small increase in adoption of dashboards for a user-base that size can be a huge win for the teams responsible.
In conclusion, while the benchmarks provided offer a glimpse into the potential of targeted user engagement, the journey does not end with the analysis of completion rates.
The next pivotal step for any company is to track feature adoption meticulously. Establishing baseline adoption metrics tailored to the unique context of your user base will empower you to measure progress accurately and refine strategies accordingly.
By setting these benchmarks internally, you create a customized metric of success that reflects the specific goals and user dynamics of your product. This data-driven approach will ensure that engagement strategies are not just successful by general standards but are truly resonant and effective for your unique user community!