I was asked to provide response rate data to our board earlier this month. During this process, I started to notice some inconstancies in how the "Response Rate" is calculated under the Surveys section. I think I identified the situation and here is my example:
Participants: 1000
# of Survey Emails: 2 (initial and follow up)
Maximum # of Emails: 2000
Say that 100 people reply to the survey and all of those wait to reply until they receive the follow-up email. This would me we have 2000 emails sent and 100 survey responses. I would expect that the response rate for this survey is 10% (100 submissions divided by 1000 invited participants).
HOWEVER, I am seeing the response rate in a scenario like this be 5% (100 responses divided by 2000 emails sent). In my opinion, the denominator should be the # of participants, not the number of emails.
Am I missing something?
Do others see this as well?
Thanks!
Page 1 / 1
Your are absolutely correct, Ben. That is how it should work. I'm requesting that Product Management weigh in on plans ot address this.
Thanks Tracy! I found a way around it with some reports and Excel magic, but not very handy. Our VP wants quick access to Response Rates for our board, so keeping this manageable for me is important.
Hi Ben,
Can you elaborate on where you see this discrepancy?
I have attached this screenshot from Advanced Outreach analytics for a survey and as you can see the survey response rate is calculated based on the number of participants and not the email sent. In this case, 20 participants responded compared to the 108 participants who were sent the email which comes out to 19%.
Thanks
Abhishek S
Can you elaborate on where you see this discrepancy?
I have attached this screenshot from Advanced Outreach analytics for a survey and as you can see the survey response rate is calculated based on the number of participants and not the email sent. In this case, 20 participants responded compared to the 108 participants who were sent the email which comes out to 19%.
Thanks
Abhishek S
Hey Abhishek,
Thanks for the note! That actually makes a lot more sense when looking at the Advanced Outreach side of things. Based on a review of your notes/screenshot and our system, I think the AO section is correct and easily reference able.
I had been looking at the "Analyze" tab inside the Survey area. I still feel that this may be broken. The top horizontal column is titled "Sent". The next one is "Responded". There are no issues for a survey with one email in the chain. However, if a chain has the potential for multiple "Sent" emails to a single participant, you cannot trust the current Response Rate calculated here. Most of our survey workflows have an initial email and a delayed follow up if they have not completed the survey. This dilutes the pool and artificially lowers Response Rate calculated under Surveys >> your survey in question >> Analyze.
I am very happy to see that, for the time being, I can reference the AO section. I would still encourage more conversation around this because, for our org, we don't always have a 1:1 ration of Survey : AO. Sometimes we have used the same survey in multiple AOs. This would make response rate slightly more difficult. It would be nice if we could get the data from either the AO or the Survey itself (and have it be the same).
Thank you!
Thanks for the note! That actually makes a lot more sense when looking at the Advanced Outreach side of things. Based on a review of your notes/screenshot and our system, I think the AO section is correct and easily reference able.
I had been looking at the "Analyze" tab inside the Survey area. I still feel that this may be broken. The top horizontal column is titled "Sent". The next one is "Responded". There are no issues for a survey with one email in the chain. However, if a chain has the potential for multiple "Sent" emails to a single participant, you cannot trust the current Response Rate calculated here. Most of our survey workflows have an initial email and a delayed follow up if they have not completed the survey. This dilutes the pool and artificially lowers Response Rate calculated under Surveys >> your survey in question >> Analyze.
I am very happy to see that, for the time being, I can reference the AO section. I would still encourage more conversation around this because, for our org, we don't always have a 1:1 ration of Survey : AO. Sometimes we have used the same survey in multiple AOs. This would make response rate slightly more difficult. It would be nice if we could get the data from either the AO or the Survey itself (and have it be the same).
Thank you!
+1 to this idea, it would be really helpful if we could pull response rate data and aggregate across multiple Programs.
So I have developed some ways to do this via reports and rules. It's hacky, but doable.
Reply
Sign up
If you ever had a profile with us, there's no need to create another one.
Don't worry if your email address has since changed, or you can't remember your login, just let us know at community@gainsight.com and we'll help you get started from where you left.
Else, please continue with the registration below.
Welcome to the Gainsight Community
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.