Skip to main content

With the change in functionality of the In-line survey question not getting partially saved when clicked, we are seeing MAJOR drops in survey clicks and submissions. It seems that respondents believe they are responding to the survey just by clicking a number in the inline question, mainly because the number they select in the inline question transfers over to the new screen. 

 

Recommendation: Allow us to choose whether we want those to be partially saved (or mark those responses uniquely). I understand the reason the change happened, to avoid tracking false responses from Mimecast, but this change wasn’t widely communicated and is, therefore, causing a lot of internal challenges. My CX team is unhappy with the rates, and we are seeing the lowest submitted rates in 8 years. At this point, without the option to change this, our CX is no longer interested in using GS for NPS sends. 

When was this change implemented, exactly? It may explain our drop-off-the-cliff metrics when we tested embedding a question.

Regardless, how are there not system mechanisms that can determine bot clicks vs human clicks? If you are able to do this to minimize false positive reporting on email opens and clicks due should it not also be possible to differentiate an automatic click vs someone responding to the survey? 

This is a terrible change in feature behavior that wasn't widely publicized.


What?! This completely caught me off guard.

Our NPS email survey audience is already pretty small (we do most via in-app), and the response rates have always been tiny, but this is really bad news. We'll definitely have to refrain from using inline questions moving forward

FYI ​@Molly.McQ ​@vineshgvk 


Well that’s an unwelcome development.

The very least should have been wide, loud and clear communication about this change !


This is really frustrating, and as ​@alizee noted, it’s surprising that this change wasn’t more widely communicated- especially given how significantly it impacts emailed survey programs. The email audience is tough enough to reach as it is, and based on general community feedback, I know we aren’t alone in this. 

At a minimum, Gainsight should have flagged this more clearly and ideally, gathered feedback from the product council or admin community before rolling this out. Giving teams the option to capture and identify partial responses, rather than removing that functionality altogether, feels like a more balanced approach that better mitigates risks without losing valuable insights from real users or sacrificing visibility into relevant engagement details. I would really like to understand the decision-making process associated with this decision. 


This is really frustrating, and as ​@alizee noted, it’s surprising that this change wasn’t more widely communicated- especially given how significantly it impacts emailed survey programs. The email audience is tough enough to reach as it is, and based on general community feedback, I know we aren’t alone in this. 

At a minimum, Gainsight should have flagged this more clearly and ideally, gathered feedback from the product council or admin community before rolling this out. Giving teams the option to capture and identify partial responses, rather than removing that functionality altogether, feels like a more balanced approach that better mitigates risks without losing valuable insights from real users or sacrificing visibility into relevant engagement details. I would really like to understand the decision-making process associated with this decision. 

At this point, without the option to change this, our CX is no longer interested in using GS for NPS sends. 

 ​@dstokowski ​@manu_mittal I don’t know who the PM is for this, otherwise we’d tag them, but this seems like a not insignificant issue and a legit concern. 

I’ve been trying to convince my org to use this instead of SurveyMonkey - now I’m less inclined to do so.


I wanted to let everyone know that my team made a fuss with our CSSC and asked GS engineering to pull the data from the initial inline question click, but we were told they could not. They also mentioned in the email back to us that: 

“This behavior was modified a few years ago due to concerns raised by some customers. Many customers use cybersecurity tools (e.g., Mimecast by GS) to verify email content. These tools may open links within the email to check for phishing threats before the email reaches the actual recipient. If such tools open an inline survey link, the response could be inadvertently recorded in Gainsight before the user even sees the email.

To prevent this, we removed the auto-save feature for responses triggered by clicking an inline survey question from the email. Now, Gainsight only captures the response if the user opens the survey page and either clicks the submit button, answers another question, or changes their answer.

 

I am surprised by their response that this change happened “a few years ago,” given the reaction from this group. 


Some food for thought regarding inline NPS questions: Our team recently did some analysis and found that customers who score us a 0 on an NPS survey are 38% more likely to churn. This data has caused a flurry of new activity and actions in our CS org. Making scores of 0 extremely significant and important information for us. 

Who’s most likely to give up on submitting a survey when it opens in a new window?
Probably the angry ones who were most inclined to give us that 0 score. 


Some food for thought regarding inline NPS questions: Our team recently did some analysis and found that customers who score us a 0 on an NPS survey are 38% more likely to churn. This data has caused a flurry of new activity and actions in our CS org. Making scores of 0 extremely significant and important information for us. 

Who’s most likely to give up on submitting a survey when it opens in a new window?
Probably the angry ones who were most inclined to give us that 0 score. 

 

Just bouncing off of your last note ​@dcassidy. Global benchmarks show angry people are more vocal than happy people so I would actually say the angry people will go all the way through regardless of how cumbersome it might be. 


“This behavior was modified a few years ago due to concerns raised by some customers. Many customers use cybersecurity tools (e.g., Mimecast by GS) to verify email content. These tools may open links within the email to check for phishing threats before the email reaches the actual recipient. If such tools open an inline survey link, the response could be inadvertently recorded in Gainsight before the user even sees the email.

To prevent this, we removed the auto-save feature for responses triggered by clicking an inline survey question from the email. Now, Gainsight only captures the response if the user opens the survey page and either clicks the submit button, answers another question, or changes their answer.

 

I am surprised by their response that this change happened “a few years ago,” given the reaction from this group. 

I’b be hard pressed to believe this timeline, and again iterate, if you can determine bot behavior and remove it from email analytics, why can’t you determine it for survey engagement (it’s the same click behavior). This change in analytics for email released August 2024. Exception to consider would be that they didn’t have a solution and did in fact removed the feature, but ignored email metrics until they had a solution?


Just to clarify, this change was made in 2019 when we started seeing a spike in fake clicks being saved as NPS responses (mostly ‘0’ since it’s the first option). Now that we have bot-click detection, totally agree we should improve this. We’re on it!


I had no idea about the bot click detection! My assumption was that no such feature existed.

When looking at email click data in Gainsight, I found that some Companies had every recipient click every link in every email that was sent.

Intuitively, I felt that this was highly unlikely, and that simplest explanation was bot generated clicks. Perhaps there is another explanation?

Best,

Ben Wanless


I had no idea about the bot click detection! My assumption was that no such feature existed.

When looking at email click data in Gainsight, I found that some Companies had every recipient click every link in every email that was sent.

Intuitively, I felt that this was highly unlikely, and that simplest explanation was bot generated clicks. Perhaps there is another explanation?

Best,

Ben Wanless

 

Probably that bot click detection isn’t working perfectly just yet.