Skip to main content

Hi folks,

do you use surveys for the visitors or members, to improve community´s user experience? How they have worked?

We use and I think you use as well.

Background:

We have run our community for years. We have started with pop up -surveys. First tweaks we made we adjusted the trigger time how quick survey triggers, after user landed to the community.

After those tweaks we got less feedback that survey is annoying because “I just wanted to read the topic and pop up filled my screen!#”%%&6”. So tweaks we made were ok.

Most of the feedback were mostly about our products. Only few feedback were for the community itself. And it is sad if you are community captain. It is ok for the company of course.

Then we made new changes to the survey. We choosed to get rid of the pop up -survey and we added floating feedback button to each and every community page. After this change, number of feedbacks dropped dramatically. I understand, especially when using mobile, floating buttons are difficult on small screen. When using desktop you dont pay attention to the button. Feedbacks we still got were about the products.

Now we have made latest changes. We asked from the users couple of months back, what kind of survey users would like to answer the most. They said that embedded is the best so we added HTML widget to the Topic sidebar and now we collect feedback in each and every topic (discussion, question, article).

We have 5 smileys there, where user can choose. We ask also to explain their response freely with words. Lastly we ask what people want to read in the community in the future. With this survey we want to improve our content.

Addition this embedded survey, we added pop up -survey to the community as well. With this survey we collect feedback about the overall experience about the community. We adjust the timing still, we dont want to annoy the users. We added pop up -survey because users said it is ok, if it doesn´t appear too soon.

We´ll see how this combination works.

 

--

 

I would like to hear your thoughts:

  • Have you used embedded surveys in the community and how they have worked?
  • What kind of survey type works the best you think? Pop up, embedded, floating feedback button, feedback button in the page or something else?
  • What kind of feedback you mostly get from the user surveys - feedback about the products or feedback about the community?

I´d say even I ask feedback about the content and the community, because most of the users come to our community to solve their problem and this is the only thing they can think about in that moment, this is the reason they give the feedback about the product. No matter what kind of survey type is in use and what I ask from them.

Thanks for sharing your insights and lessons learned along the way. 

I ran an “annual” survey for a month at the end of 2022, but missed it last year due to being on leave. 

The survey was about 10 CSAT style rating scales, validating the mission/vision of the community, evaluating their overall experience, and sharing their satisfaction level for individual community features.

If I recall correctly, I used a google form and just added it as a month long event on the calendar, featured the event on the home page, and sent an email or two. I also incentivized participation by offering points for completing it.  

It was helpful for getting data I could use to 1.) show value/success of the community with stakeholders, and 2.) combat straw man arguments or irrational worries (ex. “the answers on the community are not good” was disproven by a very high CSAT score for the quality and timeliness). 

This type of feedback is something I need to do more of, so I’m looking forward to learning from other replies in this thread. 


Thanks @DannyPancratz, very good hint. I can imagine that at the same time we can extend the query to collect data about users. To build some kind of user profiles. Especially users who just lurk in the background. It is important to know your audience.

Do you remember did you got “enough” responses from them, were you happy about that?


We got responses from 11% users who were active during the 30 days we ran the survey. And a fairly representative sample of the types of users (customers vs partners, superusers vs less active, new users vs longtime users). Superusers were over indexed a bit, but that’s to be expected. 

I was very happy with the sample size. 

 

 


Oh wow, sounds really good 👌


Surveying is an interesting topic!  :)

A bit like you say, @revote - I think it’s really about finding a balance between getting a decent rate of engagement (and valuable insight) while minimising annoyance.  In general, for what we might call ‘experience’ surveys (that aren’t focused on a particular page but on the overall experience and value of the community), I tend to follow these principles:

  1. A pop-up survey that is minimally intrusive. I’ve found that pop-up surveys are needed to get a decent amount of engagement, and it helps massively if the pop-up is elegant and doesn’t get in the way.  I like a small, minimally annoying pop-up on the bottom-right of the screen.  It shouldn’t stop someone from doing what they’re doing at that moment, but should be noticeable.  And should be easy to minimise and come back to.
  2. A maximum of 5 questions.  3 or less is even better.  For these kinds of surveys there are the well-known best practice questions to gauge value (‘did you find your answer’ etc...) and I usually like to have one true experience metric.  I prefer a CSAT-style question over NPS, as NPS is a tricky metric if you’re trying to gauge the community experience.  I’ve dabbled in CES, but I think that’s only really useful if used across the organisation.  Actually, that goes for all of these metrics - ideally we’re using a metric that is used across the organisation so it resonates as a data point.  An open text feedback question is always good too.
  3. Run the survey for a defined period.  I don’t feel the need to run a survey like this all the time.  A few weeks, a couple times a year, is enough data for these kinds of questions for me.  This also minimises the annoyance for regular members (settings to ‘not show for 6 months’ aren’t always reliable).

You might remember that we had a survey like this running here a few months ago.  We’ll do another one soon.  We would have done it already, I think, but we’re exploring new tooling options.

One other approach I like is hyper-focused micro-surveys that are placed on a single page to get a very specific insight.  Those can be incredibly useful when making UX decisions.


Thanks @Kenneth R for your thoughts.

Yeah, we also like and we use CSAT question. It is same across our platforms, so it is easy to compare.

What comes to max 5 questions, during the years, we used a survey with up to 10 or so questions. First of all, there was CSAT question and then we asked: “We have few question more, would you like to answer them as well?” If no, survey ended and if yes, we displayed rest of the questions.

With those questions we gathered data for community resolution rate / contact deflection.

Worked pretty nice, because user could choose to answer to only CSAT or rest of the questions.


Thanks for a very interesting topic @revote !

I’ve actually done my Master’s thesis based on a user survey I conducted in our community in 2023. For background information our community is an old B2C telco community. Over the years we’ve had different pop up surveys in use.

The questions in the pop up surveys have ranged from CES to “Did you get your matter solved” to “Did you find what you were looking for”. There’s not a big difference in the questions but we’ve felt that the nuance is more positive in the latter one. We haven’t really seen a major difference in the answers, but at least we’re not making it sound that we always assume that user as has a problem, they could only be looking for experiences 😊 We are also thinking of experimenting the CSAT type questions, and according to experiences from @DannyPancratz and @revote we should definitely do that!

In 2023 I conducted a larger user survey which had 30 something questions, including versions of both  “Did you get your matter solved” and “Did you find what you were looking for”. In my thesis I made some key findings:

  1. As @revote also pointed out, it seems that in pop up surveys quite often the replies are related to the situation the member is trying to solve while visiting. In the user survey, to which I invited members to answer via email, they answered more broadly about their visits overall.
  1. We got more positive results on the questions “Did you get your matter solved” and “Did you find what you were looking for” in the user survey than in the pop up survey. This also seems to be in relation to the fact that the community member is not solving a problem or looking for something right now, so they are not that frustrated 🙈

➡️With these findings and the experiences @DannyPancratz had with the annual survey, it seems that an “external survey” would be beneficial from time to time 😊


We haven’t really seen a major difference in the answers, but at least we’re not making it sound that we always assume that user as has a problem, they could only be looking for experiences 😊

This is excellent point. You have to think about what to ask but also how to ask. Small nuances makes big difference.

 

In 2023 I conducted a larger user survey which had 30 something questions, including versions of both  “Did you get your matter solved” and “Did you find what you were looking for”. In my thesis I made some key findings:

 

➡️With these findings and the experiences @DannyPancratz had with the annual survey, it seems that an “external survey” would be beneficial from time to time 😊

We have had user survey because of the certain community project back in the days. Thank you @Suvi Lehtovaara and also thank you @DannyPancratz & @Kenneth R for pointing this out - I have to add it to my ToDo list again. And as said, I have to repeat it annually or maybe twice per year.

With that user survey we can build for example user profiles or user personas, what helps us to plan for example new content.


We’re not currently surveying beyond the usual deflection pop up, however I want to insert more superduper easy surveys where they make sense. For instance in the best answer email I want to have a quick thumbs up / thumbs down to the “did this actually solve your issue?” question.

At the same time I’m also thinking how I can survey each topic based on the (brilliant) intent matrix our good @Kenneth R came up with years ago. In control, we (our moderators) label each of our topics with one of five “intents”:

Problem / troubleshooting

Advice / education / how to

Interest / pre-sales

Feedback / ideas

Account / orders

This is all part of the tagging and bagging we do on all the content, so I’m sure there’s a smart way to get some great insights based on those labels, and some willing community members :D 

The one thing I really want to avoid triggering is survey fatigue. I personally hate being bombarded with a ton of things I have to think about, then take a decision on, then click a button somewhere, when I just want to find an answer, information, or read some insightful words from a community leader. It all starts with the cookie blurb…..


We’re not currently surveying beyond the usual deflection pop up, however I want to insert more superduper easy surveys where they make sense. For instance in the best answer email I want to have a quick thumbs up / thumbs down to the “did this actually solve your issue?” question.

Really nice idea! Thanks.

 

In control, we (our moderators) label each of our topics with one of five “intents”:

Problem / troubleshooting

Advice / education / how to

Interest / pre-sales

Feedback / ideas

Account / orders

This is all part of the tagging and bagging we do on all the content, so I’m sure there’s a smart way to get some great insights based on those labels, and some willing community members :D

You mean that you read the labeled topics regularly or do you have other usage for this labeling? Build some statistics or so?

 

The one thing I really want to avoid triggering is survey fatigue. I personally hate being bombarded with a ton of things I have to think about, then take a decision on, then click a button somewhere, when I just want to find an answer, information, or read some insightful words from a community leader. It all starts with the cookie blurb…..

Yeah, I totally agree. And this is the reason I adjust the trigger time, to avoid fatigue as much as possible.

Cookie consent, chat bot pop up, newsletter promotion pop up, user experienve survey pop up and so on :D


We recently made a change to our surveys as an experiment. We used to have FCR and later on “Did you find what you were looking for”, but the results were surprisingly similar. I mean that the percentage of the “No” respondents was as large in both.

So we decided to change it to CSAT - let’s see what happens 😊

 

Now we have made latest changes. We asked from the users couple of months back, what kind of survey users would like to answer the most. They said that embedded is the best so we added HTML widget to the Topic sidebar and now we collect feedback in each and every topic (discussion, question, article).

We have 5 smileys there, where user can choose. We ask also to explain their response freely with words. Lastly we ask what people want to read in the community in the future. With this survey we want to improve our content.

Addition this embedded survey, we added pop up -survey to the community as well. With this survey we collect feedback about the overall experience about the community. We adjust the timing still, we dont want to annoy the users. We added pop up -survey because users said it is ok, if it doesn´t appear too soon.

We´ll see how this combination works.

 

 ​@revote can you share us how this combination works? Are people answering the embedded survey?


Now we have made latest changes. We asked from the users couple of months back, what kind of survey users would like to answer the most. They said that embedded is the best so we added HTML widget to the Topic sidebar and now we collect feedback in each and every topic (discussion, question, article).

We have 5 smileys there, where user can choose. We ask also to explain their response freely with words. Lastly we ask what people want to read in the community in the future. With this survey we want to improve our content.

Addition this embedded survey, we added pop up -survey to the community as well. With this survey we collect feedback about the overall experience about the community. We adjust the timing still, we dont want to annoy the users. We added pop up -survey because users said it is ok, if it doesn´t appear too soon.

We´ll see how this combination works.

 

 ​@revote can you share us how this combination works? Are people answering the embedded survey?

We haven´t got much negative feedback because of the pop up. It is matter of timing, pop up should not activate too early. But I knew this, we have used pop up -survey earlier also. Number of feedback is ok.

What is surprising is that number of feedback from the embedded survey is very, very low. User can choose to press just smiley or to add free text as well. We don´t get none of them much.

I think this is because most of users use mobile and the embedded survey is bottom of the page. Most of the users doesn't notice it.

--

So now, after few months I can say that if you want to measure your community performance and gather feedback, if you think about embedded or pop up -survey, pop up -survey is better. Just choose (test) carefully the timing.

 

EDIT: Reason for the embedded survey is to improve the content (if possible). As you can imagine, most of the users gives feedback about the situation, not for the content. So from that perspective either, I am not fan of embedded surveys 😀


What a great and interesting topic/thread to read ​@revote - Thanks for raising this matter for discussion.

 

This is something, like ​@DannyPancratz we lean on to satifsy internal business stakeholders in regards to the value that our forum brings. However, as I’m learning via this thread, lots of you are running into similar difficulties around the usefulness of the pop out surveys. 

 

I’ve noticed that the embedded ‘Yes’, ‘No’ survey on each topic garners the most engagement from our online user community - but then is slightly ‘scuppered’ by the inaccuracies of our pop out survey (timing of pop up is a major concern as well as accuracy of qaulitative feedback provided).

 

I’m really keen to know of any survey processes that Forum Community Managers may have adopted outside of these mentioned above? We’ve been getting some incredibly profound email campaign data that indicates utilising this feature could support a more direct survey approach (concious of doing this infrequently as to not bombard our users). 

Have people tried any other means or experiments around this that yeilded surprising results?

 

Thanks all😊


@BradleyOVO can you tell us a bit more what you mean by this? 

We’ve been getting some incredibly profound email campaign data that indicates utilising this feature could support a more direct survey approach (concious of doing this infrequently as to not bombard our users). 


Apologies ​@DannyPancratz  - Was clearly too eager to get a reply to this thread without taking the time to properly explain!

 

Hopefully I can share the following image for context: 

Example of email campaigns Sent Data

I’ve been aware of the Email Campaign feature on the Control platform for some time and have been eager to use it to help support engagement to our platform.

Since September we started holding monthly competitions where users could sign up and be in the running to win a set of free event tickets to one of our branded venues.

(Here’s one of the competition topics for context: https://forum.ovoenergy.com/ovo-live-164/it-s-october-ready-to-win-some-more-tickets-with-ovo-live-18478)

I set the reciepient audience to be ‘Registered Users’ (Which I’m thankful only targets those who’ve been active over the last year) and sent it away, crossing my fingers and hoping for the best.

I was expecting the spam and unsubscribe rates to be extremely high, especially where the email campaign is about a competition. However, they’ve surprisingly remained low and engagement has remained steady. We’d see a topic go from a normal footfall of below 100 views shoot up to around 1000 after a targeted email had been sent also!

 

This experience has got my cogs turning and my team excited that we could try more targeted survey campaigns around surveys, which may give us some more realiable and direct feedback on how we’re performing. This, on top of the pop out and embedded surveys.

 

Hope this makes much more sense?


@BradleyOVO thanks for the additional details! 


This experience has got my cogs turning and my team excited that we could try more targeted survey campaigns around surveys, which may give us some more realiable and direct feedback on how we’re performing. This, on top of the pop out and embedded surveys.

Have you already though how you reach the visitors? Who just lurks in the background 😀


This is the double edged sword issue - Driving and maintaining overall engagement as well as getting real, informative feedback at the same time! 😅

I’ll let you all know how the potential email campaign survey request goes when we attempt it and what sort of engagement we received...😶

I do find all of the challenges around this super exciting! 


I got idea ​@BradleyOVO from you to use Email Campaign as a tool to gather feedback from the users who have participated. This is new point of view, for me at least. And it is good that we can automate this message.

Sadly, if users unsubscribe, then they dont receive any other messages from us.


Reply