Skip to main content

Hi folks,

do you use surveys for the visitors or members, to improve community´s user experience? How they have worked?

We use and I think you use as well.

Background:

We have run our community for years. We have started with pop up -surveys. First tweaks we made we adjusted the trigger time how quick survey triggers, after user landed to the community.

After those tweaks we got less feedback that survey is annoying because “I just wanted to read the topic and pop up filled my screen!#”%%&6”. So tweaks we made were ok.

Most of the feedback were mostly about our products. Only few feedback were for the community itself. And it is sad if you are community captain. It is ok for the company of course.

Then we made new changes to the survey. We choosed to get rid of the pop up -survey and we added floating feedback button to each and every community page. After this change, number of feedbacks dropped dramatically. I understand, especially when using mobile, floating buttons are difficult on small screen. When using desktop you dont pay attention to the button. Feedbacks we still got were about the products.

Now we have made latest changes. We asked from the users couple of months back, what kind of survey users would like to answer the most. They said that embedded is the best so we added HTML widget to the Topic sidebar and now we collect feedback in each and every topic (discussion, question, article).

We have 5 smileys there, where user can choose. We ask also to explain their response freely with words. Lastly we ask what people want to read in the community in the future. With this survey we want to improve our content.

Addition this embedded survey, we added pop up -survey to the community as well. With this survey we collect feedback about the overall experience about the community. We adjust the timing still, we dont want to annoy the users. We added pop up -survey because users said it is ok, if it doesn´t appear too soon.

We´ll see how this combination works.

 

--

 

I would like to hear your thoughts:

  • Have you used embedded surveys in the community and how they have worked?
  • What kind of survey type works the best you think? Pop up, embedded, floating feedback button, feedback button in the page or something else?
  • What kind of feedback you mostly get from the user surveys - feedback about the products or feedback about the community?

I´d say even I ask feedback about the content and the community, because most of the users come to our community to solve their problem and this is the only thing they can think about in that moment, this is the reason they give the feedback about the product. No matter what kind of survey type is in use and what I ask from them.

Thanks for sharing your insights and lessons learned along the way. 

I ran an “annual” survey for a month at the end of 2022, but missed it last year due to being on leave. 

The survey was about 10 CSAT style rating scales, validating the mission/vision of the community, evaluating their overall experience, and sharing their satisfaction level for individual community features.

If I recall correctly, I used a google form and just added it as a month long event on the calendar, featured the event on the home page, and sent an email or two. I also incentivized participation by offering points for completing it.  

It was helpful for getting data I could use to 1.) show value/success of the community with stakeholders, and 2.) combat straw man arguments or irrational worries (ex. “the answers on the community are not good” was disproven by a very high CSAT score for the quality and timeliness). 

This type of feedback is something I need to do more of, so I’m looking forward to learning from other replies in this thread. 


Thanks @DannyPancratz, very good hint. I can imagine that at the same time we can extend the query to collect data about users. To build some kind of user profiles. Especially users who just lurk in the background. It is important to know your audience.

Do you remember did you got “enough” responses from them, were you happy about that?


We got responses from 11% users who were active during the 30 days we ran the survey. And a fairly representative sample of the types of users (customers vs partners, superusers vs less active, new users vs longtime users). Superusers were over indexed a bit, but that’s to be expected. 

I was very happy with the sample size. 

 

 


Oh wow, sounds really good 👌


Surveying is an interesting topic!  :)

A bit like you say, @revote - I think it’s really about finding a balance between getting a decent rate of engagement (and valuable insight) while minimising annoyance.  In general, for what we might call ‘experience’ surveys (that aren’t focused on a particular page but on the overall experience and value of the community), I tend to follow these principles:

  1. A pop-up survey that is minimally intrusive. I’ve found that pop-up surveys are needed to get a decent amount of engagement, and it helps massively if the pop-up is elegant and doesn’t get in the way.  I like a small, minimally annoying pop-up on the bottom-right of the screen.  It shouldn’t stop someone from doing what they’re doing at that moment, but should be noticeable.  And should be easy to minimise and come back to.
  2. A maximum of 5 questions.  3 or less is even better.  For these kinds of surveys there are the well-known best practice questions to gauge value (‘did you find your answer’ etc...) and I usually like to have one true experience metric.  I prefer a CSAT-style question over NPS, as NPS is a tricky metric if you’re trying to gauge the community experience.  I’ve dabbled in CES, but I think that’s only really useful if used across the organisation.  Actually, that goes for all of these metrics - ideally we’re using a metric that is used across the organisation so it resonates as a data point.  An open text feedback question is always good too.
  3. Run the survey for a defined period.  I don’t feel the need to run a survey like this all the time.  A few weeks, a couple times a year, is enough data for these kinds of questions for me.  This also minimises the annoyance for regular members (settings to ‘not show for 6 months’ aren’t always reliable).

You might remember that we had a survey like this running here a few months ago.  We’ll do another one soon.  We would have done it already, I think, but we’re exploring new tooling options.

One other approach I like is hyper-focused micro-surveys that are placed on a single page to get a very specific insight.  Those can be incredibly useful when making UX decisions.


Thanks @Kenneth R for your thoughts.

Yeah, we also like and we use CSAT question. It is same across our platforms, so it is easy to compare.

What comes to max 5 questions, during the years, we used a survey with up to 10 or so questions. First of all, there was CSAT question and then we asked: “We have few question more, would you like to answer them as well?” If no, survey ended and if yes, we displayed rest of the questions.

With those questions we gathered data for community resolution rate / contact deflection.

Worked pretty nice, because user could choose to answer to only CSAT or rest of the questions.


Thanks for a very interesting topic @revote !

I’ve actually done my Master’s thesis based on a user survey I conducted in our community in 2023. For background information our community is an old B2C telco community. Over the years we’ve had different pop up surveys in use.

The questions in the pop up surveys have ranged from CES to “Did you get your matter solved” to “Did you find what you were looking for”. There’s not a big difference in the questions but we’ve felt that the nuance is more positive in the latter one. We haven’t really seen a major difference in the answers, but at least we’re not making it sound that we always assume that user as has a problem, they could only be looking for experiences 😊 We are also thinking of experimenting the CSAT type questions, and according to experiences from @DannyPancratz and @revote we should definitely do that!

In 2023 I conducted a larger user survey which had 30 something questions, including versions of both  “Did you get your matter solved” and “Did you find what you were looking for”. In my thesis I made some key findings:

  1. As @revote also pointed out, it seems that in pop up surveys quite often the replies are related to the situation the member is trying to solve while visiting. In the user survey, to which I invited members to answer via email, they answered more broadly about their visits overall.
  1. We got more positive results on the questions “Did you get your matter solved” and “Did you find what you were looking for” in the user survey than in the pop up survey. This also seems to be in relation to the fact that the community member is not solving a problem or looking for something right now, so they are not that frustrated 🙈

➡️With these findings and the experiences @DannyPancratz had with the annual survey, it seems that an “external survey” would be beneficial from time to time 😊


We haven’t really seen a major difference in the answers, but at least we’re not making it sound that we always assume that user as has a problem, they could only be looking for experiences 😊

This is excellent point. You have to think about what to ask but also how to ask. Small nuances makes big difference.

 

In 2023 I conducted a larger user survey which had 30 something questions, including versions of both  “Did you get your matter solved” and “Did you find what you were looking for”. In my thesis I made some key findings:

 

➡️With these findings and the experiences @DannyPancratz had with the annual survey, it seems that an “external survey” would be beneficial from time to time 😊

We have had user survey because of the certain community project back in the days. Thank you @Suvi Lehtovaara and also thank you @DannyPancratz & @Kenneth R for pointing this out - I have to add it to my ToDo list again. And as said, I have to repeat it annually or maybe twice per year.

With that user survey we can build for example user profiles or user personas, what helps us to plan for example new content.


We’re not currently surveying beyond the usual deflection pop up, however I want to insert more superduper easy surveys where they make sense. For instance in the best answer email I want to have a quick thumbs up / thumbs down to the “did this actually solve your issue?” question.

At the same time I’m also thinking how I can survey each topic based on the (brilliant) intent matrix our good @Kenneth R came up with years ago. In control, we (our moderators) label each of our topics with one of five “intents”:

Problem / troubleshooting

Advice / education / how to

Interest / pre-sales

Feedback / ideas

Account / orders

This is all part of the tagging and bagging we do on all the content, so I’m sure there’s a smart way to get some great insights based on those labels, and some willing community members :D 

The one thing I really want to avoid triggering is survey fatigue. I personally hate being bombarded with a ton of things I have to think about, then take a decision on, then click a button somewhere, when I just want to find an answer, information, or read some insightful words from a community leader. It all starts with the cookie blurb…..


We’re not currently surveying beyond the usual deflection pop up, however I want to insert more superduper easy surveys where they make sense. For instance in the best answer email I want to have a quick thumbs up / thumbs down to the “did this actually solve your issue?” question.

Really nice idea! Thanks.

 

In control, we (our moderators) label each of our topics with one of five “intents”:

Problem / troubleshooting

Advice / education / how to

Interest / pre-sales

Feedback / ideas

Account / orders

This is all part of the tagging and bagging we do on all the content, so I’m sure there’s a smart way to get some great insights based on those labels, and some willing community members :D

You mean that you read the labeled topics regularly or do you have other usage for this labeling? Build some statistics or so?

 

The one thing I really want to avoid triggering is survey fatigue. I personally hate being bombarded with a ton of things I have to think about, then take a decision on, then click a button somewhere, when I just want to find an answer, information, or read some insightful words from a community leader. It all starts with the cookie blurb…..

Yeah, I totally agree. And this is the reason I adjust the trigger time, to avoid fatigue as much as possible.

Cookie consent, chat bot pop up, newsletter promotion pop up, user experienve survey pop up and so on :D


Reply