Hello everyone,
I would like to announce that, although being online for only 2 months, we will start our Beta Testing Program on the platform!
We are really excited to do so, but, again, why not hear from all of you that have more experience?
Could you please share your experience on this matter? Things like this would be very important for us:
1) How do you moderate the pilot tests?
2) How to you collect and analyze the qualitative feedback given?
3) Did you have any "leak" issue? How did you treat them?
4) Do you have workers from the different areas (such as Product, IT, Innovation) helping the moderation team? If so, what kind of access do you give to them?
Please, feel free to add any other point into this discussion :)
Looking forward to hear from you guys!
I would like to announce that, although being online for only 2 months, we will start our Beta Testing Program on the platform!
We are really excited to do so, but, again, why not hear from all of you that have more experience?
Could you please share your experience on this matter? Things like this would be very important for us:
1) How do you moderate the pilot tests?
2) How to you collect and analyze the qualitative feedback given?
3) Did you have any "leak" issue? How did you treat them?
4) Do you have workers from the different areas (such as Product, IT, Innovation) helping the moderation team? If so, what kind of access do you give to them?
Please, feel free to add any other point into this discussion :)
Looking forward to hear from you guys!
Page 1 / 1
Hi Tomas,
We have tested several services and products with our community. The moderation was done by a dedicated moderator and the community manager. All the feedback was collected in an Excel sheet. Along the way we could categorize the feedback. You could also work with post field analysis to collect the feedback. We gave several colleagues from product and IT acces to the test forum, but most of them just read. I gave them a custom user title which stated their job title. It's really good to include them, let them answer questions about the choices they made.
Good luck and have fun!
Jolien
We have tested several services and products with our community. The moderation was done by a dedicated moderator and the community manager. All the feedback was collected in an Excel sheet. Along the way we could categorize the feedback. You could also work with post field analysis to collect the feedback. We gave several colleagues from product and IT acces to the test forum, but most of them just read. I gave them a custom user title which stated their job title. It's really good to include them, let them answer questions about the choices they made.
Good luck and have fun!
Jolien
Hi Jolien,
Thanks a lot for sharing your experience :)
May I ask you to be more specific regarding the Post-field that you use? Could you give me real examples? :8
Thanks a lot for sharing your experience :)
May I ask you to be more specific regarding the Post-field that you use? Could you give me real examples? :8
We've been doing beta testing in the community for a couple of years already and there's a couple of important lessons that we learned along the way :-)
A few things to keep in mind:
- Make sure you effectively set the scope of the pilot in advance, both internally, as for the community to avoid any confusion or unanticipated situations around the product during the pilot.
- Try to agree on a way of working with the product owner/team involved beforehand, eg. the time they need to respond to questions or feedback from the community during the pilot.
- Create clear paths for participants, try to create a number of sticky posts in advance that summarize relevant information about the pilot and that keep track of feedback already reported in a structed manner (this is especially important if you intend to have a large number of participants)
- Consider using one or a couple of threads in the board for the pilot instead of letting participants open a new topic for every feedback item to avoid the board getting 'cluttered'.
As for your own questions.
1) Usually a dedicated moderator per pilot, moderation based on the regular modus operandi on the forum. Maybe a little more 'tickling' the users if needed to incite more feedback from them.
2) Much like Jolien said, excel always works ;-)
3) If it's a product that you don't want out in the open yet, I recommend you use a NDA. We've never had any leaks by the way.
4) This one depends on the teams and people involved, they need to be comfortable with the idea of interacting on the forum. When they were we always encouraged direct participation and gave them a rank that made their position clear to the participants.
A few things to keep in mind:
- Make sure you effectively set the scope of the pilot in advance, both internally, as for the community to avoid any confusion or unanticipated situations around the product during the pilot.
- Try to agree on a way of working with the product owner/team involved beforehand, eg. the time they need to respond to questions or feedback from the community during the pilot.
- Create clear paths for participants, try to create a number of sticky posts in advance that summarize relevant information about the pilot and that keep track of feedback already reported in a structed manner (this is especially important if you intend to have a large number of participants)
- Consider using one or a couple of threads in the board for the pilot instead of letting participants open a new topic for every feedback item to avoid the board getting 'cluttered'.
As for your own questions.
1) Usually a dedicated moderator per pilot, moderation based on the regular modus operandi on the forum. Maybe a little more 'tickling' the users if needed to incite more feedback from them.
2) Much like Jolien said, excel always works ;-)
3) If it's a product that you don't want out in the open yet, I recommend you use a NDA. We've never had any leaks by the way.
4) This one depends on the teams and people involved, they need to be comfortable with the idea of interacting on the forum. When they were we always encouraged direct participation and gave them a rank that made their position clear to the participants.
Hello Thomas,
Great post!! Thanks a lot
We are going to meet with internal teams today, to summarize the “beta testing” boundaries and I will take your comments into that meeting. Very useful indeed.
Great post!! Thanks a lot
We are going to meet with internal teams today, to summarize the “beta testing” boundaries and I will take your comments into that meeting. Very useful indeed.
Hi Tomas,
At this moment we use the post field analysis to tag the 'social drivers', the subject of the conversation, so we can compare them with the other contact drivers via call, social, etc.
For collecting feedback, you could use post field analysis for the sentiment of the feedback, but also for which department the feedback is meant (IT, marketing, etc), the priority level and severity.
Jolien
At this moment we use the post field analysis to tag the 'social drivers', the subject of the conversation, so we can compare them with the other contact drivers via call, social, etc.
For collecting feedback, you could use post field analysis for the sentiment of the feedback, but also for which department the feedback is meant (IT, marketing, etc), the priority level and severity.
Jolien
Reply
Sign up
If you ever had a profile with us, there's no need to create another one.
Don't worry if your email address has since changed, or you can't remember your login, just let us know at community@gainsight.com and we'll help you get started from where you left.
Else, please continue with the registration below.
Welcome to the Gainsight Community
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.