Volunteer Satisfaction Surveys: How To Get Them Right
Gathering candid opinions from your volunteers will help you gain insights around what you need to improve to ensure volunteer satisfaction at your organization. Despite our best efforts, however, sometimes our surveys raise more questions than insights and don’t help us understand the specific actions we need to take next. If that’s the case for you, here are some suggestions that I hope will help generate the rich, actionable feedback you need.
10 Ways to Build a Bang-Up Volunteer Satisfaction Survey
1) Keep it simple. Show respect for your volunteers’ time and the suggestions they are about to share. Keep your survey short. Even though it’s tempting to add a potpourri of inquiries, keep it to less than ten questions (or less than a few minutes) to complete. You’ll also help yourself out — the longer the survey, the more respondent burden, and the less likely they are to finish it anyway.
2) Keep it easy. In order to make it manageable for both you and your volunteers, use free or low-cost online survey software, such as Survey Monkey. Send out invitations via email to easily capture and automatically report your responses. Make paper surveys available to those few who don’t have internet access. And, just like other communications, stay away from jargon or acronyms that will confuse readers, regardless of how long volunteers have been onboard.
3) Always “bookend” your survey with a short introduction at the beginning and an authentic thank you at the end. Satisfaction surveys are another way to build relationships with your volunteers. Let them know why they’ve been invited, what you hope to gain, and when they will hear back from you with your next steps. Thank them and give them a specific contact person, the event they have questions or experience problems.
4) Focus on using structured questions. Rather than using primarily open-ended questions, ask a few volunteers to help you brainstorm all the possible answers to a question. Then, categorize your answers into neat, multiple-choice options. Add the category of “other” and allow respondents to add their responses, but try to make your list as exhaustive as possible. This also reduces respondent burden and makes it much, much easier to analyze results on your end.
5) Always ask volunteers to rate how likely it would be for them to recommend their volunteer experience to friends and family. This question gets at perceived value, which is the core of what drives our satisfaction with people, things, and experiences. If a person believes something is valuable to them, they’ll rave about it. If you track this metric over time, you’ll see whether your value goes up or down in the volunteers’ eyes. Also, if you ask why they chose this rating, in a subsequent open-ended question, you can find the root cause of their delight or disdain.
6) Always, invite volunteers to share additional comments in an open-ended question at the end. That sends a clear message to volunteers that your ears are open and you are willing to hear candid comments, even if they don’t fit into the neat boxes in your survey.
7) Test, test, test. Have a few volunteers test your online and paper survey instruments. Make sure they understand what your questions are really asking. Also, you can confirm with your testers that your list of multiple-choice options includes a broad enough selection.
8) Make sure your stats stack up. You want to be able to defend your survey results and have confidence in your analysis. So, you need to make sure you have enough data. First, make sure you have collected enough responses to generalize your results (collect a “statistically relevant” sample size) — the smaller the population, the larger the percentage of responses you must collect. Free online calculators can be found, like this one, to do the math for you (note, when using this tool, industry standard is a 95% confidence level with a +/- 5% margin of error). Also, make sure that if you are asking volunteers to rate something, they will need at least five data points to chose from (called a “likest scale).” Click here for a few sample scales to choose from.
9) Be prepared for the “halo effect.” In the nonprofit world, volunteers and service beneficiaries alike are reluctant to give negative feedback to nonprofit staff. If they are bothered by something, volunteers will often leave the organization before they will tell staff about their frustrations. To gather as much information as possible, collect responses anonymously and let volunteers know their privacy will be protected. When reporting the results, choose representative statements from the open-ended comments and do not associate them with an individual’s name. Finally, pay attention to complaints; they are likely the tip of the iceberg.
10) When you have a survey that works, stick with it through several survey seasons. Your survey is successful when it gets you the info you need to take action, not necessarily just because it generates a glowing report. Once you have a design that is feeding you valuable insights, resist the temptation to tweak your core questions each time. That way you can build a baseline of trustworthy trending data to work from. If you change up your annual survey every year, you’ll never know whether the changes you make have an impact.
One More Thing…
Surveys aren’t the only way to evaluate volunteer satisfaction. You can use social media to facilitate conversations within your volunteer team about their needs. You can take a look at your volunteer retention numbers, analyze where and when people are leaving, and ask why. You can also do exit interviews with volunteers to get their final suggestions on their way out the door.
Also, recognize that no satisfaction survey will get you all the information you need. If your survey results raise additional questions, think about facilitating a volunteer focus group to get at some of the deeper drivers of satisfaction. If you’re worried they won’t be honest with your or your staff, hire an outside facilitator to manage it for you.
Good tips. What I think is there should be flexibility in the questions and related materials for the volunteers, so that it may not be rigid and they would get chance to response on their own ways. This can bring lots of hidden things out.