Episode #072: Volunteer Surveys – Are You Making These Mistakes?

Welcome to the Volunteer Nation Podcast, bringing you practical tips and big ideas on how to build, grow, and scale volunteer talent. I’m your host, Tobi Johnson, and if you rely on volunteers to fuel your charity cause, membership, or movement, I made this podcast just for you. Hey there, everybody.

Welcome to another episode of the Volunteer Nation Podcast. Today I want to talk about volunteer surveys and offer up perhaps some mistakes you might be making as you are polling your volunteers and soliciting feedback about their experiences and their satisfaction level. I’ve been running surveys for years. Our volunteer management progress report survey, our key state of the industry survey, has been providing the field with key trends and data for the past eight years. It’s been a journey and I’ve been able to get broad participation and actionable information to help me make management decisions.

But it wasn’t always this way. There was a time where I was sending out surveys and got very little participation and I was generating data that I really didn’t know that I could count on and it wasn’t really reliable. So I began to educate myself on how to design and deploy feedback surveys that could supply me with the information I needed to make a real difference. I started going to training, I started practicing, and little by little, my surveys got a lot better every year. Our volunteer management progress report survey generates over 1101 600 responses from around the world every year. So we know a little bit about a thing or two about surveys. So I wanted to ask you because I know inside our volunteer pro membership community, there’s been a lot of interest in really developing surveys that actually work and create actionable information. And I just wonder if you’re also struggling to develop these questionnaires and make sure that you’re getting the right information so you can make really informed management decisions.

So I thought I’d go through a few mistakes that I see people making when I’m coaching them around building surveys. So let’s get into it. So ask yourself, as I go through this list, are you making any of these mistakes when it comes to your volunteer surveys? So here’s a mistake. Number one, not being clear on your end goal, what is it you actually want to know from your survey? And I’m just not talking about basic volunteer satisfaction data, but is there something that you suspect is not working well for volunteers? Then maybe that’s something to explore in your survey. Is there something that you’d like to confirm that you’ve heard people complain about and you want to make sure it’s a broader perception than just one person? Maybe that’s a reason you want a survey. Maybe you want to float an idea out there to your volunteers and see if they might be interested or would back that idea or get behind that idea. There are many reasons to run surveys. You might want to check the demographics of your volunteers and see if you have a broad and wide range of participation from people from all walks of life.

Lots of different ways to use surveys. So the first thing to know before you start writing a survey questionnaire is what do I actually want to know? And if we’re not clear on our end goal, we can’t work backwards and design a questionnaire that’s going to generate the data we need. So it’s really important to know your end goal and also to know what format or what way you want the information to come to you. Sometimes I’ve developed surveys where at the end it’s like, oops, that’s not the way I’m going to be able to use this data, so I’m not going to be able to cross tab this data, for example, so I can’t compare two data sets together between two questions. You also want to kind of test out the reporting capabilities once you get your survey drafted and put into your survey software. So not being clear on your goal, huge mistake. Don’t just take anybody’s survey. I see people saying, hey, does anybody have a sample survey? That’s great.

But you also want to check that survey, make sure it’s gathering the information you need. I also think another mistake that people make is that they’re not separating the nice to know versus the need to know. When it comes to developing a survey questionnaire, there’s a lot of people with a lot of opinions about what needs to be in that survey. But the fact of the matter is you’re never able to capture all the data you need in one pass, so don’t even try. Plus, when you have too many questions, it equals low completion rates because volunteers run out of time or just don’t feel like they want to answer that many questions. So we’ve got to keep these surveys very focused. And so ask yourself when you’re developing your survey, what info, if you had it, would you be willing to act upon? And if there’s information that you’re collecting that you are not willing or able to act upon once you have it, then it shouldn’t go in your survey because you’re also setting up expectations that it will be acted upon. So make sure you separate that need to know from nice to know and keep your survey very focused.

You don’t have to only do one survey a year either. And there’s lots of different ways to gather information, not always just through a questionnaire. You can do focus groups, you can do listening sessions, you can do comment cards, you can have idea mashups. There’s just tons of ways to get information. You can interview folks, lots of different ways to gather information to make decisions. So another mistake I see people making is that they don’t track changes over time. So benchmarking a baseline number and then going back and reassessing that with a survey that asks a subsequent survey later on down the road that asks the same question can help you determine or at least help you see whether or not some of your management interventions are improving things for the lives of volunteers. So establish benchmarks and resurvey to see if your interventions are making an impact.

So think about, for example, net promoter score. The question is to track net promoter is would you recommend volunteering to friends and colleagues or volunteering with us to friends and colleagues? And you have a scale of one to ten or zero to ten in some instances in some ways. So you want to make sure if you’re going to track a net promoter score, you only track it once because you want to see what the trends are over time. So plan ahead and think about how you’re going to continue to track and you need to ask that question the same way every time or you’ll get skewed data. You’re not comparing apples to apples. So you do want to think ahead when you’re thinking about establishing a benchmark and then tracking it over time. Another mistake people make is not designing a communications plan that accompanies your survey. So your volunteer survey, often what folks do with their volunteer surveys is they will just send out one email and say, please, everybody fill out the survey and then they don’t get very much response.

But you got to remember that when you’re sending out emails, only about anywhere from 25% to 50% of your volunteers are probably opening that email. If you’re sending it via an email service provider, like a MailChimp or ConvertKit or any of these other tools like that, you can check and see what percentage of people sent it out, but you need to have a plan for communicating. Why the survey? You need to develop messaging. Why is the survey important? What’s going to be done with the data? Encouraging people to take part, giving people a deadline and then communicating in multiple different ways. Maybe through a volunteer portal, through your emails, during meetings, texting out to your volunteers. Whatever way you communicate with your volunteers, you’re going to need to do it more than once, and you’re going to need to have a pretty compelling reason why people should stop and take that survey. Take a moment out of their day. So I’ve got a few more tips, but we’re going to take a break right now from my list of the biggest mistakes people make when designing volunteer surveys.

And after this break, we will be right back.

If you enjoyed this week’s episode of Volunteer Nation, we invite you to check out the VolunteerPro Premium Membership. This community is the most comprehensive resource for attracting, engaging, and supporting dedicated, high impact volunteer talent for your good cause. VolunteerPro Premium Membership helps you build or renovate an effective what’s working now volunteer program with less stress and more joy so that you can ditch the overwhelm and confidently carry your vision forward. It is the only implementation of its kind that helps your organization build maturity across five phases of our proprietary system, the Volunteer Strategy Success Path. If you’re interested in learning more, visit volpro.net/join

We are back and I’ve got a few more errors that people make with their volunteer surveys. As I said before the break, I have had a fair amount of experience developing volunteer surveys over the years and have had the opportunity to really learn from some experts in the field of survey development and design. And it’s really helped me see how to design a survey, but also how to look at the data.

So there’s a few other tips and sometimes mistakes people make that I want to also mention. So here’s one. Another mistake people make when they’re developing volunteer surveys is not protecting volunteer data and ensuring privacy. And to get truly candid responses, you really need to ensure complete and total anonymity to your survey respondents. They need to be assured that no one’s going to know who said what. That’s the only way to really ensure that people are being really upfront with you. I see a lot of volunteer surveys where folks are collecting names. It just kind of doesn’t really people are going to be nice and they’re just going to say what they think you want to hear.

So you really should keep your survey responses anonymous. So this also means not collecting IP addresses. So what that means is inside your survey software you’ll have the option to not collect IP addresses and you should not. And what an IP address is, is it’s the IP or web location from the computer that that person is taking the survey on? And so we don’t want to collect that. If you’re truly anonymous, then you must not collect that either and you don’t need it anyway. And we want to ensure and let volunteers know in our messaging, in the introduction of the survey, for example, as well as email body copy where you’re saying, hey, we’d like your candid feedback, all answers will be anonymous and we want to hear from you what you think about your volunteer experience. So we want to make sure it’s just ethical to do this. If you’re going to do human research on human beings, you need to follow some ethical principles and guidelines.

Another mistake people make is nagging people who have already completed your survey. And there is an easy way to get around this. People will send email. Some people will send only one email and not get enough responses. Other people will send too many emails and to everybody just broadcasting, to everybody just e blast the whole group. But some of your volunteers have already taken the survey, so why should they be bothered with email? That’s not relevant to them. And so it’s just not cool to continue to bombard nag people with nagging emails when they’ve already participated and sometimes they’ll get grumpy about it and respond and say I already took it. Right.

But the easy way to get around this is to use survey software that allows you to upload emails of those volunteers and the survey software will keep track of who has completed the survey. And you can load in all of your reminder emails into your survey software and it will send those out at the times you would like, and only to the people who haven’t completed the survey. So it’s a really simple way to not antagonize the folks that have been so good and gracious to complete your survey for you. Make sure you’re using software and make sure you’re using software that has that capability. Another mistake people make is not including the right mix of question types and have surveys, we talked about surveys being too long. So you create what’s called in the survey world, high respondent burden, for example, when you have too many open ended comments. So open ended comments create respondent burden because people have to stop and think and write. And when I’ve seen surveys that all of the questions 100%, are open ended and that creates a really high respondent burden.

It’s very challenging to inspire people to complete a survey like that. They usually open it up and close it back up or only complete the first question. So you want to mix it up with some multiple choice or fill in the blank or open end and open ended. I would not include more than two, maybe three open ended comments in any survey of volunteers. That means you need to do the work to figure out if you’re doing, for example, a multiple choice or multiple response question. Multiple choices. People can pick one multiple responses. People can pick more than one option in the answer to a question or what they call in the survey world a survey item that if you do the work, you’ve got to be able to precipitate down and use your best guesstimate about what those options should be in the answers to that question.

You can always include an other, but that takes extra data analysis at the end. So how do you do that? You can do that by asking people. You can do focus groups to nail down the options so that you’re offering up a very informed set of options that people can pick from in that survey item. So you want to mix up your question types, you don’t want your survey to be too long and you don’t want to include too many open ended questions that create high respondent burden. Another mistake is that folks don’t analyze the open ended questions properly. So the comments to the open ended questions, I should say so instead, if we’re just glancing through and reading them, our minds are biased and so we will often glance through a set of open ended question responses and say, okay, I get it. I understand what people are saying here. And the biases, the hidden biases in our minds will look for the responses that we think we are supposed to be reading if we already think we know it’s a confirmation bias.

So we’re looking for confirmation of what we already believe when we’re looking at open ended comments. So a proper analysis of open ended comments is to create categories and to go through your open ended comments and categorize them. And then you go back and count how many comments were categorized in certain ways. And that will help you reduce the amount of bias in your open ended comments analysis. So that’s a mistake that often people make innocently too. Nobody has a sort of mal intent when they’re doing this. But it is important to analyze your open ended comments properly rather than just reading through them and getting your gut reaction. All right, another mistake people make is not sharing the results with both volunteers and leadership, right? When we ask people to participate and give their perceptions, especially our volunteers and our supporters, our coworkers, whoever we’re asking in our context or our communities, it’s really not only ethical and the right thing to do to share that information back because they helped crowdsource that information.

But it’s also important for folks to see what the responses were, see what the results were. Now, you can aggregate things. People don’t need to know all of the nitty gritty, but you can aggregate and report back. And what’s even more important is to report back what you’re going to do about the information that you gathered. You want to let both your leadership and sometimes you’re putting out proposals to your leadership, like, okay, here’s what we learned, here’s what we propose we’re going to do. What do you think? And with volunteers, if you have an advisory group, you could also do that with them, provide proposed responses or ask them what they would propose. But at some point, there’ll be decisions made about the results of the survey, and you want to report that out to volunteers. It’s also really important to point back when you make changes in your programming, in your support, in the volunteer experience, you’re hoping to make improvements.

When those things are changed, do you want to point back to your survey results? Because often those changes don’t come about till six months, a year later. And volunteers have long forgotten the survey that helped you make the decision to make that change. And so volunteers will say, you don’t listen to us. Our feedback is never taken into account. I just made this change because it came from that survey. And so early in my career, working with volunteers, I realized that because they’re not at our organization all the time, that I needed to really point it out hey, we’re making this change. We learned from you all from our last survey that this is a preference or this is a sticking point for you.

We’re making this change because of your feedback. Thank you again so much for participating in our survey. And what this also does, in addition to just building goodwill, it will also inspire people to participate in your next survey. So you’re baking in support for your next round of basically internal research inside your organization. So it’s important that people know that data doesn’t you just don’t pull data in and then you analyze it, sit in your office, make decisions, and move on. It’s got to be more public than that, and it’s got to be sort of an ongoing conversation or ongoing sort of letting people know, hey, we’re making this change because we heard from you and that was your preference or that was a problem that you identified. The final mistake that I want to mention around developing and deploying volunteer surveys is that you need to be willing to change if you are asking for feedback. So the mistake is asking for feedback and not being willing to change any of your practices.

If it’s just about drawing data from your volunteers and gathering data from your volunteers, find other ways to do it, perhaps through the volunteer application, if it’s demographic information or a separate supplement to your application so it does not appear discriminatory. But if that’s all you want is volunteer demographics, a survey is not the way to go. A perception survey is not the way to go because that doesn’t build goodwill. People don’t want to just feel like widgets. And so we need to be ready and willing. And if your leadership is asking for a survey, ask your leadership, to what extent are we willing to make changes based on what we learn now? Not everything, not everything is possible to change. Sometimes it’s unrealistic. We don’t have the budget, there are rules we need to follow as an organization, et cetera.

So not everything can change. And that’s okay too, and that should be communicated back. But when you put out a survey to ask for people’s honest opinions, you must be willing to make changes. And people love change, but they don’t like to be changed. And it does require sometimes if you’re improving the volunteer experience, certainly at some point staff needs to change some type of behavior, doing more of something or doing less of something or doing something a different way. And so everybody has to get on board around that. So when you’re going out, there’s nothing worse than surveying folks and then nothing happening. That’s the fastest way to pour morale that I can think of almost.

So just make sure you’re willing to change. Deploying and actively responding to volunteer feedback can build goodwill and improve your volunteer engagement. So set a plan for regular volunteer input throughout the year. Engage satisfaction over time and you will start to reap the benefits. You can do an annual volunteer perception survey. You can do surveys at a 30 day mark. You could do surveys after. And these don’t have to be long, just understanding how people are doing after their first shift, after a training, after a big project is over and done.

When you’re thinking about creating a new program or a new support system for volunteers, you want to get feedback and ideas ahead of time, all kinds of ways and times you can survey your volunteers. So think about any of these mistakes if you’re making them. Take some steps to make adjustments and you’ll do better with your surveys. I also wanted to let you know surveys are on my mind because we are offering on September 6 a Build It Half Day Volunteer Survey Boot camp. It’s going to be from noon to 04:00 P.m. Eastern or 09:00 A.m. To 01:00 p.m. Pacific here in the US.

Now, if you’re not able to make that time frame, that’s okay. We will be, as always, recording and providing replay recording. And with the Build It workshop, we’re going to be taking you through a process. I want my students of this participants in this program to be able to walk away with a survey that’s drafted up, ready to go, a plan of action and an understanding of how they’re going to analyze the data and present the data so that it can be acted upon. So I’m excited about it. Should be fun. I’m going to share all of my secret best tips about developing surveys and I’m going to have a treasure trove of downloadable tools, templates, swipe files, et cetera that can save you a lot of time, including a template for a sample volunteer survey as well as a survey project plan. So consider joining us.

You’re probably wondering, well, where do I go and how do I sign up? The way to get access to the Build It Half Day Volunteer Survey Boot Camp is to become a member of Volunteer Pro because this workshop is only and exclusively for Volunteer Pro members. But the good news is you can join for as little as $59 a month and that’s our monthly membership. You can either join as an annual member or you can join as a monthly member. Either way, you’ll get access to this live bootcamp and it’s going to be a lot of fun. Our members in our own member surveys and we do surveys a couple of times a year of our Volunteer Pro members surveys. Developing volunteer surveys was a top need and request from our members as well as having a longer workshop. We do a lot of 1 hour seminars and coaching calls, but they wanted a longer and let us know they wanted to have something that was a more intensive deep dive. So here you have it.

So absolutely a response to my members needs so there you go. So I hope this has been helpful for you to think about your volunteer surveys. I hope that you’ll join us for the half day volunteer surveys boost camp. The build it boost camp. I think you will find the time well worth the investment of just sitting down, getting to it and getting a survey that’s really going to work. And the most important thing to give you, actionable feedback. It’s so important that you have reliable data. And I’m going to talk about how to get that data, how to analyze that data, and how to present that data to others.

All right, so we will see you next time, same time, same place on the volunteer nation. As always, we appreciate every one of you and if you liked what you learned today, share it with a friend. That’s how we reach more people. So have a great rest of your week and we will see you next time on the volunteer nation. Thanks for listening to this episode of the Volunteer Nation Podcast. If you enjoyed it, please be sure to subscribe rate and review so we can reach people like you who want to improve the impact of their good cause. For more tips and notes from the show, check us out at Tobijohnson.com. We’ll see you next week for another installment of volunteer nation.