volunteer survey questions

This is How to Ask Better Volunteer Survey Questions

No matter what you need to learn about volunteers – their satisfaction, their impact on clients, their needs, or the reasons they leave – crafting the right volunteer survey questions can help you make smart management choices.

What’s more, the data you gather will allow you to confidently demonstrate and communicate the impact volunteers have on your nonprofit.

But, too often survey questions aren’t designed to capture real, actionable feedback. Sure, the results might feel good, but what are you really learning that can help you improve your volunteer strategy?

volunteer surveys at volpro.netAs they say in the research biz, “garbage in, garbage out.” This couldn’t be more true when it comes to volunteer survey questions. So, take the time to learn how to design your research to provide you rock solid information.

Structuring Your Volunteer Survey Questions

The foundation for gathering reliable data are your survey questions. They must be developed in a way that not only encourages volunteers to complete your survey top to bottom, but also inspires them to give good, candid answers.

You might not like everything you hear, but ensuring a range of volunteer feedback — both positive and negative — will make all difference.

Would you rather spin your wheels on tactics that have zero impact or make well-informed investments based on real, actionable data? We’ll assume the latter. 🙂

Volunteer Survey Questions: 14 Quick Fixes

Below are some of the top mistakes we see people make when writing their volunteer survey questions. After you draft up your next volunteer survey questionnaire, review this list to see where you might improve for clarity and more reliable data.

1) Ambiguous questions

These are more common than you think. Your goal should be that readers understand the question in the same way across the board. This means you need to be crystal clear what you mean. Ambiguous questions include words or phrases that can be interpreted in different ways.

Example: “Are you respected by fellow volunteers?”

The word “respect” can mean different things to different people, thus you won’t be comparing apples to apples.

2) Double-Barrel Questions

These kinds of survey questions ask the reader to respond to two things at once. Consequently, you never know which item your reader is really reacting to.

Example: “How would you rate your volunteer training and support?”

In this case, the reader is asked to assess two separate items that might happen at different times and be delivered by different people. It’s not really clear what “support” means and from whom?

3) Leading Questions

Leading survey questions are worded in a way that sway the reader to one side of the argument. Usually you can tell a question is leading if it includes non-neutral wording or when an example is given.

Example: “How would you like us to communicate with you, for example via Facebook?”

If this were an open-ended comment, you’d be surprised how many people would write in Facebook, and you’d miss out on all of the other more creative answers. Unfortunately, we humans are swayed that easily. So, keep suggestions out of the mix, unless you are offering a complete list for a multiple choice question.

4) Loaded Questions

Similar to leading questions, loaded questions influence the reader to answer in a certain way. In addition, they make assumptions about the reader and forces them into an answer that doesn’t necessarily reflect their circumstances.

Example: “Which weekend shift do you prefer?”

This assumes the reader as the time to volunteer on the weekend. If there are no other options, they may select one of the only options available, but never be able to commit.

5) Absolutes

Absolutes in survey questions force respondents into a corner where they can’t give useful feedback. These are usually Yes/No questions and often include wording such as “always,” “all,” “every,” “ever,” etc.

Example: “Do you log your hours right after every shift? Yes or No.”

This seems like a simple enough question, but how many people may actually do it every time? And, if they occasionally miss it and log their hours the next week, they would truthfully answer “no” even if they are prompt 95% of the time.

volunteer surveys at volpro.net6) Asking Readers to Remember

When we ask readers to remember things from too far in the past, we risk getting erroneous answers. A good guideline for reasonable recall is keep it within three months.

Example: “Do you use the information you learned in training last year?”

This requires the reader to remember what was included in training, specifically. Although they may have now incorporated what they learned into their service, they will only be able to remember a few tidbits of the coursework.

7) Jargon or Acronyms

In nonprofits we are fond of the shortcuts we take with language. But, we shouldn’t assume those who don’t work with us on a daily basis understand.  

Example: “Are you happy with our NPO’s parking reimbursement policy?”

If you include acronyms commonly used at your agency, spell them out and then include the abbreviation in parentheses. That ensure that no one feels left out or tries to gamble on an educated guess.

8) Not Sequencing Questions Properly

Volunteer survey questions are kind of like dating — you don’t ask the tough ones right off the bat. Give your readers time to warm up to the task and build trust in the process. It’s always a best practice to include your most important “need to know” questions at the beginning to make sure most people answer them.

Tip: Start with easy questions first and leave personal questions including demographics until the end.

9) Too Many Options

Although it’s hard to categorize all responses, multiple choice questions need to have a limit so the reader doesn’t become overwhelmed and quit halfway through. Remember, your goals is to have them complete the whole thing.

Tip: Keep choices to no more than 10-12 options and use columns to organize them in a symmetrical way. This also reduces the amount of scrolling the reader must do to get through an online volunteer survey, making it appear shorter and easier.

10) Not Limiting Answers

Along the same lines, if readers are allowed to check as many options as they’d like, they’ll check off many, and you’ll have a hard time prioritizing what is the most important.

Tip: Have readers prioritize by limiting the number of boxes they can check off to no more than 3-5, depending on the total number of options available and the number of volunteers you have. You’ll thank yourself when it comes time to analyze the data.

11) Overlapping Answers

More common than you might think, we often neglect to double check that we aren’t offering the same option in two answers. This most often happens when the choices are numbered ranges.

Example: “How many hours do you volunteer each week (0-3, 3-5, 5 or more)?”

If someone volunteers 3 or 5 hours a week, which option should they choose? Along the same lines, don’t forget to include a range (for example, “0-25%, 26-50%, 76-100%). It’s embarrassing to notice after the fact, and your data will be incomplete.

[Tweet “Your questionnaire must encourage volunteers to complete it AND inspire them to give candid answers.”]

12) Not Including “Other”

While this is not always the case, it makes sense to give readers the chance to add their own special sauce. They may not fit into your neatly constructed boxes, and it may interest you to be aware of these outliers.  

Tip: That said, while you shouldn’t offer too many options, if 10% or more choose “Other” then you may not have enough categories.

13) Not Including “Not Applicable”

Similar to no including “Other” as an option in some questions, offer the option to select “Not Applicable” is also a smart practice when your reader may not be qualified to respond in an informed way. Doing so allows volunteer to opt out of one question without abandoning the survey altogether because they worry they can’t be truthful.

Tip: Use “N/A” when you suspect that not every volunteer has had an experience and, therefore, may not have an informed opinion about it.

Example: if you asked “Rate the level of fun you had at last month’s annual volunteer celebration?”

You would include N/A because it’s likely that some of your volunteers did not attend and could only comment based on what others said about it, not their own direct perception.

14) Making All Questions Required

In volunteer surveys, if a question is required, the reader cannot continue on without answering it. Making all questions required risks having the respondent just click on anything to get through to the end or stop because they get offended that you are so inflexible. They may have a legitimate reason they don’t want to, or can’t, answer a particular question. Honor this.

Tip: In some cases, it makes sense to require certain volunteer survey questions. For example, a preliminary question that qualifies the reader to take the survey, bounces them, or a general satisfaction question that is critical to your data analysis. But, choose wisely, and make some questions (e.g., demographics) optional.

Free [Tip Sheet] How to Ask Smarter Demographic Questions in Your Volunteer Survey

Ask smarter questions. Get richer survey results. Benefit your program!

Keep Refining Your Volunteer Survey Questions

Being human is an imperfect science, which makes studying us even harder. But that doesn’t mean we shouldn’t try to understand behaviors and perceptions that might help us learn and improve our programs.

Honing in on the right volunteer survey questions is one way to quantify highly-subjective qualitative information to track trends over time.

Don’t expect perfection right from the start. Keep refining your survey questions, and you’ll find they get better and better as you develop them over time.