This Is A Custom Widget

This Sliding Bar can be switched on or off in theme options, and can take any widget you throw at it or even fill it with your custom HTML Code. Its perfect for grabbing the attention of your viewers. Choose between 1, 2, 3 or 4 columns, set the background color, widget divider color, activate transparency, a top border or fully disable it on desktop and mobile.

This Is A Custom Widget

This Sliding Bar can be switched on or off in theme options, and can take any widget you throw at it or even fill it with your custom HTML Code. Its perfect for grabbing the attention of your viewers. Choose between 1, 2, 3 or 4 columns, set the background color, widget divider color, activate transparency, a top border or fully disable it on desktop and mobile.

10 Common Mistakes that Keep Respondents from Completing Your Survey

image

Developing survey questions is harder than it looks. Asking questions is easy, but asking direct, unbiased, and valid questions is more of an art form. There’s a lot that goes into it, including flow, how the questions tie into what your program evaluation wants to answer, and keeping your respondents engage enough to complete the survey.

Here are 10 common mistakes and my tips for avoiding them:

Not knowing your target audience: Understanding who your audience is can help you craft survey questions that are pertinent to them. Avoid using words or phrases that your respondents may not know the meaning of. Instead, use words and phrases that are tailored to your target audience. Are your surveying nurses, social workers, or teachers? It’s ok to use words or phrases that are most common to those target audiences. On the other hand, if you’re not sure if your audience will understand what you mean by “reproductive justice”, it’s best to gather insights from the program coordinator or workshop facilitator to see if this term has been discussed.

Not explaining WHY: Believe it or not, most respondents are willing to help you if you share the value in completing your survey. When a respondent knows what’s in it for them, there is likelihood that the survey gets completed. If respondents know that their responses can aid in determining pay raises or in the restructuring of an under-performing program’s activities you’re more likely to complete it. If an incentive (i.e. a gift card to the respondent’s favorite retail store, coffee shop, or to wherever Visa is accepted) is included when a respondent completes your survey, indicate that on your survey at the very beginning before respondents begin.

Including extensive demographic questions: When you ask too many demographic questions, it can result in taking up a lot of room that could have been used for other questions. Before you add in questions to gather information on a respondent’s income level, religion, socio-economical status, etc., consider if it’s appropriate and relevant to the overall survey and the basis of the evaluation. Also, unless the rest of your survey depends on these answers, consider leaving demographic questions for the end of the survey as they tend to be the more uninteresting parts of a survey for respondents to complete.

Asking too many questions: Tying into the second point, asking too many questions can be the downfall of your survey. There are a variety of question types—open-ended, multiple choice, Likert or interval (very satisfied, satisfied, neutral, dissatisfied, very dissatisfied), ratio (“How many days do you spend studying?”), and dichotomous (true/false, yes/no, agree/disagree)—but it’s more about the intent behind the question. My recommendation is to create a survey that can have up to 15 questions. Keep in mind that engagement levels wane, especially during an online survey where there are more distractions (i.e., social media, videos, online shopping, etc.)

Overusing question types: Be mindful of overusing questions types. A survey consisting of 15 open-ended questions to gather more insights on a program’s activities sounds good in theory, but giving that survey to a 12 year old may not yield in the result you want. This doesn’t take away from if a preteen can complete a survey, but a survey that has an appropriate combination of multiple choice, open-ended, or dichotomous questions can make the time completing a survey fly by more quickly. Also, when using open-ended questions, consider the intent of the question and if a narrative response can add more insights.

Asking double barred questions: “When you go out on a date, do you prefer to go to a dinner and a movie?” This may seem like a straightforward question and a common dating scenario, but many respondents may prefer one answer choice over the other. Appropriate answer choices can include

a) Dinner and movie

b) Dinner

c) Movie

d) Other

Not including “other” or prefer not to answer”: Including “prefer not to answer” may not help much with gathering data, but it does show your respondents that you value their privacy and you’re giving the option of opting out. If a respondent notices that you’re not allowing her to opt out, she may become discouraged and will stop entirely. And when it comes to “other” responses, it may indicate that another answer choice wasn’t very obvious to you. If you decide to include “other” as an answer choice, consider letting the respondent share their explanation.

Including answer choices that overlap:

What is your age range?

a)      13-18

b)      18-24

c)       24-30

See the problem here? Mutually exclusive answer choices give respondents a chance to have a clear answer choice. To avoid ambiguity, adjust the answer choices to 13-18, 19-24, and 25-30.

Not asking direct questions: Indirect questions do not communicate your intent behind asking the question.

Example: What suggestions do you have for improving this program?

In this example, respondents will give you all types of answers, which would all be helpful. But if the intent is to discover suggestions for improving a particular activity within a program, make that known.

Leading respondents to an answer choice: Don’t be bias by creating questions with the intent of steering your respondents in a particular direction.

Example: How would you rate the career of legendary musician Michael Jackson?

Depending on whom you ask, Michael Jackson may or may not be a “legendary”.  This type of wording can create bias. To avoid this, eliminate “legendary”. Here’s another example:

The United States government should force all states to mandate comprehensive sex education in public school systems.

Does everyone support comprehensive sex education? Who likes to be force? Words like “force” can represent having control over your respondents. An alternative can be “The United States government should mandate comprehensive sex education in public school systems.” While the intent is still the same, words like “could”, “should”, and “would” are less controlling.

All in all, surveys are a great way to gather feedback on your program or service. Now that you know what common mistakes to avoid, you’re on your way to developing more quality surveys. If you have further questions or you like for me to lead your group or organization on improving your survey development skills, contact me and let’s get started.

RAISE YOUR VOICE: What are some common mistakes you’ve made as a survey developer, and how have you corrected those mistakes? 

If you like this post, subscribe to the Raise Your Voice newsletter to receive resources, advice, and tips to help you raise your voice for women and girls of color.
By | 2016-10-25T01:48:01+00:00 October 1st, 2014|Categories: Program Design & Evaluation|Tags: , , , |0 Comments