Data Collection Methods: Survey Research

48 Designing Effective Survey Questions

To this point we have considered several general points about surveys including when to use them, some of their pros and cons, and how often and in what ways to administer surveys. In this section we will get more specific and take a look at how to pose understandable questions that will yield usable data and how to present those questions on your survey.

Asking Effective Survey Questions

The first thing you need to do in order to write effective survey questions is identify what exactly it is that you wish to know. While that should go without saying, we cannot stress enough how easy it is to forget to include important questions when designing a survey. For example, suppose you want to understand how students at your school made the transition from high school to college. You wish to identify which students were comparatively more or less successful in this transition and which factors contributed to students’ success or lack thereof. To understand which factors shaped successful students’ transitions to college, you will need to include questions in your survey about all the possible factors that could contribute. Consulting the literature on the topic will certainly help, but you should also take the time to do some brainstorming on your own and to talk with others about what they think may be important in the transition to college. Perhaps time or space limitations will not allow you to include every single item you have come up with, so you will need to think about ranking your questions to be sure to include those that you view as most important.

Although we have stressed the importance of including questions on all topics you view as important to your overall research question, you do not want to take an everything-but-the-kitchen-sink approach by uncritically including every possible question that occurs to you. Doing so puts an unnecessary burden on your survey respondents. Remember that you have asked your respondents to give you their time and attention and to take care in responding to your questions; show them your respect by only asking questions that you view as important.

Once you have identified all the topics about which you would like to ask questions, you will need to actually write those questions. Questions should be as clear and to the point as possible. This is not the time to show off your creative writing skills; a survey is a technical instrument and should be written in a way that is as direct and succinct as possible. The best way to show your appreciation for your respondents´ time is to not waste it. Ensuring that your questions are clear and not overly wordy will go a long way toward showing your respondents the gratitude they deserve.

Related to the point about not wasting respondents’ time, make sure that every question you pose will be relevant to every person you ask to complete it. This means two things: first, that respondents have knowledge about whatever topic you are asking them about, and second, that respondents have experience with whatever events, behaviours, or feelings you are asking them to report. In our example of the transition to college, heeding the criterion of relevance would mean that respondents must understand what exactly you mean by “transition to college” if you are going to use that phrase in your survey and that respondents must have actually experienced the transition to college themselves.

When developing survey questions, a researcher must consider the following aspects:

  • Context effects: This can be a function of funnelling or be inadvertent, but questions that are asked can prime (i.e., make more salient) certain views or thoughts that then impact the way respondents answer subsequent questions. For example, if we ask you a number of questions about harm reduction and the Insite Safe Injection Site, and then ask you whether you support the Safe Injection Site, you may be more likely to support the site than if I had asked you several questions about crime in the area of the site before asking you if you support the site.
  • Context appropriate wording: It is important that the wording you choose is appropriate for the people who are going to be answering your questions. Age, language, jargon, and the like should be considered. You do not want to be asking people questions they cannot understand due to their age, or language barriers. You want your vocabulary to be appropriate for the people who are answering your survey.
  • Minimizing bias: Questions with loaded terms (e.g., adjectives like disgusting, dangerous, wonderful, and terms like always, never) and non-neutral wording should be avoided. These are questions that ultimately lead people to the “correct” answer. The tone of the question will also impact how people answer (Tone of wording). People answering the questions should not feel judged for their response or their opinion. If they do, they are less likely to answer the question honestly as they will instead answer the question the way they think you want them to respond.
  • Ambiguity: Questions can be ambiguous in many ways and this is one area that can benefit from pilot testing (or pre-testing) your questions to determine which questions can be interpreted differently from your intended meaning. In particular, use of words like “often” or “sometimes” can result in different interpretations. However, even words that appear to be clear to the researcher can be misinterpreted by the respondents and make the question difficult for them to answer. Acronyms can also make questions difficult to answer if they are unknown to the respondents. As noted above, context appropriate wording to the audience responding to the questions should be considered and thus acronyms are sometimes appropriate.
  • Meaningless responses: People can and do respond to questions about things they have no knowledge of. As a researcher, you want to have responses by people who have some knowledge or ability to meaningfully answer the question.
  • Double barrelled questions: This type of question should be avoided at all costs – essentially this is a question where there is more than one question within it. For example: I enjoy biking and hiking in my free time. If a respondent enjoys biking but not hiking, how do they respond?

If you decide that you do wish to pose some questions about matters with which only a portion of respondents will have had experience, it may be appropriate to introduce a filter question into your survey. A filter question is designed to identify some subset of survey respondents who are asked additional questions that are not relevant to the entire sample.

There are some ways of asking questions that are bound to confuse survey respondents. Researchers should take great care to avoid these kinds of questions. These include questions that pose double negatives, those that use confusing or culturally specific terms, and those that ask more than one question but are posed as a single question. Any time respondents are forced to decipher questions that utilize two forms of negation, confusion is bound to ensue. In general, avoiding negative terms in your question wording will help to increase respondent understanding. You should also avoid using terms or phrases that may be regionally or culturally specific (unless you are absolutely certain all your respondents come from the region or culture whose terms you are using).

Another thing to avoid when constructing survey questions is the problem of social desirability. We all want to look good, right? And we all probably know the politically correct response to a variety of questions whether we agree with the politically correct response or not. In survey research, social desirability refers to the idea that respondents will try to answer questions in a way that will present them in a favourable light. Perhaps we decide that to understand the transition to college, we need to know whether respondents ever cheated on an exam in high school or college. We all know that cheating on exams wrong, so it may be difficult to get people to admit to cheating on exam in a survey. But if you can guarantee respondents’ confidentiality, or even better, their anonymity, chances are much better that they will be honest about having engaged in this socially undesirable behaviour. Another way to avoid problems of social desirability is to try to phrase difficult questions in the most benign way possible.  Babbie (2010) offers a useful suggestion for helping you do this—simply imagine how you would feel responding to your survey questions. If you would be uncomfortable, chances are others would as well.

Finally, it is important to get feedback on your survey questions, in a pre-test, from as many people as possible, especially people who are like those in your sample. Now is not the time to be shy. Ask your friends for help, ask your mentors for feedback, ask your family to take a look at your survey as well. The more feedback you can get on your survey questions, the better the chances that you will come up with a set of questions that are understandable to a wide variety of people and, most importantly, to those in your sample.

In sum, in order to pose effective survey questions, researchers should do the following:

  1. Identify what it is they wish to know.
  2. Keep questions clear and succinct.
  3. Make questions relevant to respondents.
  4. Use filter questions when necessary.
  5. Avoid questions that are likely to confuse respondents such as those that use double negatives, use culturally specific terms, or pose more than one question in the form of a single question (double-barrelled).
  6. Imagine how they would feel responding to questions.
  7. Get feedback, especially from people who resemble those in the researcher’s sample.

Response Options

While posing clear and understandable questions in your survey is certainly important, so too is providing respondents with unambiguous response options. Response options are the answers that you provide to the people taking your survey. Generally, respondents will be asked to choose a single (or best) response to each question you pose, though certainly it makes sense in some cases to instruct respondents to choose multiple response options. One caution to keep in mind when accepting multiple responses to a single question, however, is that doing so may add complexity when it comes to tallying and analyzing your survey results.

Offering response options assumes that your questions will be closed-ended questions. In a quantitative written survey, which is the type of survey we have been discussing here, chances are good that most if not all your questions will be closed ended. This means that you, the researcher, will provide respondents with a limited set of options for their responses. To write an effective closed-ended question, there are a couple of guidelines worth following. First, be sure that your response options are mutually exclusive. For example, look at the age categories depicted in Examples 1 & 2.

Mutually exclusive example 1

How old are you?

  • 19-29
  • 29-39
  • 39-49
  • 49-59
  • 59 or older

Mutually exclusive example 2

How old are you?

  • 20-29
  • 30-39
  • 40-49
  • 50-59
  • 60 or older

What do you notice in Example #1?  If I am 39 years old do I choose option 2 or option 3?  In other words, the options are not mutually exclusive. If you look at Example #2 you will see that the options are now mutually exclusive.  Another thing to remember is to keep the span of numbers the same for each category. For example, with the exception of the last category, all other categories should represent the same number of years.  In Example #2, all choices represent a span of 10 years.

Surveys need not be limited to closed-ended questions. Sometimes survey researchers include open-ended questions in their survey instruments as a way to gather additional details from respondents. An open-ended question does not include response options. Rather, respondents are asked to reply to the question in their own way, using their own words. These questions are generally used to find out more about a survey participant’s experiences or feelings about whatever they are being asked to report in the survey. If, for example, a survey includes closed-ended questions asking respondents to report on their level of physical activity on a weekly basis, an open-ended question could ask respondents what physical activities they participated in. While responses to such questions may also be captured using a closed-ended format, allowing participants to share some of their responses in their own words can make the experience of completing the survey more satisfying to respondents and can also reveal new motivations or explanations that had not occurred to the researcher.

Other things to avoid when it comes to response options include fence-sitting and floating. Fence-sitters are respondents who choose neutral response options, even if they have an opinion. This can occur if respondents are given, say, five rank-ordered response options, such as strongly agree, agree, no opinion, disagree, and strongly disagree. Some people will be drawn to respond “no opinion” even if they have an opinion, particularly if their true opinion is the non-socially desirable opinion. Floaters, on the other hand, are those that choose a substantive answer to a question when really they do not understand the question or do not have an opinion. If a respondent is only given four rank-ordered response options, such as strongly agree, agree, disagree, and strongly disagree, those who have no opinion have no choice but to select a response that suggests they have an opinion.

As you can see, floating is the flip side of fence-sitting. Thus, the solution to one problem is often the cause of the other. How you decide which approach to take depends on the goals of your research. Sometimes researchers actually want to learn something about people who claim to have no opinion. In this case, allowing for fence-sitting would be necessary. Other times researchers feel confident all respondents will be familiar with every topic in their survey. In this case, perhaps it is fine to force respondents to choose an opinion. There is no always-correct solution to either problem. Table 8.2 provides examples of the various types of research questions, including their content, structure and wording.

Table 8.2 Survey question examples: Content, Structure and Wording
Open Ended Question Closed Ended Version Type of Closed Ended Question
1. What do you like most about your job? Rate the following statement: I like my job
1 strongly agree2 agree3 neither agree nor disagree

4 disagree

5 strongly disagree

Rating Scale – Likert
2. What is your income? How much did you earn in 2018?

  1. 0-$20,000
  2. $20,001 – $40,000
  3. $40,001 – $60,000
  4. $60,001 – $80,000
  5. $80,001 or more

OR

What was your income for 2018? _______________

Categorical response

 

 

 

Single Response

3. What do you think of the Vancouver Police Department? How would you rate the Vancouver Police Department on the following dimensions?
Fair _ _ _  unfair
Respectful _ _ _ disrespectful
Knowledgeable _ _ _ _ Lacking Knowledge
Semantic Differential
Question Wording Examples
Question Critique Type of issue
1. Agree or Disagree: Hookers on the streets are a threat to public safety. The use of the term “hookers” is inflammatory and indicates to the respondent what the “expected” response should be. Loaded terms
2. Agree or Disagree: I support the legalization of street drugs and their taxation. This question asks two questions (legalization and taxation). Respondents who feel differently about these issues will have difficulty answering the question. It also is ambiguous – what is a street drug, and what is meant by legalization and taxation? Not everyone knows what legalization is, and taxation may be applied in many ways and used in different ways. Double Barrelled
Ambiguous language
3. Agree or Disagree: I believe that the VPD should increase the number of NCOs by increasing the number of Cpls. This question assumes you know what VPD, NCO and Cpls stand for. It also asks two questions. You may believe the number of NCOs should increase but not by increasing the number of Cpls. Use of Acronyms
Double Barrelled
4. Agree or Disagree: Canada has good immigration policies This question could be answered by anyone, but does not indicate whether they have any knowledge of the topic. This might be a good question after asking a series of question to determine that the person has knowledge first. It is also somewhat ambiguous – what does “good” mean in this context? Ambiguous language

Text Attributions

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

An Introduction to Research Methods in Sociology Copyright © 2019 by Valerie A. Sheppard is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book