Nothing Special   »   [go: up one dir, main page]

Designing An Effective Questionnaire

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Laura Colosi

Designing an Effective Questionnaire


Where to Begin Attitudes can include participant’s perceptions,
Questionnaires are the most commonly used method feelings, or judgments, collected in an effort to assess
for collecting information from program participants participant views on a subject or on the program
when evaluating educational and extension programs. itself. For example, “the biggest challenge parents
When designing a questionnaire, it is important to face is disciplining their child(ren)…”
keep some key points in mind in order to best meet Behavior can include what people do, will do, or
the evaluation needs of a specific program or have done in areas related to the topic area of a
intervention. First, what type of information is needed program. For example, in a parent program about
both to capture the important objectives of your appropriate discipline, one behavior to measure is
program and to fulfill the purpose of your evaluation? frequency of yelling at home. Behavior change is
Second, what type of question(s) and response(s) will often hard to capture, especially if the duration of a
best capture the information you seek? Finally, in program or intervention is short. Accurate assessment
what format should the questionnaire be designed to of behavior change frequently requires the
make it user-friendly and to also capture the breadth opportunity to implement a pre and post test, and
of information needed to measure a program’s ideally, some follow-up data after the program has
impact? been completed. In essence, behavioral changes
illustrate the translation of knowledge and attitudinal
Kinds of Information changes to skill sets and actions – which can take
The first question to ask is: “What type of some time to happen depending on the area of
information do I need to collect?” The answer to this intervention.
depends on the goals and content of your program.
For example, if you have a program that teaches Types of Questions
parents age-appropriate discipline for children you Once you have identified the type of information you
will need to design a questionnaire that collects want to collect, based on the content areas and
information on the basic content areas covered by this intended outcomes of your program, the next step is
curriculum. A good way to start is to go through to draft questions for each piece of relevant
your curriculum and make a list of the key concepts information you hope to capture with your
or skills that parents should learn in each section. questionnaire.
Then, you can design questions to capture whether Open ended questions are those that do not place
parents learned new skills, changed their attitudes, restrictions on the answers respondents can provide.
and/or whether they ultimately changed their One such example is, “What are the biggest
parenting practices in each of the areas covered by challenges you face when disciplining your child?”
your program. This question allows participants to answer in their
Frequently, survey questions capture information own words. As a result, open-ended questions yield
in several categories, including, but not limited to, more varied responses than closed-ended questions,
knowledge, attitudes, and behavior. and may highlight responses that evaluators could not
Knowledge refers to what participants understand have anticipated. However, the information yielded
about program content, ideas or concepts. When from open ended questions takes much longer to read
capturing this information, it is usually best to offer through and to code in order to identify common
response choices that are factual in nature to measure responses, which could slow down the reporting of
content comprehension. For example, “it is true results. As such, one must weigh the importance of
that…”
free expression of responses against the time required Because the respondents who meant “steady
to get useful information from each question. pay” chose “steady pay/job security” and the others
In addition, because open-ended questions who meant “high pay” on the open ended question
capture only one person’s experience on a given could choose “work that pays well” the distinction
question, it is difficult to report results for the entire among them was apparent based on the answer
group. This is due to the many possible choices available for the question. As a result, areas
interpretations of each respondent’s views, and the in which an answer to an open-ended question can be
inability to state with certainty that each participant’s unclear are easily solved by using a closed-ended
interpretation of both the question and their responses question with greater specificity in response choices.
are consistent. However, open-ended data can be It is the greater specificity and consistency yielded by
useful when exploring the range of possible responses closed-ended questions that allows one to generalize
to a question or topic, especially when pilot testing a results that can be reported across respondents. In
program prior to its implementation. addition, the use of closed-ended choices allows for a
Another type of question is a close-ended timelier and more systematic analysis of data
question in which respondents much choose among collected. Instead, one only needs to compute
specific response options for each question. For frequency of responses for each question, or means
example, when asking “What are the biggest on a Likert scale to gauge impact of the program
challenges you face when disciplining your child?” a itself. Disadvantages are that there is no room for
close-ended question may offer the following choice participants to express responses in their own words,
of answers: 1) getting my child to listen; 2) losing my as they must use only pre-determined choices to
temper; or 3) there are no challenges to disciplining answer a question.
my child. There are many possible ways to structure When deciding whether to use open- or closed-
responses to close-ended questions, including forced ended questions, it is important to think about your
choices, agree/disagree and Likert scales (discussed goals. A general rule of thumb to follow is – if an
below). educator knows the specific information needed to
Advantages of close-ended questions are that answer a question – and requires a single frame of
carefully chosen response options allow for the same reference among respondents, closed-ended responses
frame of reference for all participants when choosing are preferred (Converse and Presser, p.33). If
an answer. The answers to a close-ended question are however, an educator is not sure what the range of
pre-determined, and as a result, they are both more possible responses are to a question, and hopes to
specific than open-ended questions and are more conduct a preliminary exploration of a topic, open-
likely to promote consistency among respondents in ended questions will work better. Note also that
terms of understanding both the question and previous research 1 shows that respondents are more
responses. willing to offer sensitive information on a survey
For example, Converse and Presser (1986) using an open-ended response.
provide a classic scenario: consider the open- ended
question, “People look for different things in a job. Common Pitfalls in Designing Questions
What would you most prefer in a job?” They found Leading Questions: Regardless if a question is
that the most common answer was “pay.” However, open- or closed-ended, it is important that it not lead
upon further analysis it appeared that some a participant to your preferred response. An example
respondents meant “high pay” and some meant of a leading question is, “Do you believe it is okay to
“steady pay.” Because the wording of answer choices spank young children despite the AAP
was not specific enough, it was not possible to know recommendations not to do so?” in which it is clear
which person meant which. Contrast this with a that the acceptable answer is “no.”
closed-ended question, “People look for different Double barreled questions: Make sure each
things in a job. Which one of the five following things question addresses only one issue. For example, do
would you most prefer in a job?” that offers the NOT ask, “Did this program teach you how to
following response choices: discipline you child and manage your home
1. work that pays well; finances?” It is possible that the program was
2. work that I am proud of; successful in one domain but not the other, and this
3. autonomy;
4. friendly co-workers; or 1
Converse and Presser 1986; Taylor-Powell 1998; and
5. steady pay/job security. Bradburn, et al. 2004.

2
question does not allow for that distinction to be participant. Note also that Bradburn (2004) cautions
made. against the use of categories such as (1)never, (2)
Unclear or ambiguous questions: Each rarely, (3) occasionally, (4)fairly often, and (5) often;
question needs to be written very clearly. For as these types of choices “can confuse and annoy
example, asking “Do you think children require strict respondents,” and also because a researcher has no
discipline?” requires each respondent to answer based way of knowing what “often” means to each
on their own definition of “strict”. In contrast, asking participant. It is very important that categories chosen
“Is it appropriate to spank a child who breaks your as responses capture all possible choices, but are also
rules?” is much more specific. It is also important carefully worded to avoid ambiguity of interpretation
provide as much information as possible to make sure to reduce the likelihood of invalid responses on your
that each question is clear and interpreted the same questionnaire.
way by everyone. For example, “Do you think youth Another popular way structure response choice is
should be spanked?” is not as clear as “Do you think to use a Likert Scale. A Likert scale is used to rate
it is appropriate to spank a child between 12-16 years each item on a response scale. For instance,
old?” participants are asked to answer each question by
Invasive/personal: Questions about personal rating each item on a 1-to-3, 1-to4, 1-to-5, 1-to-6, or
information are difficult to phrase in a non-intrusive 1-to-7 response scale. In general, a 5-point Likert
way. One possible solution is to provide broad scale is an appropriate choice and would include the
categories of responses from which to choose. For items: “strongly agree,” “agree,” “neutral,”
example, a participant may be more comfortable “disagree,” and “strongly disagree.” Using this type
reporting his or her income as part of a category (e.g. of scale allows you to ask many questions as
$20,000 – $30,000) rather than a specific number. statements like “Children should be yelled at as
It is helpful to re-read your questionnaire to see infrequently as possible” in which participants choose
each question through the survey respondent’s eyes how strongly they agree or disagree with that
by asking if each question is: too personal; statement.
judgmental; ambiguous or unclear; leading; or
capturing more than one idea. Finally, it is very Circle the number that corresponds to your feelings about each
important to collect demographic data (such as age, statement…
sex, educational level or race) from participants when

Disagree

disagree
Strongly

Strongly
opinion
Agree
Agree
possible, preferably at the end of the survey

No
instrument. This data allows educators to better
understand their audience and also allows researchers
to examine whether a program may affect different
groups in different ways. For example, one program Children should be yelled at
may have better outcomes for women than for men, 5 4 3 2 1
as infrequently as possible.
or older versus younger parents. This ability to
correlate outcomes and participant characteristics
allows for the identification of shortcomings or Reading books with a child
strengths in a given curricula for certain audiences, under age 3 is important to 5 4 3 2 1
and thus, can guide any needed modifications of the foster early literacy skills.
curriculum.

Choosing Response Options One primary reason to use a Likert Scale is the data
When using closed-ended questions, you will need to
are easy to code and report back by simply assigning
carefully consider what response options you would
codes to the responses (for example strongly agree =
like to provide to the respondents. Generally,
5, agree = 4, neutral = 3, disagree = 2, and strongly
responses should be a set of mutually-exclusive disagree = 1) so that a higher score reflects a higher
categorical choices. For example, when asking about
level of agreement of each item. This is important
income, a range of choices may be presented such as
because after you enter the individual scores, you can
(1) less than $20,000 a year; (2) $21,000 – 40,000 a
easily calculate an average – or mean score for the
year; (3) 41,000 – 60,000 a year; or (4) more than
whole group for each survey question. In the case of
61,000 a year. These types of categories remove the assigning higher values to stronger agreement, then
possibility that more than one answer may apply to a

3
higher mean scores for each question will translate 7. The first questions should be easy to read,
into levels of agreement for each item, and thus, cover the more important topics of interest,
lower scores will reflect participants’ disagreement and if possible, be closed ended.
with each item asked. 2 8. It is also good to start with more general
questions and then move to greater
Questionnaire Format specificity towards the end of the
The layout of a questionnaire determines how easy questionnaire.
it is for respondents to read, understand and answer 9. Pay attention to the flow of questions; the
each question you ask. As such, questionnaire format instrument should be logical, and easy to
heavily influences the quality of the data you collect. follow. If possible, keep questions on similar
Specific things to consider in design include general topics close together. Check and re-check all
appearance, typeface, blank space, order of questions, instructions and/or skip patterns.
and the placement and clarity of instructions on the 10. When switching to different topic areas use a
questionnaire. Finally, it is also recommended that, if transitional statement to help guide the
time permits, the instrument be piloted before respondent through the questionnaire.
implementation to identify any problem areas in both
format and content of the questionnaire. Note also that asking too many questions is
Here are some general tips 3 on format: burdensome to both the program’s participants and
1. Start with a clear introduction that is both the staff who need to analyze and report the
informative about the purpose of the evaluative data. As a result, it is important that each
questionnaire, explains how the information question included gathers essential information
collected will be used (e.g., improve (Taylor-Powell, 1998). Remember also to think about
program, etc), and assures them that their the educator’s time constraints for implementation of
personal information will remain confidential the questionnaire, the possible literacy level variation
(alternatively, you could number instruments or any possible language barriers the participants may
and remove names altogether – you must, have that impact the educator’s ability to administer
however, match the pre and post instruments the survey.
by participant number if using that design).
2. Use an easy to read typeface, and allow for Summary
some blank space between questions. When writing questions for an evaluative survey, it is
3. Do not break question text or instructions to important to recall the objective of the program
turn pages; keep all text together for each delivered, and the information required to measure
question. the success in meeting the objectives. Educators need
4. Use italics or bold for instructions to also to focus on the type of information needed to
distinguish them from the question itself. assess the program impact, deciding whether to
5. Arrange the answers vertically under each include knowledge, attitudes, skills and/or behavior.
question. For example, Good survey design also includes attention to the
How often do you read to your audience completing the questions (e.g., literacy,
elementary school aged child? language, etc.), and the purpose of the question
__Every day (outcome data, qualitative feedback, satisfaction with
__4-6 times a week the program and/or audience demographics). The
__1-3 times a week flow of the entire questionnaire is also critical, and
__Less that once a week educators need to ensure that each question is clear
6. If needed, place any explanatory text or and the directions are equally easy to understand.
definitions in a parenthesis immediately after Most importantly, remember that the quality of the
the question. information gathered from a survey instrument is
dependent on the clarity of each question asked on a
2 questionnaire. This reinforces the importance of
For more information, see Measuring Evaluation Results
thoughtful design for questionnaires to capture and
with Microsoft Excel available at
http://www.parenting.cit.cornell.edu/Excel%20Tutorial1_
report the true experience and change of participants
RED3.pdf in programs.
3
For more detail see reference by Ellen Taylor-Powell,
1998 cited at the end of this brief.

4
Steps to Developing an Effective Questionnaire

™ Decide what information you need to collect for your program;

™ Search for previous evaluative efforts on programs similar to yours to see what
others may have done (and review their questionnaires if possible, but be sure to
obtain permission if using a questionnaire or items developed by someone else);

™ Draft your questions or modify questions on an existing survey to fit your own
program needs;

™ Place the questions in logical order;

™ Re-read the entire questionnaire and add specific instructions, transitions, and any
clarifying information for questions or responses (in parentheses where applicable);

™ Focus on the format of the questionnaire with attention to layout, readability, time
demands on respondents, logic and clarity of content;

™ If time allows, have a few co-workers complete the questionnaire to identify


problem areas in your survey;

™ Revise the instrument as needed based on feedback provided;

™ Prepare educators on protocol for implementing the questionnaire; and

™ You are ready to go!


References
Bradburn, N., Sudman, S. and Wansink, B. (2004) Asking Questions. Jossey-Bass; San Francisco,
Ca.

Converse, J. and Presser, S. (1986) Survey Questions: Handcrafting the Standardized


Questionnaire. Sage University Press; Newbury Park, Ca.

Schwarz, N. and Oyserman, D. (2001) Asking Questions about Behavior: Cognition,


Communication, and Questionnaire Construction. American Journal of Evaluation, Vol. 22, No. 2,
pp. 127-160.

Taylor-Powell, E. (1998) Questionnaire Design: Asking questions with a purpose. University of


Wisconsin Extension.

For more information about the


Parenting in Context project at Cornell,
Visit our website at:
http:www.parenting.cit.cornell.edu

Laura Colosi is an Extension Associate in the Department of


Policy Analysis and Management at Cornell University.

© 2006 Cornell Cooperative Extension

You might also like