Research Design in QualitativeQuantitativeMixed Methods
Research Design in QualitativeQuantitativeMixed Methods
Research Design in QualitativeQuantitativeMixed Methods
Research Design in
Qualitative/Quantitative/
Mixed Methods
M ichael R. H arwell
University of Minnesota
148
Research Design
In educational research, it is usually possible
(and certainly popular) to characterize a research
studys methodology as qualitative; as quantitative; or as involving both qualitative and quantitative methods, in which case it is typically
referred to as mixed methods. The term research
design is widely used in education, yet it takes on
different meanings in different studies. (The
terms research method and research design will be
used interchangeably in this chapter.) For example, in one study, research design may reflect the
entire research process, from conceptualizing a
problem to the literature review, research questions, methods, and conclusions, whereas in
another study, research design refers only to the
methodology of a study (e.g., data collection and
analysis). Perhaps not surprisingly, there is variation within and between methodologies in how
research design is defined. However, this variation does not affect an examination of the role
of research design in promoting rigorous study
of promising ideas and, thus, a single definition
of research design is not adopted in this chapter.
I assume that research questions are the driving
force behind the choice of a research design and
any changes made to elements of a design as a
study unfolds.
Identifying a studys research design is impor
tant because it communicates information about
key features of the study, which can differ for
qualitative, quantitative, and mixed methods.
However, one common feature across research
designs is that at one or more points in the
research process, data are collected (numbers,
words, gestures, etc.), albeit in different ways and
for different purposes. Thus, qualitative studies
are, among other things, studies that collect and
Central to this inquiry is the presence of multiple truths that are socially constructed (Lincoln
& Guba, 1985). Qualitative research is usually
described as allowing a detailed exploration of a
topic of interest in which information is collected
by a researcher through case studies, ethnographic
work, interviews, and so on. Inherent in this
approach is the description of the interactions
among participants and researchers in naturalistic
settings with few boundaries, resulting in a flexible
and open research process. These unique interactions imply that different results could be obtained
from the same participant depending on who
the researcher is, because results are created by a
participant and researcher in a given situation
(pp. 3940). Thus, replicability and generalizability are not generally goals of qualitative research.
Qualitative research methods are also described
as inductive, in the sense that a researcher may
construct theories or hypotheses, explanations,
and conceptualizations from details provided by a
participant. Embedded in this approach is the
perspective that researchers cannot set aside their
experiences, perceptions, and biases, and thus
cannot pretend to be objective bystanders to the
research. Another important characteristic is that
the widespread use of qualitative methods in education is relatively new, dating mostly to the 1980s,
with ongoing developments in methodology and
reporting guidelines (Denzin, 2006). The relative
newness of this methodology also means that
professional norms impacting research, including evidence standards, funding issues, and editorial practices, are evolving (see, e.g., Cheek,
2005; Freeman, deMarrais, Preissle, Roulston, &
St.Pierre, 2007). Good descriptions of qualitative
methods appear in Bogdan and Biklen (2003),
Creswell (1998), Denzin and Lincoln (2005),
Miles and Huberman (1994), and Patton (2002).
There are several categorizations of research
designs in qualitative research, and none is universally agreed upon (see, e.g., Denzin & Lincoln,
2005). Creswell (2003) listed five strategies of
inquiry in qualitative research that I treat as synonymous with research design: narratives, phenomenological studies, grounded theory studies,
ethnographies, and case studies. Creswell also
described six phases embedded in each research
design that are more specific than those suggested
by Crotty (1998), but still encompass virtually all
aspects of a study: (1) philosophical or theoretical
perspectives; (2) introduction to a study, which
includes the purpose and research questions;
(3) data collection; (4) data analysis; (5) report writing; and (6) standards of quality and verification.
Journals that publish qualitative methodology
papers and qualitative research studies in education include Qualitative Research, Qualitative
Inquiry, Field Methods, American Educational
Research Journal, Educational Researcher, and the
International Journal of Qualitative Studies in
Education. Examples of the use of qualitative
research designs are provided by Stage and Maple
(1996), who used a narrative design to describe
the experiences of women who earned a bachelors or masters degree in mathematics and opted
to earn a doctorate in education; Gaines
(2005), who explored the process of interpreting
149
150
151
152
example, some authors insist that a mixed methods study is any study with both qualitative and
quantitative data, whereas other authors say a
mixed methods study must have a mixed methods question, both qualitative and quantitative
analyses, and integrated inferences (Tashakkori,
2009). There is also disagreement regarding
various aspects of mixed methods, such as when
mixing should occur (e.g., at the point of designing a study, during data collection, during data
analyses, and/or at the point of interpretation).
Still other authors have criticized the whole
idea of mixed methods (Denzin, 2006; Sale,
Lohfeld, & Brazil, 2002; Smith & Hodkinson,
2005), criticism which is sometimes framed in
terms of the response of advocates of a particular stance to arguments for mixing methods:
the purist stance, the pragmatic stance, and the
dialectical stance (Greene & Caracelli, 1997;
Johnson & Onwuegbuzie, 2004; Lawrenz &
Huffman, 2002). Those adopting a purist stance
argue that mixed methods are inappropriate
because of the incompatibility of the worldview
or belief system (paradigms) (Tashakkori &
Teddlie, 2003) underlying qualitative and quantitative methods, i.e., qualitative and quantitative methods are studying different phenomena
with different methods (Smith & Hodkinson,
2005). Some purists have also raised concerns
that mixed methods designs leave qualitative
methods in the position of being secondary to
quantitative methods (Denzin, 2006; Giddings,
2006; Yin, 2006).
Researchers who adopt a pragmatic stance
argue that paradigm differences are independent
of, and hence can be used in conjunction with,
one another in the service of addressing a question (Johnson & Onwuegbuzie, 2004; Morgan,
2007). Wheeldon (2010) summarizes this view:
Instead of relying on deductive reasoning and
general premises to reach specific conclusions, or
inductive approaches that seek general
conclusions based on specific premises,
pragmatism allows for a more flexible abductive
approach. By focusing on solving practical
problems, the debate about the existence of
objective truth, or the value of subjective
perceptions, can be usefully sidestepped. As such,
pragmatists have no problem with asserting both
that there is a single real world and that all
individuals have their own unique interpretations
of that world. (p. 88)
However, the pragmatic stance also has its critics (Mertens, 2003; Sale et al., 2002). Dialectical
researchers argue that multiple paradigms are
compatible and should be used, but their differences and the implications for research must be
made clear (Greene & Caracelli, 1997). It is important to emphasize that a researcher who adopts a
dialectic stance would, other things being equal,
draw on the same procedures for mixing as one
adopting a pragmatic stance. The issue of stances
is certainly not settled, and additional developments on this topic continue to appear in the
mixed methods literature (e.g., design stance of
Creswell & Plano Clark, 2007).
Mixed methods designs seem especially firmly
rooted in the evaluation literature. An early
important paper in this area was Greene, Caracelli,
and Graham (1989), which highlighted five major
purposes of (or justifications for) a mixed methods evaluation. One is triangulation, which examines the consistency of findings, such as those
obtained through different instruments, and
which might include interviews and surveys.
According to Green et al., triangulation improves
the chances that threats to inferences will be controlled. A second purpose is complementarity,
which uses qualitative and quantitative data
results to assess overlapping but distinct facets of
the phenomenon under study (e.g., in-class
observations, surveys), and a third is development, in which results from one method influence subsequent methods or steps in the research;
for example, interviews with teachers might suggest that an additional end-of-year assessment be
added. A fourth purpose of a mixed methods
evaluation is initiation, in which results from one
method challenge other results or stimulate new
directions for the research; for example, teacher
interviews might challenge results provided by
administrators in a school district. The fifth and
last purpose is expansion, which may clarify
results or add richness to the findings.
A number of frameworks for mixed methods have appeared in this literature, many of
which have built on the work of Greene et al.
(1989) (Caracelli & Greene, 1997; Creswell,
2003; Johnson & Onwuegbuzie, 2004; Lawrenz
& Huffman, 2002; Leech & Onwuegbuzie, 2007;
Morse, 2010; Newman, Ridenour, Newman, &
DeMarco, 2003; Tashakkori & Teddlie, 2003).
These frameworks differ in many ways, but
they all successfully convey a sense of the large
number of methodological tools available to
153
154
process of how academic counselors make decisions, and used this model and a case study
approach (Creswell, 1998). Norman purposively
sampled 6 counselors and 24 students and collected data via in-depth interviews with the
counselors and students that included their
observations of the advising process, and high
school and college records of students mathematics course taking and grades. Norman
reported that the counselors generally misinterpreted the mathematics portions of high school
transcripts, which had important implications
for advising a student on which college mathematics course to begin with. For example, a student
whose transcript indicated that his or her highest
completed high school mathematics course was
Integrated Mathematics IV, a standards-based
course, was generally advised to start with a
precalculus mathematics course, even though
Integrated Mathematics IV is a precalculus
course and the student should have been advised
to enroll in Calculus I. Norman also reported
that counselors who looked at transcripts of
students who completed a traditional high
school mathematics curriculum, in which the
highest mathematics course completed was
listed as precalculus, were more likely to recommend that a student enroll in Calculus I. Norman
suggested that counselor views toward standardsbased curricula may be related to working in a
mathematics department, because mathematics
departments have generally been quite critical of
standards-based curricula (Roitman, 1999;
Schoenfeld, 2004).
Norman (2008) also collected and analyzed
quantitative data for a sample of more than
1,000 college freshmen that included information on the high school mathematics curriculum they completed; their score on a college
mathematics placement exam; and the difficulty
of their first college mathematics course, which
was captured using a 4-point Likert variable
(1 = a course that should have been completed
in high school, which is sometimes referred to as
a developmental course; 4 = a course whose difficulty exceeded that of Calculus I). The results
of these analyses suggested that the curriculum
a student completed was related to his or her
mathematics placement score and the difficulty
level of the students first college mathematics
course. In particular, students who completed a
standards-based high school mathematics curriculum were more likely to enroll in a less difficult
155
with it. Here, the study could focus on discovering and understanding students experiences with
the advising process and its implications for their
college experience. A case study approach, using a
purposively chosen sample of students who
began their college mathematics course taking at
different difficulty levels, would be appropriate.
Information obtained from interviews and student academic records could be used to inform
the construction of a survey to be sent to a representative sample of students. The survey results
could be used to improve decision making by
counselors and to enhance generalizability.
A fourth mixed methods approach is the concurrent triangulation design, which is used when
the focus is on confirming, cross-validating, or
corroborating findings from a single study.
Qualitative and quantitative data are collected
concurrently, such that weaknesses of one kind of
data are ideally offset by strengths of the other
kind. Typically, equal weight is given to the two
kinds of data in mixing the findings, although
one kind of data can be weighted more heavily.
The qualitative and quantitative data are analyzed
separately, and mixing takes place when the findings are interpreted. Important strengths of this
approach are the ability to maximize the information provided by a single study, for example,
when interest is in cross-validation, and a shorter
data collection period compared to the sequential
data collection approaches. Important weaknesses include the additional complexity associated with collecting qualitative and quantitative
data at the same time and the expertise needed to
usefully apply both methods. Discrepancies
between the qualitative and quantitative findings
may also be difficult to reconcile.
In the Howell et al. (2002) study, the primary
finding was that the achievement of African
American students who received a voucher to
attend private school was on average higher than
that of African American students who did not
receive a voucher, and that this difference did not
emerge for other student groups. Adopting a concurrent triangulation design could provide an
explanation for these findings by collecting qualitative data in the form of interviews with parents
of students who did and did not receive a voucher,
and quantitative data in the form of student test
scores and background information. This would
offer an opportunity to corroborate findings
from this study with respect to the improved
achievement of African American students but
156
not other students. For example, a plausible outcome of concurrent data collection is that the
qualitative data suggest that parents of African
American students appeared to be more committed to, and enthusiastic about, their students
education in general and the voucher program in
particular, than parents of other students, and
that this enthusiasm persisted throughout the
school year (Achievement tests used in this study
were given at the end of the school year.). In this
instance, the qualitative and quantitative information would provide corroborating evidence
that the improved achievement of African
American students could be attributed in part to
receiving a voucher and enrolling in a private
school, and in part to the support, encouragement, and motivation of students parents.
The fifth approach is the concurrent nested
design, in which qualitative and quantitative
data are collected concurrently and analyzed
together during the analysis phase. Greater
weight is given to one kind of data, in the sense
that one kind of data is typically embedded in
the other. However, there may or may not be a
guiding theoretical perspective. A popular application of this approach is with multilevel structures (Tashakkori & Teddlie, 2003), in which
different levels or units of an organization are
studied. Strengths of this approach include the
shorter data collection period and the multiple
perspectives embedded in the data, whereas
weaknesses include the level of expertise needed
to execute the study successfully, especially in
mixing the qualitative and quantitative data
within the data analysis, and difficulties in reconciling conflicting results from the qualitative
and quantitative analyses.
In this design, qualitative and quantitative data
are mixed in the analysis phase, a process that can
take many different forms (see, e.g., Bazeley, 2009;
Tashakkori & Teddlie, 2003). Caracelli and Greene
(1993) described four strategies to mix qualitative
and quantitative data in the analysis. One is data
transformation, in which qualitative data are
transformed to quantitative data or qualitative
data are transformed into narrative, and the
resulting data are analyzed. In Normans (2008)
study, this could involve transforming (i.e., rescal
ing) qualitative data in the form of interviews,
field notes, and so on to a quantitative form that
captures key themes in these data. Typically, the
transformed qualitative data exhibit a nominal or
ordinal scale of measurement.
157
158
study to compare learning outcomes of a treatment group whose members receive a promising
intervention against those of a control group
using longitudinal data. However, in the service
of developing a better understanding of the
intervention, the researcher may decide to add a
preliminary component to the study in which
single-subject methods (Kratochwill & Levin,
1992) will be used to examine the learning trajectories of a small number of purposively sampled participants. This would require expertise
in single-subject methodology that a researcher
may or may not possess. Similar examples can
be constructed for qualitative and mixed methods studies.
Still, the greater challenge for many researchers is likely to be modifying personal norms
defining scholarship and the adequacy of a
contribution to the field. This may require a
researcher to step outside the methodological
boundaries he or she has been trained to honor
and, in some cases, enforce, and to embrace
research designs the researcher may have been
taught, and have taught others, are inferior.
Embracing other methodologies in ways that
cross disciplinary, funding, and publication
lines, for example, serving as a member of an
multidisciplinary research team that includes
(and values) individuals with expertise in qualitative or quantitative methods, will require a
certain amount of risk taking but will help to
move the field toward more flexible designs.
Second, researchers can work (or continue to
work) to modify professional norms in their
roles as authors, manuscript reviewers, journal
editors, panel members evaluating grant proposals, and so forth, to allow a greater range of
studies and findings to be supported. This will
require an artful balancing between encouraging risk taking (e.g., studies employing innovative interventions or emerging methods of
qualitative inquiry) and the need to satisfy
standards of research design and reporting that
help to ensure the integrity of study results.
Interestingly, there is growing evidence of support for doing just this. Some of this evidence
appears in studies published in journals, such as
the American Educational Research Journal,
Educational Evaluation and Policy Analysis, and
the Journal of Mixed Methods Research, that rely
on multidisciplinary research teams. Other evidence is in the form of new funding programs
at the U.S. Department of Education (primarily
159
Conclusion
The pursuit of promising ideas in educational
research sounds noble, and it is. In this spirit, the
task of this chapter was not to reexamine or
reignite the conflict between qualitative and
quantitative methods, nor to assess the peacemaking capacity of mixed methods, but rather
to examine the role of research design in supporting the rigorous study of ideas that are
believed to be worth studying.
An examination of qualitative and quantitative methods in education suggests that singular
applications of these methodologies will continue to play an important role in research studying new ideas. Still, there is good reason to
believe that mixed methods do indeed represent,
as Teddlie and Tashakkori (2003) argued, a third
methodological movement (p. 5), which is only
now beginning to mature as a well-established
methodological alternative with agreed-on foundations, design, and practices (p. 287).
In arguing for the value of mixed methods,
Creswell and Plano Clark (2007) wondered,
what would happen if
quantitative researchers paid more attention to
the incredible range of hypotheses that
qualitative researchers have generated for them?
And what if qualitative researchers spent more
160
References
Alise, M. A., & Teddlie, C. (2010). A continuation of
the paradigm wars? Prevalence rates of
methodological approaches across the social/
behavioral sciences. Journal of Mixed Methods
Research, 4, 103126.
Balnaves, M., & Caputi, P. (2001). Introduction to
quantitative research methods: An investigative
approach. Thousand Oaks, CA: Sage.
Bazeley, P. (2009). Mixed methods data analysis. In
E. Halcomb & S. Andrew (Eds.), Mixed methods
research for nursing and the health sciences
(pp. 84118). London: Wiley-Blackwell.
Berkenkotter, C. (1991). Paradigm debates, turf wars,
and the conduct of sociocognitive inquiry in
composition. College Composition and
Communication, 42, 151169.
Bogdan, R. C., & Biklen, S. K. (2003). Qualitative
research for education: An introduction to
theories and methods (4th ed., pp. 742).
Boston: Allyn & Bacon.
Borman, G. D., Hewes, G. M., Overman. L. T., &
Brown, S. (2003). Comprehensive school reform
and achievement: A meta-analysis. Review of
Educational Research, 73, 125230.
Bozarth, J. D., & Roberts, R. R. (1972). Signifying
significant significance. American Psychologist,
27(8), 774775.
Brown, S. (2009). Learning to read: Learning disabled
post-secondary students talk back to special
education. International Journal of Qualitative
Studies in Education, 22, 8598.
Bryman, A. (2004). Social research methods (2nd ed.).
Oxford, UK: Oxford University Press.
Bryman, A. (2007). Barriers to integrating
quantitative and qualitative research. Journal of
Mixed Methods Research, 1, 822.
161
162
163
11
Intellect, Light, and
Shadow in Research Design
J ohn P. B ean
Indiana University
165
166
167
168
169
170
IV. Findings
a. A description of the sample actually
analyzed in the study
b. Description of the data
c. Treatment of missing cases
d. Possible or known biases
e. Description of how the researcher met
the assumptions required to use the
chosen statistics
f. Presentation of the data
g. Support or lack of support for the
hypotheses or theories used
h. Discussion of the findings
i. Conclusions
171
I. Topic to be studied
a. The overall interest focusing on what
will be explained or described
b. Organizing metaphor (like grounded
theory)
c. The mystery and the detective
d. Hermeneutic elements
e. The significance of the study
f. Why the reader should be interested.
II. Getting information away from the source
a. Relevant literature
b. Theories
c. Findings for content area, themes, foci,
or analogous situations
172
173
174
175
Generalizability
176
177
178
179
180
Conclusion
The reporting of research can be viewed as sto
rytelling, as part of a mythic process of identify
ing who we are. In storytelling, we seek to
remember the past, invent the present, and envi
sion the future (Keen & Valley-Fox, 1989).
Research can be viewed as a similar process in
remembering the past by examining the litera
ture; inventing the present by conducting the
study and describing the findings; and envision
ing the future where this research influences
thought, policy, and practice.
181
References
182