Qualitative Method - Case Study
Qualitative Method - Case Study
Qualitative Method - Case Study
CASE STUDY
Introductio
To
unfamiliar n case study methodology, there is
graduate students and researchers
with
often misunderstanding about what a case study is
and how it, as a form of qualitative research.
4. Collect Data
• Gather all relevant documents.
• Set up interviews/surveys with stakeholders.
•Seek informed consent of each respondent (written or
documented oral).
•If the respondent has consented, conduct the
interview/survey.
Steps Involved in a Case Study
5. Analyze Data
Review all relevant documents.
Review all interview/survey data.
6. Disseminate Findings
Write report.
Solicit feedback.
Revise
Disseminate.
What are the potential sources of
information?
Case studies typically rely on multiple sources of
information and methods to provide as complete a picture as
possible. Information sources could include:
• Project documents
• Project reports, including quarterly reports, midterm reviews
• Monitoring visits
• Mystery client reports
• Facility assessment reports
• Interviews
• Questionnaire/survey results
• Evaluation reports
• Observation
• Others
Elements of a Case Study
A case study do not have set elements that
are needed to be included. These elements will vary
depending on the case or story chosen, the data
collected, and the However, case studies typically
describe a program or intervention put in place to
address a particular problem.
Elements of a Case Study
Here are some elements that you could draw out from in order
to conduct your case study:
1
De@initions
❑❑ Evaluation research, sometimes called program
evaluation, refers to a research purpose rather than a
specific method.
❑❑ This purpose is to evaluate the impact of social
interventions such as new treatment methods,
innovations in services, and a host of others.
❑❑ Evaluation research is a form of applied research—it is
intended to have some real-‐world effect.
❑❑ Many methods, like surveys and experiments can be
used in evaluation research.
❑❑ In recent years, the 5ield of evaluation research has
become an increasingly popular and active research
specialty, as re5lected in textbooks, courses, and projects.
2
Research vs. Evaluation
Systematic
Research Evaluation
Methods
M.Q.Patton
28
Surveillance & Monitoring vs.
ProgramEvaluation
29
Topics Appropriate to Evaluation Research
◆◆Evaluation research is appropriate whenever some social
intervention occurs or is planned.
Community mobilization
efforts Communication systems
Research initiatives
Infrastructure-building
Surveillance &
monitoring systems
Training and educational
Policy development activities services & staff qualifications
Problem/crisis investigations
31
Administrative systems
When to Conduct Evaluation?
Conception Completion
32
WhyEvaluate Programs?
35
Identifying Stakeholders
❖❖Persons Involved in Program Operations
➢➢Staffand Partners
❖❖Persons affected or served by the program
➢➢Clients, their families and social networks,
providers and community groups
❖❖Intended users of the evaluation 5indings
➢➢Policy makers, managers, administrators,
advocates, funders, and others
❖❖BeSure to Include both Supporters and Skeptics!
36
Engaging Stakeholders
Stakeholders should be involved in…
✓✓Describing program activities, context, and
priorities
✓✓De5ining problems
✓✓Selecting evaluation questions and methods
✓✓Serving as data sources
✓✓De5ining what constitutes the“proof”of success
✓✓Interpreting 5indings
✓✓Disseminating information
✓✓Implementing results
37
Working withStakeholders
Identify stakeholders for your program
✓✓Those involved in program operations
✓✓Persons served or affected by the program
✓✓Intended users of evaluation 5indings
Think about which ones you need most for…
✓✓Credibility
✓✓Implementation
✓✓Advocacy
✓✓Funding
List ways to keep them engaged
14
Formulating the Problem: Issues of
Measurement
◆◆Problem: What is the purpose of the intervention to be
evaluated?
◆◆This question often produces vague results.
◆◆A common problem is measuring the
“unmeasurable.”
◆◆Evaluation research is a matter of 5inding out whether
something is there or not there, whether something
happened or did not happen.
◆◆To conduct evaluation research, we must be able to
operationalize, observe, and measure.
39
What is the outcome, or the
response variable?
❑❑If a social program is intended to accomplish
something, we must be able to measure that
something.
❑❑It is essential to achieve agreements on
de5initions in advance.
❑❑In some cases you may 5ind that the de5initions of a
problem and a suf5icient solution are de5ined by
law or by agency regulations; if so you must be
aware of such speci5ications and accommodate
them.
40
Operationalizing Success/Failure
◆◆De5initions o f “ s u c c e s s ” a n d “ f a i l u r e ”
can be
rather dif5icult, and these are usually not binary, but on
a scale.
41
Cost-‐Bene@itAnalysis
How much does the program cost in relation to
what it returns in bene5its?
✓✓If the bene5its outweigh the cost, keep the program
going.
✓✓If the reverse, change it or‘junk i t ’ .
✓✓Unfortunately this is not an appropriate analysis to
make if thinking only in terms of money.
Ultimately, the criteria of success and failure are
often a matter of agreement.
The people responsible for the program may commit
themselves in advance to a particular outcome that
will be regarded as an indication of success.
42
Measurement inEvaluation
❖❖Researchers must take measurement quite
seriously in evaluation research, carefully
determining all the variables to be measured and
getting appropriate measures for each.
❖❖Such decisions are often not purely scienti5ic
ones.
❖❖Evaluation researchers often must work out their
measurement strategy with the people
responsible for the program being evaluated.
❖❖There is also a signi5icant political aspect.
43
Additional Issues andImplications
The Social Context
✓✓Evaluation research has a special propensity for
running into problems.
✓✓Logistical problems
✓✓Ethical problems
Three important reasons whythe implications of the
evaluation research results are not always put into
practice.
✓✓Theimplications may not always be presented in a way that
the non-‐researchers can understand.
✓✓Evaluation results sometimes contradict deeply held
beliefs
✓✓Vestedinterests in the programs assert their in5luence
20