Nothing Special   »   [go: up one dir, main page]

Monitoring and Evaluation in Emergencies

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 31

Monitoring and evaluation in

Emergencies
Session overview
• Introduction to M&E in emergencies
• The project cycle:
– Monitoring
– Evaluation
– LogFrames
– Indicators
• Evaluation of humanitarian action and the
DAC criteria
Learning objectives
By the end of this session, you should be able to:
• Understand the basic concepts of monitoring and evaluation
• Be capable of describing key evaluation parameters and the
importance of each of the parameters
• Be aware of the importance of monitoring and evaluation for
nutrition interventions in emergencies
• Be aware of the present gaps in practice in terms of
monitoring and evaluation of interventions in emergencies.
Introduction
• Monitoring and evaluation in emergency
contexts has two functions:

– Accountability – assessing the extent to which


humanitarian actions meet humanitarian
standards
– Reporting to donors – based on specific project
indicators
Sphere standard
• Common standard 5: monitoring
The effectiveness of the programme in responding to
problems is identified and changes in the broader context
are continually monitored, with a view to improving the
programme, or to phasing it out as required.
• Common standard 6: evaluation
There is a systematic and impartial examination of
humanitarian action, intended to draw lessons to improve
practice and policy and to enhance accountability.
• Replaced by Core Standard “ Performance, transparency
and learning”
Project cycle

Disaster Assess Analyse

Plan
Design
Advocacy Re-design

Monitor
Implement
Evaluate
What is Monitoring?
• Monitoring = the routine oversight of the
implementation of an activity / intervention
• Aim = to establish the extent to which an
activity is proceeding according to plan and
allow timely corrective action, as necessary
PROCESS (PERFORMANCE) IMPACT (SITUATION)
MONITORING MONITORING

Measuring progress in program- Measuring change in a condi-


me delivery / implementation tion / situation as a result
of the intervention
Example – Monitoring GFD
What are the main elements of a programme that
need monitoring during General Food
Distribution?
(note: performance monitoring)

Number and identification of beneficiaries

Food management

Management of food distribution


Importance of monitoring
• Monitoring is important in its own right and
an implicit part of an evaluation and yet is
often done badly:
– Routine data collection not done routinely!
– Data collection done poorly
– Information not processed or used in a timely manner
Guiding principles for monitoring
• Focus on minimal information required for
each level of responsibility
• Include all forms of communication: verbal,
written, formal, informal
• Use participatory methods
• Create an obligation to act
What is Evaluation?
• Evaluation = the process to determine as
systematically and objectively as possible, the
significance/worth of a programme
intervention.
• Aim = to provide evidence to improve current
activities and future planning
PROCESS EVALUATION andEVALUATION
IMPACT
programming
Measures the performance of an Measures the level of success
intervention in terms of availabi- in terms programme outcomes
lity, quality, etc. of a service
Evaluation Parameters
• Are we doing the right thing?
EFFECTIVENESS
• Often measured in the negative

• Are we doing the thing right?


EFFICIENCY
• Cost effectiveness versus expediency

• Are we doing the best thing in a context?


RELEVANCE
• Example: food aid versus cash

• Are we changing the situation long term?


IMPACT
• Baseline information is key

• Are we reaching target population/area?


COVERAGE
• Geographical versus beneficiary coverage
Example – Evaluating IYCF
What does a IYCF programme evaluation need to
assess to see whether it had the expected impact?
(note: impact evaluation)

Changes in IYCF practices ( breastfeeding, bottlefeeding,


use of BMS, etc...)
In Summary
MONITORING EVALUATION

Routinely Frequency Periodically


Internally managed Management Externally managed

Management tool Purpose Learning tool

Short term Horizon Long term

Keep track Main action Appraise, assess

Inputs, process, outputs Focus Also outcomes and impact

Workplans, performance Reference Programme objectives,


targets external standards
Project cycle

Disaster Assess Analyse

Plan
Design
Advocacy Re-design

Monitor
Implement
Evaluate
Approaches to evaluation in emergencies

• No evaluation!
• Single-agency post-intervention evaluation
• Increasing move towards:
– Inter-agency evaluations: the objective with these is to
evaluate responses as a whole and the links between
interventions
– Real time evaluations: carried out 8 to 12 weeks after the
onset of an emergency and are processed within one
month of data collection
Good practice
• Describe methods used
• Use a multi-method approach and cross-
check
• Talk to primary stakeholders
• Disaggregate findings
• Ensure a focus on social process and
causality
• Make clear any evaluator bias
• Integrate the DAC criteria!
Challenges to M&E in emergencies

• Lack of standard methodologies and indicators


• Lack of time to establish baseline
• Rapidly changing environment
• Perceived opportunity cost of M&E
• Insufficient information on cost effectiveness
of nutrition interventions
M&E for specific nutrition responses

• Throughout the rest of the course, pay


particular attention to the approaches used
for M&E:
– Monitoring food assistance
– Reporting formats for CMAM/SFPs
– Indicators for IFE
– M&E for food security/livelihoods
What is a Log Frame?
• The logical framework or
logframe is an analytical
tool used to plan, monitor,
and evaluate projects.
• Originally developed by the
United States Department
of Defense, and adopted
by USAID in the late 1960s.
LogFrames

A management tool used to


improve the design of
interventions, most often at the
project level.

It involves identifying:
• strategic elements (inputs,
outputs, outcomes and
impact)
• their causal relationships
• Indicators
• assumptions and risks that
may influence success and
failure.
Impact

Outcome Outcome Outcome

Output Output Output Output Output Output


Impact

Outcome

Output

Output

Output
Logframes take the analysis further to the
identification of indicators and the means
of verification (or sources of data) for Impact

those indicators.
Outcome
Logframes push a discipline of identifying
indicators for each component in the logic Output

model.   Output

Output

Assumptions are an integral part of the


matrix. Assumptions may also include the
  outcomes of other projects or of activities of
other organisations.
Indicators
• An indicator is a measure that is used to
demonstrate change in a situation, or the
progress in, or results of an activity,
project, or programme:
– are measures used to demonstrate changes over
time
– point to the results
– enable us to be “watchdogs”
– are essential instruments for monitoring and
evaluation
Types of indicators
Indicators exist in many different forms:
 Direct indicators correspond precisely to results at
Direct
any performance level.
Indirect /
 Indirect or "proxy" indicators demonstrate the
proxy
change or results if direct measures are not feasible.

 Indicators are usually quantitative measures,


Qualitative
expressed as percentage or share, as a rate, as a
Quantitative ratio.
 Indicators may also be qualitative observations.

Global /
standardised  Standardised global indicators are
comparable in all settings.
Locally
developed  Other indicators tend to be context specific
and must be developed locally.
Performance indicators
Impact
 are measures that show results
relative to what was planned at
Outcome
each level of the "results chain"

 are tools for performance-based


Output
decisions about programme
strategies and activities
Process
 can also be used later to
evaluate project/programme
success
Input
Measure the quality and quantity of long-term results
Impact generated by programme outputs.

Measure the intermediate results generated by


Outcome programme outputs. They often correspond to any
change in people’s behaviour as a result of programme.

Measure the quantity, quality and timeliness of the


Output products — goods or services — that are the result of
an activity/project/programme.

Measure the progress of activities in a


Process programme/project and the way these are carried out .

Measure the quantity, quality and timeliness of


Input resources provided for an activity/project/programme.
Contribution to increase of lives saved
Impact

Death rate, Recovered, defaulter, Coverage


Outcome

Number of SAM admitted into Inpatient centres; Number


Output of SAM admitted in outpatient Centres, etc..

Number of inpatient centres established and functional,


Process number of outpatient centres for SAM established and
functional,.

Quantity of RUTF utilised, Funds utilised, Human


Input resources, etc.
Choosing indicators

• Where will the data be collected from?


• Who will collect it?
• When will it be collected, and how frequently?
• How will the data be collected and stored?
• Who will analyse the data?
• How will it be reported?
• How will management decisions be made based on the
monitoring report?
Example indicators
• Proposed core indicators for Sphere standard
related to IFE:
– % exclusive breastfeeding in infants <6 months
– % children aged 6 to 23 months consuming diet of
appropriate diversity and frequency
Any questions?

You might also like