Beruflich Dokumente
Kultur Dokumente
EVALUATION
IPDET GBEPM 740 /
MBAH 760 / MPA 720
Lecturer Dr Joseph Mwape Chileshe
PhD(Agricultural Engineering) SCAU 2012
MSc (Agricultural Mechanization)SCAU 1996
B.Eng. (Mechanical Engineering)UNZA 1991
Email mwapejc@gmail.com
Cell: +260969538805
Section I
IPDET © 2009 2
Section II – Monitoring
• the monitoring process,
• types of monitoring,
• purpose of monitoring
• significance / importance of monitoring of
projects
• when monitoring is initiated
• the time frame for monitoring projects
IPDET © 2009 3
Monitoring Cont’d
• which institutions or individuals undertake
monitoring in project management.
• roles of these monitoring institution or
individuals
• project types and monitoring procedures
• outcomes of monitoring i.e. monitoring
outputs
• communication lines
• case study of a specific project
IPDET © 2009 4
SECTION III -EVALUATION OF
PROJECTS
•◦Concept of Evaluation
•◦Context of Application.
•◦Types of Evaluation.
•◦Purpose of Evaluation.
•◦Significance of Evaluation
IPDET © 2009 5
Evaluation Cont’d
IPDET © 2009 7
Evaluation Cont’d.
•EMERGING CONTEMPORARY
APPROACHES OF PROJECT EVALUATION
•◦Participatory Monitoring and
Evaluation.
•◦Stakeholder involvement.
•◦Roles of stakeholder.
•◦The process of participatory Evaluation
IPDET © 2009 8
SECTION V -MONITORING AND
EVALUATION IN THE DEVELOPMENT
PLANS
•a. SNDP
•b. MDG
IPDET © 2009 9
SECTION VI-Students Project on Monitoring &
Evaluation
•◦Group work
IPDET © 2009 10
Project...
A collection of linked activities, carried
out in an organised manner,with a
clearly defined START POINT and END
POINT to achieve some specific results
desired to satisfy the needs of the
organisation at the current time
What is a Project?
Definition:
A temporary endeavor under-taken
to create a unique product or service
18
IPDET © 2009
Objectives
Monitoring Evaluation
•Data collected on program •Data collected to answer specific
activities questions
•Ongoing, routine •Periodic
•Focus on activities and output, •Focus on outcome, impact
compared to target
Evaluation
Monitoring or Evaluation?
Monitoring
It Starts with QUESTIONS
Examples
• Money
• Staff
• Policies, guidelines
• Equipment
• Partners
ACTIVITIES
• The work that we do, what we implement
• Also called “processes”
Examples
• Training events
• Meetings
• Events
• Outreach
• Home visits
OUTPUTS
Examples
Examples
OUTPUT: INDICATOR:
Number of officials Number of officials attending the meeting
attending the meeting compared to number invited
OUTCOME: INDICATOR:
Increased funding Percentage of NTP budget covered by the
Ministry of Health
Focus on INDICATORS
Indicators
• An indicator is a measure that is used to show
change in a situation, or the progress in/results of an
activity, project, or programme.
• Indicators:
– enable us to be “watchdogs”;
– are essential instruments for monitoring and
evaluation.
– are objectively verifiable measurements
What are the Qualities of a Good
Indicator?
• Specific The Sphere Project provides the most
Types of indicators
Direct indicators correspond precisely to results
Direct
at any performance level.
Indirect /
Indirect or "proxy" indicators demonstrate the
proxy
change or results if direct measures are not
feasible.
Indicators are usually quantitative measures,
Qualitative
expressed as percentage or share, as a rate,
Quantitative etc.
Indicators may also be qualitative
observations.
Global /
standardised Standardised global indicators are
comparable in all settings.
Locally
developed Other indicators tend to be context
specific and must be developed locally.
Impact
Outcome
Output
Input
Impact Related to Goal
Related to Outputs
Output
Input Related to
Activities/Resources
Related to Goal Malnutrition rates
Impact amongst young
children reduced
% of young children
Outcome Related to Objectives getting appropriate
(or Purposes) complementary food
X number of mothers
Related to Outputs
Output know about good
complementary food and
how to prepare that
Nutritional education
Input Related to to mothers on
Activities/Resources complementary food
*
IMPACT
Examples
• Higher rate of treatment success
• Reduction in deaths among MDR-TB patients
performance
M&E
efficiency
outputs
effectiveness appropriateness
outcomes
Quantitative indicators
Qualitative indicators
DO NO HARM
assessment
coverage
INPUTS
connectedness
accountability
timeliness
A WASP NEST………? *
Definition
Monitoring
‘The systematic and continuous assessment of the
progress of a piece of work over time….’
*
Information collected for monitoring
must be:
• Useful and relevant
• Accurate
• Regular
• Acted upon
• Shared
• Timely
*
Monitoring is an implicit part of an evaluation.
*
Can you give examples of Monitoring
in your current work?
For example
- From a CMAM programme?
- From a Micronutrient programme?
- From a General Food Distribution?
- From a Health programme?
- From a Livelihoods programme?
*
Monitoring
• Monitoring compares intentions with results
• It guides project revisions, verifies targeting criteria
and whether assistance is reaching the people
intended.
• It checks the relevance of the project to the needs.
• It integrates and responds to community feedback
• It enhances transparency and accountability
Difference between
Monitoring of
• Process/activities
• Impact/results
*
The project cycle Disaster
ASSESSMENT
EVALUATION
Monitoring
PROGRAMME
IMPLEMENTATION DESIGN
*
Why would you
do an evaluation
of a programme?
*
Definitions
Evaluation
The aim is to determine relevance and fulfilment of
objectives, as well as efficiency, effectiveness, impact
and sustainability of a project.
*
There has been an increased
focus on evaluation of
humanitarian action as part
of efforts to improve quality and
standards
*
Monitoring versus Evaluation
MONITORING =
EVALUATION =
It aims to
– Improve policy and practice
– Enhance accountability
*
Evaluations are done when /
because:
– Monitoring highlights unexpected results
– More information is needed for decision making
– Implementation problems or unmet needs are identified
– Issues of sustainability, cost effectiveness or relevance
arise
– Recommendations for actions to improve performance
are needed
– Lessons learning are necessary for future activities
Evaluations
• Evaluation involves the same skills as assessment
and analysis
• Evaluation should be done impartially and ideally
by external staff
• Evaluation can also occur during (e.g. mid-term)
and after implementation of the project
*
Evaluation looks at
*
Evaluation looks at
• Relevance/Appropriateness: Doing the right thing in the right way at the right
time.
• Connectedness (and coordination): Was there any replication or gaps left in
programming due to a lack of coordination?
• Coherence: Did the intervention make sense in the context of the emergency
and the mandate of the implementing agency? Are their detrimental effects of
the intervention on long run?
• Coverage: Who has been reached by the intervention, and where: linked to
effectiveness?
• Efficiency: The extent to which results have been delivered in the least costly
manner possible.
• Effectiveness: The extent to which an intervention has achieved its objectives –
• Impact: Doing the right thing, changing the situation more profoundly and in
the longer-term.
*
Example on General Food Distribution
*
M&E in emergencies?
*
The “M” and the “E”…
Monitoring Evaluation
*
Real-time evaluations (2)
• It is an improvement-oriented review; it can be regarded
more as an internal function than an external process.
• It helps to bring about changes in the programme, rather
than just reflecting on its quality after the event.
• A real-time “evaluator” is a “facilitator”, working with staff
to find creative solutions to any difficulties they encounter.
• It helps to get closer to the people affected by crisis, and
this enables to improve accountability to ‘beneficiaries’.
*
Monitoring & Evaluation systems
• Main components of M&E systems:
– M&E work plan for data collection and analysis,
covering baseline, on-going M&E
– Logical framework, including indicators and
means/source of verification
– Reporting flows and formats
– Feedback and review plan
– Capacity building design
– Implementation schedule
– Human resources and budget
• Indicators:
– enable us to be “watchdogs”;
– are essential instruments for monitoring and
evaluation.
– are objectively verifiable measurements
What are the Qualities of a Good
Indicator?
• Specific
• Measurable The Sphere Project provides the most
accepted indicators for nutrition and
• Achievable food security interventions in emergencies:
see Module 21.
• Relevant
• Time-bound
And there is also the SMART initiative….
Standardised Monitoring and Assessment in Relief and Transition
Initiative - interagency initiative to improve the M&E of humanitarian
assistance
Types of indicators
Indicators exist in many different forms:
Direct indicators correspond precisely to results at
Direct
any performance level.
Indirect /
Indirect or "proxy" indicators demonstrate the
proxy
change or results if direct measures are not feasible.
Global /
standardised Standardised global indicators are
comparable in all settings.
Locally
developed Other indicators tend to be context specific
and must be developed locally.
Impact
Outcome
Output
Input
Impact Related to Goal
Related to Outputs
Output
Input Related to
Activities/Resources
Related to Goal Malnutrition rates
Impact amongst young
children reduced
% of young children
Outcome Related to Objectives getting appropriate
(or Purposes) complementary food
X number of mothers
Related to Outputs
Output know about good
complementary food and
how to prepare that
Nutritional education to
Input Related to mothers on
Activities/Resources complementary food
*
What is a Log Frame?
The logical framework or logframe is an analytical tool
used to plan, monitor, and evaluate projects.
? ? ?
IMPACT
OUTCOME
INPUTS
Impact
Outcome
Output
Output
Output
INPUTS
Other terms that can be found in a logframe:
Goal
If the OBJECTIVES are produced, then this should
contribute to the overall GOAL
Objectives /
outcomes If OUTPUTS/RESULTS are produced, then the
OBJECTIVES are accomplished
Deliverable outputs
If adequate ACTIVITIES are conducted, then
OUTPUT/RESULTS can be produced
Activities
If adequate RESOURCES/INPUTS are provided; then
activities can be conducted
Activities versus Results
Completed activities are not results.
• e.g. a hospital was built, does not mean that injured
and sick people can be treated in the hospital, maybe
the hospital has no water and the beds have not been
delivered.
*
es
fram
Log
Example
Another Example…
*
Key messages
• The monitoring of nutrition interventions in emergencies is an integral
part of saving lives and maintaining nutrition status of the affected
population.
• Successful monitoring systems allow for improvements in
interventions in ‘real time’.
• Evaluations are important tools for learning, assessing interventions,
comparing the costs of the interventions and their impact. Essential
evaluation parameters are: effectiveness; efficiency;
relevance/appropriateness; impact and coverage
• Involving communities in M&E places the affected population at the
heart of the response, providing the opportunity for their views and
perceptions to be incorporated into programme decisions and
increases accountability towards them.
• A common mistake of designing M&E systems is creating a framework
which is overly complex. Always make an M&E system practical and
doable.
• The logical framework or logframe is an analytical tool used to plan,
monitor, and evaluate projects.
Monitoring for Community based
Management of Acute Malnutrition
(CMAM) interventions
• Types of monitoring, e.g.
– Individual case monitoring,
– Programme / activities monitoring
• Routine supervision
• External evaluations
• Coverage surveys are one of the most important
tools for evaluation of CMAM interventions
• Distance: > 90 % of the target population is within less than one day’s
return walk (including time for treatment) of the programme site for
dry ration SFP and no more than one hour’s walk for on-site wet SFP
• Coverage is > 50 % in rural areas, > 70 % in urban areas and > 90 % in
a camp situation
• Formative
• focus on improved performance before and during
implementation (project, program or policy)
• Summative
• focus on outcomes (results, consequences)
• Prospective
• asses the likely outcomes of proposed interventions
• Is this program/project/policy worth evaluating?
• Will the gains be worth the effort/resources expended?
•Ethical
•Managerial
•Decisional
•Educative and Motivational
• Projects
• Programs
• Policies
• Organizations
• Sectors
• Themes
• Country assistance
• Strategy –
• Are the right things being done?
• Operations –
• Are things being done right?
• Learning –
• What are we learning?
• Are there better ways?
• Monitoring
• routine, ongoing, and internal activity of tracking key
indicators
• internal activity
• used to collect information on a program’s activities,
outputs, and outcomes to measure performance of the
program
• Evaluation
• periodic and time bound
• can be internal, external, or participatory
• periodic feedback to key stakeholders
•Internal evaluators
•External evaluators
•Participatory evaluators
•Each has advantages and disadvantages
ion ty
ans i
Exp al activ
1980s to now glob
s
tatu
ns
sio
fes
Pro
mid 1970-80 tine
rou
re
Mo and
1950s-60s (US ope)
r
Eu ik
utn
1957 Sp i o n
uct ks
str - Ban
con
Progression of Evaluation
1940 Re WW r
afte ols
cho da)
al S Cana
dic
Me and
1900 (US d s
an
on ram
ati og
uc Pr
Ed cial
1880 So
rn s
de ion
Mo aluat
IPDET © 2009
1600 Ev
nd
ta
yp
2000 BC Eg ina
Ch
Development Evaluation:
•Audit tradition
•Social science tradition
• Relevance
• Effectiveness
• Efficiency
• Impact
• Sustainability
• Promote accountability
• Facilitate comparability
• Enhance reliability and quality of services provided
• Organizational independence
• Behavioral independence
• Protection from external influence
• Avoidance of conflicts of interest
“Evaluation of the past is the first step towards vision for the future.”
-- Chris Widener, motivational speaker and author
Questions?
IPDET © 2009 133