Beruflich Dokumente
Kultur Dokumente
A D VA N C E D
MONITORING
AND
EVALUATION
Workshop Manual
MONITORING
AND
EVALUATION
Workshop Manual
© 2012
ii Advanced Monitoring and Evaluation Workshop Manual
Disclaimer
This publication was supported by the Cooperative Agreement Number 5U2GPS001914 from
The Centers for Disease Control and Prevention. Its contents are solely the responsibility of the
Caribbean Health Research Council and do not necessarily represent the official views of the
Centers for Disease Control and Prevention.
The material contained herein is that of a collaboration between the Caribbean Health Research
Council (CHRC), the U.S. Centers for Disease Control and Prevention, and the Regional
Monitoring and Evaluation Technical Working Group. Please seek permission from the CHRC to
copy, modify, publish or transmit this material which was compiled specifically for the purposes
of this workshop.
iii
Table of Contents
Acknowledgements.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . v
Preface.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... vii
Workshop Objectives.. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. xi
1 Why Evaluate.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... 1
2 Evaluability Assessment .. ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 11
3 Major Types of Programme Evaluation.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . .. 21
4 Developing Evaluation Questions: Key Considerations.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . .. 35
5 Measuring Programme Outcomes.. ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 47
6 Answering Evaluation Questions using Quantitative and Qualitative Methods.. . ... . ... . ... . .. 63
7 Data Analysis.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . .. 75
8 Analyzing Data.. ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 85
10 Managing and Resource Requirements for Evaluations.. .... .... .... .... .... .... .... .... .... .... 121
11 Ethical Considerations.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 129
12 Challenges to Conducting Evaluations.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 139
13 Writing an Evaluation Report .. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 147
Bibliography.. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. 159
iv Advanced Monitoring and Evaluation Workshop Manual
Acknowledgements v
Acknowledgements
T
he publication of the Advanced Monitoring and Evaluation (M&E) Workshop
Manual is the result of a collaborative effort by a number of partners, coordinated
and led by the Caribbean Health Research Council (CHRC).
CHRC expresses gratitude to the principal collaborators i.e. the members of the Caribbean
M&E Technical Working Group. The latter comprises the Caribbean HIV/AIDS Alliance,
US Centers for Disease Control and Prevention (CDC), PAHO HIV Caribbean Office, HIV/
AIDS Programme Unit of the Organization of Eastern Caribbean States (OECS), the Pan
Caribbean Partnership against HIV/AIDS (PANCAP), and UNAIDS.
CHRC acknowledges the critical role of the scores of stakeholders from Ministries
of Health, National AIDS Programmes and non-governmental organisations (NGOs)
throughout the Caribbean who participated in the various specialized M&E workshops
and provided the feedback that guided the development of the current manual. CHRC
is also grateful to the consultants who made useful contributions to the content of the
workshop materials.
It is also important to recognize the hardworking CHRC team led by Ms Elizabeth Lloyd
and including Mr. Erin Blake, Ms Shelly-Ann Hunte, Ms Keren Wilson, Ms Candice
McKenzie and Ms Jaselle Neptune for their role in the finalization of the content and
format of the Manual.
The development of Advanced M&E Workshop Manual would not have been possible
without the funding received from the CDC, through a Co-operative Agreement with
CHRC.
vi Advanced Monitoring and Evaluation Workshop Manual
vii
Preface
T
he Advanced Monitoring and Evaluation (M&E) Workshop Manual is the sequel
to the highly regarded Basic M&E Manual, which was published in 2011. This
specialised Manual was developed by the Caribbean Health Research Council to
build on the fundamental M&E concepts that were introduced in the Basic Workshop but
with a sharp focus on the capacity to conduct programme evaluations in the Caribbean.
The Advanced Workshop was designed to address the needs identified by Caribbean
health professionals who had benefitted from the basic M&E training and wanted to
further enhance their skills. Although CHRC had been hosting specialized M&E workshops
since 2006, both the content and the format were repeatedly revised in response to the
feedback received from the participants who attended the various ‘pilot’ workshops.
We are now very pleased with the eventual product, which has also been receiving very
positive reviews.
Persons who complete the training are expected to be well tooled for the conduct of
programme evaluations. Using the usual CHRC format of PowerPoint presentations
and facilitated group work sessions, they are first reminded as to why it is important
to evaluate programmes. Subsequent topics cover determining the readiness of the
programme for evaluation, choosing the right evaluation questions as well as the most
appropriate evaluation methodology i.e. whether to use quantitative and/or qualitative
approaches. There is also a strong focus on data analysis with both academic and hands-
on sessions. Indeed, the participants are taught to use the Epi Info software, which is not
only a powerful data analysis package but is available free of charge over the internet
through the support of the Centers for Disease Control and Prevention (CDC). Critically,
the workshop also addresses some of the realities and challenges that practitioners face
in the conduct of evaluations such as ethical and budgeting considerations, managing
the evaluation and crucially, writing the evaluation report. The workshop is designed to
be hosted over five days but given its modular format, it can be customised to meet
stakeholders’ requirements and available resources, including time.
It is important to appreciate how the Advanced Workshop relates to other M&E capacity
development efforts. CHRC in collaboration with the Caribbean M&E Technical Working
Group has recently published the M&E Training Strategy for the Caribbean, which is
already being implemented. The Strategy comprises ten key elements designed to
viii Advanced Monitoring and Evaluation Workshop Manual
ensure that M&E is institutionalized in the Caribbean, including the spawning of an M&E
culture as well as establishing systems to build capacity that cover in-service training
to Masters Degrees in Monitoring and Evaluation at Universities. The development of
standardised training materials such as this workshop manual is probably the most
prominent element. Others include participants being strategically selected for training
and receiving structured, sequenced tuition through an expanded range of modalities,
with post-workshop support and mentorship.
It is also our intention that the Advanced M&E Workshop Manual would be used as a
critical reference guide for persons interested in and conducting programme evaluations.
This is the experience with other similar CHRC publications such as the Basic M&E
Workshop Manual and the flagship Basic Research Skills Workshop Manual. The
Advanced Manual also includes a extensive bibliography so that persons who have
further interest in evaluation, can access additional resources.
CHRC expects that the Advanced M&E training workshop and this Manual would
continue to play a critical part in the development of M&E capacity in the Caribbean
and facilitate the achievement of our goal, which is to institutionalize M&E is all health
programmes. We also envision that CHRC will be producing additional specialised M&E
workshop manuals, in response to other identified needs.
Workshop Agenda
Time Day 1 Day 2 Day 3 Day 4 Day 5
8:30 - 9:00 Introduction & Recap of Recap of Recap of Recap of
outline of the previous day previous day previous day previous day
training
Workshop Objectives
1 Why Evaluate?
Learning Objectives:
Monitoring
Ques0ons
•
How
many
condoms
were
distributed?
•
How
many
tests
were
done?
•
How
many
preven0on
workshops
were
conducted
for
secondary
school
students?
Evalua0on
Ques0ons
•
What
is
the
impact
of
the
condom
distribu0on
programme
on
the
number
of
persons
tes0ng
posi0ve
for
sexually
transmi]ed
diseases?
•
Have
the
preven0on
workshops
helped
to
promote
safe
sexual
prac0ces
among
secondary
school
students?
INDICATORS
Why Evaluate? 5
Programme Components
Outputs
Ac;vi;es
or
services
that
the
project
is
E.g.
quarterly,
in
the
providing.
Outputs
lead
to
outcomes
5th
month,
in
the
2nd
year,
etc.
Outcomes
Changes
in
behaviors
or
skills
as
a
result
of
the
Short,
medium,
long-‐
interven;on.
Outcomes
are
an;cipated
to
lead
term
to
impacts
Impacts
Measurable
changes
in
health
status,
e.g.
Related
to
long-‐term
reduced
STI/HIV
prevalence.
Impacts
are
the
outcomes
effect
of
several
interven;ons
and
are
achieved
over
;me
Types
of
Indicators
Input
Financial,
human,
material,
and
technical
resources
Example:
o Total
budget
for
ac0vi0es
in
2009
o #
of
staff
members
required
to
execute
ac0vi0es
o #
of
posters,
TV
slots
required
to
promote
programme
ac0vi0es
over
the
next
4
months
Example:
%
of
young
women
aged
15
–
24
who
never
had
sex
Impact
o The
longer
term
improvements
you
are
aiming
at
o May
be
needed
for
na0onal
or
global
repor0ng
Example:
%
of
HIV-‐infected
infants
born
to
HIV-‐
infected
mothers
WHAT IS THE WHAT DO WHAT DO HOW WHERE HOW ARE WHAT WHAT
PURPOSE OF YOU WANT YOU NEED WILL YOU ARE YOU YOU WILL YOU CAN'T YOU
THE TO KNOW? TO MEASURE GOING TO GOING DO WITH DO
PROGRAMME ANSWER IT GET IT? TO GET IT ONCE (CAVEATS)
OR PROJECT? THE IT? YOU GET
QUESTION? IT?
Percent
of
women
and
men
aged
15–49
who
had
sex
with
more
than
one
partner
in
the
last
12
months
• Concern
#4:
Evalua0on
may
produce
nega0ve
results
and
lead
to
informa0on
that
will
make
the
programme
look
bad
or
lose
funding
We
evaluate
to…
• Ascertain
if
you
are
doing
the
right
things,
and
doing
them
right
Ten
Steps
to
Results-‐based
M&E
System,
Kusek
and
Rist,
2004
• Take
stock
of
where
you
are
Empowerment
evalua&on
principles
in
prac&ce,
FeXerman
&
Wandersman,
2005
• Dis&nguish
what
works
from
what
doesn’t
U&liza&on-‐focused
evalua&on,
PaXon,
2008
• Par1cipate
in
the
evalua1on
and
show
programme
staff
that
you
think
it
is
important
• Involve
as
many
of
the
programme
staff
as
much
as
possible
and
as
early
as
possible
• Be
aware
of
the
ethical
and
cultural
issues
in
an
evalua1on
10 Advanced Monitoring and Evaluation Workshop Manual
GROUP
WORK:
Crea&ng
an
Evalua&on
Case
Study
• Think
about
a
project
/
programme
that
you
are/were
involved
in
Introduc&on
Brief
statement
to
serve
as
a
‘lead
in’
to
your
case.
Area
you
are
working
in.
Why
is
this
an
issue
Background
Informa&on
on
the
context
(length
of
&me
project
has
been
implemented,
history
of
programme
etc).
2 Evaluability
Assessment
Learning Objectives:
• Helps
clarify
goals
and
objec&ves
as
well
as
theory
of
change
for
the
programme
Other
Benefits
• Help
ensure
evalua6on
resources
are
used
judiciously
– To
address
needs
of
stakeholders
– To
focus
on
aspects
of
the
programme
that
are
developmentally
appropriate
for
evalua6on
– To
address
ques6ons
that
can,
in
fact,
be
answered
given
design
feasibility
and
data
availability
Is intervention promising?
Yes
Does intervention have programme design integrity and
realistic, achievable goals?
Yes
Is intervention implemented as
intended and at an appropriate
developmental level?
Yes
To answer questions:
(1) Is there a feasible
design? (2) Are data
available or feasible
to collect?
Assist in
improvement
Yes of programme design,
implementation, and
Evaluable evaluation characteristics
Intervention 7
Cau6on
• Can
delay
evalua6on
unnecessarily…
…
if
applied
to
all
programmes
before
evalua6on
…
if
the
assessment
process
too
long
• Determine
the
programme-‐specific
issues
which
could
be
examined
in
the
evalua6on
• Programme
Descrip6on
• Iden6fica6on
of
Evalua6on
Issues
and
Ques6ons
• Evalua6on Op6ons
11
16 Advanced Monitoring and Evaluation Workshop Manual
• Programme
Profile
– Need,
programme
ra6onale,
structure,
governance,
clients,
etc.
• Opera6ng
environment
– risks
12
Using
Logic
Models
• Generates
clear,
concrete
statements
of
programme
goals/objec6ves
and,
Control
Direct
Indirect
Influence
Influence
14
Evaluability Assessment 17
16
Programme
Success
Q 2. To what extent
are students
involved in the
programme
improving their skills 17
18 Advanced Monitoring and Evaluation Workshop Manual
* Questn #1 x x x x
* Questn #2 x x x
* Questn #3 x x x
Issue #2
* Questn #4 x x x
* Questn #5 x x x
Issue #3
Etc.
18
20
Evaluability Assessment 19
Finding a balance…
EVALUATOR
PROGRAM SCIENTIFIC
CONSIDERATIONS CONSIDERATIONS
21
Working
Session
Instruc6ons:
22
Template
Programme Title
Programme Description
Documents Collected
Logic Model?
Data collection on intervention reality
Stakeholder Consulted
Goal Agreement
Logic Model Agreement
Produce Assessment Report
Feedback
23
20 Advanced Monitoring and Evaluation Workshop Manual
Evaluability Assessment 21
3 Major Types
of Programme
Evaluation
Learning Objectives:
Evalua4on
Uses
• Evalua4on
can
provide
informa4on
on:
Evalua4on Spectrum
Learning
Accountability
Evaluate!
• The
most
important
thing,
though,
is
to
evaluate!
• It
is
beKer
to
do
an
average
evalua4on
than
not
to
evaluate
at
all
• The
focus
should
be
on
u4lity,
relevance
and
prac4cality
–
aiming
for
rigour
and
validity
where
possible
(not
the
reverse)
• Successful
interven&ons
depend
on
regular
feedback
and
adjustments
Major Types of Programme Evaluation 25
Evalua4on
Approaches
• Goal-‐based
evalua&on:
Evalua4on
that
measures
the
extent
to
which
a
programme
reaches
clear
and
specific
objec4ves
• Goal-‐free
evalua&on:
Evalua4on
in
which
the
evaluators
ignore
the
rhetoric
of
the
programme
and
base
the
evalua4on
purely
on
the
degree
to
which
the
programme
meets
par4cipants’
needs
• Mul&-‐site
evalua&on:
Evalua4on
that
examines
interven4ons
at
different
loca4ons
• Cluster
evalua&on:
Evalua4on
that
looks
at
similar
or
related
interven4ons
Evalua4on
Approaches
• Social
Assessments:
Looks
at
social
structures,
processes
and
changes
within
a
community
or
group
• Environmental
Assessments:
Explores
the
affects
of
the
programme
on
the
environment
• Par&cipatory
Evalua&on:
Evalua4ons
in
which
responsibili4es
for
planning,
implemen4ng
and
repor4ng
are
shared
with
stakeholders,
who
may
help
define
the
evalua4on
ques4ons,
collect
and
analyse
data,
draN
and
review
the
report
• Outcome
mapping:
Mapping
of
behaviour
change
(at
the
outcome
level
–
not
impact
level)
26 Advanced Monitoring and Evaluation Workshop Manual
Evalua4on
Approaches
• Rapid
Assessment:
Systema4c
semi-‐structured
approach
used
in
the
field,
typically
by
a
team
of
evaluators
(seen
as
a
compromise
between
speed
and
rigour)
• Evalua&on
Synthesis:
Approach
in
which
an
evaluator
looks
across
interven4ons
addressing
a
similar
issue
or
theme
to
determine
causality
• Meta-‐evalua&on:
Expert
review
of
one
or
more
evalua4ons
against
professional
quality
standards
to
determine
the
credibility
of
the
conclusions
(evalua4on
of
evalua4on)
Goal-‐Based
Evalua4ons
• Goal-‐based
evalua4ons
are
the
most
common
type
of
evalua4on
for
programmes
• Goal-‐based
evalua4ons
look
at
whether
the
programme
is
achieving
it
stated
goals
and/or
objec4ves
• They
are
closely
4ed
to
results-‐based
M&E
• Efficiency:
Is
the
programme
making
the
most
of
the
available
resources?
• Impact:
What
are
the
long
term
effects
of
the
programme
going
to
be?
Evalua4on Process
Collect,
Ar4culate
Report
Define
the
review
how
the
findings
evalua4on
(clean)
Use
evalua4on
and
ques4on
and
and
findings
will
be
recomme
parameters
analyze
conducted
nda4ons
data
Needs
Assessments
• What
need
is
there
for
a
programme?
• Defining
‘need’
is
essen4ally
a
poli4cal
process
– “Discrepancy
between
what
is
and
what
should
be”
(Posavac
&
Carey,
1992)
– “Actual
state
vs.
a)
ideal,
b)
desired,
c)
expected,
d)
norm,
e)
minimum”
(Scriven
&
Roth,
1990)
• Defining
condi4ons
vs.
problems
• Role
of
poli4cs,
public
policy,
culture,
society
Needs Assessments
Condi4on Problem
Programme Need
Process
Evalua4ons
• Process-‐based
evalua4ons
are
geared
to
fully
understand
how
a
programme
works
– how
does
it
produce
the
results
that
it
does?
– Is
it
structured
correctly
to
achieve
the
results
it
aims
to
produce?
• Useful
if
programmes
are
long
standing
and
have
changed
over
the
years
(warranted
if
clients
are
dissa4sfied)
• Demonstrates
how
a
programme
truly
operates
(useful
in
portraying
to
outside
partner)
• There
are
numerous
ques4ons
that
could
be
addressed
in
a
process
evalua4on.
Ques4ons
are
selected
by
carefully
considering
what
is
important
to
know
about
the
programme.
30 Advanced Monitoring and Evaluation Workshop Manual
Outcome
Evalua4ons
• Evaluate
the
extent
to
which
programmes
are
mee4ng
predetermined
goals
or
objec4ves
• This
type
of
evalua4on
is
increasingly
important
(for
both
accountability
and
learning)
• Key
ques4on:
Is
your
organiza4on
really
doing
the
right
programme
ac4vi4es
to
bring
about
the
outcomes
it
expects?
Major Types of Programme Evaluation 31
Engaging
Stakeholders
• Membership
on
an
Evalua4on
Advisory
or
Steering
CommiKee
• Through
client
feedback
as
part
of
the
methodology
–
surveys,
focus
groups,
key
informant
interviews,
etc…
• Par4cipatory
Evalua4on
methods
built
into
the
study
design
32 Advanced Monitoring and Evaluation Workshop Manual
Notes
• There
is
no
"perfect"
evalua4on
design
• Work
hard
to
include
some
interviews
in
your
evalua4on
methods.
The
story
is
usually
the
most
powerful
depic4on
of
the
benefits
of
your
services.
• Don't
interview
just
the
successes.
You'll
learn
a
great
deal
about
the
programme
by
understanding
its
failures,
dropouts,
etc.
Look
at
the
unintended…
• Don't
throw
away
evalua4on
data
once
a
report
has
been
generated.
Data
can
provide
precious
informa4on
later
when
trying
to
understand
changes
in
the
programme.
Major Types of Programme Evaluation 33
Session
Summary
• It
is
more
important
to
evaluate
than
to
worry
about
doing
it
right
• There
are
many
different
types
of
evalua4on
• The
evalua4on
approach
is
determined
by:
– What
is
being
evaluated
– The
purpose
of
the
evalua4on
– How
the
evalua4on
results
will
be
used
• Who
does
the
evalua4on
will
be
determined
by
the
evalua4on
approach
and
whether
there
is
a
need
for
independence
Session
Summary
• Common
evalua4ons:
– Needs
Assessment
–
explores
the
need
for
a
programme
– Process
Evalua4on
–
explore
the
implementa4on
of
a
programme
– Outcome
Evalua4on
–
explores
whether
a
programme
is
achieving
its
goals
and
objec4ves
• Proving
that
a
programme
worked
(Outcome
Evalua4on)
has
become
more
and
more
important
with
increased
compe44on
for
limited
resources
Acknowledgements
The
Road
to
Results:
Designing
and
Conduc&ng
Effec&ve
Development
Evalua&ons
By
Linda
G.
Morra-‐Imas
and
Ray
C.
Rist
Commissioned
by
the
World
Bank
34 Advanced Monitoring and Evaluation Workshop Manual
Working
Session
• Scenario
1:
How
would
you
evaluate?
• A
five-‐year
‘Condom
Promo4on
Programme’
that
uses
the
media
to
promote
condom
usage
coupled
with
the
free
distribu4on
of
condoms.
The
programme
is
in
its
2nd
year
and
management
want
to
know
if
their
strategies
are
working.
Working
Session
• Scenario
2:
How
would
you
evaluate?
• The
Ministry
of
Health
has
completed
a
ten-‐
year
programme
focused
on
child
obesity
and
now
is
seeking
to
understand
if
the
programme
has
made
a
difference.
Working
Session
• Scenario
3:
How
would
you
evaluate?
• The
Government
is
considering
merging
HIV
and
chronic
disease
programmes
with
a
view
to
improving
treatment
and
lowering
costs.
The
Minister
has
asked
for
more
informa4on
with
which
to
make
this
decision.
Developing Evaluation Questions: Key Considerations 35
4 Developing
Evaluation Questions:
Key Considerations
Learning Objectives:
• Intended par.cipants
• Intended outcomes
• Programme impact
3
Evalua.on
Ques.ons
• Broad
categories
(oLen
with
sub-‐categories)
– Need
for
the
programme
– Programme
Theory
– Programme
Process
or
Implementa6on
– Programme
Impacts
–
some%mes
referred
to
as
Programme
Success
especially
when
not
a
complete
impact
assessment
– Programme
Efficiency
–includes
cost-‐effec%veness
Program
Theory
Program
Formative Evaluations
Process
Program
Impacts
Summative
Evaluations
Program
Efficiency
5
38 Advanced Monitoring and Evaluation Workshop Manual
6
7
Evalua6on
Ques6on:
Has
the
service
provision
for
PLHIV
clients
improved?
Developing Evaluation Questions: Key Considerations 39
Intended
Par.cipants
• Evalua6on
Objec6ve
Focused
on:
MEETING
NEEDS
OF
INTENDED
CLIENTS:
e.g.
:
How
many
trained
staff
members
are
working
at
the
clinic?
• Context
(need
for
adequately
staffed
clinics
to
provide
service
to
clients)
Evalua.on
Ques.on:
Does
the
programme
have
adequate,
appropriately
trained
staff
to
meet
its
intended
objec.ve?
Intended
Outcomes
• Evalua6on
Objec6ve
Focused
on:
Intended
Outcomes:
e.g
Is
the
Night
Health
Centre
being
u.lized
by
PLHIV?
• Context
(ascertaining
the
impact
of
a
programme
to
engage
loss
to
follow-‐up
clients
by
providing
night
clinics)
Evalua.on
Ques.on:
Is
there
an
increase
in
the
number
and
variety
of
PLHIV
accessing
services
in
the
health
sector?
40 Advanced Monitoring and Evaluation Workshop Manual
• What types of decisions or judgments need to be made?
• By when?
13
• Norma.ve
14
Developing Evaluation Questions: Key Considerations 41
Descrip6ve
ques6ons:
• Seek
to
determine
what
is.
15
16
Examples :
17
42 Advanced Monitoring and Evaluation Workshop Manual
Norma.ve
ques.ons:
Compare
what
is
with
what
should
be
• Compare
the
current
situa.on
with
a
specified
target,
goal,
or
benchmark.
• Similar
to
those
oLen
asked
in
performance
audi.ng.
• Ask
the
following:
•
Are
we
doing
what
we
are
supposed
to
be
doing?
•
Are
we
hinng
our
target?
•
Did
we
accomplish
what
we
said
we
would
accomplish?
18
19
20
Developing Evaluation Questions: Key Considerations 43
Type
of
Evalua.on
Ques.ons:
Cause
&
Effect
Examples:
• As
a
result
of
the
training
programme,
do
par.cipants
have
higher
paying
jobs
than
they
otherwise
would
have?
• Did
the
microenterprise
programme
reduce
the
poverty
rate
in
the
community
in
which
they
operated?
• Did
the
increased
tax
on
gasoline
improve
air
quality?
• What
other
impacts
or
side
effects
(posi.ve
or
nega.ve)
did
this
interven.on
have
on
the
wider
community?
21
Descrip.ve
Ques.ons
Evalua.on
ques.ons
that
• What
are
the
primary
ac.vi.es
describe
aspects
of
a
process,
a
of
the
programme?
condi.on,
a
set
of
views,
or
a
• How
do
people
get
into
the
set
of
organiza.onal
programme?
rela.onships
or
networks.
• Where
has
the
programme
been
implemented?
Describes
inputs,
ac.vi.es,
and
• What
services
does
the
outputs
programme
provide
to
men
Norma.ve
Ques.ons
Ques.ons
that
compare
what
is
• Did
we
achieve
the
targets
we
with
what
should
be
set
ourselves
• Are
prac..oners
adhering
to
They
compare
the
current
agreed
protocols
situa.on
with
a
specified
target,
goal,
or
benchmark
Cause
&
Effect
Ques.ons
Determine
what
difference
the
• As
a
result
of
the
job
training
interven.on
makes
programme,
do
par.cipants
have
higher
paying
jobs
than
OLen
referred
to
as
outcome,
they
otherwise
would
have?
impact,
or
a<ribu%on
ques%ons
22
23
44 Advanced Monitoring and Evaluation Workshop Manual
24
26
Developing Evaluation Questions: Key Considerations 45
27
• Specific
methodology
and
evalua.on
design
depends
on
the
nature
of
the
ques.ons
being
asked;
the
resources
available
for
the
evalua.on;
&
.me
constraints
• Need
to
balance
methodological
rigour
with
prac.cal
reali.es
of
resource
&
.me
constraints
Discussion
Ques.ons
• In
any
evalua.on
experience
that
you
may
have
had
or
seen,
has
the
eventual
evalua.on
generally:
29
46 Advanced Monitoring and Evaluation Workshop Manual
30
• Clear
and • Programme
specific goals
• Measurable • Evidence
• Doable • Programme
• Keyterms criteria
defined • Participant
• Scope rates
• Timeframe • Cost
• Population information
• Funding levels
Measuring Programme Outcomes 47
5 Measuring Programme
Outcomes
Learning Objectives:
Got Outcomes?
Demand
for
Outcomes:
Did the programme
work?
Accountability
Are
scarce
resources
Were the programme being
used
most
objectives achieved? efficiently
and
effec4vely?
Effec:ve
OR
Ineffec:ve
Programme
Programme
Effort
Effort
“Did
not
hit
the
“Hit
the
Mark”
Mark”
Outcome
Indicators
• Effec4ve
programmes
also
establish
well
conceived
indicators
that
directly
measure
programme
performance
at
ALL
levels
(ac4vity,
output,
outcome
and
impact)
• Good
indicators
and
objec4ves
are
Specific,
Measurable,
Appropriate,
Realis4c
and
Time-‐
based
(SMART)
• Data
collec4on
for
outcomes
(objec4ves)
and
goals
(impact)
should
occur
in
parallel
with
the
programme
ac4vi4es
50 Advanced Monitoring and Evaluation Workshop Manual
Programme
Lifecycle
Forma:ve
Evalua:on
Programme
Development
Programme
Effect
Programme
Implementa:on
Outcome
Evalua:on
Process
Evalua:on
Less
More
important
important
Measuring Programme Outcomes 51
Evidence-‐Based
Programmes
Numerous
terms,
Funding
agencies
are
pushing
for
criteria,
and
evidence
the
use
of
evidence-‐based
used
to
iden4fy
programmes
as
a
mechanism
for
“evidence-‐based”
ensuring
that
organiza4ons
are
programmes
implemen4ng
programmes
that
– Best
prac4ces
HIT
THE
MARK!
– Model
programme
– Effec4ve
programme
Rigorous,
scien4fic
outcome
– Science-‐based
evalua4ons
have
– Promising
programme
determined
a
programme
– Guidelines
to
be
effec4ve
and,
thus,
recommended
for
widespread
adop4on
52 Advanced Monitoring and Evaluation Workshop Manual
Ques4on
If
a
programme
has
already
been
proven
effec4ve,
then
is
it
necessary
to
conduct
an
outcome
evalua4on
when
adopted?
It
depends…
1. Posi4ve
results
from
evidence-‐based
programmes
tested
in
“ideal
seangs”
may
not
be
replicated
when
adopted
in
“real
world”
seangs*
Cause
Effect
(Programme)
(Outcome)
Confounding Factors
Cause
Effect
School-‐based
Decreased
violent
conflict
resolu4on
events
programme
from
baseline
to
follow-‐up
Confounding
Factors
Cause
Effect
School-‐based
Decreased
violent
conflict
resolu4on
events
programme
from
baseline
to
follow-‐up
Confounders
-‐
Greater
Police
enforcement
-‐
Youth
violence
geang
agen4on
in
the
media
Measuring Programme Outcomes 55
Resources
Quasi-‐Experimental
Required
Single
Group
Pre-‐Post
Fewer
Lower Higher
Evidence
for
Inferring
Causality
Assess
Readiness
• Conduct
an
Evaluability
Assessment
• Is
the
programme
well-‐designed?
• Is
there
evidence
the
programme
was
implemented
as
planned?
• Is
there
a
plan
on
how
the
results
can
be
used?
• Does
your
department
have
the
resources?
– Commitment
– Person
power
– Exper4se
58 Advanced Monitoring and Evaluation Workshop Manual
Gather
resources
1. Obtain
commitment
from
higher
levels
2. Assign
evalua4on
coordinator
– Should
not
be
person
responsible
for
planning/implemen4ng
the
programme
3. Convene
stakeholder
evalua4on
team
– Should
include
(at
min)
higher
level
administrator,
programme
director,
programme
delivery
staff,
and
evalua4on
coordinator
4. Gain
access
to
the
following
exper4se
– Experienced
evalua4on
expert
– Topic
experts/professionals
– Logic
model
developer
– Data
collector/s
(for
whatever
outcomes
selected)
– Data
programmer/sta4s4cian
Considera4ons
• Which
outcomes
in
the
logic
model
are
important
to
stakeholders?
• Has
research
already
demonstrated
causal
links?
• Are
comparison
groups
readily
available?
• Will
there
be
enough
“events”
to
“rule
out
chance”
for
any
changes
found
in
outcome?
Session
Summary
• The
programme
type
and
the
evalua4on
importance
are
key
considera4ons
when
deciding
whether
to
undertake
an
outcome
evalua4on
Session
Summary
• Outcome
evalua4on
designs:
– Single
group
pre
and
post
test
– Quasi-‐experimental
– Randomized
control
• Key
steps
in
planning
an
outcome
evalua4on:
– Assess
readiness
– Assemble
needed
resources
– Develop
a
logic
framework
– Select
the
outcomes
to
evaluate
Measuring Programme Outcomes 61
Working
Session
• In
your
groups
look
back
over
the
evalua4on
matrix
that
you
developed
earlier
and
outline
how
you
would
go
about
undertaking
an
evalua4on
for
the
programme
you
have
selected.
– What
steps
do
you
need
to
take
to
make
the
evalua4on
happens?
– What
are
some
of
the
key
considera4ons
for
the
evalua4on
design?
62 Advanced Monitoring and Evaluation Workshop Manual
Answering Evaluation Questions using Quantitative and Qualitative Methods 63
6 Answering Evaluation
Questions using
Quantitative and
Qualitative Methods
Learning Objectives:
• “It
aims
to
produce
informa/on
that
has
direct
relevance
to
subsequent
decisions
about
improvements
to
or
the
con/nua/on
of
a
par/cular
ac/on
programme”
(Hall
and
Hall
1996).
3
Quan%ta%ve Qualita%ve
5
66 Advanced Monitoring and Evaluation Workshop Manual
6
Quan/ta/ve Methods
8
Answering Evaluation Questions using Quantitative and Qualitative Methods 67
9
• Surveys
• Exit
Interviews
• Record
Abstrac/on
• Checklists
• Observa/on
• Experiments:
Experimental
design
and
Quasi-‐experimental
design
• Cost
Benefit
and
Cost-‐
Effec/veness
analyses
• Most
were
covered
in
Basic
M&E
workshop
11
68 Advanced Monitoring and Evaluation Workshop Manual
Surveys (1)
12
13
14
Answering Evaluation Questions using Quantitative and Qualitative Methods 69
• Since
there
are
no
natural
prices
for
healthy
states,
cost-‐benefit
analysis
requires
the
crea/on
of
ar/ficial
ones
by
assigning
a
dollar
value
to
human
life.
Economists
create
ar/ficial
prices
for
health
benefits
by
looking
at
what
people
are
willing
to
pay
for
them.
15
• Addresses
technical
efficiency
issues
only
(doing
it
the
right
way)
16
Qualita've Methods
18
19
20
Answering Evaluation Questions using Quantitative and Qualitative Methods 71
21
22
24
25
• What
a
researcher
finds
in
focus
groups
is
interac/on
among
the
members
that
is
not
found
in
other
forms
of
research.
26
Answering Evaluation Questions using Quantitative and Qualitative Methods 73
Discussion
Session
• How
does
the
evalua/on
ques/on
drive
the
selec/on
of
each
approach?
27
74 Advanced Monitoring and Evaluation Workshop Manual
Data Analysis 75
7 Data Analysis
Learning Objectives:
Introduc3on
1. The
appropriate
analysis
depends
on:
Example
:
Indicator:
Percent
of
women
and
men
aged
15–24
who
had
sex
with
more
than
one
partner
in
the
last
12
months
1. In
the
last
12
months,
have
you
had
sexual
intercourse
with
a
non-‐regular
partner?
2. If
the
answer
to
ques3on
1
is
“yes”.
How
many
non-‐regular
partners
have
you
had
sex
with
in
the
last
12
months?
3. If
the
answer
to
ques3on
1
is
“yes”:
Did
you
(or
your
partner)
use
a
condom.
78 Advanced Monitoring and Evaluation Workshop Manual
Research
Ques+ons
Primary
Research
Ques/on
What
is
the
percentage
of
young
persons
(15-‐24
years)
who
had
sex
with
more
than
one
partner
in
the
last
12
months?
Hypotheses
The
first
three
hypotheses
assume
we
have
baseline
data.
• The
percentage
of
young
persons
(15-‐24
years)
repor3ng
sex
with
more
than
one
partner
within
the
last
12
months
is
less
than
20%.
• The
prevalence
of
condom
use
reported
by
young
persons
(15-‐24
years)
having
sex
with
non-‐regular
partners
within
the
last
12
months
is75
%.
• More
males
than
females
have
mul3ple
sex
partners
among
young
persons
(15-‐24
years).
Specific
Aims/Objec3ves
• To
determine
percentage
of
young
persons
(15-‐24
years)
repor3ng
mul3ple
sex
partnering
within
the
last
12
months.
• To
determine
if
there
is
an
associa3on
between
sex
and
mul3ple
sex
partnering
among
young
persons
(15-‐24
years)
within
the
last
12
months.
Data Analysis 79
Study
Designs
• Quan5ta5ve
Methods
– Experimental
• Randomized
controlled
trial/Quasi
Experiment
• Qualita5ve
Methods
– Focus
group
discussion
– In-‐depth
interviews
• Qualita3ve methods
13
Ø Standardiza3on
– Standard
data
collec3on
methods
(calibra3on)
– wrigen
protocol
Ø Pilots/pre-‐tests
Data Analysis 81
Data Processing
15
17
82 Advanced Monitoring and Evaluation Workshop Manual
Valida3on
18
Coding
Ø Grouping
and
assigning
numeric
codes
to
the
responses
to
open
ques3ons
– e.g.,
types
of
physical
ac3vity
performed
Walk=1;
Jog=2,
Aerobics=3,
etc
19
Data
Entry
• Process
whereby
informa3on
is
transformed
into
a
format
that
can
be
read
by
a
computer
• Develop
coding
notes/codebook
– Ensure
consistency
&
standard
interpreta3on
of
codes
20
83
Cleaning/Edi3ng
Objec3ve
:
Iden3fy
and
correct
errors
1. Use
of
programmed
data
entry
soDware
Ø E.g.
Validate
in
EpiInfo
2. Printout
and
check
each
entry
3. Data
cleaning
techniques
such
as
frequencies
or
cross-‐tabula3ons
and
look
for:
– Out
of
range
values
e.g.
Ages
of
secondary
students
–
values
of
4
or
28
would
be
out-‐of-‐range
i.e.
errors.
– Consistency
of
the
data
e.g.
Errors
in
skip
pagerns
–
men
answering
ques3on
about
doing
PAP
smears
in
last
12
months
– Find
relevant
ques3onnaire
and
enter
correct
code
on
computer
21
Data
Storage
• Secure storage of hard copies of questionnaires"
• 3-5 years"
• Storage of data entered in computer "
• hard drive of computer "
• back up -- external drives/other data storage mediums
(CDs, flash drives)"
"
• Ethical issues"
• Must ensure confidentiality of the data collected"
1. Proper storage of data sheets and records
2. Limiting access to identifiable data
3. Adequately securing research records
4. Removing identifiers from human specimens and data 22
23
84 Advanced Monitoring and Evaluation Workshop Manual
Analyzing Data 85
8 Analyzing Data
Learning Objectives:
Data
Analysis
• Ensure
that
data
entry
is
completed
and
data
cleaned
• May
be:
– Descrip8ve
– Inferen8al
• Dummy
tables
– Should
be
prepared
at
the
design
phase
Dummy
Tables
• Constructed
to
guide
the
presenta8on
of
results
• Based
on
evalua8on
ques8ons
• Guides
the
approach
to
analysis
Education
Primary
Secondary
Tertiary
4
88 Advanced Monitoring and Evaluation Workshop Manual
Type of illness
Diabetes
Hypertension
Asthma
Dengue fever
Depression
Other
• Categorical
(qualita8ve)
– measures
characteris8cs
that
have
no
numerical
value
• e.g.
presence
of
disease,
gender,
occupa8on
• Inferen8al
Sta8s8cs
– Use
the
data
collected
from
samples
to
make
generaliza8ons
which
go
beyond
the
sample
• i.e.
predic8ons
about
the
popula8on
from
which
they
came
• Include
the
determina8on
of
confidence
intervals
and
hypothesis
tes8ng
7
Analyzing Data 89
Descrip8ve
Sta8s8cs
– Dependent
on
the
type
of
data
i.e.
categorical
or
quan8ta8ve
data
Categorical
– frequencies
or
percentages
– tables
or
charts
Quan/ta/ve
– Usually
presented
in
terms
of:
• Central
tendency
• Variability
/
dispersion
• Graphically:
– Bar
charts
– Pie
charts
10
90 Advanced Monitoring and Evaluation Workshop Manual
450
400
350
Number of Patients
300
250
200
150
100
50
12
13
Analyzing Data 91
• Mean:
– numerical
data
and
symmetric
distribu8on
• Median:
– numerical
data
with
skewed
distribu8ons
14
Normal Distribution
Skewed Distributions
15
16
92 Advanced Monitoring and Evaluation Workshop Manual
17
18
INFERENTIAL STATISTICS
19
Analyzing Data 93
Sta8s8cal
Inference
• Process
by
which
inferences
and
generaliza8ons
are
made
about
popula8on
parameters
20
• Sample
Sta/s/cs
–
measures
obtained
from
a
subset
of
the
popula8on
e.g.
sample
prevalence
21
Es8ma8on
• Data
collected
from
a
random
sample
to
es8mate
the
characteris8cs
of
the
popula8on
Confidence Interval
23
24
25
Analyzing Data 95
26
27
28
96 Advanced Monitoring and Evaluation Workshop Manual
Example:
(cont’d)
• Propor8on
anemic
(p)
=
263/586
=
0.449
– 95%
CI
(p)
=
0.449
±
(2
*
SE
(p))
– SE
(p)
=
√
[0.449
x
(1-‐0.449)/586]
=
0.021
– 95%
CI
=
0.449
±
0.042
=
0.407
to
0.491
29
Type of illness
Diabetes
Hypertension
Asthma
Dengue fever
Depression
Other
30
HYPOTHESIS
TESTING
Analyzing Data 97
Hypothesis
Tes8ng/
Significance
Tes8ng
• Hypothesis
– refutable
predic8on
Note:
• In
hypothesis
tes8ng
,
a
significance
test
can
never
prove
that
a
null
hypothesis
is
true
or
false,
but
can
give
informa8on
about
the
strength
of
the
evidence
against
it.
34
98 Advanced Monitoring and Evaluation Workshop Manual
• Type
II
– did
not
reject
Null
when
false
– concluded
that
there
was
none
but
there
really
was
a
difference
– related
to
sample
size
• Power
of
the
study-‐
the
ability
to
show
an
associa8on
or
difference
when
there
was
really
one
36
Sta8s8cal
Significance
• hypothesis
test
≡
significance
test
• p<0.05
– difference/associa8on
is
sta8s8cally
significant
• p>0.05
– not
significant
Analyzing Data 99
Categorical Variables
2
×
2
Tables
Independent
1
2
Outcome
1
a
b
2
c
d
– n
<
20
– n
is
between
20
and
40
and
the
Expected
Frequency
of
any
cell
is
<5
Example
An
inves.ga.on
was
carried
out
to
determine
if
a
new
vaccine
is
effec.ve
against
TB.
The
results
of
a
randomised
trial
are
shown
below
TB
vaccine
placebo
yes
30
90
no
230
150
Vaccine
30/260
=
11.5%
infected
Placebo
90/240
=
37.5%
infected
Was
the
vaccine
effec-ve?
or
are
observed
differences
due
to
chance?
102 Advanced Monitoring and Evaluation Workshop Manual
Null
hypothesis:
no
difference
(no
associa8on)
H0:
p1
=
p2
Alterna8ve
hypothesis:
a
difference(associa8on)
HI:
p1
≠
p2
χ2
=
46.11,
P
<0.001
Therefore,
Reject
null
hypothesis
(as
P<0.05)
Interpreta8on:
There
is
a
significant
difference
between
the
propor8on
of
persons
contrac8ng
TB
when
those
vaccinated
were
compared
with
those
given
the
placebo
(p<0.001).
11.4%
of
the
persons
given
the
vaccine
contracted
TB
compared
with
the
37.5%
given
the
placebo.
Choosing
the
Right
Analysis
Analyzing Data 103
Descrip8ve
• One
categorical
variable
§ E.g.
prevalence
of
smoking,
sa8sfied
with
services
§ Es8mate
(95%CI)
Analy8cal
• 2
categorical
variables
– Sta8s8cal
test
• Cross-‐tabula/on
• Chi
square
test
• Mul8ple
Regression
§ Very
powerful
analysis
§ Ensure
condi8ons
are
sa8sfied
Compare two paired groups Paired t test Wilcoxon test McNemar's test
Compare
three
or
more
One-‐way
ANOVA
Kruskal-‐Wallis
test
Chi-‐square
test
unmatched
groups
55
Analyzing Data 105
Always:
ü Get
feedback/support
ü Consult
(bio)sta/s/cian
106 Advanced Monitoring and Evaluation Workshop Manual
Epi Info Tutorial 107
Learning Objectives:
Main Modules
Enter
Data
• The
Enter
module
automa5cally
creates
the
database
from
the
ques5onnaire
in
MakeView.
• Users
enter
data,
modify
exis5ng
data,
or
search
for
records.
• The
Views
are
displayed
and
users
perform
the
data
entry
while
the
Check
Code
validates
the
data
or
performs
any
automa5c
calcula5ons
that
were
specified
in
MakeView.
Analyze
Data
• The
Analysis
module
is
used
to
analyze
data
entered
with
the
Enter
module
or
data
imported
from
24
different
data
formats.
• Sta5s5cs,
tables,
graphs,
and
maps
are
produced
with
simple
commands
such
as
READ,
FREQ,
LIST,
TABLES,
GRAPH,
and
MAP.
• As
each
command
is
run,
it
is
saved
to
the
program
editor
where
it
can
be
customized
and
saved,
shared,
and
used
in
the
future
as
data
are
revised.
EPI
Reports
• The
Epi
Report
module
is
a
user-‐friendly
tool
to
create
professional
custom
reports
that
include
results
from
the
Analysis
output.
• Can
combine
Analysis
output
with
data
from
Enter
as
well
as
other
sources
such
as
Access.
• Reports
can
be
saved
as
HTML
files
for
easy
distribu5on
or
web
publishing
Epi Info Tutorial 111
Epi
Maps
• The
Epi
Map
module
displays
geographic
maps
with
data
from
Epi
Info.
• Epi
Map
displays
files
containing
geographic
boundaries
layered
with
data
results
from
the
Analysis
module.
Crea5ng
a
View
112 Advanced Monitoring and Evaluation Workshop Manual
Field
Names
Field
Names
are
unique
“
variable
names”
when
analyzing
the
data.
– Field
names
may
have
a
number
in
them,
but
cannot
start
with
a
number
– Field
names
may
not
have
symbols
or
spaces
in
them
– Field
names
should
be
logical
and
easy
to
recall
for
later
analyses
Field
Names
are
formed
when
entering
text
in
the
“Ques3on
or
Prompt”
box,
but
can
be
edited.
Before
you
go
to
the
EnterData
program,
make
sure
you
are
happy
with
the
field
names,
because
once
the
data
entry
screen
is
opened,
field
names
cannot
be
cahnaged
unless
the
data
is
deleted
(DANGEROUS)
Check
Code
• Check
Code
is
a
series
of
commands
that
tell
Epi
Info
that
you
want
to
do
certain
“checks”
of
your
data
as
it
is
being
entered.
• By
using
Check
Code,
you
can
protect
your
data
against
many
common
types
of
errors
and
make
data
entry
easier.
• It
is
helpful
when
you
have
more
than
one
person
entering
data.
• There
are
two
ways
to
create
Check
Code
in
Epi
Info:
– Set
the
code
in
the
Field
Defini5on
dialog
box
– In
MakeView,
click
on
the
“Program”
buJon
and
begin
to
build
(write)
the
code.
• Check
Code
should
be
created
when
you
are
crea5ng
and
modifying
your
“view”
using
the
MakeView
program
(i.e.,
before
data
entry).
Tab
Order
• The
default
order
for
data
entry
is
the
order
in
which
the
fields
were
created
(and
then
modified).
• Thus,
you
might
want
to
change
the
order
of
fields
for
data
entry:
– Click
“Edit”
from
the
MakeView
menu
– Select
“Order
of
FieldEntry
(Taborder)”
op5on
– By
clicking
on
the
“Up”
or
“Down”
key,
you
can
change
the
order
of
fields
for
data
entry.
Entering Data
Retrieving
Records
• You
can
quickly
find
records
by
one
of
three
ways:
• In
the
“Choose
search
field
(s)”,
select
the
field(s)
that
define(s)
your
criteria.
• In
the
box
next
to
the
field
you
selected,
type
in
the
criteria.
Analyzing
Data
Epi Info Tutorial 117
READ
Command
• The
READ
command
imports
your
data
so
that
it
can
be
analyzed.
• Specify
the:
• data
source
(project)
• data
format
• data
table
Frequency
Distribu5ons
• The
FREQ
command
produces
a
frequency
table
for
specified
variable(s).
• The
resul5ng
table
shows
– how
many
records
have
each
value
of
the
field
– The
percentage
of
the
total
– Cumula5ve
percentage
– 95
%
confidence
intervals
for
each
value
118 Advanced Monitoring and Evaluation Workshop Manual
TABLES
Command
OUTCOME
• The
TABLES
(Dependent
command
produces
Variable)
a
cross-‐tabula5on
of
+ -
two
or
more
categorical
variables.
+ 20 40
EXPOSURE
(Independent
Variable) - 37 59
MEANS
Command
• The
MEANS
command
produces
descrip5ve
sta5s5cs
for
one
con5nuous
variable.
• The
sta5s5cs
include:
– Mean
– Median
– Mode
– Min/max
– Quan5les
– Variance/
standard
devia5on
Mock
Database
• Age
-‐
Age
• Sex
-‐
Sex
• Marital
Status
-‐
Married
• Educa5on
Level
-‐
Educ
• Ethnicity
-‐
Ethnicity
• Sexual
Intercourse
with
a
non-‐regular
partner
within
the
last
12
months?
Sex_Nonreg_partner
• How
many
non-‐regular
have
you
had
sex
within
the
last
12
months?
No_reg_partners
• Did
you
or
your
partner
use
a
condom
the
last
5me
you
had
sex
with
you
most
recent
non-‐regular
partners?
Condom_nonregpartner
• Have
you
had
an
STI
within
the
last
12
months?
STI
Epi Info Tutorial 119
Ques5ons
• What
should
be
our
plan
for
analysis?
• What
type
of
sta5s5cal
analysis
should
we
perform
to
answer
our
research
ques5ons?
120 Advanced Monitoring and Evaluation Workshop Manual
Managing and Resource Requirements for Evaluations 121
10 Managing
and Resource
Requirement for
Evaluations
Learning Objectives:
From
start
to
finish,
there
are
at
least
8
key
steps
to
observe
1. GeTng
started
2. Ini6al
planning
3. Terms
of
Reference
(TOR)
4. Selec6on
of
the
Evaluator/Evalua6on
Team
5. Evalua6on
Work
Plan
6. Fieldwork
Phase
7. Repor6ng
on
Evalua6on
Results
(
Oral
&
Wri`en)
8. Follow-‐up
to
the
Evalua6on
• The
terms
of
reference
(ToR)
document
defines
all
aspects
of
how
a
consultant
or
a
team
will
conduct
an
evalua6on
• Provides
an
explicit
statement
of
the
objec6ves
of
the
evalua6on,
roles
and
responsibili6es
of
the
evaluators
and
the
evalua6on
client,
and
resources
available
for
the
evalua6on
124 Advanced Monitoring and Evaluation Workshop Manual
} Does
the
delivery
mode
need
to
be
modified
or
alterna6ves
to
the
programme
developed?
(Alterna6ves)
} How
well
is
the
programme
performing?
Is
it
mee6ng
its
objec6ves
(Programme
success)
} What
results
are
the
programme
achieving,
both
intended
&
unintended?
(Programme
Impacts
&
Effects)
Design
Considerations
• Advisable
to
use
mul6ple
lines
of
evidence
• Use
both
quan6ta6ve
and
qualita6ve
methods
12
* Literature review
*
Administra6ve
records
*
Key
source
of
informa6on
on
ac6vi6es,
costs,
outputs,
etc.
• Nature
&
complexity
of
the
programme
–mul6-‐site?
Na6onal
vs
local;
etc.
• Sensi6vity of evalua6on issues and the decisions to be taken with the results
Addi6onal
Reading
• OECD
DAC
Summary
of
Key
Norms
&
Standards
–
Reference:
Morra
Imas,
L.
&
Rist,
R.
(2009)
• Websites
• CES
www.evalua6oncanada.ca
• AEA
www.eval.org
• AES
www.aes.asn.au
128 Advanced Monitoring and Evaluation Workshop Manual
Ethical Considerations 129
11 Ethical
Considerations
Learning Objectives:
Ethics
• The
study
of
the
fundamental
principles
that
define
values
and
determine
moral
duty
and
obliga0on
• The
"science
(study)
of
morality“
• A
set
of
moral
principles
or
values
• The
principles
of
conduct
governing
an
individual
or
group;
concerns
for
what
is
right
or
wrong,
good
or
bad
Interna0onal
Guides
• Helsinki
Declara0on
(1964,,,,2008)
– Developed
by
the
World
Medical
Associa0on
– Sets
fundamental
principles
• Design
and
performance
of
research
• Informed
consent
• Ethical
review
– hap://www.wma.net/en/30publica0ons/10policies/b3/
• CIOMS
Interna0onal
Ethical
Guidelines
– Council
of
Interna0onal
Organiza0ons
of
Medical
Sciences
– Established
by
WHO
and
UNESCO
to
serve
interest
of
biomedical
community
• 2002
Biomedical
research
guidelines
• 2008
–
Epidemiological
studies
guidelines
– hap://www.cioms.ch/
Informed
Consent/Assent
• Individual
have
right
to
choose
whether
to
par0cipate
(un-‐coerced)
and
to
stop
at
any
0me
without
any
penalty
Consent
Form
• Explana0on
of
purpose
of
study
• Dura0on
of
par0cipa0on
• Descrip0on
of
procedures,
poten0al
risks
and
benefits
• Disclosure
of
alterna0ve
procedures
• Descrip0on
of
how
confiden0ality
will
be
maintained
• Whether
compensa0on
will
be
given
or
if
treatment
is
available
if
injured
• Person
to
contact
for
addi0onal
info
about
study
–
address,
phone
numbers
etc.
• Par0cipa0on
is
voluntary
and
no
penalty
or
loss
of
benefits
if
refuse
• Par0cipant
has
read
the
form
and
understands
it
and
signature
indicates
agreement
• Signature
of
witness
also
needed
134 Advanced Monitoring and Evaluation Workshop Manual
Confiden0ality/Privacy
• Need
for
iden0fiers
such
as
name,
address?
– Poten0al
for
social
and
other
harm
Vulnerable
Popula0ons
• Diminished
ability
to
protect
interests
• Need
special
jus0fica0on
• Special
safeguards
– Children
– Pregnant
women
– Fetus
– Mentally
disabled
– Terminally
ill
– Prisoners
– Dependent
posi0ons
– Disadvantaged
–
economically,
educa0onally
Ethical Considerations 135
Incen0ves
• To
encourage
people
to
par0cipate
– Need
to
recruit
adequate
numbers
in
0mely
fashion
• Submit
applica0on
– Follow
guidelines
– Allow
sufficient
0me
for
review
– If
necessary,
address
concerns
and
resubmit
Applica0on
Document
• Title
page
– Name
of
Applicants,
Ins0tu0on,
Dates
(applica0on,
start
of
study,
dura0on)
On-‐line
Tutorial
• For
NIH
funded
grantees
• Human
Par0cipant
Protec0ons
Educa0on
for
Research
Teams
• hap://bioethics.od.nih.gov/casestudies.html
• Tutorial
and
exercise
• Cer0ficate
Regional
Ini0a0ves
• Bioethics
Society
of
the
English
Caribbean
– www.bioethicscaribe.org.jm
• Caribbean
Research
Ethics
Ini0a0ve
– www.caribbeanethics.com
138 Advanced Monitoring and Evaluation Workshop Manual
Session
Summary
• All
research
and
evalua0ons
must
be
ethically
conducted
• Ethical
research/evalua0on:
– Demonstrates
the
value
of
the
research/evalua0on
– Is
well
designed
(scien0fically
valid)
– Demonstrates
fair
par0cipant
selec0on
– Respects
persons
– Has
a
favourable
risk-‐benefit
ra0o
– Acquires
informed
consent
– Undergoes
independent
review
Session
Summary
• Ethics
approval
must
be
sought
before
the
field
work
is
undertaken
• Recommenda0ons
from
the
ethical
review
must
be
incorporated
into
the
research/evalua0on
design
• The
ethics
review
process
is
different
country
to
country
• Ethics
review
may
be
done
by
the
Ministry
of
Health,
an
Ethics
Commiaee
or
Ins0tu0onal
Review
Board
Challenges to Conducting Evaluations 139
12 Challenges to
Conducting
Evaluations
Learning Objectives:
Suggested
Solu.on
• The
RealWorld
Evalua5on
Approach
developed
by
Jim
Rugh
and
Samuel
Bickel
• (based
on
joint
work
with
Michael
Bamberger
and
Linda
Mabry).
African
Evalua5on
Associa5on
(AfrEA)
Step
2
Addressing
budget
constraints
• Modify
evalua.on
design
Step
3
Addressing
.me
constraints
• All
Step
2
tools
plus:
Step
4
Addressing
data
constraints
• Reconstruc.ng
baseline
data
• Mul.ple
methods
144 Advanced Monitoring and Evaluation Workshop Manual
Step
5
Addressing
poli.cal
constraints
• Accommodate
pressures
from
funding
agencies
or
clients
on
evalua.on
design
Step
6
Assessing
the
strengths
and
weaknesses
of
the
evalua.on
design
• Iden.fy
threats
to
validity
of
quasi-‐
experimental
designs
• Objec.vity/confirmability
• Replicability/dependability
• Internal validity/credibility/authen.city
• External validity/transferability/fit
• U.liza.on/applica.on/ac.on
orienta.on
Challenges to Conducting Evaluations 145
In
Conclusion
-‐
1
Evaluators
must
be
prepared
to:
In
Conclusion
-‐
2
• Evaluators
must
be
prepared
for
real-‐world
evalua9on
challenges
• There
is
considerable
experience
to
draw
on
• A
toolkit
of
rapid
and
economical
“RealWorld”
evalua9on
techniques
is
available
• Never
use
9me
and
budget
constraints
as
an
excuse
for
sloppy
evalua9on
methodology
• A
“threats
to
validity”
checklist
helps
keep
you
honest
by
iden9fying
poten9al
weaknesses
in
your
evalua9on
design
and
analysis
146 Advanced Monitoring and Evaluation Workshop Manual
Writing an Evaluation Report 147
13 Writing an
Evaluation Report
Learning Objectives:
• Formal
Record
What
you
discovered
in
conduc$ng
an
evalua$on,
in
terms
of
both
process
and
evalua$on
results,
may
be
applicable
to
future
programmes.
An
evalua$on
report
is
assurance
that
lessons
learned
are
available
for
future
applica$on.
• Your
work
can
help
others
Sharing
your
evalua$on
report
with
peers
who
may
be
considering
the
development
of
similar
programmes
may
help
them
to
more
effec$vely
design
their
programmes
3
4
5
150 Advanced Monitoring and Evaluation Workshop Manual
6
9
10
-‐ Everything
in
the
Execu$ve
Summary
should
be
based
directly
on
what
is
in
the
report.
No
new
informa$on
should
be
presented
in
the
Execu$ve
Summary.
-‐ Generally,
an
Execu$ve
Summary
should
be
between
three
and
five
pages
(depending
on
the
length
of
the
report).
-‐ The
average
busy
reader
should
come
away
with
an
understanding
of
what
the
project
was
about,
the
main
evalua$on
ques$ons,
key
findings,
and
major
conclusions
and
recommenda$ons
from
the
Execu$ve
Summary.
11
152 Advanced Monitoring and Evaluation Workshop Manual
12
13
14
Writing an Evaluation Report 153
-‐ What
was
the
ac$vity
or
programme
about?
In
this
sec$on
the
evaluator
must
provide
the
reader
a
concise
picture
of:
a. What
the
project
was
going
to
do?
b. What
the
objec$ves
were?
c. How
it
was
to
be
done?
d. Where
it
was
to
be
done?
e. Who
was
going
to
do
it?
f. At
what
cost?
15
16
-‐ The
descrip$on
should
include
the
unit
of
analysis,
selec$on
of
samples,
data
collec$on
instruments,
types
of
data
collected,
analy$c
techniques
used,
who
did
it,
and
when
it
was
done.
17
154 Advanced Monitoring and Evaluation Workshop Manual
• Design
and
Methodology
con9nued…
-‐ Ques$onnaires,
observa$on
checklists,
descrip$ons
of
sampling
procedures,
data
analysis
procedures,
and
other
suppor$ng
materials
should
be
included
in
the
Annex.
-‐ Here,
if
space
permits,
a
very
useful
summary
chart
can
be
displayed
which
aligns
the
evalua$on
ques$ons
with
the
data
type
and
source
used
to
answer
each
ques$on.
18
19
20
Writing an Evaluation Report 155
21
22
23
156 Advanced Monitoring and Evaluation Workshop Manual
24
25
26
Writing an Evaluation Report 157
27
28
29
158 Advanced Monitoring and Evaluation Workshop Manual
30
Working
Session
• Using
the
components
outlined
in
this
presenta$on,
for
a
programme/project
of
your
choice,
generate
a
template
for
an
evalua$on
report.
• Please
note
that
a
descrip$on
of
what
ought
to
be
involved
under
each
component
should
be
included.
32
159
Bibliography
American Evaluation Association. Guiding principles for evaluators. http://www.eval.org/publications/
guidingprinciples.asp (accessed September 2nd, 2012).
Bamberger M, Rugh J, Marby L. 2012. RealWorld Evaluation: Working under budget, time, data, and
political constraints. Sage Publications, Inc.
Becker H. 1970. Problems of inference and proof in participant observation. In H.S. Becker Sociological
work: Method and substance. Chicago: Aldine (Reprinted from American Sociological Review 1958 23:
652 – 660)
Boulmetis J, Dutwin J. 2011. The ABCs of Evaluation: Timeless techniques for program and project
managers. JohnWiley & Sons, Inc.
Bradford-Hill A. 1965. The environment and disease: Association or causation? Proceedings of the Royal
Society of Medicine 58: 295–300.
Centre of Excellence for Evaluation (Treasury Board of Canada Secretariat). 2004. The art and
architecture of writing evaluation reports. www.tbs-sct.gc.ca/cee/career-carriere/workshops-ateliers/
aawer-amrre-eng.pdf// (accessed September 2nd, 2012)
Cook TD, Reichardt CR. 1979. Qualitative and quantitative methods in evaluation research. Sage
Publications, Inc
Creswell JW. 2002. Research Design: Qualitative, quantitative and mixed methods approaches.
Thousand Oaks, CA: Sage Publications Inc.
Creswell JW, Plano Clark V. 2006. Designing and conducting mixed methods research. Thousand Oaks,
CA: Sage Publications Inc.
Epi Info Workshop. University of California, Berkeley – Center for Infectious Disease and Emergency
Preparedness [Online]. 2011 [cited 2011 Aug 24]. Available from URL : http://www.idready.org/epi/
Epi Info Tutorials. Center for Disease Control and Prevention [Online]. 2011 [cited 2011 Aug 24].
Available from URL : http://wwwn.cdc.gov/epiinfo/html/tutorials.htm
Evaluation Support Scotland. Evaluation Support Guide 11: Report writing. www.
evaluationsupportscotland.org.uk/downloads/SupportGuide11aug08.pdf // (accessed September 2nd,
2012)
Fetterman DM, Abraham W. 2004. Empowerment evaluation principles in practice. New York: The
Guilford Press.
Glasgow RE, Lichtenstein E, Marcus AC. 2003. Why don’t we see more translation of health promotion
research to practice? American Journal of Public Health 93 (8):1261-1267.
Groves RM, Fowler FJ, Couper MP, Lepkowski JM, Singer E, Tourangeau R. 2004. Survey Methodology.
New Jersey: John Wiley and Sons.
160 Advanced Monitoring and Evaluation Workshop Manual
Hall D, Hall I. 1996. Practical social research: project work in the community. Hampshire: Palgrave
Macmillan.
Hatry H, van Houten T, Plantz MC, Taylor M. 1996. Measuring programme outcomes: A practical
approach. Alexandra, VA: United Way of America.
Kusek JZ, Rist RC. 2004. Ten Steps to a results based Monitoring and Evaluation system. Washington
DC.: The International Bank for Reconstruction and Development/ The World Bank.
Leviton LC, Collins CB, Laird BL, Kratt PP. 1998. Teaching evaluation using evaluability assessment.
Evaluation 4 (4): 389 -409.
Morgan DL. 1996. Focus Groups as Qualitative Research. Thousand Oaks, CA: Sage Publications Inc.
Morra - Imas LG, Rist RC. 2009. The road to results: Designing and conducting effective development
evaluations. Washington DC.: The International Bank for Reconstruction and Development/ The World
Bank.
Owen J. 2007. Program evaluation: Forms and approaches. Taylor and Francis Group.
Patton MQ. 2002. Qualitative research and evaluation methods. Thousand Oaks, CA: Sage Publications
Inc.
Patton MQ. 2008. Utilization-focused evaluation. Thousand Oaks, CA: Sage Publications Inc.
TCPS 2. 2010. Tri-Council Policy Statement: Ethical conduct for research involving humans. Ottawa.
Interagency Secretariat on Research Ethics.
Posavac EJ, Carey RG. 1992. Program evaluation: methods and case studies. Prentice Hall.
Reichardt CS, Cook TD. 1979. ‘Beyond qualitative versus quantitative methods’. In TD Cook and CS
Reichardt (eds). Qualitative and quantitative methods in evaluation research. Beverly Hills, CA,. Sage.
Pp 7-32.
Scriven M, Roth J. 1990. Needs assessment: concepts and practice. Reprinted in Evaluation Practice
11: 135-44.
Thompson NJ, McClintock HO. 2000. Demonstrating your programme’s worth: A primer on evaluation
for programmes to prevent unintentional injury. Georgia: CDC. www.cdc.gov/ncipc/pub-res/demonstr.htm
// (accessed September 2nd, 2012)
W.K. Kellogg Foundation. 1998. The W.K. Foundation evaluation handbook. Battle Creek, MI: WK
Kellogg Foundation. www.wkkf.org/documents/WKKF/EvaluationHandbook/EvalHandbook.pdf //
(accessed September 2nd, 2012)
World Bank. 2004. Monitoring and Evaluation: Some tools methods and approaches. Washington D.C.:
The International Bank for Reconstruction and Development/World Bank.
World Bank. 2011. Writing terms of reference for an evaluation: A how-to guide. Washington D.C.: The
International Bank for Reconstruction and Development/World Bank.
Yarbrough D, Shulha L, Hopson R, Caruthers F. 2011. The program evaluation standards: A guide for
evaluators and evaluation users. Sage Publishers Inc.
161
© 2012