Sie sind auf Seite 1von 176

Caribbean Health Research Council

A D VA N C E D
MONITORING
AND
EVALUATION
Workshop Manual
MONITORING
AND
EVALUATION
Workshop Manual

© 2012
ii Advanced Monitoring and Evaluation Workshop Manual

Disclaimer

This publication was supported by the Cooperative Agreement Number 5U2GPS001914 from
The Centers for Disease Control and Prevention. Its contents are solely the responsibility of the
Caribbean Health Research Council and do not necessarily represent the official views of the
Centers for Disease Control and Prevention.

The material contained herein is that of a collaboration between the Caribbean Health Research
Council (CHRC), the U.S. Centers for Disease Control and Prevention, and the Regional
Monitoring and Evaluation Technical Working Group. Please seek permission from the CHRC to
copy, modify, publish or transmit this material which was compiled specifically for the purposes
of this workshop.
iii

Table of Contents
Acknowledgements.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . v

Preface.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... vii

Workshop Agenda .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .ix

Workshop Objectives.. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. xi

1 Why Evaluate.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... 1

2 Evaluability Assessment .. ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 11

3 Major Types of Programme Evaluation.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . .. 21

4 Developing Evaluation Questions: Key Considerations.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . .. 35

5 Measuring Programme Outcomes.. ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 47

6 Answering Evaluation Questions using Quantitative and Qualitative Methods.. . ... . ... . ... . .. 63

7 Data Analysis.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . .. 75

8 Analyzing Data.. ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 85

9 Epi Info Tutorial .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. 107

10 Managing and Resource Requirements for Evaluations.. .... .... .... .... .... .... .... .... .... .... 121
11 Ethical Considerations.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 129
12 Challenges to Conducting Evaluations.. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 139
13 Writing an Evaluation Report .. . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . ... . 147
Bibliography.. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. 159
iv Advanced Monitoring and Evaluation Workshop Manual
Acknowledgements v

Acknowledgements

T
he publication of the Advanced Monitoring and Evaluation (M&E) Workshop
Manual is the result of a collaborative effort by a number of partners, coordinated
and led by the Caribbean Health Research Council (CHRC).

CHRC expresses gratitude to the principal collaborators i.e. the members of the Caribbean
M&E Technical Working Group. The latter comprises the Caribbean HIV/AIDS Alliance,
US Centers for Disease Control and Prevention (CDC), PAHO HIV Caribbean Office, HIV/
AIDS Programme Unit of the Organization of Eastern Caribbean States (OECS), the Pan
Caribbean Partnership against HIV/AIDS (PANCAP), and UNAIDS.

CHRC acknowledges the critical role of the scores of stakeholders from Ministries
of Health, National AIDS Programmes and non-governmental organisations (NGOs)
throughout the Caribbean who participated in the various specialized M&E workshops
and provided the feedback that guided the development of the current manual. CHRC
is also grateful to the consultants who made useful contributions to the content of the
workshop materials.

It is also important to recognize the hardworking CHRC team led by Ms Elizabeth Lloyd
and including Mr. Erin Blake, Ms Shelly-Ann Hunte, Ms Keren Wilson, Ms Candice
McKenzie and Ms Jaselle Neptune for their role in the finalization of the content and
format of the Manual.

The development of Advanced M&E Workshop Manual would not have been possible
without the funding received from the CDC, through a Co-operative Agreement with
CHRC.
vi Advanced Monitoring and Evaluation Workshop Manual
vii

Preface

T
he Advanced Monitoring and Evaluation (M&E) Workshop Manual is the sequel
to the highly regarded Basic M&E Manual, which was published in 2011. This
specialised Manual was developed by the Caribbean Health Research Council to
build on the fundamental M&E concepts that were introduced in the Basic Workshop but
with a sharp focus on the capacity to conduct programme evaluations in the Caribbean.

The Advanced Workshop was designed to address the needs identified by Caribbean
health professionals who had benefitted from the basic M&E training and wanted to
further enhance their skills. Although CHRC had been hosting specialized M&E workshops
since 2006, both the content and the format were repeatedly revised in response to the
feedback received from the participants who attended the various ‘pilot’ workshops.
We are now very pleased with the eventual product, which has also been receiving very
positive reviews.

Persons who complete the training are expected to be well tooled for the conduct of
programme evaluations. Using the usual CHRC format of PowerPoint presentations
and facilitated group work sessions, they are first reminded as to why it is important
to evaluate programmes. Subsequent topics cover determining the readiness of the
programme for evaluation, choosing the right evaluation questions as well as the most
appropriate evaluation methodology i.e. whether to use quantitative and/or qualitative
approaches. There is also a strong focus on data analysis with both academic and hands-
on sessions. Indeed, the participants are taught to use the Epi Info software, which is not
only a powerful data analysis package but is available free of charge over the internet
through the support of the Centers for Disease Control and Prevention (CDC). Critically,
the workshop also addresses some of the realities and challenges that practitioners face
in the conduct of evaluations such as ethical and budgeting considerations, managing
the evaluation and crucially, writing the evaluation report. The workshop is designed to
be hosted over five days but given its modular format, it can be customised to meet
stakeholders’ requirements and available resources, including time.

It is important to appreciate how the Advanced Workshop relates to other M&E capacity
development efforts. CHRC in collaboration with the Caribbean M&E Technical Working
Group has recently published the M&E Training Strategy for the Caribbean, which is
already being implemented. The Strategy comprises ten key elements designed to
viii Advanced Monitoring and Evaluation Workshop Manual

ensure that M&E is institutionalized in the Caribbean, including the spawning of an M&E
culture as well as establishing systems to build capacity that cover in-service training
to Masters Degrees in Monitoring and Evaluation at Universities. The development of
standardised training materials such as this workshop manual is probably the most
prominent element. Others include participants being strategically selected for training
and receiving structured, sequenced tuition through an expanded range of modalities,
with post-workshop support and mentorship.

It is also our intention that the Advanced M&E Workshop Manual would be used as a
critical reference guide for persons interested in and conducting programme evaluations.
This is the experience with other similar CHRC publications such as the Basic M&E
Workshop Manual and the flagship Basic Research Skills Workshop Manual. The
Advanced Manual also includes a extensive bibliography so that persons who have
further interest in evaluation, can access additional resources.

CHRC expects that the Advanced M&E training workshop and this Manual would
continue to play a critical part in the development of M&E capacity in the Caribbean
and facilitate the achievement of our goal, which is to institutionalize M&E is all health
programmes. We also envision that CHRC will be producing additional specialised M&E
workshop manuals, in response to other identified needs.

Donald T. Simeon, PhD.


Director
Caribbean Health Research Council
ix

Workshop Agenda
Time Day 1 Day 2 Day 3 Day 4 Day 5
8:30 - 9:00 Introduction & Recap of Recap of Recap of Recap of
outline of the previous day previous day previous day previous day
training

9:00 - 9:30 Workshop Major Types Measuring Data Analysis Managing


Pretest of Programme Programme and Resource
Evaluation Outcomes Requirements
for Evaluations

9:30 - 10:45 Why Evaluate? Ethical


Considerations

10:45 -11:00 BREAK BREAK BREAK BREAK BREAK

11:00 - 12:30 Working Working Working Analyzing Data Challenges to


Session: Session: Session: Conducting
evaluations
Why Evaluate? Major Types Measuring
of Programme Programme
Evaluation Outcomes

12:30 - 1:30 LUNCH LUNCH LUNCH LUNCH LUNCH

1:30 - 3:00 Evaluability Developing Answering Working Writing an


Assessment Evaluation Evaluation Session: Evaluation
Questions: Key Questions using Report
Considerations Quantitative Data Analysis
and Qualitative
Methods

3:00 - 3:15 BREAK BREAK BREAK BREAK BREAK

Working Working Working Working Workshop


Session: Session: Session: Session: Post-test
3:15 - 4:30 Evaluation
Evaluability Matrix Answering Data Analysis
Assessment Evaluation
Questions using Workshop
Quantitative Evaluation &
and Qualitative Wrap Up
Methods
x Advanced Monitoring and Evaluation Workshop Manual
xi

Workshop Objectives

By the end of this workshop, participants will be able to:

• Understand the terms, theory and practice of evaluation;

• Understand when and how to use the different evaluation tools

• Identify the strengths and weakness of evaluation approaches; and,

• Solicit and manage an evaluation process


xii Advanced Monitoring and Evaluation Workshop Manual
1

1 Why Evaluate?

Learning Objectives:

By the end of this session participants will be able to:

• Review core M&E concepts

• Understand the role of indicators in evaluation

• Understand the scope and intention of evaluation


activities
2 Advanced Monitoring and Evaluation Workshop Manual
Why Evaluate? 3

What  is  Monitoring?  


o  A   con0nuous   func0on   that   uses   the   systema0c  
collec0on  of  data  on  specified  indicators  to  document  
the   extent   of   progress   towards   the   realiza0on   of  
intended  programme  or  project  outcomes  

o  Monitoring  is  the  rou0ne  process  of  data  collec0on  to  


measure  progress  toward  programme  objec0ves  

Monitoring  ac0vi0es  provide  answers  to  the  following  


ques0ons:  
 
1.  Is  the  programme  achieving  its  goals?  
2.  Is  the  programme  being  implemented  as  intended?  
3.  What  factors  are  facilita0ng/  hindering  success?  
4.  What  are  the  unintended  outcomes?  
5.  What  are  the  lessons  learned  up  to  this  point?  
6.  Are  stakeholders’  priori0es  being  addressed?  

What  is  Evalua0on?  


•  Evalua0on  is  the  determina0on  of  the  value  of  a  
project,  programme  or  policy  

•  “Evalua0on  should  be  seen  as  a  process  of  


knowledge  produc0on    which  rests  on  the  use  of  
rigorous  empirical  inquiry  .  The  evaluator  must  
make  a  series  of  interrelated  decisions    in  order  to  
make  a  judgment  of  worth”  (Owen  2007)  
4 Advanced Monitoring and Evaluation Workshop Manual

Types  of  Evalua0on  


•  Summa0ve  evalua0on      
Occurs  upon  the  comple0on  of  the  programme    to  determine  the  
extent  to  which  results  have  been  achieved.  Guides  decisions  
regarding  the  adop0on,  expansion  or  con0nua0on  of  a  
programme  

•  Other  types  of  evalua0on  


     Forma0ve  evalua0on    
 Examines  the  ways  in  which  a  programme,  policy,  or  project  is  
implemented.  This  type  of  evalua0on  is  conducted  during  the  
implementa0on  phase  of  a    project  or  programme.  Forma0ve  
evalua0ons  are  some0mes  called  process  evalua0ons,  because  
they  focus  on  opera0ons.  
       Prospec0ve  evalua0on    
Assesses  the  likely  outcomes  of  proposed  projects,    programmes  or  
policies.  Also  known  as  an  evaluability  assessment,  it  answers  the  
ques0on  “is  this  programme  or  project  worth  evalua0ng”?  

Monitoring  data  feeds  evalua0on  ac0vi0es  


 

Monitoring  Ques0ons  
•   How  many  condoms  were  distributed?  
•   How  many  tests  were  done?  
•   How  many  preven0on  workshops  were  conducted  for  
   secondary  school  students?  
 
Evalua0on  Ques0ons  
•   What  is  the  impact  of  the  condom  distribu0on    
   programme  on  the  number  of  persons  tes0ng  
   posi0ve  for  sexually  transmi]ed  diseases?  
 
•   Have  the  preven0on  workshops  helped  to  promote    
   safe  sexual  prac0ces  among  secondary  school    
   students?    
 
 

INDICATORS  
Why Evaluate? 5

Why  Are  Programme  Indicators  


Important?  
•  Used  to  describe  the  progress  made  towards  
mee0ng  the  programme  objec0ves  
•  Allow  the  iden0fica0on  of  “kinks”  in  the  programme  
implementa0on  process  
•  Monitor  the  progress  of  the  programme  

•  Provide  the  informa0on/data  that  answer  


monitoring  and  evalua0on  ques0ons  

•  Provide  clues,  signs,  and  markers  that  tell  us  how  


close  we  are  to  our  intended  path  

Programme  Components    

Level   Descrip,on   Timeframe  


Inputs   Resources  that  are  put  into  the  project.  Lead  to   Throughout  
the  achievement  of  the  outputs     programme  

Outputs   Ac;vi;es  or  services  that  the  project  is   E.g.  quarterly,  in  the  
providing.    Outputs  lead  to  outcomes     5th  month,  in  the  2nd  
year,  etc.  

Outcomes     Changes  in  behaviors  or  skills  as  a  result  of  the     Short,  medium,  long-­‐
interven;on.  Outcomes  are  an;cipated  to  lead   term  
to  impacts    

Impacts   Measurable  changes  in  health  status,  e.g.   Related  to  long-­‐term  
reduced  STI/HIV  prevalence.    Impacts  are  the   outcomes  
effect  of  several  interven;ons  and  are  achieved  
over  ;me    

Results  Based  Chain    


6 Advanced Monitoring and Evaluation Workshop Manual

 Types  of  Indicators  


•  Indicators  are  used  at  all  levels  of  the  programme  
implementa0on  process    

–  Monitoring  indicators  (input,  process  and  output)    


 e.g.  #  of  staff  members  trained  in  PMTCT  during  the  first  6  
months  of  the  programme    
 
–  Outcome  indicators  
 e.g.  %  of  schools  that  provided  life  skills-­‐based  HIV  
educa0on  in  the  last  academic  year  (UNGASS)  
   
–  Impact  indicators  
 e.g.  %  of  most-­‐at-­‐risk  popula0ons  who  are  HIV-­‐infected  
(UNGASS)  
 

 
Types  of  Indicators  
Input    
Financial,  human,  material,  and  technical  resources  
 
Example:    
o  Total  budget  for  ac0vi0es  in  2009  
o  #  of  staff  members  required  to  execute  ac0vi0es  
o  #  of  posters,  TV  slots  required  to  promote  programme  ac0vi0es  
over  the  next  4  months  
 

Types  of  Indicators  


Output    
               Ac0vi0es  completed  
 
Example:    
 
o  Training  material  distributed  
o  Number  of  sensi0za0on  sessions  conducted  on  diabetes  and  
hypertension  in  the  past  6  months  
Why Evaluate? 7

Types  of  Indicators  


Outcome    
o  Improvements  targeted  by  the  end  of  the  project  
o  Consequence  of  ac0vi0es  
o  Used  for  strategic  and  programma0c  repor0ng  to  na0onal  
authori0es  and  donors  

 Example:  
 %  of  young  women  aged  15  –  24  who  never  had  sex  
 
 

Impact    
o  The  longer  term  improvements  you  are  aiming  at    
o  May  be  needed  for  na0onal  or  global  repor0ng  
 
 Example:    
 %  of  HIV-­‐infected  infants  born  to  HIV-­‐  infected  mothers  
   

Role  of  Indicators  in  Evalua0on  :  EVALUATION  MATRIX  


 
Evalua0on  Objec0ve:  To  assess  the  effect  of  a  programme  aimed  at  encouraging  safe  sexual    
prac0ces  among  persons  15  –  49  in  the  community  of  St.  Pierre  
Type  of  Evalua0on:    Impact  (Cause  &  Effect  Evalua0on  Ques0on)  

programme / Evaluation Information Indicator Info. Data Data Limitations


Project Question(s) Required Source(s) Collection Analysis
Objective Methods Methods

WHAT IS THE WHAT DO WHAT DO HOW WHERE HOW ARE WHAT WHAT
PURPOSE OF YOU WANT YOU NEED WILL YOU ARE YOU YOU WILL YOU CAN'T YOU
THE TO KNOW? TO MEASURE GOING TO GOING DO WITH DO
PROGRAMME ANSWER IT GET IT? TO GET IT ONCE (CAVEATS)
OR PROJECT? THE IT? YOU GET
QUESTION? IT?

Percent  of  
women  and  
men  aged  
15–49  who  
had  sex  with  
more  than  
one  partner  
in  the  last  
12  months  

COMMON  CONCERNS  ABOUT  EVALUATION  


•  Concern  #1:  Evalua0on  diverts  resources  away  from  the  programme  
 
and  therefore  harms  par0cipants  

•  Concern  #2:  Evalua0on  increases  the  burden  for  programme  staff  

•  Concern  #3:  Evalua0on  is  too  complicated  

•  Concern  #4:  Evalua0on  may  produce  nega0ve  results  and  lead  to  
informa0on  that  will  make  the  programme  look  bad  or  lose  funding  

•  Concern  #5:  Evalua0on  is  just  another  form  of  monitoring  

•  Concern  #6:  Evalua0on  requires  seFng  performance  standards,  and  


this  is  too  difficult  
8 Advanced Monitoring and Evaluation Workshop Manual

Evalua&ons  help  to:  


•  Generate  knowledge  about  your  programme  
or  project’s  
–  Goals  
–  Recipients  
–  Personnel  
–  Physical  Resources  
–  Outputs  
–  Outcomes  

Evalua&ons  help  to:  


 
•   Generate  knowledge  about  your  processes  
–  Data  sources  
–  Data  collec&on  methods  
–  Data  collec&on  tools  
–  Methods  of  analyzing  and  interpre&ng  the  data  
you  collect  

Evalua&ons  help  to:    


•   Make  a  determina&on  of  worth  
–  Con&nue  as  is  
–  Con&nue  with  revisions  
–  Stop  

•  Make  the  link  between  the  intended  goals  of  


the  programme,  the  priori&es  of  stakeholders,  
and  the  actual  outputs  and  outcomes  of  the  
programme  
Why Evaluate? 9

Why  Evaluate  Your  Programme?  


•  To  determine  the  effec&veness  of  programmes  
–  Were  the  intended  outcomes  achieved  for  the  intended  
users?  
•  To  strengthen  financial  responses  and  accountability  
•  To  promote  a  culture  of  learning  which  is  focused  on  
service  improvement  through  evidence-­‐based  
prac&ces  
•  To  promote  replica&on  of  successful  interven&ons  
(using  evidenced-­‐based  prac&ces)  
•  To  determine  the  impact  of  programmes  by  repor&ng  
on  the  intended  as  well  as  unintended  outcomes  

We  evaluate  to…  
•  Ascertain  if  you  are  doing  the  right  things,  and  
doing  them  right  
 Ten  Steps  to  Results-­‐based  M&E  System,  Kusek  
and  Rist,  2004  
•  Take  stock  of  where  you  are  
 Empowerment  evalua&on  principles  in  prac&ce,  
FeXerman  &  Wandersman,  2005    
•  Dis&nguish  what  works  from  what  doesn’t  
 U&liza&on-­‐focused  evalua&on,  PaXon,  2008    

GUIDELINES  FOR  CONDUCTING  A  SUCCESSFUL  


EVALUATION  
•  Invest  heavily  in  planning    
•  Integrate  the  evalua1on  into  ongoing  ac1vi1es  of  the  programme  

•  Par1cipate  in  the  evalua1on  and  show  programme  staff  that  you  
think  it  is  important  

•  Involve  as  many  of  the  programme  staff  as  much  as  possible  and  as  
early  as  possible  

•  Be  realis1c  about  the  burden  on  you  and  your  staff  

•  Be  aware  of  the  ethical  and  cultural  issues  in  an  evalua1on  
10 Advanced Monitoring and Evaluation Workshop Manual

GROUP  WORK:    
Crea&ng  an  Evalua&on  Case  Study  
 
 
•  Think  about  a  project  /  programme  that  you  
are/were  involved  in    

•   Please  refer  to  the  handout  as  you  work  to  


complete  the  following  template  for  an  
evalua&on  case  study  

Evalua&on  Case  Study  


Title   What  is  the  &tle  of  your  case  

 Introduc&on   Brief  statement  to  serve  as  a  ‘lead  in’  to  your  case.    
Area  you  are  working  in.  
Why  is  this  an  issue  

Background   Informa&on  on  the  context  (length  of  &me  project  has  been  
implemented,  history  of  programme  etc).  

Descrip&on   Full  descrip&on  of  the  programme    


Par&cipants,  Target  audience.    
Programme  effort    
Problem  being  faced  
 
Conclusion     Statements  that  conclude  the  case  descrip&on  
Evaluability Assessment 11

2 Evaluability
Assessment

Learning Objectives:

By the end of this session, participants will be able to:

• Know what is an evaluability assessment (EA)

• Know the goals and benefits of an EA

• Be familiar with the process to conduct an EA


12 Advanced Monitoring and Evaluation Workshop Manual
Evaluability Assessment 13

Evalua&on  Assessment  –  What  is  it?  


A  brief,  preliminary  study  to:  

•  serve  as  front-­‐end  planning  for  the  eventual  evalua&on  

•  determine  whether  a  programme  is  ready  to  be  evaluated    

•  see  if  an  evalua&on  would  be  useful  and  feasible  

•  help  scope  the  evalua&on  

•  determine  possible  op&ons  for  conduc&ng  the  evalua&on  

Why  is  an  Evaluability  Assessment  


important?  
Direct  Benefits  

•  Helps  clarify  goals  and  objec&ves  as  well  as  theory  of  change  for  the  
programme  

•  Iden&fies  data  sources  

•  By  extension,  iden&fies  data  gaps  and  needs  for  data  collec&on  

•  Clarifies  informa&on  needs  of  key  stakeholders  

•  Helps  focus  the  evalua&on  

Why  is  an  Evaluability  Assessment  


important?    
Indirect  Benefits  

•  Programme  managers  can  benefit  through  clearer  ar6cula6on  


of  programme  goals  and  objec6ves  

•  May  actually  re-­‐define  the  purpose  of  the  evalua6on  

•  Facilitates  communica6on  between  evaluators  and  


stakeholders  

•  Increases  likelihood  that  evalua6on  results  would  be  used  


5
14 Advanced Monitoring and Evaluation Workshop Manual

Other  Benefits  
•  Help  ensure  evalua6on  resources  are  used  
judiciously  
–  To  address  needs  of  stakeholders  
–  To  focus  on  aspects  of  the  programme  that  are  
developmentally  appropriate  for  evalua6on  
–  To  address  ques6ons  that  can,  in  fact,  be  answered  given  
design  feasibility  and  data  availability  

•  Improve  programme  design  and  implementa6on  

Is intervention promising?

Yes
Does intervention have programme design integrity and
realistic, achievable goals?

Yes

Is intervention implemented as
intended and at an appropriate
developmental level?

Yes
To answer questions:
(1) Is there a feasible
design? (2) Are data
available or feasible
to collect?
Assist in
improvement
Yes of programme design,
implementation, and
Evaluable evaluation characteristics
Intervention 7

Cau6on  
•  Can  delay  evalua6on  unnecessarily…  
             …  if  applied  to  all  programmes  before  
evalua6on  
               …  if  the  assessment  process  too  long  

•  Senior  officials  could  become  impa6ent  if  the  


‘planning  stage’  is  seen  as  too  long    
–  could  discredit  role  and  use  of  Evalua6on  in  the  
organiza6on   8
Evaluability Assessment 15

Goals  of  an  Evaluability  


Assessment  
•  Develop  an  understanding  of  programme  ra6onale  and  structure  and  its  
opera6ng  environment  

•  Iden6fy  expected  use(s)  of  the  evalua6on  findings  

•  Review  previous  studies  (evalua6on,  research,  internal  audit,  etc.)  

•  Determine  the  programme-­‐specific  issues  which  could  be  examined  in  the  
evalua6on  

•  Determine,  analyze  and  cost  evalua6on  op6ons  

•  Recommend  an  appropriate  evalua6on  approach  


9

Overview  of  Assessment  Process  


•  May  be  more  or  less  formal    
–  May  involve  a  Terms  of  Reference  

•  Should  involve  programme  personnel  

•  Should  not  be  lengthy  

•  Products  of  the  Assessment:  


–  Assessment  report    
•  may  be  limited  to  a  short  note  for  straigh[orward  cases  
•  a  major  evalua6on  should  have  a  substan6al;  assessment  
–  Op6ons  and  recommenda6on  for  evalua6on  
–  Basis  for  TOR  for  eventual  evalua6on  
–  Updated  profile  of  the  programme  
10
                         

Key  Components  of  an  Assessment  

•  Programme  Descrip6on  
                 
•  Iden6fica6on  of  Evalua6on  Issues  and    
Ques6ons  

•  Evalua6on  Op6ons  

11
16 Advanced Monitoring and Evaluation Workshop Manual

Describing  the  Programme  


 Working  with  stakeholders  (including  the  intended  users),  
develop  and  document:    

•  Programme  Profile    
–  Need,  programme  ra6onale,  structure,  governance,  clients,  etc.  

•  Theory  of  change  –  Logic  Model    


–  Expecta6ons  re:  outputs  and  outcomes  

•  Opera6ng  environment    
–  risks  
12

 
Using  Logic  Models  
•  Generates  clear,  concrete  statements  of  programme  
goals/objec6ves  and,  
   

•  Links  programme  ac6vi6es  to  desired/expected  


outcomes  
•  A  tool  to  gain  consensus  on  what  a  programme  is  
expected  to  achieve,  how  best  to  measure  
‘performance’,  and  what  would  be  deemed  ‘success’  
•  A  basis  for  addressing:  
         *  Are  we  measuring  the  right  things?    
           *  Do  we  have  the  right  indicators?  
           *  What  are  the  key  ques6ons  that  need  to  answered?
13

                     A  ‘Results  Chain’  Logic  Model  of  a  Programme:    


                                           From  Ac6vi6es  to  Outcomes  

Inputs Activities Outputs Immediate Intermediate Final Outcomes


Outcomes Outcomes (Impacts)

Control

Direct
Indirect
Influence
Influence

Process: Are we doing


things right?

Outcomes: Are we doing the right things?

14
Evaluability Assessment 17

Iden6fying  Evalua6on  Issues  and  


Ques6ons  
•  Through  consulta6on,  document  review  and  environmental  
scanning:  
–  iden6fy  relevant  issues  and  ques6ons  for  possible  study  –  E.g.  
               *  Management  needs  
               *  External  requirements  (e.g.  funding,  repor6ng)  
               *  Other  issues  determined  to  be  of  value  

•  Develop  an  Issues  *  Indicator  Matrix  

•  Align  with  data  sources  and  methodologies  


–   important  and  useful  since  it  clearly  and  concisely  summarizes  the  
research  strategy  for  future  evalua6on   15

Issues  ,  Indicators  &  Methods  


Issue
                                                                                 
Question Question Question

Indicator Indicator Indicator Indicator Indicator

Method Method Method Method

16

Matching  Issues,  Indicators,  Data  Sources  


and  Methods  
Issue/Question Indicator Standard Data Source Method
(Optional)
Programme •  Length of •  waiting list longer •  Admin data •  Admin Data
waiting list than 2 months •  Environmental Analysis
Rationale
•  No. of referrals •  More than 75% of scan •  Doc/Lit Review
Q 1. Is there a need from teachers teachers, parents & •  Teachers •  Survey of
for the after-school •  Teacher, admin indicate teachers
•  School
tutoring program? school admin & strong need for Administrators •  Parent Focus
parent opinion programme groups
•  Key Informant
Interviews with
School Admin

Programme
Success
Q 2. To what extent
are students
involved in the
programme
improving their skills 17
18 Advanced Monitoring and Evaluation Workshop Manual

Matrix  of  Issues,  Ques6ons  and  Methods  


Methods
Issues & Key Focus Client Document/ Programme
Informant Groups Survey File Records
Questions Interview Review
Issue #1

* Questn #1 x x x x
* Questn #2 x x x
* Questn #3 x x x
Issue #2

* Questn #4 x x x
* Questn #5 x x x
Issue #3

Etc.
18

Iden6fying  Op6ons  for  Evalua6on  


•  Analysis  would  provide  an  understanding  of  what  is  required  re  fieldwork  
for  a  poten6al  evalua6on:  
             *  data  sources  and  data  gaps  
             *  6me  and  cost  implica6ons  
 
•  Ar6culate  small  no.  of  op6ons,  iden6fying    
             *  evalua6on  issues  and  ques6ons  
               *  methodologies  
               *  cost  
               *  6me  requirements  
               *  advantages/disadvantages  i.e.  strengths  and  weaknesses  of  op6on  

•  Iden6fy  recommended  op6on  along  with  ra6onale  


19

Evaluability  Assessment  Process  

20
Evaluability Assessment 19

Finding  a  balance…  

EVALUATOR
PROGRAM SCIENTIFIC
CONSIDERATIONS CONSIDERATIONS

21

Working  Session  
Instruc6ons:  

ü   Think  about  a  project  or  programme  that  may  


be  in  need  of  an  evalua6on  

ü Determine  whether  or  not  it  is  ready  to  be  


evaluated  

22

Template  
Programme Title
Programme Description
Documents Collected
Logic Model?
Data collection on intervention reality

Stakeholder Consulted
Goal Agreement
Logic Model Agreement
Produce Assessment Report
Feedback
23
20 Advanced Monitoring and Evaluation Workshop Manual
Evaluability Assessment 21

3 Major Types
of Programme
Evaluation

Learning Objectives:

By the end of this session, participants will be able to:

• Understand that there are many factors that contribute to


choosing an evaluation approach

• Be aware of the many different approaches to evaluations

• Understand the key elements of the most common


evaluations that they will face (needs assessment, process
and outcome evaluation)

• Practice choosing an evaluation approach and justifying it


22 Advanced Monitoring and Evaluation Workshop Manual
Major Types of Programme Evaluation 23

Types  of  Evalua4on  


•  Three  broad  types  of  evalua4on:    
–  Forma&ve:  evalua4on  of  ways  in  which  a  programme,  
project  or  policy  is  being  implemented  (process,  mid-­‐
term  evalua4on)  
–  Summa&ve:  evalua4on  conducted  at  the  end  of  a  
programme  or  project  to  determine  the  extent  to  
which  it  achieved  desired  results  (outcome,  impact  
evalua4ons)  
–  Prospec&ve:  evalua4on  of  the  likely  outcomes  of  a  
programme,  project  or  policy  (similar  to  an  
evaluability  assessment)  

Purposes  for  Evalua4on  


•  Ethical:  repor4ng,  accountability  

•  Managerial:  decisions  on  alloca4on  of  resources,  


improve  management  

•  Decisional:  con4nue,  terminate,  reshape  the  


programme  

•  Educa&onal  and  Mo&va&onal:  help  people  


understand  and  iden4fy  with  the  programme  

Evalua4on  Uses  
•  Evalua4on  can  provide  informa4on  on:  

–  Strategy:  Are  the  right  things  being  done  at  the  


right  4me?  

–  Opera&on:  Are  things  being  done  the  right  way?  

–  Learning:  Are  there  beKer  ways  to  be  doing  


things?  
24 Advanced Monitoring and Evaluation Workshop Manual

Evalua4on  Spectrum  

Learning  

Accountability  

Evalua4on  and  Evaluators  


•  Credibility  is  important  and  will  determine  the  
type  of  evalua4on  and  evaluator  –  independent  
experts  applying  rigorous  methodologies  are  
oNen  seen  as  the  gold  standard...  

•  But  evalua4ons  can  be:  


–  Internal:  Conducted  by  people  from  within  the  
organisa4on  
–  External:  Conducted  by  people  outside  the  
organisa4on  
–  Par&cipatory:  Conducted  by  stakeholders  

Evaluate!  
•  The  most  important  thing,  though,  is  to  
evaluate!  
•  It  is  beKer  to  do  an  average  evalua4on    
 than  not  to  evaluate  at  all  
•  The  focus  should  be  on  u4lity,  relevance  and  
prac4cality  –  aiming  for  rigour  and  validity  
where  possible  (not  the  reverse)  
•  Successful  interven&ons  depend  on  regular  
feedback  and  adjustments  
Major Types of Programme Evaluation 25

Evalua4on  Design  Influenced  by...  


•  The  reason  for  the  evalua4on  
•  The  evalua4on  ques4ons  posed  
•  The  subject/s  being  evaluated  
•  Your  target  audience  for  the  evalua4on  results  
•  Time  
•  Budget  
•  Geography  
•  Importance  and  scale  of  the  programme  

Evalua4on  Approaches  
•  Goal-­‐based  evalua&on:  Evalua4on  that  measures  
the  extent  to  which  a  programme  reaches  clear  
and  specific  objec4ves    
•  Goal-­‐free  evalua&on:  Evalua4on  in  which  the  
evaluators  ignore  the  rhetoric  of  the  programme  
and  base  the  evalua4on  purely  on  the  degree  to  
which  the  programme  meets  par4cipants’  needs  
•  Mul&-­‐site  evalua&on:  Evalua4on  that  examines  
interven4ons  at  different  loca4ons  
•  Cluster  evalua&on:  Evalua4on  that  looks  at  
similar  or  related  interven4ons  

Evalua4on  Approaches  
•  Social  Assessments:  Looks  at  social  structures,  
processes  and  changes  within  a  community  or  group  
•  Environmental  Assessments:  Explores  the  affects  of  
the  programme  on  the  environment  
•  Par&cipatory  Evalua&on:  Evalua4ons  in  which  
responsibili4es  for  planning,  implemen4ng  and  
repor4ng  are  shared  with  stakeholders,  who  may  help  
define  the  evalua4on  ques4ons,  collect  and  analyse  
data,  draN  and  review  the  report  
•  Outcome  mapping:    Mapping  of  behaviour  change  (at  
the  outcome  level  –  not  impact  level)  
26 Advanced Monitoring and Evaluation Workshop Manual

Evalua4on  Approaches  
•  Rapid  Assessment:  Systema4c  semi-­‐structured  
approach  used  in  the  field,  typically  by  a  team  of  
evaluators  (seen  as  a  compromise  between  
speed  and  rigour)  
•  Evalua&on  Synthesis:    Approach  in  which  an  
evaluator  looks  across  interven4ons  addressing  a  
similar  issue  or  theme  to  determine  causality  
•  Meta-­‐evalua&on:  Expert  review  of  one  or  more  
evalua4ons  against  professional  quality  standards  
to  determine  the  credibility  of  the  conclusions  
(evalua4on  of  evalua4on)  

Goal-­‐Based  Evalua4ons  
•  Goal-­‐based  evalua4ons  are  the  most  common  
type  of  evalua4on  for  programmes  
•  Goal-­‐based  evalua4ons  look  at  whether  the  
programme  is  achieving  it  stated  goals  and/or  
objec4ves  
•  They  are  closely  4ed  to  results-­‐based  M&E  

Common  Programme  Evalua4on  


Concerns  
•  Relevance:    Is  this  the  right  programme  at  the  right  4me?    

•  Efficiency:    Is  the  programme  making  the  most  of  the  available  
resources?  

•  Effec&veness:    Is  the  programme  achieving  its  objec4ves  

•  Impact:    What  are  the  long  term  effects  of  the  programme  going  to  
be?  

•  Sustainablity:    Are  the  programme  outcomes  likely  to  con4nue  


without  further  input  and  assistance?  
Major Types of Programme Evaluation 27

Programme  Evalua4on  can...  


•  Verify  or  increase  the  impact  of  ac4vi4es  on  clients    
•  Help  find  out  if  your  service  is  actually  helping  its  
clients  (leave  out  the  guesswork!)  
•  Improve  delivery  mechanisms  to  ensure  that  you  are  
more  efficient  and  less  costly  
•  Verify  that  you  are  doing  what  you  think  you  are  doing  
•  Produce  strategic  data  for  future  decision  making  
•  Produce  valid  comparisons  between  programmes  to  
decide  which  should  be  retained  
•  Fully  describe  effec4ve  programmes  for  replica4on  
elsewhere  or  scale  up    

Evalua4on  Process  

Collect,  
Ar4culate   Report  
Define  the   review  
how  the   findings  
evalua4on   (clean)   Use  
evalua4on   and  
ques4on  and   and   findings  
will  be   recomme
parameters   analyze  
conducted   nda4ons  
data  

Planning  the  Evalua4on  


•  Prepara4on  is  key  to  an  effec4ve  evalua4on!    
•  Good  planning  ensures  focus  and  a  useable  
outcome,  cuts  down  on  costs  and  4me  was4ng  
•  Consider  the  following:  
–  From  what  sources  should  the  info  be  collected    
(employees,  clients,  documents)  
–  How  can  the  info  be  collected  in  a  reasonable  fashion  
(ques4onnaire,  interview,  focus  groups,  document  
review)  
–  When  is  the  info  needed  
–  What  resources  are  available  to  collect  the  info  
28 Advanced Monitoring and Evaluation Workshop Manual

Programme  Evalua4on  Types  


Area  of    Programme  Interest   Type  of  Evalua&on  
Relevance   needs  assessment  
Efficiency   process  evalua4on  
Effec4veness   mid-­‐term  evalua4on,  outcome  
  evalua4on  
 
Impact   impact  evalua4on  
Sustainablility   sustainability  assessment  
(environmental,  social)  

Needs  Assessments  
•  What  need  is  there  for  a  programme?  
•  Defining  ‘need’    is  essen4ally  a  poli4cal  
process  
–  “Discrepancy  between  what  is  and  what  should  
be”    (Posavac  &  Carey,  1992)  
–  “Actual  state  vs.  a)  ideal,  b)  desired,  c)  expected,  
d)  norm,  e)  minimum”    (Scriven  &  Roth,  1990)  
•  Defining  condi4ons  vs.  problems  
•  Role  of  poli4cs,  public  policy,  culture,  society  

Role  of  Evaluators  in  Needs  


Assessments  

•  Apply  research  techniques  to  measure  condi4ons  


•  Assist  major  stakeholders  in  precisely  defining  
the  “problem”  in  a  manner  that  they  can  agree  
•  Iden4fy  specific  needs  associated  with  the  
defined  problem  
•  Assist  in  development  of  appropriate  
interven4ons  to  address  needs  
•  Determine  the  extent  to  which  a  programme  is  
addressing  needs  
Major Types of Programme Evaluation 29

Needs  Assessments  

Condi4on   Problem    

Programme   Need  

Timing  of  Needs  Assessments  


•  Assessment  of  need  can  be  conducted  at  various  
4mes  during  the  lifecycle  of  a  programme…  
–  Very  important  prior  to  programme  development  
–  Role  of  pilot  programmes/projects  
–  Should  be  revisited  during  programme  
implementa4on  
–  Target  groups  
–  Appropriateness  of  interven4on  
–  Also  has  role  when  assessing  programme  impacts  and  
efficiency  

Process  Evalua4ons  
•  Process-­‐based  evalua4ons  are  geared  to  fully  understand  
how  a  programme  works  
–  how  does  it  produce  the  results  that  it  does?    
–  Is  it  structured  correctly  to  achieve  the  results  it  aims  to  
produce?  
•  Useful  if  programmes  are  long  standing  and  have  changed  
over  the  years  (warranted  if  clients  are  dissa4sfied)  
•  Demonstrates  how  a  programme  truly  operates  (useful  in  
portraying  to  outside  partner)  
•  There  are  numerous  ques4ons  that  could  be  addressed  in  a  
process  evalua4on.  Ques4ons  are  selected  by  carefully  
considering  what  is  important  to  know  about  the  
programme.  
30 Advanced Monitoring and Evaluation Workshop Manual

Process  Evalua4on  Ques4ons  


•  On  what  basis  do  employees/clients  decide  that  
services  are  needed?  
•  What  is  required  of  employees  in  order  to  deliver  
the  programme?  
•  How  are  employees  trained  on  delivery  of  the  
programme?  
•  How  are  clients  recruited?  
•  What  is  required  of  clients?  
•  How  do  employees  select  which  products  or  
services  will  be  provided  to  the  client?  

Process  Evalua4on  Ques4ons  


•  What  is  the  general  process  for  clients  who  access  the  
services?  
•  What  do  clients  consider  to  be  strengths  of  the  
programme?  
•  What  do  staff  consider  to  be  strengths  of  the  programme?  
•  What  typical  complaints  are  heard  from  employees  and  
clients?  
•  What    do  employees  and  customers  recommend  to  
improve  the  programme?  
•  On  what  basis  do  managers  decide  that  the  product  is  no  
longer  needed?  
•  Who  are  the  unintended  clients?  

Outcome  Evalua4ons  
•  Evaluate  the  extent  to  which  programmes  are  
mee4ng  predetermined  goals  or  objec4ves  
•  This  type  of  evalua4on  is  increasingly  
important  (for  both  accountability  and  
learning)  
•  Key  ques4on:  Is  your  organiza4on  really  doing  
the  right  programme  ac4vi4es  to  bring  about  
the  outcomes  it  expects?  
Major Types of Programme Evaluation 31

Steps  to  Develop  an  Outcome  


Evalua4on  
1.  Iden4fy  the  major  outcomes  you  want  to  examine  or  
verify  for  the  programme  under  evalua4on  
2.  Choose  the  outcomes  that  you  want  to  priori4ze    
3.  For  each  outcome,  specify  the  observable  measures  or  
indicators  
4.  Specify  the  target  goals  and  intended  clients  
5.  Iden4fy  what  informa4on  is  required  for  each  indicator  
6.  Decide  how  data  will  be  collected  (realis4cally  and  
efficiently)  
7.  Analyze  and  report  the  findings  
8.  Use  the  findings!  

Outcome  Evalua4on  Ques4ons  


•  Did  the  programme  have  the  right  goals  and  
objec4ves?  
•  How  should  priori4es  be  changed  to  put  more  
focus  on  achieving  goals?  
•  How  should  4melines/budgets  be  adjusted  to  
enable  accomplishment  of  objec4ves?  
•  How  should  goals  be  established  in  the  future?  
•  What  can  be  learnt  for  future/other  
programmes?  

Engaging  Stakeholders  
•  Membership  on  an  Evalua4on  Advisory  or  
Steering  CommiKee  
•  Through  client  feedback  as  part  of  the  
methodology  –  surveys,  focus  groups,  key  
informant  interviews,  etc…  
•  Par4cipatory    
 Evalua4on    
 methods  built    
 into  the  study    
 design  
32 Advanced Monitoring and Evaluation Workshop Manual

Advantages  of  using  Stakeholders  


•  Provide  first-­‐hand  knowledge  of  how  
programmes  actually  operate  –  useful  in  
shaping  ques4ons  
•  Useful  in  helping  to  interpret  findings  
•  Useful  in  helping  to  develop  reasonable  
recommenda4ons  
•  Increases  likelihood  of  follow-­‐through  on  the    
implementa4on  of  recommenda4ons  

Disadvantages  of  using  Stakeholders  


•  Risk  that  objec4vity  is  lost  
•  Percep4on  by  decision-­‐makers  that  objec4vity  
is  tainted  
•  Poten4al  for  domina4on  and  misuse  by  some  
stakeholders  
•  Increase  in  overall  4me  required  due  to  
increased  consulta4on  and  stakeholder  
involvement  

Notes  
•  There  is  no  "perfect"  evalua4on  design  
•  Work  hard  to  include  some  interviews  in  your  
evalua4on  methods.    The  story  is  usually  the  most  
powerful  depic4on  of  the  benefits  of  your  services.  
•  Don't  interview  just  the  successes.  You'll  learn  a  great  
deal  about  the  programme  by  understanding  its  
failures,  dropouts,  etc.  Look  at  the  unintended…  
•  Don't  throw  away  evalua4on  data  once  a  report  has  
been  generated.  Data  can  provide  precious  
informa4on  later  when  trying  to  understand  changes  
in  the  programme.  
Major Types of Programme Evaluation 33

Session  Summary  
•  It  is  more  important  to  evaluate  than  to  worry  
about  doing  it  right  
•  There  are  many  different  types  of  evalua4on  
•  The  evalua4on  approach  is  determined  by:  
–  What  is  being  evaluated  
–  The  purpose  of  the  evalua4on  
–  How  the  evalua4on  results  will  be  used  
•  Who  does  the  evalua4on  will  be  determined  by  
the  evalua4on  approach  and  whether  there  is  a  
need  for  independence  

Session  Summary  
•  Common  evalua4ons:  
–  Needs  Assessment  –  explores  the  need  for  a  
programme  
–  Process  Evalua4on  –  explore  the  implementa4on  of  a  
programme  
–  Outcome  Evalua4on  –  explores  whether  a  programme  
is  achieving  its  goals  and  objec4ves  
•  Proving  that  a  programme  worked  (Outcome  
Evalua4on)  has  become  more  and  more  
important  with  increased  compe44on  for  limited  
resources  

Acknowledgements  
 The  Road  to  Results:  
Designing  and  
Conduc&ng  Effec&ve  
Development  
Evalua&ons  
 By  Linda  G.  Morra-­‐Imas  
and  Ray  C.  Rist  
 Commissioned  by  the  
World  Bank  
34 Advanced Monitoring and Evaluation Workshop Manual

Working  Session  
•  Scenario  1:  How  would  you  evaluate?  
•  A  five-­‐year  ‘Condom  Promo4on  Programme’  
that  uses  the  media  to  promote  condom  
usage  coupled  with  the  free  distribu4on  of  
condoms.  The  programme  is  in  its  2nd  year  
and  management  want  to  know  if  their  
strategies  are  working.    

Working  Session  
•  Scenario  2:  How  would  you  evaluate?  
•  The  Ministry  of  Health  has  completed  a  ten-­‐
year  programme  focused  on  child  obesity  and  
now  is  seeking  to  understand  if  the  
programme  has  made  a  difference.  

Working  Session  
•  Scenario  3:  How  would  you  evaluate?  
•  The  Government  is  considering  merging  HIV  
and  chronic  disease  programmes  with  a  view  
to  improving  treatment  and  lowering  costs.  
The  Minister  has  asked  for  more  informa4on  
with  which  to  make  this  decision.    
Developing Evaluation Questions: Key Considerations 35

4 Developing
Evaluation Questions:
Key Considerations

Learning Objectives:

By the end of this session, participants will be able to:

• Understand the importance of evaluation questions in


planning an evaluation

• Think through the process of establishing evaluation


objectives

• Learn to develop evaluation questions


36 Advanced Monitoring and Evaluation Workshop Manual
Developing Evaluation Questions: Key Considerations 37

Focus  of  Evalua.on  Ques.ons  


•  Problem  or  need  

•  Intended  par.cipants  

•  Strategy  or  implementa.on  

•  Intended  outcomes  

•  Programme  impact  

3  

Evalua.on  Ques.ons  
•  Broad  categories  (oLen  with  sub-­‐categories)  
–  Need  for  the  programme  
–  Programme  Theory  
–  Programme  Process  or  Implementa6on  
–  Programme  Impacts  –  some%mes  referred  to  as  
Programme  Success  especially  when  not  a  
complete  impact  assessment  
–  Programme  Efficiency  –includes  cost-­‐effec%veness  

Need for the


program
Needs Assessments
Program Development

Program
Theory

Program
Formative Evaluations
Process

Program
Impacts
Summative
Evaluations

Program
Efficiency

5  
38 Advanced Monitoring and Evaluation Workshop Manual

An  Important  Star.ng  Point  –    


Understanding  Programme  Theory  &  Context  
•  What  is  the  ‘problem’  that  ini.ated  this  programme?  

•  Clarifying  programme  objec.ves.  What  is  the  


programme  intended  to  achieve  (results)?    

•  Clients—Who  are  the  intended  clients  of  the  


programme  (primary,  secondary)?  Other  stakeholders?    

•  NOTE:  Programme  origins  or  theory  not  always  well  


documented  –  need  for  senior-­‐level  consulta.ons  

6  

Examples  of  Evalua.on  


Ques.ons  

7  

Problem  or  Need  


•  Evalua6on  Objec6ve  Focused  on:  PROBLEM/NEED  :    
       e.g:  In  the  last  six  months  were  more  services  for  
PLHIV  
       implemented  at  community  health  centres?  

•  Perceived  problem:  -­‐  insufficient  services  

Evalua6on  Ques6on:  
Has  the  service  provision  for  PLHIV  clients  improved?  
Developing Evaluation Questions: Key Considerations 39

Intended  Par.cipants  
•  Evalua6on  Objec6ve  Focused  on:  MEETING  NEEDS  OF  
INTENDED  CLIENTS:    
 e.g.  :  How  many  trained  staff  members  are  working  at  
the  clinic?  
 
•  Context  (need  for    adequately  staffed  clinics  to  provide  
service  to  clients)  
 
Evalua.on  Ques.on:  
Does  the  programme  have  adequate,  appropriately  trained  
staff  to  meet  its  intended  objec.ve?  
 

Strategy  or  Implementa.on  


•  Evalua6on  Objec6ve  Focused  on:  Strategy/  
Implementa6on:  
 
e.g  Are  our  staff  members  capable  of  providing  
effec.ve  VCT  services?  
 
•  Context  (any  implementa.on  has  to  consider  human  
capacity)    
 
Evalua6on  Ques6on:  
Are  the  VCT  services  being  delivered  as  intended?    
 

Intended  Outcomes  
•  Evalua6on  Objec6ve  Focused  on:  Intended  
Outcomes:  
 
e.g  Is  the  Night  Health  Centre  being  u.lized  by  
PLHIV?  
 
•  Context  (ascertaining  the  impact  of  a  programme  to  
engage  loss  to  follow-­‐up  clients  by  providing  night  
clinics)    
 
Evalua.on  Ques.on:  
Is  there  an  increase  in  the  number  and  variety  of  PLHIV  
accessing  services  in  the  health  sector?  
 
40 Advanced Monitoring and Evaluation Workshop Manual

Evalua6on  Objec6ve  Focused  on:    Impact:    


e.g.  Is  there  greater  community  empathy  for    PLHIV?  
 
Context  (establishing  policy  to  address  s.gma  and  
discrimina.on  toward  PLHIV)  
 
Evalua6on  Ques6on:  
What  policies  are  in  place  to  reduce  S.gma  &  
Discrimina.on  against  PLHIV?  

Clarifying  Evalua.on  Objec.ves  


Ques.ons  to  ask  at  the  outset:  
•  What  is  the  ‘purpose’  &  the  priori.es  of  the  evalua.on?  

•  Who  is  asking  for  the  evalua.on?  

•  What  ques.ons  need  to  be  answered?    

•  What  types  of  decisions  or  judgments  need  to  be  made?  

•  By  when?  

•  For  what  purpose?  

13  

Types  of  Evalua.on  Ques.ons  


•  Descrip.ve  

•  Norma.ve  

•  Cause  &  Effect  

14  
Developing Evaluation Questions: Key Considerations 41

Type  of  Evalua.on  Ques.ons  –  Descrip.ve  

Descrip6ve  ques6ons:    
•  Seek  to  determine  what  is.  

•  May  describe  aspects  of  a  process,  a  condi.on,  a  


set  of  views,  or  a  set  of  organiza.onal  
rela.onships  or  networks.    

•  Pafon  (2002)  refers  to  descrip.ve  ques.ons  as  


the  founda.on  of  evalua.ons.  

15  

Type  of  Evalua.on  Ques.ons  –  Descrip.ve  

•  seek  to  understand  or  describe  a  programme  or  


process  provide  a  “snapshot”  of  what  is  

•  are  straighkorward  (who,  what,  where,  when,  how,  


how  many)  

•  can  be  used  to  describe  inputs,  ac.vi.es,  and  outputs  


   
•  are  frequently  used  to  gather  opinions  from  
programme  clients.  

16  

Type  of  Evalua.on  Ques.ons  –  Descrip.ve    

Examples  :  

•  What  are  the  goals  of  the  programme  from  the  


perspec.ves  of  different  stakeholders?  
•  What  are  the  primary  ac.vi.es  of  the  programme?  
•  How  do  people  get  into  the  programme?  
•  Where  has  the  programme  been  implemented?  
•  What  services  will  the  programme  provide  to  men?    

17  
42 Advanced Monitoring and Evaluation Workshop Manual

Type  of  Evalua.on  Ques.ons  –Norma.ve    

Norma.ve  ques.ons:    
Compare  what  is  with  what  should  be  
•  Compare  the  current  situa.on  with  a  specified  
target,  goal,  or  benchmark.    
•  Similar  to  those  oLen  asked  in  performance  
audi.ng.      
•  Ask  the  following:  
•  Are  we  doing  what  we  are  supposed  to  be  doing?  
•  Are  we  hinng  our  target?  
•  Did  we  accomplish  what  we  said  we  would  accomplish?  

18  

Type  of  Evalua.on  Ques.ons  :    


Cause  &  Effect    
   
Cause  &  Effect  ques6ons:    
 
•  Determine  what  difference  the  interven.on  makes  

•  OLen  referred  to  as  outcome,  impact,  or  a<ribu%on  


ques%ons  

•  Afempt  to  measure  what  has  changed  because  of  


the  interven.on  

19  

Type  of  Evalua.on  Ques.ons:    


Cause  &  Effect    
   
Cause  &  Effect  ques6ons:    
 
•  Seek  to  determine  the  effects  of  a  project,  programme,  
or  policy.  They  are  the  “so  what”  ques.ons  
 
•  Ask  whether  the  desired  results  have  been  achieved  as  
a  result  of  the  programme  

20  
Developing Evaluation Questions: Key Considerations 43

 
Type  of  Evalua.on  Ques.ons:    
Cause  &  Effect    
Examples:    
•  As  a  result  of  the  training  programme,  do  par.cipants  
have  higher  paying  jobs  than  they  otherwise  would  
have?  
•  Did  the  microenterprise  programme  reduce  the  
poverty  rate  in  the  community  in  which  they  
operated?  
•  Did  the  increased  tax  on  gasoline  improve  air  quality?  
•  What  other  impacts  or  side  effects  (posi.ve  or  
nega.ve)  did  this  interven.on  have  on  the  wider  
community?  
21  

Summary:  Types  of  Evalua.on  Ques.ons  


Type  of  Evalua6on  Ques6on   Defini6on   Example  

Descrip.ve  Ques.ons   Evalua.on  ques.ons  that   • What  are  the  primary  ac.vi.es  
describe  aspects  of  a  process,  a   of  the  programme?  
condi.on,  a  set  of  views,  or  a   • How  do  people  get  into  the  
set  of  organiza.onal   programme?  
rela.onships  or  networks.     • Where  has  the  programme  
  been  implemented?  
Describes  inputs,  ac.vi.es,  and   • What  services  does  the  
outputs   programme  provide  to  men  
Norma.ve  Ques.ons   Ques.ons  that  compare  what  is   • Did  we  achieve  the  targets  we  
with  what  should  be   set  ourselves  
  • Are  prac..oners  adhering  to  
They  compare  the  current   agreed  protocols  
situa.on  with  a  specified  target,    
goal,  or  benchmark  
Cause  &  Effect  Ques.ons   Determine  what  difference  the   •  As  a  result  of  the  job  training  
interven.on  makes   programme,  do  par.cipants  
have  higher  paying  jobs  than  
OLen  referred  to  as  outcome,   they  otherwise  would  have?  
impact,  or  a<ribu%on  ques%ons    
  22  

Determining  the  Appropriate  


Ques.ons  for  an  Evalua.on  
•  May  have  already  been  iden.fied  via  an  evalua.on  
framework  

•  Use  two  key  methods:  


         *  Review  of  programme  documents  
 *    Interviews  with  key  informants–  senior  officials,  programme              
manager,  staff  &  (if    appropriate/feasible)  clients  

23  
44 Advanced Monitoring and Evaluation Workshop Manual

Determining  the  Appropriate  


Ques.ons  for  Evalua.on  
•  Review  of  programme  documents  
             *  search  for  documenta.on  about  programme  objec.ves,  theory  
                   and  overall  context  for  programme  interven.on  
 
•  Interviews  with  senior  officials  &  programme  manager/staff  to  
gather  intelligence  on  the  following:  
                       *  a  descrip.on  of  the  programme  ac.vi.es  &  process  
                       *  their  expecta.ons  
                       *  strengths  &  weaknesses  of  the  programme  &  expected    
                               obstacles  to  programme  implementa.on  
                       *  sources  of  informa.on  that  could  be  used  for  the    
         evalua.on  

24  

Example  of  types  of  problems  in  early  


implementa.on  of  a  programme  
•  Incomplete  or  inadequate  interven.on-­‐-­‐  
Programme  not  being  used  or  implemented  
properly  i.e.  as  intended  

•  Wrong  interven.on-­‐-­‐  (e.g.  unan.cipated  


obstacles)  

•  Non  standardized  interven.on–  inequality  


across  program  sites  
25  

Considera.on  of  Stakeholder  Interest    


in  Framing  the  Evalua.on  
Key  Stakeholders  
•  Policy  makers/sponsors  
•  Programme  developers  
•  Programme  administrators/managers  
•  Programme  implementers  
•  Intended  programme  beneficiaries  
•  Special  interest  groups  
•  Others  

26  
Developing Evaluation Questions: Key Considerations 45

Useful  Informa.on  from  Stakeholders    


in  Planning  Stage  
–  Percep.on  of  the  programme  
–  Programme  purposes/goals  
–  Programme  theory  
–  Concerns  
–  Evalua.on  ques.ons  
–  Intended  uses  of  evalua.on  
–  Other  stakeholders  and  their  stake.  

27  

Other  Considera.ons  for    


Planning  &  Framing  the  Evalua.on  
•  Resource  considera.ons:  financial  resources,  human  resources,  exper.se,  .me.  

•  Use  mul.ple  lines  of  evidence  

•  Generally  use  both  quan.ta.ve  and  qualita.ve  methods  

•  Specific  methodology  and  evalua.on  design  depends  on  the  nature  of  the  
ques.ons  being  asked;  the  resources  available  for  the  evalua.on;  &  .me  
constraints  

•  Need  to  balance  methodological  rigour  with  prac.cal  reali.es  of  resource  &  .me  
constraints  

•  Role  of  the  evaluator  or  evalua.on  team  

•  Level  of  par.cipa.on  of  stakeholders  

•  Variety  among  stakeholders   28  

Discussion  Ques.ons  
•  In  any  evalua.on  experience  that  you  may  have  
had  or  seen,  has  the  eventual  evalua.on  
generally:  

•  Met  the  client’s  informa.on  needs?    

•  Addressed  the  issues  raised  by  the  client?  If  not,  


could  something  have  been  done  differently  at  
the  front-­‐end  when  scoping  the  evalua.on?  
What?  

29  
46 Advanced Monitoring and Evaluation Workshop Manual

GROUP  WORKING  SESSION:  


THE  EVALUATION  MATRIX  
Using  your  Case  Study    from  Day  1  
 
Complete  Columns  1  &  2  of  the  
Evalua.on  Matrix  

30  

 THE  EVALUATION  MATRIX  


 

Evalua.on  Objec.ve:  _______________________________  


 
Type  of  Evalua.on:            
   
Evaluation Information Information Data Data Limitations Conclusions
Question(s) Required Source(s) Collection Analysis
Methods Methods

WHAT DO YOU WHAT DO YOU


WANT TO NEED TO
KNOW? ANSWER THE
QUESTION?

• Clear
and • Programme
specific goals
• Measurable • Evidence
• Doable • Programme
• Keyterms criteria
defined • Participant
• Scope rates
• Timeframe • Cost
• Population information
• Funding levels
Measuring Programme Outcomes 47

5 Measuring Programme
Outcomes

Learning Objectives:

By the end of this session, participants will be able to:

• Understand the key steps when planning an outcome


evaluation

• Appreciate the different outcome evaluation designs

• Understand the key steps when undertaking an outcome


evaluation
48 Advanced Monitoring and Evaluation Workshop Manual
Measuring Programme Outcomes 49

Got  Outcomes?  

Demand  for  
Outcomes:  
Did the programme
work?
Accountability  
 
  Are  scarce  resources  
Were the programme being  used  most  
objectives achieved? efficiently  and  
effec4vely?      
 

Outcome  evalua4on  seeks  to  determine  whether  


programme  was  …  

Effec:ve     OR   Ineffec:ve    
Programme     Programme    
Effort   Effort  “Did  not  hit  the  
“Hit  the  Mark”   Mark”  

Effective programmes tend to be:

Well-­‐designed  and   Well-­‐implemented


conceptually  sound     with fidelity

Outcome  Indicators  
•  Effec4ve  programmes  also  establish  well  
conceived  indicators  that  directly  measure  
programme  performance  at  ALL  levels  (ac4vity,  
output,  outcome  and  impact)  
•  Good  indicators  and  objec4ves  are  Specific,  
Measurable,  Appropriate,  Realis4c  and  Time-­‐
based  (SMART)  
•  Data  collec4on  for  outcomes  (objec4ves)  and  
goals  (impact)  should  occur  in  parallel  with  the  
programme  ac4vi4es  
50 Advanced Monitoring and Evaluation Workshop Manual

Programme  Lifecycle  
  Forma:ve  
Evalua:on  
Programme    
Development  

Programme    
Effect   Programme    
Implementa:on  
Outcome  
Evalua:on   Process    
Evalua:on  

Outcome  evalua4on  may  not  be  warranted  for  


every  programme  at  any4me!  

Outcome  evalua4ons  are  


resource  intensive  
• Time  
• Money  
• Exper4se  
 
Some  programmes  may  be  
too  low  dose    for  an  outcome  
evalua4on  
 

Outcome  Evalua4on  Decision  Making  

Low   Evidence-­‐based:   New,  untested  or    


Programme  Type  

dose/   replicated  or   reinvented  


intensity   adapted  
Evalua:on  Importance  

Less     More    
important   important  
Measuring Programme Outcomes 51

Low  Dose/Intensity  Programmes  


•  Disseminate  informa4on  to   Disseminate  Informa4on  
increase  awareness  or  
knowledge  only  
•  Limited  exposure  to  
message  
•  Examples:  
–  Posters  
–  Brochures  
–  Health  fairs  
–  Fact  sheets  
–  Magnets/key  chains  
–  Informa4on  sessions   Increase  awareness  and/or    
knowledge  

Outcome  evalua4ons  may  be  less  


important  for  low  dose  programmes  
•  Increasing  awareness  or  knowledge  (by  itself)…  
–  May  be  an  important  founda4onal  ac4vity  or  one  
ac4vity  of  a  larger  strategy    
–  BUT  research  indicates  it  does  not  change  behavior  or  
socio-­‐environmental  condi4ons  ALONE  
•  Therefore….  
–  Are  the  resources  required  for  an  outcome  evalua4on  
jus4fied?  
–  Might  it  be  more  prudent  to  save  scarce  resources  for  
other  programmes  or  evalua4ons?  

Evidence-­‐Based  Programmes  
   Numerous  terms,     Funding  agencies  are  pushing  for  
criteria,  and  evidence   the  use  of  evidence-­‐based  
used  to  iden4fy   programmes  as  a  mechanism  for  
“evidence-­‐based”   ensuring  that  organiza4ons  are  
programmes     implemen4ng  programmes  that  
–  Best  prac4ces   HIT  THE  MARK!  
–  Model  programme  
–  Effec4ve  programme   Rigorous,  scien4fic  outcome  
–  Science-­‐based     evalua4ons  have  
–  Promising  programme   determined  a  programme  
–  Guidelines   to  be  effec4ve  and,  thus,  
  recommended  for  
widespread  adop4on  
 
 
52 Advanced Monitoring and Evaluation Workshop Manual

Ques4on  

 
 If  a  programme  has  already  been  proven  
effec4ve,  then  is  it  necessary  to  conduct  
an  outcome  evalua4on  when  adopted?  

It  depends…  
1.  Posi4ve  results  from  evidence-­‐based  
programmes  tested  in  “ideal  seangs”  
may  not  be  replicated  when  adopted  in  
“real  world”  seangs*  

2.  Adop4on  of  evidence-­‐based  


programme  varies  greatly*  

Ways  evidence-­‐based  programmes  are  


adopted*    
Replica:on  
–  Reproducing  a  programme  with  complete  fidelity  to  
protocol  and  delivered  to  similar  popula4on  as  in  
efficacy  trial  
 
Adapta:on  
–  Tailoring  a  programme  to  meet  needs  of  different  
popula4ons  or  delivery  channels  
–  Core  elements  remain  the  same  
 
Re-­‐inven:on  
–  Adding  or  removing  core  elements  
*Taken  from  Collins,  C.  (2006)  Evalua'ng  interven'ons  that  have  already  been  
Determined  to  be  efficacious.  CDC/AEA  Summer  Evalua4on  Ins4tute  
Measuring Programme Outcomes 53

New,  Untested  or  Re-­‐invented  Programmes  

Outcome  evalua4ons  are  ALWAYS  


warranted  for  new,  untested  or    
re-­‐invented  programmes  

Logic  of  Outcome  Evalua4on  

Outcome  evalua4on  is  all  about  agemp4ng  to  


determine  whether  a  programme  caused  an  
effect  in  an  intended  outcome  

Cause   Effect  
(Programme)   (Outcome)  

How  do  we  infer  that  when  a  hand  “flips  


a  switch”  it    turns  on  a  light?    
 
54 Advanced Monitoring and Evaluation Workshop Manual

How  do  we  infer  a  cause  and  effect  


rela4onship?  
Criteria  1   Criteria  2   Criteria  3  
Is  there  evidence   Is  there  evidence   Is  there  evidence  
that  the  cause   that  the  cause   that  any  other  
preceded  the   changed  the  effect?   factor  (e.g.,  
effect?   confounder)  
caused  the  effect?  
YES:    The  hand   YES:    When  the   NO:    Do  not  
moves  before  the   hand  moves,  the   observe  anything  
light  turns  on.   light  turns  on  too.   else  turning  on  the  
  light.    But….  
 

Confounding  Factors  

Cause   Effect  
School-­‐based   Decreased  violent  
conflict  resolu4on   events  
programme   from  baseline  to  
follow-­‐up  

Is the observed association causal due to the


effect of another (confounding) factor?

Confounding  Factors  
Cause   Effect  
School-­‐based   Decreased  violent  
conflict  resolu4on   events  
programme   from  baseline  to  
follow-­‐up  

Confounders  
-­‐   Greater  Police  
enforcement  
-­‐  Youth  violence  geang  
agen4on  in  the  media    
Measuring Programme Outcomes 55

General  Outcome  Evalua4on  Designs  


Single  Group     Quasi-­‐   Randomized  
Pre-­‐Post  Test   Experimental   Control  trial  
One  group  receives  the   One  group  receives  the   Par4cipants  
programme   programme  and  another   randomly  allocated  to  
group  serves  as   one  of  two  groups  –  one  
comparison   group  receives  
  programme  and  the  
other  group  serve  as  
control  
Compares  outcome   Compares  outcome   Compares  outcome  
before  and   between  programme   between  programme  
aker  the  programme   and  comparison  groups   and  control  groups  

Designs  and  Criteria  for  Inferring  


Causality  
Evidence  for   Single  Group   Quasi-­‐ Randomized  
inferring  causality   Pre-­‐Post  Test   Experimental   Control  Trial    

Did  the  programme   YES   YES   YES  


come  before  
Baseline   Baseline   Baseline  
outcome?  
Did  the  outcome   YES   Stronger   Stronger  
change  in  the  
Pre  vs.  Post     Experiment  vs.   Experiment  vs.  
expected  direc4on?    
Control  Group   Control  Group  
Was  something  beside   Weaker   Stronger   Strongest  
the  programme  (e.g.,  
No  Control   Non-­‐Random   Randomiza4on  
confounder)  
Group   Comparison  Group   to  Control  
responsible  for  the  
Group  
outcome?  

A  Single  Group  Pre-­‐Post  Evalua4on  


56 Advanced Monitoring and Evaluation Workshop Manual

A  Single  Group  Mul4ple  Year  


Evalua4on  

A  Comparison  Group  Mul4ple  Year  


Evalua4on  

Why  comparison  groups  provide  greater  


evidence  for  inferring  causality  

•  Helps  to  rule  out  alterna4ve  explana4ons  (e.g.,  


confounders)  for  changes  found  in  outcome    
•  groups  assumed  to  be  similar  except  for  exposure  
to  programme  

•  If  posi4ve  change  found  when  comparison  


groups  are  aken  into  account,  then  greater  
evidence  that  the  programme  was  responsible  
for  that  change  
Measuring Programme Outcomes 57

Stronger  evalua4on  designs  tend  to  


require  greater  resources  
Greater
Randomized    
Control  Trial  

Resources   Quasi-­‐Experimental  
Required  
Single  Group  
Pre-­‐Post  

Fewer
Lower Higher
Evidence  for  Inferring  Causality  

Key  steps  for  planning  an  outcome  evalua4on  

1.  Assess  readiness  (Evaluability  Assessment)  

2.  Gather  needed  resources  

3.  Array  possible  outcomes  by  developing  logic  


model  (or  Evalua4on  Matrix)  

4.  Select  outcomes  to  evaluate      

Assess  Readiness    
•  Conduct  an  Evaluability  Assessment    
•  Is  the  programme  well-­‐designed?    
•  Is  there  evidence  the  programme  was  implemented  
as  planned?    
•  Is  there  a  plan  on  how  the  results  can  be  used?      
•  Does  your  department  have  the  resources?  
–  Commitment  
–  Person  power  
–  Exper4se  
58 Advanced Monitoring and Evaluation Workshop Manual

Gather  resources  
1.  Obtain  commitment  from  higher  levels  
2.  Assign  evalua4on  coordinator  
–  Should  not  be  person  responsible  for  planning/implemen4ng  the  
programme  
3.  Convene  stakeholder  evalua4on  team  
–  Should  include  (at  min)  higher  level  administrator,  programme  
director,  programme  delivery  staff,  and  evalua4on  coordinator    
4.  Gain  access  to  the  following  exper4se  
–  Experienced  evalua4on  expert  
–  Topic  experts/professionals  
–  Logic  model  developer  
–  Data  collector/s  (for  whatever  outcomes  selected)  
–  Data  programmer/sta4s4cian  

Establish  the  Programme  Logic  


•  Review  exis4ng  programme  documents:  
–  Has    a  logic  model  been  ar4culated?  If  not,  is  
there  a  clear  logic  expressed  in  the  indicators  or  
the  narra4ve?    
–  Are  the  outcome  indicators  well  conceived  and  
directly  related  to  the  programme’s  stated  
objec4ves?  
•  In  some  cases  you  may  need  to  establish  a  
logic  model  and  ar4culate  programme  
objec4ves  

Logic  model  elements  


Inputs   Outputs   Outcomes  
    Short-­‐term  
Indicators  of  quality   What  should  change  
Resources   immediately  from  the  ac4vity?  
Money,  staff,  or  facili4es   and  quan4ty  of    
ac4vi4es  implemented    
available  to  implement    
ac4vi4es     Intermediate  
    What  is  influenced  by  the  
  short-­‐term  and    
Strategy/Ac4vi4es     influences  the  long-­‐term  
What  the  programme  does     outcome?  
with  resources  aimed        
at  influencing  outcomes    
  Long-­‐term  
 
What  is  the  ul4mate  problem  to  
 
be  addressed  by  the  programme?    
Measuring Programme Outcomes 59

Outputs  Vs  Outcomes  


Outputs   Outcomes  
•  Assesses  programme   •  Assesses  programme  
implementa4on   effec4veness  
•  Assesses  quan4ty  and   •  Assesses  changes  in  
quality  of  programme   individuals,  groups,  or  
ac4vi4es  implemented   environments  during  or  
•  E.g.    #  agending  workshop,   aker  exposure  to  
par4cipant  sa4sfac4on   programme  ac4vi4es  
•  E.g.  increase  knowledge,  
reduce  injury  

Outcome  Indicator  Example    


•  Poorly  wrigen:  
– ART  services  improved  
•  Well  wrigen:  
– By  the  end  of  2015,  150  health  care  
workers  will  have  the  ability  to  deliver  
ART  services  according  to  na4onal  
and/or  interna4onal  standards  

Selec4on  of  Outcomes  to  evaluate  


• Not  all  outcomes  in  logic  model  must  be  
evaluated;    Select  outcomes  carefully  

• “Not  everything  that  counts  can  be  counted  


and  not  everything  that  can  be  counted  
counts”.   Albert Einstein
60 Advanced Monitoring and Evaluation Workshop Manual

Considera4ons  
•  Which  outcomes  in  the  logic  model  are  
important  to  stakeholders?  
 
•  Has  research  already  demonstrated  causal  
links?  
 
•  Are  comparison  groups  readily  available?    
 
•  Will  there  be  enough  “events”  to  “rule  out  
chance”  for  any  changes  found  in  outcome?    

Session  Summary  
•  The  programme  type  and  the  evalua4on  importance  
are  key  considera4ons  when  deciding  whether  to  
undertake  an  outcome  evalua4on  

•  Outcome  evalua4on  seeks  to  determine  whether  a  


programme  caused  an  effect  

Session  Summary  
•  Outcome  evalua4on  designs:  
–  Single  group  pre  and  post  test  
–  Quasi-­‐experimental    
–  Randomized  control  
•  Key  steps  in  planning  an  outcome  evalua4on:  
–  Assess  readiness  
–  Assemble  needed  resources  
–  Develop  a  logic  framework  
–  Select  the  outcomes  to  evaluate  
Measuring Programme Outcomes 61

Working  Session  
•  In  your  groups  look  back  over  the  evalua4on  
matrix  that  you  developed  earlier  and  outline  
how  you  would  go  about  undertaking  an  
evalua4on  for  the  programme  you  have  
selected.  
–  What  steps  do  you  need  to  take  to  make  the  
evalua4on  happens?  
–  What  are  some  of  the  key  considera4ons  for  the  
evalua4on  design?  
62 Advanced Monitoring and Evaluation Workshop Manual
Answering Evaluation Questions using Quantitative and Qualitative Methods 63

6 Answering Evaluation
Questions using
Quantitative and
Qualitative Methods
Learning Objectives:

By the end of this session participants will be able to:

• Identify the various methods available to answer evaluation


questions
• Understand when to use quantitative or qualitative methods
for evaluation
• Understand the challenges that each method presents and
develop creative solutions
• Select the most appropriate methods for answering
evaluation questions for your programme
64 Advanced Monitoring and Evaluation Workshop Manual
Answering Evaluation Questions using Quantitative and Qualitative Methods 65

Defini/on  of  Evalua/on  Research    

•  Research  that  is  specifically  intended  for  the  purpose  of  


undertaking  assessments  and  evalua/ons,  as  dis/nct  from  
conven/onal  basic  and  other  kinds  of  research.    

•  “It  aims  to  produce  informa/on  that  has  direct  relevance  to  
subsequent  decisions  about  improvements  to  or  the  
con/nua/on  of  a  par/cular  ac/on  programme”  (Hall  and  
Hall  1996).  

•  Evalua/on  research  requires  the    specifica/on  of  goals  and  


purpose  and  serves  to  enable  the  collec/on  of  data  to  
answer  evalua/on  ques/ons.    

3  

Why  choose  a  par/cular  method?    

•  The  method  (s)  you  choose  are  determined  by:  


•  The  purpose  (s)  of  your  evalua/on    
•  Your  evalua/on  ques/ons  
•  The  resources  you  have  to  design  and  conduct  it.    

•  Quan/ta/ve  methods  produce  “numbers”  i.e.  


numeric  data  
•  Qualita/ve  methods  capture  descrip/ons,  
opinions,  feelings.  
4  

Quan/ta/ve  and  Qualita/ve  Methods  

Quan%ta%ve     Qualita%ve  

•  Useful  for  collec/on  of   •  Construct  social  reality,  


factual  informa/on   cultural  meaning  
•  Focus  is  on  variables   •  Focus  is  on  interac/ve  
processes,  events  
•  Reliability  is  a  primary   •  Authen/city  is  a  primary  virtue  
virtue  
•  Contextual  circumstances  
•  Many  research  par/cipants   prevail  
•  Sta/s/cal  analysis   •  Few  research  par/cipants  
•  Detachment  of  researcher   •  Thema/c  analysis  
(objec/vity  highlighted)   •  Immersion  of  researcher    

5  
66 Advanced Monitoring and Evaluation Workshop Manual

Quan/ta/ve  and  Qualita/ve  Methods  

•  Debate  over  the  rela/ve  usefulness  of  qualita/ve  


and  quan/ta/ve  methods  for  conduc/ng  
evalua/ons  (Reichardt  and  Cook  1979)  

•  In  prac/ce,  it  is  agreed  that  combining  


quan/ta/ve  and  qualita/ve  methods  (some/mes  
called  “mixed  method”  evalua/ons)  produces  a  
richer  and  more  comprehensive  understanding  of  
a  project’s  accomplishments  and  lessons  learned.    

6  

Quan/ta/ve  Methods  

•  The  Quan/ta/ve  method  is  the  one  that  is  


more  frequently  encountered  in  the  research  
literature.    

•  The  more  dominant  factor  that  characterizes  


the  method  is  its  numerical  nature  and  units  
of  measurement,  for  example,  rates,  ra/os,  
propor/ons,  percentages,  means,  etc  
7  

STRENGTHS  of  Quan/ta/ve  Methods  

•  Precision  -­‐  through  quan/ta/ve  and  reliable  


measurements  
•  Control  -­‐  through  sampling  and  design  
•  Ability  to  produce  causality  statements,  through  
the  use  of  controlled  experiments  
•  Sta/s/cal  techniques  allow  for  sophis/cated  
analyses  
•  Replicable  

8  
Answering Evaluation Questions using Quantitative and Qualitative Methods 67

LIMITATIONS  of  Quan/ta/ve  Methods  


(1)  

•  Due  to  the  complexity  of  human  experience,  it  


is  difficult  to  rule  out  or  control  all  the  
variables.  
•  People  do  not  all  respond  in  the  same  ways  as  
inert  mager  in  the  physical  sciences.  
•  Its  mechanis/c  ethos  tends  to  exclude  no/ons  
of  freedom,  choice  and  moral  responsibility.  

9  

LIMITATIONS  of  Quan/ta/ve  Methods  


(2)  

•  Quan/fica/on  can  become  an  end  in  itself  


•  Unable  to  take  account  of  people's  unique  
ability  to  interpret  their  experiences,  
construct  their  own  meanings  and  act  on  
these  
•  Assump/on  that  facts  are  true  and  the  same  
for  all  people  all  of  the  /me  
•  Focus  on  numerical  data  and  variables  
10  

Quan/ta/ve  Methods  Used  for  


Answering  Evalua/on  Ques/ons  

•  Surveys  
•  Exit  Interviews  
•  Record  Abstrac/on  
•  Checklists  
•  Observa/on  
•  Experiments:  Experimental  design  and                                                                                                      
Quasi-­‐experimental  design  
•  Cost  Benefit  and  Cost-­‐  Effec/veness  analyses  
•  Most  were  covered  in  Basic  M&E  workshop  
11  
68 Advanced Monitoring and Evaluation Workshop Manual

Surveys  (1)  

•  A  survey  can  be  defined  as  a  systema/c  method  for  


gathering  informa/on  from  a  sample  of  en//es  for  the  
purposes  of  construc/ng  quan/ta/ve  descrip/ons  of  the  
agributes  of  the  larger  popula/on  of  which  the  en//es  are  
members  (Groves  et  al  2004).    

•  Surveys  olen  collect  comparable  informa/on  for  a  


rela/vely  large  number  of  people  in  par/cular  target  
groups.  

•  Can  be  used  to  evaluate  the  outcomes  of  a  programme/


project  interven/ons  

12  

Advantages  and  Disadvantages  of  


Surveys  
                     Advantages                            Disadvantages  
•  Mul/ple  data  collec/on   •  Rely  on  self  reports  of  
methods  are  available   behaviour  –  Avenue  for  
•  Ensures  confiden/ality  and   biases  
anonymity   •  High  level  of  literacy  may  be  
•  Useful  for  collec/ng  data  on   required.  
sensi/ve  topics  
•  Good  for  describing  
characteris/cs  of  larger  
popula/ons    
•  Wide  range  of  topics  

13  

Cost  –  Benefit  Analysis  (1)    

•  Cost-­‐benefit  analysis  es/mates  the  total  expected  benefits  


of  a  programme,  compared  to  its  total  expected  costs  
(World  Bank,  2004).  

•  Costs  and  effects  are  valued  in  monetary  terms  


–  Monetary  valua/on  of  health  states  
–  Monetary  saving  for  averted  treatment  
–  Produc/vity  losses  averted  

•  Can  assess  economic  feasibility:  if  total  benefits  >  total  


costs,  programme  is  considered  “economically  feasible”  

14  
Answering Evaluation Questions using Quantitative and Qualitative Methods 69

Cost  –  Benefit  Analysis  (2)    

•  CBA  allows  comparison  of  programmes  between  health  care  and  


other  sectors  (alloca/ve  efficiency)  

•  Ethical  and  methodological  issues  (willingness  to  pay  methodology  


vs.  ability  to  pay).  It  is  considered  unethical  to  value  life  and  it  is  
difficult  to  es/mate  a  correct  value  for  health  states.  

•  Since  there  are  no  natural  prices  for  healthy  states,  cost-­‐benefit  
analysis  requires  the  crea/on  of  ar/ficial  ones  by  assigning  a  dollar  
value  to  human  life.    Economists  create  ar/ficial  prices  for  health  
benefits  by  looking  at  what    people  are  willing  to  pay  for  them.    

15  

Cost  –  Effec+veness  Analysis  (1)    

•  Cost-­‐effec+veness  analysis  compares  the  rela+ve  performance  of  


two  or  more  programmes  or  programme  alterna+ves  in  reaching  a  
common  outcome.  

•  Probably  the  most  used  approach.    

•  Used  when  the  alterna+ve  interven+ons  have  a  significant  


difference  in  effect  as  well  as  costs.  

•  Evaluates  alterna+ve  interven+ons  for  the  same  measure  of  


outcome  (unit)  

•  Addresses  technical  efficiency  issues  only  (doing  it  the  right  way)  
 
16  

Cost  –  Effec/veness  Analysis  (2)    

•  The  consequences  of  the  different  interven/ons  


are  different    but  can  be  measured  in  iden/cal  
natural  units.    In  this  case  the  inputs  are  costed.    
Compe/ng  interven/ons  are  compared  in  terms  
of  cost  per  unit  of  consequence.  

•  The  outcomes  can  be  in  natural  units,  


intermediate  or  final  units  and  can  include  
indirect  as  well  direct  benefits.    
17  
70 Advanced Monitoring and Evaluation Workshop Manual

Qualita've  Methods  

•  It  cons'tutes  an  interpreta've  approach  to  


data  or  to  empirical  material.  

•  Applies  to  the  explora'on  and  inves'ga'on  of  


social  processes,  rela'onships  and  
phenomenon.    

18  

STRENGTHS  of  Qualita/ve  Methods  


•  The  researcher  gains  an  insider's  view  of  the  field.  This  
allows  the  researcher  to  find  issues  that  are  olen  missed  
(such  as  subtle/es  and  complexi/es)  by  quan/ta/ve  
methods.  
•  Qualita/ve  descrip/ons  can  play  the  important  role  of  
sugges/ng  possible  rela/onships,  causes,  effects  and  
dynamic  processes.    
•  Turn  to  qualita/ve  reports  in  order  to  examine  forms  of  
knowledge  that  might  otherwise  be  unavailable,  thereby  
gaining  new  insight.    
•  Adds  a  “dose  of  reality”  as  it  gives  the  actual  self  account  
of  their  social  experience.  

19  

LIMITATIONS  of  Qualita/ve  Methods  


(1)  

•  The  concern  for  validity  or  reliability  is  a  major  


cri/cism  because  of  the  subjec/ve  nature  of  qualita/ve  
data  and  its  origin  in  single  contexts,  it  is  difficult  to  
apply  conven/onal  standards  of  reliability  and  validity.    

•  Contexts,  situa/ons,  events,  condi/ons  and  


interac/ons  cannot  be  replicated  to  any  extent  nor  can  
generalisa/ons  be  made  to  a  wider  context  than  the  
one  studied  with  any  confidence.  

20  
Answering Evaluation Questions using Quantitative and Qualitative Methods 71

LIMITATIONS  of  Qualita/ve  Methods  


(2)  

•  The  viewpoints  of  both  researcher  and  


par/cipants  have  to  be  iden/fied  and  elucidated  
because  of  issues  of  bias.  
•  The  /me  required  for  data  collec/on,  analysis  
and  interpreta/on  is  lengthy.  
•  Researcher's  presence  has  a  profound  effect  on  
the  subjects  of  study.  
•  Issues  of  anonymity  and  confiden/ality  present  
problems.  

21  

Qualita/ve  Methods  Used  for  


Answering  Evalua/on  Ques/ons  

•  Key  Informant  Interviews    


•  Focus  Group  Discussions  
•  Par/cipant  Observa/on  
•  Case  Studies  
•  Ethnographic  Studies  
•  The  first  two  are  the  most  widely  used  

22  

Key  Informant  Interviews  (1)  

•  A  series  of  open-­‐ended  ques/ons  posed  to  


individuals  selected  for  their  knowledge  and  
experience  in  a  topic  of  interest.  Interviews  are  
qualita/ve,  in-­‐depth,  and  semi-­‐structured.  They  
rely  on  interview  guides  that  list  topics  or  
ques/ons.  

•  Are  op/mal  for  collec/ng  data  on  individuals,  


personal  histories,  perspec/ves  and  experiences.    
23  
72 Advanced Monitoring and Evaluation Workshop Manual

Key  Informant  Interviews  (2)  


•  Provides  rich  data,  details,  insights  from  
programme  par/cipants  and  stakeholders  
about  their  experiences,  behaviors  and  
opinions  

•  Par/cularly  suited  for  complex  or  sensi/ve  


subjects  

24  

Focus  Group  Discussions  (1)  

•  Focus  group  sessions  can  be  defined  as  "basically  


group  interviews”  (Morgan  1996)  

•  The  fundamental  data  that  focus  groups  produce  


are  transcripts  of  the  group  discussion  

•  Data  collected  from  focus  groups  provide  


informa/on  that  would  otherwise  be  unknown.    

25  

Focus  Group  Discussions  (2)  


Focus  Group  Discussions  (2)  

•  Use  group  dynamics  to  generate  data  and  insights  

•  Useful  for  genera/ng  ideas  and  strategies,  defining  problems  in        


project  implementa/on,  assist  with  interpre/ng  quan/ta/ve  
findings    

•  Open-­‐ended  ques/ons  or  topics  designed  to  s/mulate  discussion;  


topics  usually  broader  than  interview  ques/ons  

•  What  a  researcher  finds  in  focus  groups  is  interac/on  among  the  
members  that  is  not  found  in  other  forms  of  research.    

26  
Answering Evaluation Questions using Quantitative and Qualitative Methods 73

Discussion  Session  
•  How  does  the  evalua/on  ques/on  drive  the  
selec/on  of  each  approach?  

•  Think  about  what  informa/on  is  needed  to  


answer  the  ques/on.    
•  What  sources  of  informa/on  you  need  to  answer  
the  ques/on?  
•  Will  you  use  quan/ta/ve  or  qualita/ve  methods?  
•  Which  type  of  method  or  combina/on  of  
methods  will  you  use?    

27  
74 Advanced Monitoring and Evaluation Workshop Manual
Data Analysis 75

7 Data Analysis

Learning Objectives:

By the end of this session participants will be able to:


• Identify the various factors to consider that would determine
the most appropriate approach to data analysis
• Appreciate key data processing and analysis concepts and
issues
• Use the Epi Info software for data processing and basic
analyses
• Identify the most appropriate analyses for different types of
data
76 Advanced Monitoring and Evaluation Workshop Manual
Data Analysis 77

Introduc3on  
 
1.  The  appropriate  analysis  depends  on:  

2.  Evalua3on/Research  ques3ons  or  objec3ves  

3.  Study  design/data  collec3on  methods  

4.  Type  of  data  collected  


•  Quan3ta3ve  or  qualita3ve  

The  Evalua3on  Ques3on  


•  Cri3cal  part  of  the  Evalua3on  Process  

–  Forms  the  basis  for  the  hypotheses,  aims  and  


objec5ves  
–  Informs  the  evalua3on  design  and  methods  
–  Influences  the  type  of  data  collected  
–  Determines  the  method  of  analysis  used  

Example  :  
Indicator:  Percent  of  women  and  men  aged  15–24  who  had  
sex  with  more  than  one  partner  in  the  last  12  months    

•  How  to  measure  it  


•  Survey  respondents  aged  15  –  24  years  are  asked  the  
following  ques3ons:  

1.  In  the  last  12  months,  have  you  had  sexual  intercourse  with  a  
non-­‐regular  partner?    
2.  If  the  answer  to  ques3on  1  is  “yes”.  How  many  non-­‐regular  
partners  have  you  had  sex  with  in  the  last  12  months?  
3.  If  the  answer  to  ques3on  1  is  “yes”:  Did  you  (or  your  partner)  
use  a  condom.  
78 Advanced Monitoring and Evaluation Workshop Manual

Research  Ques+ons  
Primary  Research  Ques/on  
What  is  the  percentage  of  young  persons  (15-­‐24  years)  who  
had  sex  with  more  than  one  partner  in  the  last  12  months?  

Secondary  Research  Ques/ons  


•  What  is  the  average  number  of  non-­‐regular  partners  
reported  by  young  persons  (15-­‐24  years)  within  the  last  12  
months?  
•  What  is  the  prevalence  of  condom  use  reported  by  young  
persons  (15-­‐24  years)  having  sex  with  a  non-­‐regular  partner  
within  the  last  12  months?  
•  Is  there  an  associa+on  between  sex  and  mul+ple  
partnering  among  young  persons  (15-­‐24  years)?  

Hypotheses  
The  first  three  hypotheses  assume  we  have  baseline  data.  
•  The  percentage  of  young  persons  (15-­‐24  years)  repor3ng  sex  with  
more  than  one  partner  within  the  last  12  months  is  less  than  20%.  

•  The  mean  number  of  non-­‐regular  partners  reported  by  young  


persons  (15-­‐24  years)  who  are  sexually  ac3ve  within  the  last  12  
months  is  zero.  

•  The  prevalence  of  condom  use  reported  by  young  persons  (15-­‐24  
years)  having  sex  with  non-­‐regular  partners  within  the  last  12  
months  is75  %.  

•  More  males  than  females  have  mul3ple  sex  partners  among  young  
persons  (15-­‐24  years).  

Specific  Aims/Objec3ves  
 
•  To  determine  percentage  of  young  persons  (15-­‐24  years)  repor3ng  
mul3ple  sex  partnering  within  the  last  12  months.  

•  To  determine  mean  number  of  non-­‐regular  partners  reported  by  


young  persons  (15-­‐24  years)  within  the  last  12  months.  

•  To  determine  the  prevalence  of  condom  use  reported  by  young  


persons  (15-­‐24  years)  having  sex  with  non-­‐regular  partners  within  
the  last  12  months.  

•  To  determine  if  there  is  an  associa3on  between  sex  and  mul3ple  
sex  partnering  among  young  persons  (15-­‐24  years)  within  the  last  
12  months.  
Data Analysis 79

Study  Designs  
•  Quan5ta5ve  Methods  
–  Experimental  
•  Randomized  controlled  trial/Quasi  Experiment  

–  Observa3onal  (non-­‐experimental)  studies  


•  Surveys  
•  Observa3ons  
•  Data  extrac3on  from  records/documents  

•  Qualita5ve  Methods  
–  Focus  group  discussion  
–  In-­‐depth  interviews  

What  is  the  most  appropriate  Data  


Collec3on  Method?  
•  Depends  on  evalua3on  ques3ons  

•  Quan3ta3ve  research  methods  

•  Qualita3ve  methods  

•  Other  issues  to  address  that  determine  the  


appropriate  data  analysis  includes:  
–  Sampling  methods  
–  Sample  size  
–  Measurements  taken  

What  is  the  most  appropriate  sampling  


methodology?  
}  Dependent on the evaluation methods

Non-­‐probability  sampling   Probability    


 methods   sampling  methods  

•  convenience  sampling   •  simple  random  sampling  


•  quota  sampling   •  systema5c  random  sampling  
•  Purposive  sampling   •  stra5fied  sampling  
•  cluster  sampling  
80 Advanced Monitoring and Evaluation Workshop Manual

How  many  par3cipants  to  include?  


Ø For  quan3ta3ve  methods,  must  be  determined  
objec3vely  using  the  relevant  calcula3ons  
–  Must  be  adequate  but  not  excessive  
–  Available  resources  must  be  considered  
•  financial    
•  personnel    
•  Time  

Ø For  qualita3ve  methods,  should  include  enough  


par3cipants  to  ensure  that  there  is  adequate  
representa3on  

Sample  size  implica3ons  for  quan3ta3ve  methods  

•  Sample  size  too  large  


–  Wastage  of  resources  

•  Sample  size  too  small  


–  Spurious  findings  
–  Selec3on  bias  
–  Difference  between  groups/Associa3ons  
•  Lack  of  power  
–  Prevalence/es3mates  
•  reduced  precision  

13

Measurement  Issues  -­‐  Quality  Control  


Ø  Must  be  reliable  and  valid  
–  Reliability    
•  Level  of  agreement  in  replicate  measurements  under  iden3cal  condi3ons  
–  Validity  
•  measure  what  it  is  supposed  to  

Ø  Standardiza3on  
–  Standard  data  collec3on  methods  (calibra3on)  
–  wrigen  protocol  

Ø  Training  is  cri3cal  

Ø  Pilots/pre-­‐tests  
Data Analysis 81

Data  Processing  

15

What  happens  aDer  Data  Collec3on  

1.  Preparatory  procedures  for  analysis  


2.  Data  analysis  
3.  Report  prepara3on  
4.  Dissemina3on  the  Report  
5.  Use  of  the  Report  

CHRC Data Analysis Workshop 16

Data  Analysis  Preparatory  Procedures  

1.  Valida3on  (quality  control)  


2.  Coding  
3.  Data  Entry  
4.  Cleaning  (quality  control)  
5.  Storage    

17
82 Advanced Monitoring and Evaluation Workshop Manual

Valida3on  

Ø A  quality  control  ac3vity  


Ø Includes:    
a.  determining  if  interviews  actually  were  conducted  
as  specified    
–  e.g.,  by  calling  par3cipants  to  ask  about  the  interview    
b.  edi3ng  i.e.  checking  for  interviewer  mistakes    
–   have  all  the  ques3ons  been  asked  and      
 responses  recorded?  
–   have  the  skip  pagerns  been  followed?  
–   have  the  responses  to  the  open-­‐ended      
 ques3ons  been  checked?  

18

Coding    
Ø Grouping  and  assigning  numeric  codes  to  the  
responses  to  open  ques3ons  
–  e.g.,  types  of  physical  ac3vity  performed  Walk=1;  Jog=2,  
Aerobics=3,  etc  

Ø The  Coding  Process  is  as  follows:  


–  Obtain  a  list  of  responses  
–  Group  similar  responses  
–  Assign  numeric  codes  
•  May  have  a  code  for  ‘Other’  where  infrequent  responses  are  
grouped  
–  Enter  codes  

19

Data  Entry  
•  Process  whereby  informa3on  is  transformed  into  
a  format  that  can  be  read  by  a  computer  
•  Develop  coding  notes/codebook  
–  Ensure  consistency  &    standard  interpreta3on  of  
codes  

–  Data  is  entered  in  computer  soDware  


•  Spreadsheets  e.g.  Excel  
•  Sta3s3cal  soDware    
–  EpiInfo,  SPSS,  Stata,  SAS  
–  Beger  data  retrieval,  management,  valida3on  and  analysis  

20
83

Cleaning/Edi3ng  
Objec3ve  :  Iden3fy  and  correct  errors    
1.  Use  of  programmed  data  entry  soDware    
Ø  E.g.  Validate  in  EpiInfo  
2.  Printout  and  check  each  entry  
3.  Data  cleaning  techniques  such  as  frequencies  or  
cross-­‐tabula3ons  and  look  for:  
–  Out  of  range  values  e.g.  Ages  of  secondary  students  –  values  of  4  
or  28  would  be  out-­‐of-­‐range  i.e.  errors.  
–  Consistency  of  the  data  e.g.  Errors  in  skip  pagerns  –  men  
answering  ques3on  about  doing  PAP  smears  in  last  12  months  
–  Find  relevant  ques3onnaire  and  enter  correct  code  on  computer  

21

Data  Storage  
•  Secure storage of hard copies of questionnaires"
•  3-5 years"
•  Storage of data entered in computer "
•  hard drive of computer "
•  back up -- external drives/other data storage mediums
(CDs, flash drives)"
"

•  Ethical issues"
•  Must ensure confidentiality of the data collected"
1.  Proper storage of data sheets and records
2.  Limiting access to identifiable data
3.  Adequately securing research records
4.  Removing identifiers from human specimens and data 22

Final  checks  before  analysis  


•  Consider prior to beginning data analysis:"
–  All data collected, accurately entered and verified for
correctness"
–  No missing questionnaires / data collection forms"
"

23
84 Advanced Monitoring and Evaluation Workshop Manual
Analyzing Data 85

8 Analyzing Data

Learning Objectives:

By the end of this session participants will be able to:


• Identify the various factors to consider that would determine
the most appropriate approach to data analysis
• Appreciate key data processing and analysis concepts and
issues
• Use the Epi Info software for data processing and basic
analyses
• Identify the most appropriate analyses for different types of
data
86 Advanced Monitoring and Evaluation Workshop Manual
Analyzing Data 87

Data  Analysis  
•  Ensure  that  data  entry  is  completed  and  data  cleaned    

•  Analysis  depends  on    


–  Evalua8on  ques8on  
–  Variable  Type  

•  May  be:  
–  Descrip8ve    
–  Inferen8al      

•  Dummy  tables  
–  Should  be  prepared  at  the  design  phase  

Dummy  Tables  
•  Constructed  to  guide  the  presenta8on  of  
results  
•  Based  on  evalua8on  ques8ons  
•  Guides  the  approach  to  analysis  

Example  of  Dummy  Table  (1)        


Demographic  characteris8cs  of  the  sample  
Male   Female  
n (%)   n (%)  
Age  
15-24y  
24-34y  
35-44y  
45-54y  
55-65y  

Education  
Primary  
Secondary  
Tertiary  
4
88 Advanced Monitoring and Evaluation Workshop Manual

Example  of  Dummy  Table  (2)      


 Health  status  of  the  sample  
Male   Female  
n (%)   n (%)  
Had illness in past 3 months  

Type of illness  
Diabetes  
Hypertension  
Asthma  
Dengue fever  
Depression  
Other  

Types  of  Data  


•  Quan8ta8ve  (numerical)  
–  measure  characteris8cs  that  can  be  quan8fied  –  have  
real  numerical  value  
•  Discrete  -­‐  finite  values  e.g.  #  pregnancies  
•  Con8nuous  -­‐  any  value  with  a  certain  range  e.g.  height,  
weight  

•  Categorical  (qualita8ve)  
–  measures  characteris8cs  that  have  no  numerical  value  
•  e.g.  presence  of  disease,  gender,  occupa8on  

Descrip8ve  vs.  Inferen8al  Sta8s8cs  


•  Descrip8ve  Sta8s8cs    
–  summarize  or  describe  important  features  about  the  
data    
–  does  not  infer  anything  that  goes  beyond  the  data  
themselves  

•  Inferen8al  Sta8s8cs    
–  Use  the  data  collected  from  samples  to  make  
generaliza8ons  which  go  beyond  the  sample    
•  i.e.  predic8ons  about  the  popula8on  from  which  they  came    
•  Include  the  determina8on  of  confidence  intervals  and  
hypothesis  tes8ng  

7
Analyzing Data 89

Descrip8ve  Sta8s8cs  
–  Dependent  on  the  type  of  data  i.e.  categorical  or  
quan8ta8ve  data  

Categorical    
–  frequencies  or  percentages      
–  tables  or  charts  

Quan/ta/ve  
–  Usually  presented  in  terms  of:  
•  Central  tendency    
•  Variability  /  dispersion  

Summarizing  categorical  data  


•  Tabular  form  
–  Frequency  tables:  list  of  categories  and  their  
numbers/percentages  
–  Con8ngency  table:  characteris8cs  are  examined,  
by  a  grouping  variable  such  as  sex  

•  Graphically:    
–  Bar  charts  
–  Pie  charts  

Educa8onal  AXainment,  by  Sex  


 Educa8onal  aXainment  Males    Females  
             n  (%)                      n  (%)  
 No  schooling      38        (8)  100  (14)  
 Primary        314  (69)  468  (67)  
 Secondary  or  higher    91      (20)  108  (16)  
 Not  known      11        (2)  19        (3)  
 
 TOTAL        454      695    

10
90 Advanced Monitoring and Evaluation Workshop Manual

Educa8onal  AXainment  by  Gender  


Males Females
500

450

400

350
Number of Patients

300

250

200

150

100

50

No Schooling Primary Secondary Not Known 11

Summarizing  quan8ta8ve  data  


•  Measures  of  central  tendency  (average):    
–  A  summary  number  /  index  that  gives  an  idea  of  
the  whole  data  set  
–  Tells  what  the  middle  or  average  value  is  
–  Mean,  Median  
•    Measures  of  dispersion  /  spread  
–    Standard  devia8on,  range  

12

Measures  of  central  tendency  


 
•  Mean (average) = sum of observations /
number of observations
–  Most frequently used
–  May be affected by extreme values
•  Median
–  Centre most value
–  Less sensitive to extreme values

13
Analyzing Data 91

Which  to  use?  


•  Examine  the  distribu8on  
–  Symmetric  or  bell  shaped?  
–  Skewed:  presence  of  excess  outlying  observa8ons  in  
one  direc8on  only?  

•  Mean:    
–  numerical  data  and  symmetric  distribu8on  

•  Median:    
–  numerical  data  with  skewed  distribu8ons  

14

Normal Distribution

The bell-shaped (Symmetric) curve. This represents a


distribution that is the basis of many of the tests of inference
that we use to draw conclusions or make generalizations.

Skewed Distributions
15

Measures  of  dispersion  


 
•  Standard deviation (SD): spread of the data
about the mean
•  ‘Average deviation’ from the mean
•  Used with the mean
•  Range
•  Lowest to highest value
•  Used with the median

16
92 Advanced Monitoring and Evaluation Workshop Manual

The  haemoglobin  of  8  children  (  mg/dl)  


         5    5    6    9    9    9    10    11  
       
     Mean  Hb  =    (5+5+6+9+9+9+10+11  )  /  8  
       =  64/8  =  8.0  mg/dl  
   
 Median  =  (n+l)/2  th  observaDon  (in  order)  
               =  (8+1)/2  =  4.5th  value        
                               (  i.e.    the  average  of  the  4th  and  5th    observaDons    
     =  (9+9)/2  =  18/2  =  9.0  mg/dl    
               

17

Points  to  Note  


•  Mean  and  Standard  Devia8on  
–  most  useful  measures  for  summarizing  
quan8ta8ve  variables  
–  u8lize  all  informa8on  in  data  
–  used  when  data  are  normally  distributed  

•  Median  and  Range  


–  used  when  data  are  skewed  

18

INFERENTIAL  STATISTICS  

19
Analyzing Data 93

Sta8s8cal  Inference  
•  Process  by  which  inferences  and  generaliza8ons  
are  made  about  popula8on  parameters  

•  Two  main  categories:  


–  Es8ma8on:  obtaining  an  es8mate  of  the  popula8on  
parameter  using  data  from  a  sample  

–  Hypothesis  tes8ng:  arriving  at  a  decision  about  a  


stated  hypothesis  

20

Popula8ons  and  Samples  


•  Popula/on  Parameters      
–    characteris8cs  of  the  popula8on  e.g.  popula8on  
prevalence  

•  Sample  Sta/s/cs    
–    measures  obtained  from  a  subset  of  the  
popula8on  e.g.  sample  prevalence  
       

21

Es8ma8on  
•  Data  collected  from  a  random  sample  to  
es8mate  the  characteris8cs  of  the  popula8on  

•  Sample  mean/propor8on  es8mates  the  


popula8on  mean/propor8on    

•  Generaliza8ons  have  doubts  


–  Es8mate  not  likely  to  be  iden8cal  to  popula8on  
value  
22
94 Advanced Monitoring and Evaluation Workshop Manual

Confidence  Interval  

•  Important  to  know  accuracy  of  sample  


es8mate                      
–  how  close  to  popula8on  parameter  

•  useful  to  have  an  interval  within  which  we  


expect  popula8on  value  to  lie  

23

95%  Confidence  Interval  


•  95%  sure  that  popula8on  parameter  will  lie  
within  this  interval  
 
•  It  is  an  expression  of  faith  placed  in  sample  
sta8s8c  to  correctly  es8mate  popula8on  
parameter  

•  Results  expressed  as  mean  (95%  CI)  


 Results  expressed  as  propor8on  (95%  CI)  

24

Formula  –  95%  CI  (Mean)  


95%  CI  (mean)  =  mean  ±  2*SE  (mean)  
•  Standard  Error  (SE)  
–  SE(mean)  =    SD  /  √  n  
         
 SD=Standard  Devia8on  
 n=sample  size  

25
Analyzing Data 95

Example:  95%  CI  (Mean)  


Survey  the  birth  weights  of  n=100  children  
 mean  =  3.50kg,  SD  =  0.50kg  
 
 SEM  =  0.50/√100  =  0.  05  
 95%  CI  (mean)  =  mean  ±  (2  x  0.05)  
                 =  3.50  ±  0.10  =  3.40  to  3.60  
 Therefore:    there  is  a  95%  probability  that  the  
mean  birth  weight  of  the  popula8on  is  between  
3.40kg  and  3.60kg  

26

Formula  -­‐  Propor8on  


•  95%  CI  (p)  =  p  ±  2  x  SE(p)  
     SE(p)  =  √  [  p  (1  -­‐  p)  /  n]  
       where  p=propor8on  

27

Example:  95%  CI    (Propor8on)  


 Survey  conducted  to  determine  the  prevalence  
of  anemia  among  rural  schoolchildren  
 n=  586,  of  these  263  were  anemic  
 
 We  want  to  know  the  prevalence  of  anemia  
among  all  rural  school  children  

28
96 Advanced Monitoring and Evaluation Workshop Manual

Example:  (cont’d)  
•  Propor8on  anemic  (p)  =  263/586  =  0.449  
–  95%  CI  (p)  =  0.449  ±  (2  *  SE  (p))    
–  SE  (p)  =  √  [0.449  x  (1-­‐0.449)/586]  =  0.021    
–  95%  CI  =  0.449  ±  0.042  =  0.407  to  0.491  

•  Therefore,  we  are  95%  sure  that  the  


prevalence  of  anemia  among  rural  school  
children  is  between  40.7%  and  49.1%  

29

Example  of  Dummy  Table  (3)      


Health  status  of  the  sample  
n (%)   95% CI  
Had illness in past 3 months  

Type of illness  
Diabetes  
Hypertension  
Asthma  
Dengue fever  
Depression  
Other  

30

HYPOTHESIS  TESTING  
 
Analyzing Data 97

Hypothesis  Tes8ng/  
Significance  Tes8ng  
•  Hypothesis  
–  refutable  predic8on  

•  Is  the  observed  difference  /  associa8on  due  to  


chance  (i.e.  sampling  error)  or  is  it  real?  

Steps  in  hypothesis  tes8ng  


 1.  State  Null  hypothesis  (Ho)    
   no  difference  between  groups  being  compared    
   i.e.  mean1  =  mean2    or      p1  =  p2  
 
 2.  State  Alterna8ve  hypothesis  (HI)  
   there  is  a  difference  i.e.  
   mean1  ≠  mean2      or      p1  ≠  p2  
   
 3.  State  the  Type  I  Error  =  0.05  
 
 4.  Do  relevant  sta8s8cal  test  
       is  there  evidence  to  reject  (Ho)?  
     If  so,  conclude  that  the  difference  was  ‘significant’  
 

Note:  
•  In  hypothesis  tes8ng  ,  a  significance  test  can  
never  prove  that  a  null  hypothesis  is  true  or  
false,  but  can  give  informa8on  about  the  
strength  of  the  evidence  against  it.  

•  In  making  decision  about  whether  to  reject  a  


null  hypothesis  we  can  make  2  errors  
–  Type  I  
–  Type  II  

34
98 Advanced Monitoring and Evaluation Workshop Manual

Types  of  Error  


•  Type  I  
–  Null  rejected  when  true  
•  Really  no  difference  but  you  concluded  that  there  was  
•  significance  level  
•  P  value  
•  Set  at  0.05  –  accept  a  1  in  20  chance  of  being  wrong  

•  Type  II  
–  did  not  reject  Null  when  false  
–  concluded  that  there  was  none  but  there  really  was  a  difference  
–  related  to  sample  size  
•  Power  of  the  study-­‐  the  ability  to  show  an  associa8on  or  difference  
when  there  was  really  one    

Hypothesis  tests  -­‐  Concepts  


ü P  value  
•  probability  that  we  obtained  the  observed  data  if  the  
null  hypothesis  were  true  i.e.  there  was  no  
difference  
•  indicates  whether  observed  difference  /  associa8on  
was  due  to  chance  
•  smaller  value  implies  less  dependable  null  hypothesis  
•  determines  whether  we  reject  the  null  hypothesis  
•  Max=1;  Min=0  
•  The  0.05  threshold  is  commonly  used  in  health  
research  i.e.  a  5%  significance  level  

36

Sta8s8cal  Significance  
•  hypothesis  test  ≡  significance  test  

•  p<0.05  
–  difference/associa8on  is  sta8s8cally  significant  

•  p>0.05  
–  not  significant  
Analyzing Data 99

Comparison  of  Means  


•  Student’s  t-­‐test  
   examines  differences  between  2  means  
   -­‐  paired  t-­‐test  (matched  data)  
   -­‐  unpaired  t-­‐test  (unmatched  data)  

•  Analysis  of  Variance  (ANOVA)  


   compares  differences  among  means    
-­‐  when  there  are  more  than  2  groups  

T-­‐test  -­‐  Assump8ons  


1.  Variances  are  equal  
very  important  
test  for  equality  of  variances  
 

2.  Data  are  normally  distributed  


Examine  distribu8on  or  do  test  
–  can  tolerate  moderate  departures  from  normality  

If  assump8ons  not  sa8sfied  


•  Tests  not  valid  May  try  transferring  the  data  
or    
•  Use  non-­‐parametric  (distribu8on-­‐free)  test  
     Parametric          Non-­‐Parametric  
–  Paired  t-­‐test                        Wilcoxon  signed  rank  test  
–  Unpaired  t-­‐test                    Mann  Whitney  U  test  
–  Analysis  of  variance                    Kruskal-­‐Wallis  test  

–  not  as  powerful  as  parametric  tests  


•  Less  likely  to  find  a  significant  difference  
100 Advanced Monitoring and Evaluation Workshop Manual

Categorical  Variables  

Hypothesis  Tes8ng:  Categorical  


Variables  

•  Comparison  of  differences  in  observed  


frequencies  or  percentages  
–  Chi  square  test  

Chi  Square  (χ2)  Test  


•  used  to  analyze:  
–   differences  between  propor8ons  (percentages)    
–  associa8ons  between  observed  frequencies  

•  raw  data  are  grouped  into  a  con8ngency  table  

•  Always  calculate  the  percentages    


–  Assist  in  interpre8ng  the  results  
Analyzing Data 101

     2  ×  2  Tables  
         Independent  
         1    2    
Outcome  
     1    a    b    
     2    c    d    

•  the  test  compares  observed  freq  in  each  cell  with  


the  expected  (theore=cal)  freq  if  there  was  no  
associa=on  between  the  2  variables  

Validity  (for  2×2  tables)  


•    χ2  is  not  valid  if:    

–  n  <  20      
–  n  is  between  20    and  40    and  the  Expected  
Frequency  of  any  cell  is  <5  

•  if  condi8ons  do  not  hold    


–  Use  Fisher’s  Exact  Test    
–  Use  Con8nuity  Correc8on  

Example  
 An  inves.ga.on  was  carried  out  to  determine  if  a  new  vaccine  is  
effec.ve  against  TB.    The  results  of  a  randomised  trial  are  shown  
below  
 
     TB      vaccine    placebo  
   yes    30              90  
   no    230          150  
 
 
Vaccine    30/260  =  11.5%  infected  
Placebo    90/240  =  37.5%  infected  
 
 Was  the  vaccine  effec-ve?    
 or  are  observed  differences  due  to  chance?  
102 Advanced Monitoring and Evaluation Workshop Manual

Null  hypothesis:      
 no  difference  (no  associa8on)  
   H0:      p1    =    p2    
 

Alterna8ve  hypothesis:    
 a  difference(associa8on)  
   HI:      p1    ≠    p2      
 

Type  I  Error  =  0.05    


 

Conduct  Chi  square  test:  


 
Determine  the  P-­‐value  for  the    χ2  
 
 

   
 

χ2  =  46.11,  P  <0.001  
 
 Therefore,  Reject  null  hypothesis  (as  P<0.05)  
 
 Interpreta8on:  
 There  is  a  significant  difference  between  the  propor8on  of  
persons  contrac8ng  TB  when  those  vaccinated  were  
compared  with  those  given  the  placebo  (p<0.001).      
 
 11.4%  of  the  persons  given  the  vaccine  contracted  TB  
compared  with  the  37.5%  given  the  placebo.  

 
 
Choosing  the  Right  Analysis    
 
 
Analyzing Data 103

•  Should  be  addressed  in  planning  stage  


•  Cri8cal  part  of  protocol  
–  Data  analysis  plan  
–  Dummy  tables,  etc.  
•  Seek  advice  from  biosta8s8cian  

Descrip8ve  
•  One  categorical  variable  
§  E.g.  prevalence  of  smoking,  sa8sfied  with  services  
§  Es8mate  (95%CI)  

•  One  quan8ta8ve  variable  


§  E.g.  mean  birth  weight,  body  mass  index  (BMI)  
§  Es8mate  (95%CI)  

Analy8cal  
•  2  categorical  variables  
–  Sta8s8cal  test  
•  Cross-­‐tabula/on  
•  Chi  square  test  

–  E.g.  associa8on  between  smoking  and  prostate  


cancer  
104 Advanced Monitoring and Evaluation Workshop Manual

1  Categorical  and  1  Quan8ta8ve    


•  E.g.  age  group  and  blood  pressure  
–  Compare  the  mean  BP  of  persons  of  various  age  
groups  
•  T-­‐test  
–  Paired  or  unpaired  
•  Depends  on  design  
•  Analysis  of  Variance  (>2  means)  
•  Assump8ons  sa8sfied?  
–  Normal  distribu8on,  equal  variances  
–  If  not,  use  non-­‐parametric  tests  
•  Mann-­‐Whitney  U  test,  Kruskal-­‐Wallis  

More  than  2  Variables  


•  1  dependent  and  >1  independent  variables  
§  E.g.  BP,  BMI,  age  
§  Useful  when  need  to  control  for  confounders    
§  Associa8on  between  BP  and  BMI,  awer  controlling  
for  age  

•  Mul8ple  Regression  
§  Very  powerful  analysis  
§  Ensure  condi8ons  are  sa8sfied  

Goal   Scale  -­‐  Normal   Rank,  Score,  or  Scale   Categorical    


Distribu/on   Non-­‐normal    
distribu/on  

Describe  one  group   Mean,  SD   Median,  interquar8le   Propor8on/  


range   Percentage  
Compare  one  group  to  a   One-­‐sample  Xest  Wilcoxon  test   Chi-­‐square  
hypothe/cal  value   goodness  of  fit  

Compare  two   Unpaired  t  test   Mann-­‐Whitney  test   Chi  square  (Fisher’s  


unpaired  groups   for  small  samples)  

Compare  two  paired  groups  Paired  t  test   Wilcoxon  test   McNemar's  test  

Compare  three  or  more   One-­‐way  ANOVA   Kruskal-­‐Wallis  test   Chi-­‐square  test  
unmatched  groups  

55
Analyzing Data 105

Always:  
ü Get  feedback/support  
ü Consult  (bio)sta/s/cian  
106 Advanced Monitoring and Evaluation Workshop Manual
Epi Info Tutorial 107

9 Epi Info Tutorial

Learning Objectives:

By the end of this session participants will be able to:


• Identify the various factors to consider that would determine
the most appropriate approach to data analysis
• Appreciate key data processing and analysis concepts and
issues
• Use the Epi Info software for data processing and basic
analyses
• Identify the most appropriate analyses for different types of
data
108 Advanced Monitoring and Evaluation Workshop Manual
Epi Info Tutorial 109

About  Epi  Info  


•  Epi  Info  is  public  domain  sta5s5cal  so6ware  
for  epidemiology  developed  by  the  Centers  
for  Disease  Control  and  Preven5on  (CDC)  

•  The  program  allows  for  electronic  


ques5onnaire  crea5on,  data  entry,  and  
analysis,  report  wri5ng  and  mapping    

Main  Modules  

Make  View  Module  


•  Allows  users  to  create  ques5onnaires  and  
data  entry  forms  called  Views.    
•  Users  place  ques5ons  and  data  entry  fields  on  
one  or  more  pages  of  a  View  and  tailor  the  
data  entry  process  with  condi5onal  skip  
paJerns,  data  valida5on,  and  custom  
calcula5ons  programmed  by  the  user  using  
MakeView's  Check  Code.  
110 Advanced Monitoring and Evaluation Workshop Manual

Enter  Data  
•  The  Enter  module  automa5cally  creates  the  
database  from  the  ques5onnaire  in  
MakeView.    
•  Users  enter  data,  modify  exis5ng  data,  or  
search  for  records.    
•  The  Views  are  displayed  and  users  perform  
the  data  entry  while  the  Check  Code  validates  
the  data  or  performs  any  automa5c  
calcula5ons  that  were  specified  in  MakeView.    

Analyze  Data    
•  The  Analysis  module  is  used  to  analyze  data  
entered  with  the  Enter  module  or  data  imported  
from  24  different  data  formats.    
•  Sta5s5cs,  tables,  graphs,  and  maps  are  produced  
with  simple  commands  such  as  READ,  FREQ,  LIST,  
TABLES,  GRAPH,  and  MAP.    
•  As  each  command  is  run,  it  is  saved  to  the  
program  editor  where  it  can  be  customized  and  
saved,  shared,  and  used  in  the  future  as  data  are  
revised.  

EPI  Reports  
•  The  Epi  Report  module  is  a  user-­‐friendly  tool  
to  create  professional  custom  reports  that  
include  results  from  the  Analysis  output.    
•  Can  combine  Analysis  output  with  data  from  
Enter  as  well  as  other  sources  such  as  Access.    
•  Reports  can  be  saved  as  HTML  files  for  easy  
distribu5on  or  web  publishing  
Epi Info Tutorial 111

Epi  Maps  
•  The  Epi  Map  module  displays  geographic  
maps  with  data  from  Epi  Info.    
•  Epi  Map  displays  files  containing  geographic  
boundaries  layered  with  data  results  from  the  
Analysis  module.  

Crea5ng  a  View  
112 Advanced Monitoring and Evaluation Workshop Manual

Crea%ng  a  New  Ques%onnaire  


1.  Create/Open  Project:    
 Select  File>New  and  type  a  Project  File  Name  and  Click  
Open  
 
2.  Create  new  View  in  Name  the  View  window:  
 Note:  You  can  also  create  a  New  View  in  an  Exis%ng  Project  
by  selec%ng  File>New,  type  new  View  name  and  click  Ok.    
 
3.  Type  a  View  name  and  click  Ok  
 
4.    Right  click  in  the  View,  the  Field  Defini%on  dialog  box  opens  
to  begin  crea%ng  fields.  

Field  Defini5on  Box  

Field  Defini*on  Box  


•  The  Field  Defini*on  dialog  box  offers  op*ons  
for  entering  the  Ques*on  or  Prompt,  the  Field  
or  Variable  Type,  Field  Name  and  other  
characteris*cs.  
Epi Info Tutorial 113

Field  Names  
Field  Names  are  unique  “  variable  names”  when  analyzing  the  
data.  
–  Field  names  may  have  a  number  in  them,  but  cannot  start  with  a  
number  
–  Field  names  may  not  have  symbols  or  spaces  in  them  
–  Field  names  should  be  logical  and  easy  to  recall  for  later  analyses  
Field  Names  are  formed  when  entering  text  in  the  “Ques3on  or  
Prompt”  box,  but  can  be  edited.    Before  you  go  to  the  EnterData  
program,  make  sure  you  are  happy  with  the  field  names,  
because  once  the  data  entry  screen  is  opened,  field  names  
cannot  be  cahnaged  unless  the  data  is  deleted  (DANGEROUS)  

Selec5ng  Field  Types  


Field  Variable  Types  summary:  
–  Label/Title  –  For  display  only,  they  do  not  hold  data.  
–  Number  –  Receives  numeric  data  (must  use  this  field  type  
for  later  calcula5ons,  such  as  averages).  Can  select  number  
paJerns,  such  as  #,  ##,  ###,  ##.##  
–  Text  –  Receives  text,  numbers  (cannot  use  these  numbers  
for  later  calcula5ons),  or  symbols.  
–  Mul3line  –  For  more  than  one  line  of  text,  e.g  addresses,  
comments.  
–  Phone  Number  –  Special  field  in  phone  number  format.  
–  Date  –  Special  filed  in  various  date  formats.  

Field  Type  Cont’d  


Text  (Uppercase)  –  Contents  text  entries  into  
capital  leJers  to  ensure  correct  analysis  later.  
For  example,  to  avoid  confusing  ‘female’  with  
‘Female’,  this  func5on  will  convert  all  to  
‘FEMALE’  and  analyze  as  such.  
Check  Box  –  Used  with  ‘check  all  that  apply’  
ques5ons.  
114 Advanced Monitoring and Evaluation Workshop Manual

Check  Code  
•  Check  Code  is  a  series  of  commands  that  tell  Epi  Info  that  you  want  
to  do  certain  “checks”  of  your  data  as  it  is  being  entered.    
•  By  using  Check  Code,  you  can  protect  your  data  against  many  
common  types  of  errors  and  make  data  entry  easier.  
•  It  is  helpful  when  you  have  more  than  one  person  entering  data.  
•  There  are  two  ways  to  create  Check  Code  in  Epi  Info:  
–  Set  the  code  in  the  Field  Defini5on  dialog  box  
–  In  MakeView,  click  on  the  “Program”  buJon  and  begin  to  build  (write)  
the  code.  
•  Check  Code  should  be  created  when  you  are  crea5ng  and  
modifying  your  “view”  using  the  MakeView  program  (i.e.,  before  
data  entry).  

Formaing  Data  Entry  Screens  


•  Aligning  data  entry  fields  
–  Use  the  mouse  to  draw  a  doJed-­‐line  box  around  the  
field(s)  to  be  aligned.  
–  Under  the  “Format”  heading,  select  “Alignment”  and  
choose  whether  you  want  horizontal  or  ver5cal  alignment.  
•  Using  background  grids  
–  Under  the  “Format”  heading  on  the  menu,  choose  
“seings”.  
–  The  dialog  box  allows  you  to:  
•  turn  the  “snap  to  grid”  feature  on  or  off  
•  turn  the  grid  on  or  off  
•  specify  character  widths  between  grid  lines  
•  choose  whether  you  want  the  prompt  or  field  to  “snap  to  grid”  

Formaing  Data  Entry  Screens  


•  Text  and  their  corresponding  fields  can  be  cut,  
copied  and  pasted  between  pages  of  a  View.  
•  To  use  the  cut/copy/paste  techniques:  
–  use  the  mouse  to  draw  a  doJed  line  around  the  
fields  to  manipulate  (both  text  AND  field)  
–  Under  the  “Edit”  heading  on  the  menu,  choose  
the  desired  ac5on  or  you  can  use  the  short-­‐cut  
keys  (eg,  in  order  to  cut,  press  Ctrl+X).  
Epi Info Tutorial 115

Tab  Order  
•  The  default  order  for  data  entry  is  the  order  in  
which  the  fields  were  created  (and  then  
modified).  
•  Thus,  you  might  want  to  change  the  order  of  
fields  for  data  entry:  
–  Click  “Edit”  from  the  MakeView  menu  
–  Select  “Order  of  FieldEntry  (Taborder)”  op5on  
–  By  clicking  on  the  “Up”  or  “Down”  key,  you  can  
change  the  order  of  fields  for  data  entry.  

Entering  Data  

Steps  for  Entering  Data  


•  From  the  Epi  Info™  main  page,  click  Enter  Data.  
The  Enter  window  opens.  
•  Select  File>Open.    
•  Open  the  project  you  created  in  Make  View.  The  
Select  a  Table  dialog  box  opens.      
•  Select  the  relevant  View.  
•  Click  OK.  The  Data  Table  dialog  box  opens.    
•  Click  OK  to  create  a  New  Data  Table.  The  New  
Data  Table  dialog  box  opens.  
116 Advanced Monitoring and Evaluation Workshop Manual

Steps  for  Entering  Data  cont’d  


•  Click  OK  to  keep  the  Data  Table  Name  and  
Unique  ID  set  as  the  default.  The  Enter  page  
opens  with  the  view  you  made  in  Make  View.  
•  No5ce  you  are  in  Record  1  of  1  on  Page  1  5tled  
Personal  Informa5on.  
•  The  Enter  Page  Panel  displays  a  list  of  pages  
created  in  the  project.  Data  entered  into  a  page  is  
automa5cally  saved;  however,  you  should  use  
the  Save  Data  buJon  or  File>Save  a6er  making  
changes  to  exis5ng  records.  

Retrieving  Records  
•  You  can  quickly  find  records  by  one  of  three  ways:  

1.  Using  selected  criteria  –  i.e.  using  the  “Find”  buJon.  

•  In  the  “Choose  search  field  (s)”,  select  the  field(s)  that  define(s)  
your  criteria.  
•  In  the  box  next  to  the  field  you  selected,  type  in  the  criteria.  

2.  Specifying  a  record  number  


3.  Browsing  through  the  records  

Analyzing  Data  
Epi Info Tutorial 117

READ  Command  
•  The  READ  command  imports  your  data  so  that  
it  can  be  analyzed.  
•  Specify  the:    
•  data  source  (project)  
•  data  format  
•  data  table  

Data  Formats  to  Analyze  


•  Access  2000   •  FoxPro  2.0  
•  Access  97   •  FoxPro  2.5  
•  DBASE  5.0   •  FoxPro  2.6  
•  DBASE  III   •  FoxPro  3.0  
•  DBASE  IV   •  HTML  
•  Epi  2000   •  ODBC  
•  Epi6   •  Paradox  3.x  
•  Epi6  Direct  Read   •  Paradox  4.x  
•  Excel  3.0   •  Paradox  5.x  
•  Excel  4.0   •  Text  
•  Excel  5.0   •  Text  (delimited)  
•  Excel  8.0   •  Text  (fixed)  

Frequency  Distribu5ons  
•  The  FREQ  command  produces  a  frequency  
table  for  specified  variable(s).  
•  The  resul5ng  table  shows    
–  how  many  records  have  each  value  of  the  field  
–  The  percentage  of  the  total  
–  Cumula5ve  percentage  
–  95  %  confidence  intervals  for  each  value  
118 Advanced Monitoring and Evaluation Workshop Manual

TABLES  Command  
OUTCOME
•  The  TABLES   (Dependent
command  produces   Variable)

a  cross-­‐tabula5on  of   + -
two  or  more  
categorical  variables.   + 20 40
EXPOSURE
(Independent
Variable) - 37 59

MEANS  Command  
•  The  MEANS  command  produces  descrip5ve  
sta5s5cs  for  one  con5nuous  variable.  
•  The  sta5s5cs  include:  
–  Mean  
–  Median  
–  Mode  
–  Min/max  
–  Quan5les  
–  Variance/  standard  devia5on  

Mock  Database  
•  Age  -­‐  Age  
•  Sex  -­‐  Sex  
•  Marital  Status  -­‐  Married  
•  Educa5on  Level  -­‐  Educ  
•  Ethnicity  -­‐  Ethnicity  
•  Sexual  Intercourse  with  a  non-­‐regular  partner  within  the  last  12  
months?  Sex_Nonreg_partner  
•  How  many  non-­‐regular  have  you  had  sex  within  the  last  12  
months?  No_reg_partners  
•  Did  you  or  your    partner  use  a  condom  the  last  5me  you  had  sex  
with  you  most  recent  non-­‐regular  partners?  
Condom_nonregpartner  
•  Have  you  had  an  STI  within  the  last  12  months?  STI  
Epi Info Tutorial 119

Ques5ons  
•  What  should  be  our  plan  for  analysis?  
•  What  type  of  sta5s5cal  analysis  should  we  
perform  to  answer  our  research  ques5ons?  
120 Advanced Monitoring and Evaluation Workshop Manual
Managing and Resource Requirements for Evaluations 121

10 Managing
and Resource
Requirement for
Evaluations
Learning Objectives:

By the end of this session, participants will be able to:

• Identify the Key Steps in Managing the Evaluation Process

• Develop Terms of Reference for an Evaluation

• Estimate Time and Resource Requirements


122 Advanced Monitoring and Evaluation Workshop Manual
Managing and Resource Requirements for Evaluations 123

The  ‘Process’  for  Conduc6ng  an  Evalua6on  –  


An  Overview  
 
 

•  ‘Process’  is  cri6cal  for  ul6mate  acceptance  of  evalua6on  


results    
 
 

•  Important  for  both  ‘small’  and  ‘large’  scale  studies  


 
 

•  Requires  good  project  planning  and  management  skills  

Key  Steps  in  Managing  the  Evalua6on  Process  

From  start  to  finish,  there  are  at  least  8  key  steps  to  observe  
1.  GeTng  started  
2.  Ini6al  planning  
3.  Terms  of  Reference  (TOR)  
4.  Selec6on  of  the  Evaluator/Evalua6on  Team  
5.  Evalua6on  Work  Plan  
6.  Fieldwork  Phase  
7.  Repor6ng  on  Evalua6on  Results  (  Oral  &  Wri`en)  
8.  Follow-­‐up  to  the  Evalua6on  

Developing  Terms  of  Reference  for  an  


Evalua6on  
•  Oben  overlooked,  oben  poorly  conceptualized  

•  The  terms  of  reference  (ToR)  document  defines  all  aspects  of  
how  a  consultant  or  a  team  will  conduct  an  evalua6on  

•  Presents  an  overview  of  the  requirements  and  expecta6ons  of  


the  evalua6on  

•  Provides  an  explicit  statement  of  the  objec6ves  of  the  evalua6on,  
roles  and  responsibili6es  of  the  evaluators  and  the  evalua6on  
client,  and  resources  available  for  the  evalua6on  
124 Advanced Monitoring and Evaluation Workshop Manual

Developing  Terms  of  Reference  for  an  


Evalua6on  
A  ToR  provides  clearly  detailed  parameters  for:  

•  Why  and  for  whom  the  evalua6on  is  being  done    


•  What  it  intends  to  accomplish    
•  How  it  will  be  accomplished    
•  Who  will  be  involved  in  the  evalua6on    
•  When  milestones  will  be  reached  and  when  the  
evalua6on  will  be  completed    
•  What  resources  are  available  to  conduct  the  
evalua6on.  

Outline  of  ToR  for  an  Evalua6on  

•  Content  of  the  Terms  of  Reference    


•  Background  Informa6on  and  Ra6onale    
•  Specific  Objec6ves  of  the  Evalua6on  
•  Approach  and  Methodology    
•  Professional  Qualifica6ons    
•  Deliverables  and  Schedule    
•  Budget  and  Payment    
•  Submission  Guidelines    
Managing and Resource Requirements for Evaluations 125

Clarifying  the  Ra6onale  for  the  Evalua6on    


•  Ask  basic  ques6ons  about  why  study  is  needed  –  What  kind  of  advice  
is  needed?  Ques6ons  answered?  
 
•  Is  the  evalua6on  intended  to:  
             *  advise  on  the  opera6on  or  delivery  of  the  programme?    (process  or    
                   implementa6on  kinds  of  issues)                                                                                                                                                                                                        
             *  advise  on  the  success  or  con6nua6on  of  the  programme?  (outcome  
                   or  impact  kinds  of  issues)  
•  Who  needs  this  informa6on?    
 
•  What  level  of  detail  is  required?  
 
•  By  when  is  the  informa6on  needed?  In  what  format?  

Objec6ves  of  the  Evalua6on  


Range  of  possible  evalua6on  ques6ons  
 
}  Does  the  programme  ra6onale  s6ll  hold?  -­‐  Need  for  the  programme?  (Programme  
ra6onale)    

}  Does  the  delivery  mode  need  to  be  modified  or  alterna6ves  to  the  programme  
developed?  (Alterna6ves)  

}  How  well  is  the  programme  performing?  Is  it  mee6ng  its  objec6ves    (Programme  
success)  

}  What  results  are  the  programme  achieving,  both  intended  &  unintended?  
(Programme  Impacts  &  Effects)    

}  Management-­‐oriented  issues  re  programme  delivery,  efficiency,  etc.    (Programme  


process  )  

}  Cost-­‐effec6veness  of  the  programme  

Approach  and  Methodology  

•  What  resources  ($,  people)  have  been/can  be  


brought  to  the  evalua6on  study?  
•  Is  this  a  good  6me  to  evaluate…Why  or  why  not?  
•  Data  issues:  Availability  of  data;  need  for  primary  
research  (e.g.  survey);  complexity  of  field  research  
(e.g.  many  sites,  regionally  distributed);  etc.  
•  Time  constraints  in  delivering  on  results  of  the  
evalua6on?  
•  Is  programme  management  suppor6ve  and  
knowledgeable  about  the  evalua6on?  
•  Prior  evalua6on  or  research  in  this  area?  
126 Advanced Monitoring and Evaluation Workshop Manual

Design  Considerations    
•  Advisable  to  use  mul6ple  lines  of  evidence  
 
•  Use  both  quan6ta6ve  and  qualita6ve  methods  

•  Build  a  team  with  different  special6es  (social  sciences,  


economics,  procurement,  accoun6ng  etc.)  
 
•  Specific  methodology  and  evalua6on  design  depend  on  
the  nature  of  the  ques6ons  being  asked;  the  resources  
available  for  the  evalua6on;  &  6me  constraints  

•  Need  to  balance  methodological  rigor  with  prac6cal  


reali6es  of  resource  &  6me  constraints  

12

Address  the  following  ques6ons  for  determining  appropriate  


approach  
 
1.  What  are  the  possible  sources  of  data;  i.e.  where  the  data/informa6on  is  
located?  
2.  What  are  the  most  appropriate  data  collec6on  methods;  i.e.  how  will  we  get  
the  data?  
3.  Will  exis6ng  data  be  sufficient  to  answer  the  evalua6on  ques6ons  OR  will  new  
data  collec6on  need  to  be  undertaken?  
4.  Will  the  study  be  designed  so  as  to  collect  informa6on  at  one  point  in  6me  
(post-­‐programme  interven6on)  OR  will  data  be  collected  at  two  points  (pre-­‐  
and  post)?  
5.  Will  a  control  or  comparison  group  be  used  to  compare  the  impact  of  the  
programme  on  clients?  
6.  What  are  the  cost,  technical  and  6me  considera6ons  (reality  check)?  

Methods  appropriate  for  the  par6cular  study  


•  It  depends…..  
 
•  Rely  on  exis6ng  data  as  much  as  possible:  
 
             *  Programme  Document  review—generally  a  star6ng  point  

             *  Literature  review  

             *  Administra6ve  records    
 *    Key  source  of  informa6on  on  ac6vi6es,      costs,  outputs,  etc.  

 *  Depends    on  quality  of    Informa6on  Systems  

             *  Programme  case  files  

             *  Past  evalua6ons  or  other  reviews  

             *  Studies  currently  ongoing  (e.g.  Internal  audits  or  Policy  review)  


 
Managing and Resource Requirements for Evaluations 127

Time  and  Resources  for  an  Evalua6on  

} Time  and  resource  requirements  can  vary  widely  


across  different  evalua6ons  
 

} Need  to  find  the  right  balance  between  a  


methodology/design  that  will  provide  credible  results  
AND  one  that  will  yield  results  in  a  6mely  fashion  &  
within  budget  
 

} Any  one  study  can  generally  be  conducted  via  


different  approaches  –  with  implica6ons  for  
methodological  purity,  credibility  of  results,  as  well  as  
6ming  and  costs  
 

} Evalua6on  =  science  +  art  

Factors  Influencing  the  Timing  &  Resourcing  


•  What  is  driving  the  need  for  the  Evalua6on?  –  Interim  or  Impact?  More  than  
one  organiza6on  implicated?  
 

•  Nature  &  complexity  of  the  programme  –mul6-­‐site?  Na6onal  vs  local;  etc.  
 

•  Accessibility  to  programme  clients  


 

•  Availability  &  quality  of  programme  administra6ve  records  


 

•  Availability  of  in-­‐house  exper6se  for  the  evalua6on  


 

•  Need  for  specialized  exper6se  


 

•  Need  for  data  collec6on  


 

•  Sensi6vity  of  evalua6on  issues  and  the  decisions  to  be  taken  with  the  results  

Addi6onal  Reading  
 

•  Using  Evalua,on  –  A  Prac,cal  Guide  for  Non-­‐Evaluators.    Lahey,  R.  (2008)  


 
•  Canadian  Evalua6on  Society  (CES)  Guidelines  for  ethical  conduct  

•  American  Evalua6on  Associa6on  (AEA)  Guiding  principles  for  evaluators  

•  Joint  Commi`ee  for  Standards  for  Educa6onal  Evalua6on  programme  


evalua6on  standards  

•  OECD  DAC  Summary  of  Key  Norms  &  Standards  –  Reference:  Morra  Imas,  
L.  &  Rist,  R.  (2009)  
•  Websites  
•  CES  www.evalua6oncanada.ca  
•  AEA  www.eval.org    
•  AES  www.aes.asn.au    
128 Advanced Monitoring and Evaluation Workshop Manual
Ethical Considerations 129

11 Ethical
Considerations

Learning Objectives:

By the end of this session, participants will be able to:

• Understand the important ethical considerations for


evaluations

• Know when and where to obtain ethics approval


130 Advanced Monitoring and Evaluation Workshop Manual
Ethical Considerations 131

Ethics  
•  The  study  of  the  fundamental  principles  that  define  
values  and  determine  moral  duty  and  obliga0on    
 
•  The  "science  (study)  of  morality“  
 
•  A  set  of  moral  principles  or  values    
 
•  The  principles  of  conduct  governing  an  individual  or  
group;  concerns  for  what  is  right  or  wrong,  good  or  
bad      

History  of  Research  Ethics  


•  Nazi  medical  war  crimes  (WWII)  
–  Experiments  on  prisoners  in  concentra0on  camps  
–  Injec0on  with  viruses,  immersion  in  ice,  etc.  
 
•  Nuremberg  Code  (1947)  
–  Developed  aVer  trial  of  23  persons  for  ‘crimes  against  
humanity’  
–  Basis  of  interna0onal  ethics  codes  
–  Voluntary  consent  of  the  par1cipants  is  absolutely  
essen1al  

•  Tuskegee  Syphilis  Study  


–  In  1932,  400  Black  men  in  Alabama  USA  and  200  controls  were  
followed  to  examine  natural  history  of  untreated  syphilis    
–  By  1936,  more  complica0ons  observed  in  the  syphilis  group  
–  By  1946,  mortality    
 was  2  0mes  greater  
–  Penicillin  was    
 discovered  in  1940’s    
 but  pa'ents  not    
 informed  or  treated  
–  Publicized  in    
 press  in  1972  
132 Advanced Monitoring and Evaluation Workshop Manual

Interna0onal  Guides  
•  Helsinki  Declara0on  (1964,,,,2008)  
–  Developed  by  the  World  Medical  Associa0on  
–  Sets  fundamental  principles  
•  Design  and  performance  of  research  
•  Informed  consent  
•  Ethical  review  
–  hap://www.wma.net/en/30publica0ons/10policies/b3/  
 
•  CIOMS  Interna0onal  Ethical  Guidelines  
–  Council  of  Interna0onal  Organiza0ons  of  Medical  Sciences  
–  Established  by  WHO  and  UNESCO  to  serve  interest  of  biomedical  community  
•  2002  Biomedical  research  guidelines  
•  2008  –  Epidemiological  studies  guidelines  
–  hap://www.cioms.ch/  

Four  Guiding  Principles  


•  Respect  for  persons  
–  Informed  consent,  privacy,  confiden0ality  of  informa0on  
•  Beneficence  
–  Maximize  benefits  for  individual  par0cipants  and  society  
while  minimizing  risks  
•  Non-­‐maleficence  
–  Above  all,  do  no  harm  
•  Jus1ce  
–  Fair  selec0on  of  par0cipants  
–  Protec0on  of  vulnerable  groups  –  special  safeguards  

Overview  -­‐  Ethical  Requirements  for  Studies  


•  Value  of  the  research  
–  Only  jus0fied  if  society  would  gain  from  the  exercise  
•  Scien0fic  validity  
–  Design  and  methods,  produce  valid  results  
•  Fair  selec0on  of  par0cipants  
–  Vulnerable,  ‘over-­‐studied’  groups  
•  Respect  for  persons  
–  Must  treated  with  dignity,  respect,  humanity  and  efforts  made  to  
safeguard  their  welfare  
•  Favorable  risk-­‐benefit  ra0o  
 
•  Informed  consent  
 
•  Independent  (local)  review  of  the  protocol  
–  Sensi0vity  to  local  culture  
Ethical Considerations 133

Risk  vs.  Benefit  


•  Must  es0mate  and  minimize  harm  
–  Physical,  psychological,  social,  legal,  economic  
•  Must  enhance  benefits  
–  Individual  –  improved  health  
–  Society  –  genera0on  of  knowledge  
•  Compare  risks  and  benefits  
–  No  simple  formula  but  
–  The  greater  the  risk,  the  greater  should  the  benefit  be  
–  Consider  benefit  to  individual  and  society  
•  Some0mes  no  benefit  to  individual  

Informed  Consent/Assent  
•  Individual  have  right  to  choose  whether  to  
par0cipate  (un-­‐coerced)  and  to  stop  at  any  0me  
without  any  penalty  
 

•  Must  be  informed  about  purpose,  methods,  risks,  


benefits,  alterna0ves  
 

•  Proxy  decision  maker  needed  for  persons  with  


diminished  capacity  
 

•  Children  (below  18y)  


–  need  permission  of  guardian  
–  Must  also  give  assent  

Consent  Form  
•  Explana0on  of  purpose  of  study  
•  Dura0on  of  par0cipa0on  
•  Descrip0on  of  procedures,  poten0al  risks  and  benefits  
•  Disclosure  of  alterna0ve  procedures  
•  Descrip0on  of  how  confiden0ality  will  be  maintained  
•  Whether  compensa0on  will  be  given  or  if  treatment  is  available  
if  injured  
•  Person  to  contact  for  addi0onal  info  about  study  –  address,  
phone  numbers  etc.  
•  Par0cipa0on  is  voluntary  and  no  penalty  or  loss  of  benefits  if  
refuse    
•  Par0cipant  has  read  the  form  and  understands  it  and  signature  
indicates  agreement  
•  Signature  of  witness  also  needed  
134 Advanced Monitoring and Evaluation Workshop Manual

Confiden0ality/Privacy  
•  Need  for  iden0fiers  such  as  name,  address?  
–  Poten0al  for  social  and  other  harm  
   

•  Use  of  medical  records  


–  All  iden0fiable  data  must  be  blocked  out  
–  Data  collected  for  clinical  or  admin  purposes,  not  research  
•  If  no  consent  received  and  ID  not  blocked  out,  then  privacy  
breached  by  inves0gator  
•  If  no  consent  and  iden0fiable  data  released  to  third  party,  
confiden0ality  breached  

Fair  Selec0on  of  Par0cipants    


•  Be  aware  of  social  biases  
–  reduced  autonomy  
 

•  Should  be  related  to  topic  being  studied,  not  simply  


due  to  availability  
 

•  All  relevant  groups  should  be  given  opportunity  to  


par0cipate  
–  Women  and  children  
 

•  Should  have  access  to  benefits  of  the  research  


–  those  who  benefit  should  also  share  risks  

Vulnerable  Popula0ons  
•  Diminished  ability  to  protect  interests  
•  Need  special  jus0fica0on  
•  Special  safeguards  
–  Children  
–  Pregnant  women  
–  Fetus  
–  Mentally  disabled  
–  Terminally  ill  
–  Prisoners  
–  Dependent  posi0ons  
–  Disadvantaged  –  economically,  educa0onally  
Ethical Considerations 135

Recruitment  of  Research  Par0cipants  


•  Conflicts  of  interest    
–  Financial,  career  
–  Disclose  all  poten0al  conflicts  to  ethics  commiaee  
•  Equitable    
–  Fair  procedures  and  outcomes  in  the  selec0on  of  
par0cipants  
–  Must  consider  vulnerable  popula0ons  
•  Strategies  
–  Random  selec0on?  
–  Incen0ves  -­‐  volunteers  

Incen0ves    
•  To  encourage  people  to  par0cipate  
–  Need  to  recruit  adequate  numbers  in  0mely  
fashion  
 

•  Offered  to  par0cipants  (and  recruiters  such  as  


doctors)  
 

•  Must  avoid  exploita0on  and  minimize  


coercion  and  undue  influence  

Incen0ve  vs.  Undue  Influence  


•  Overlap  –  a  con0nuum  
•  Undue  influence  
–  An  offer  one  cannot  refuse  
–  Controlling  and  irresis0ble  
–  Problems  
•  Safety  –  chance  of  distorted  judgment  rela0ng  to  risks  
•  Representa0veness  of  sample  
•  Compromise  informed  consent  process  
136 Advanced Monitoring and Evaluation Workshop Manual

Money  as  Incen0ve  


•  Special  situa0on?  
–  Compared  with    access  to  treatment,  care,  etc  
•  Money  
–  Not  a  form  of  coercion  as  it  is  not  a  ‘threat  of  harm  to  
compel  individual  to  do  something’  but  can  be  an  undue  
inducement    
–  Is  there  a  no  or  an  acceptable  level  of  risk?  
     

•  Around  for  some  0me  –  1820’s  

Research  Ethics  Commiaees  /    


Ins0tu0onal  Review  Boards  
•  Must  review  protocol  and  give  permission  BEFORE  data  
collec0on  begins  
–  Can  request  changes  or  refuse  to  allow    the  conduct  of  
unacceptable  research  
 

•  Also  has  role  to  monitor  research  to  ensure  par0cipant’s  


rights  not  violated  
–  Inves0gator  has  duty  to  properly  record  and  store  all  data  and  
supply  commiaee  with  monitoring  info  (e.g.  adverse  events)  
 

•  All  research  on  humans  should  be  submiaed  for  approval  


–  Let  commiaee  decide  if  need  approval  

Process  in  Securing  Permission  


•  Iden0fy  relevant  ethics  commiaee  
–  Ministry  of  Health  
–  Not  all  Caribbean  countries  have  RECs  
–  Chief  Medical  Officer  ul0mately  responsible  
 

•  Submit  applica0on  
–  Follow  guidelines  
–  Allow  sufficient  0me  for  review  
–  If  necessary,  address  concerns  and  resubmit  
 

•  Follow  all  instruc0ons  in  approval  leaer  


–  Reports  etc.  
Ethical Considerations 137

Applica0on  Document  
•  Title  page  
–  Name  of  Applicants,  Ins0tu0on,  Dates  (applica0on,  start  of  
study,  dura0on)  
 

•  Details  of  research  study  


–  Background,  objec0ves,  study  design,  study  popula0on,  
sampling  and  sample  size,  data  collec0on  procedures,  data  
analysis  
–  Ethical  issues  
•  Informed  consent  process,  risk/benefit  analysis,  vulnerable  groups,  
conflicts  of  interest  
•  Appendices  
–  Consent  form  
–  Data  collec0on  instrument    

On-­‐line  Tutorial  
•  For  NIH  funded  grantees  
•  Human  Par0cipant  Protec0ons  Educa0on  for  
Research  Teams  
•  hap://bioethics.od.nih.gov/casestudies.html  
•  Tutorial  and  exercise  
•  Cer0ficate  

Regional  Ini0a0ves  
•  Bioethics  Society  of  the  English  Caribbean  
–  www.bioethicscaribe.org.jm  
 
•  Caribbean  Research  Ethics  Ini0a0ve    
–  www.caribbeanethics.com    
138 Advanced Monitoring and Evaluation Workshop Manual

Session  Summary  
•  All  research  and  evalua0ons  must  be  ethically  
conducted  
•  Ethical  research/evalua0on:  
–  Demonstrates  the  value  of  the  research/evalua0on  
–  Is  well  designed  (scien0fically  valid)  
–  Demonstrates  fair  par0cipant  selec0on  
–  Respects  persons  
–  Has  a  favourable  risk-­‐benefit  ra0o  
–  Acquires  informed  consent  
–  Undergoes  independent  review    

Session  Summary  
•  Ethics  approval  must  be  sought  before  the  field  work  
is  undertaken  
•  Recommenda0ons  from  the  ethical  review  must  be  
incorporated  into  the  research/evalua0on  design  
•  The  ethics  review  process  is  different  country  to  
country    
•  Ethics  review  may  be  done  by  the  Ministry  of  Health,  
an  Ethics  Commiaee  or  Ins0tu0onal  Review  Board  
Challenges to Conducting Evaluations 139

12 Challenges to
Conducting
Evaluations

Learning Objectives:

By the end of this session, participants will be able to:

• Discuss some of the strategic and methodological challeng-


es associated with evaluation practice

• Consider strategies to address those challenges


140 Advanced Monitoring and Evaluation Workshop Manual
Challenges to Conducting Evaluations 141

There  is  oBen….  


An  adequate  budget  but  lack  of  data  
•  No  comparable  baseline  data  and/or  inability  to  include  a  comparison  
group  in  evalua.on  design  

A  limited  budget  but  plenty  of  5me  


•  Na.onal  evalua.on  teams  may  not  have  the  
       resources  to  bring  in  foreign  exper.se  or  to  conduct  large    
       scale  sample  surveys—but  they  may  have  plenty  of  .me  to    
       use  qualita.ve  methods  and  small-­‐scale  longitudinal    
       studies  

An  adequate  budget  but  limited  5me  


•  This  is  oBen  the  situa.on  when  external  evaluators  are  contracted  to  
work  under  .ght  deadlines  and  with  limited  .me  in  the  field  

Responsibili.es  of  the  Evaluator  


•  Achieve  maximum  possible  evalua.on  rigor  within  the  
limita.ons  of  a  given  context  

•   Iden.fy  and  control  for  methodological  weaknesses  in  the  


evalua.on  design  

•   Nego.ate  with  clients  trade-­‐offs  between  desired  rigor  


and  available  resources  

•  When  presen.ng  findings  it  is  important  to  recognize  


       methodological  weaknesses  and  how  they  affect  
generaliza.on  

Suggested  Solu.on  
•  The  RealWorld  Evalua5on  Approach  developed  by  Jim  
Rugh  and  Samuel  Bickel  
•  (based  on  joint  work  with  Michael  Bamberger  and  
Linda  Mabry).  African  Evalua5on  Associa5on  (AfrEA)  

•  Developed  to  assist  evaluators  to  conduct  evalua.ons  


that  are  as  methodologically  sound  as  possible  when  
opera.ng  with  budget  and  .me  constraints  with  
limita.ons  on  the  types  of  data  to  which  they  have  
access,  and  poli.cal  pressures  to  produce  RESULTS  
142 Advanced Monitoring and Evaluation Workshop Manual

What  is  Special  About  the  


Real  World  Evalua5on  Approach?  
•  There  is  a  series  of  steps,  each  with  checklists  
for  iden.fying  constraints  and  determining  
how  to  address  them  

Steps  in  the  Real  World  


Evalua5on  (RWE)  Approach  
•  Step  1:  Planning  and  scoping  the  evalua.on  
•  Step  2:  Addressing  budget  constraints  
•  Step  3:  Addressing  .me  constraints  
•  Step  4:  Addressing  data  constraints  
•  Step  5:  Addressing  poli.cal  constraints  
•  Step  6:  Assessing  the  strengths  and  weaknesses  
of  the  evalua.on  design  
•  Step  7:  Addressing  the  weaknesses  and  
strengthening  the  evalua.on  design  

Step  1:  Planning  and  scoping  the  


evalua.on  
•  Define  client  informa.on  needs  and  
understanding  the  poli.cal  context  

•  Define  the  programme  theory  model  

•  Iden.fy  .me,  budget,  data  and  poli.cal  


constraints  to  be  addressed  by  the  RWE  

•  Select  the  design  that  best  addresses  client  needs  


within  the  RWE  constraints  
Challenges to Conducting Evaluations 143

Step  2  
Addressing  budget  constraints  
•  Modify  evalua.on  design  

•  Ra.onalize  data  needs  

•  Look  for  reliable  secondary  data  

•  Revise  sampling  strategy  

•  Economical  data  collec.on  methods  

Step  3  
Addressing  .me  constraints  
•  All  Step  2  tools  plus:  

•   Commissioning  preparatory  studies  

•  Hire  more  resource  persons  

•  Revise  format  of  project  records  to  include  


cri.cal  data  for  impact  analysis    

•  Modern  data  collec.on  and  analysis  technology  

Step  4  
Addressing  data  constraints  
•  Reconstruc.ng  baseline  data  

•  Recrea.ng  comparison  groups  

•  Working  with  nonequivalent  comparison  groups  

•  Collec.ng  data  on  sensi.ve  topics  or  from  


difficult  to  reach  groups  

•  Mul.ple  methods  
144 Advanced Monitoring and Evaluation Workshop Manual

Step  5  
Addressing  poli.cal  constraints  
•  Accommodate  pressures  from  funding  
agencies  or  clients  on  evalua.on  design  

•  Address  stakeholder  methodological  


preferences  

•  Recognize  influence  of  professional  research  


paradigms  

Step  6  
Assessing  the  strengths  and  
weaknesses  of  the  evalua.on  design  
•  Iden.fy  threats  to  validity  of  quasi-­‐
experimental  designs  

•  Assess  the  adequacy  of  qualita.ve  designs  

•  An  integrated  checklist  for  mul.-­‐method  


designs  

Step  7  :  Addressing  the  iden.fied  weaknesses  


and  strengthening  the  evalua.on  design  

•  Objec.vity/confirmability  

•  Replicability/dependability  

•  Internal  validity/credibility/authen.city  

•  External  validity/transferability/fit  

•  U.liza.on/applica.on/ac.on  orienta.on  
Challenges to Conducting Evaluations 145

In  Conclusion  -­‐  1  
Evaluators  must  be  prepared  to:  

•  Enter  at  a  late  stage  in  the  project  cycle;  


•  Work  under  budget  and  .me  restric.ons;  
•  Not  have  access  to  compara.ve  baseline  data;  
•  Not  have  access  to  iden.fied  comparison  groups  
•  Work  with  very  few  well  qualified  evalua.on  
researchers;  
•  Reconcile  different  evalua.on  paradigms  and      
informa.on  needs  of  different  stakeholders  

In  Conclusion  -­‐  2  
•  Evaluators  must  be  prepared  for  real-­‐world        
evalua9on  challenges  
•  There  is  considerable  experience  to  draw  on  
•  A  toolkit  of  rapid  and  economical  “RealWorld”        
evalua9on  techniques  is  available  
•  Never  use  9me  and  budget  constraints  as  an  
       excuse  for  sloppy  evalua9on  methodology  
•  A  “threats  to  validity”  checklist  helps  keep  you  
honest  by  iden9fying  poten9al  weaknesses  in  
your  evalua9on  design  and  analysis  
146 Advanced Monitoring and Evaluation Workshop Manual
Writing an Evaluation Report 147

13 Writing an
Evaluation Report

Learning Objectives:

By the end of this session, participants will be able to:

• Know why evaluation reports should be written

• Understand the components of an Evaluation Report

• Generate and outline for an Evaluation Report


148 Advanced Monitoring and Evaluation Workshop Manual
Writing an Evaluation Report 149

Why  Write  an  Evalua$on  Report?  

•  Formal  Record  
What  you  discovered  in  conduc$ng  an  evalua$on,  in  terms  of  both  process  
and  evalua$on  results,  may  be  applicable  to  future  programmes.  An  
evalua$on  report  is  assurance  that  lessons  learned  are  available  for  future  
applica$on.  
 
•  Your  work  can  help  others  
Sharing  your  evalua$on  report  with  peers  who  may  be  considering  the  
development  of  similar  programmes  may  help  them  to  more  effec$vely  
design  their  programmes  

3  

Why  Write  an  Evalua$on  Report?    

•  Se6ng  a  founda9on  for  future  evalua9on  efforts  


It’s  much  easier  to  design  an  evalua$on  based  on  former  experience  
than  to  start  “from  scratch”.  A  report  outlining  what  you  did,  and  
why,  as  well  as  what  worked  and  what  should  be  altered  in  the  
future,  provides  a  solid  base  from  which  to  plan  a  new  evalua$on.    

4  

A  Note  on  Style  and  Format  


•  Good  evalua$on  reports  strike  a  balance  between  depth  and  length  
so  as  a  general  rule,  reports  should  be  kept  to  less  than  40  pages,  
with  addi$onal  materials  provided  in  annexes.    
 
•  Another  tradeoff  is  between  the  use  of  sophis$cated  ‘scien$fic’  
language  and  a  graphic  narra$ve  that  captures  and  holds  the  
interest  of  the  reader.  Combine  different  ways  to  convey  
informa$on,  such  as  good  quotes,  small  case  studies  set  in  text  
boxes,  and  key  data  displays  using  clearly  labeled  data  tables  and  
charts.    

5  
150 Advanced Monitoring and Evaluation Workshop Manual

A  Note  on  Style  and  Format  


•  Evalua$on  reports  must  be  credible  and  convincing  to  the  
immediate  stakeholders  and  client.  The  evalua$on  report  is  
the  primary  instrument  by  which  an  audience  beyond  the  
client  may  be  reached  thus,  it  is  essen$al  that  the  report  be  
clear,  concise,  powerful,  and  persuasive  to  the  general  
professional  reader.  

6  

Components  of  an  Evalua0on  Report  


•  Cover  Page,     •  Introduc0on  
•  Table  of  Contents   •  The  Problem  
•  List  of  Figures  and/or  List  of   •  Purpose  of  the  Evalua0on  
Tables   •  Design  &  Methodology  
•  Preface  or  Foreword   •  Findings  
•  Acknowledgements   •  Conclusion  
•  Project  Summary   •  Limitatons  
•  Glossary   •  Recommenda0ons  
•  Acronyms   •  Lessons  Learned  
•  Execu0ve  Summary   •  Annex  
   
 
 
7  

Components  of  an  Evalua$on  Report  


•  Cover  Page,  Table  of  Contents,  List  of  Figures  
and/or  List  of  Tables  
•  Preface  or  Foreword  
-­‐  The  preface  or  foreword  is  usually  prepared  if  the  report  is  
part  of  a  series  or  needs  special  explana$on  
•  Acknowledgements  
-­‐  This  is  where  you  thank  all  those  persons  who  provided  
support  during  the  evalua$on  process  
 
8  
Writing an Evaluation Report 151

Components  of  an  Evalua$on  Report  


•  Project  Summary  
-­‐  The  project  summary  gives  key  informa$on  on  the  project  
including  but  not  limited  to  
ü  Project  name  and  Strategic  Objec$ves  
ü  Project  Number  
ü  Life  of  the  project  
ü  Implemen$ng  Partners  and  contract  numbers  
ü  Project  Funding  

9  

Components  of  an  Evalua$on  Report  


•  Glossary  
-­‐  The  glossary  is  an  especially  useful  sec$on  if  the  report  
u$lized  technical  or  project  specific  terminology  that  would  
be  unfamiliar  to  the  otherwise  generally  informed  reader.  
•  Acronyms  
-­‐  Write  out  the  proper  names  for  all  the  acronyms  used  in  the  
report.  Remember  also  to  do  the  same  in  the  report  on  first  
use.  

 
10  

Components  of  an  Evalua$on  Report  


•  Execu9ve  Summary  
-­‐  The  Execu$ve  Summary  is  an  abbreviated  version  of  the  most  important  
parts  of  the  report.    

-­‐  Everything  in  the  Execu$ve  Summary  should  be  based  directly  on  what  is  
in  the  report.  No  new  informa$on  should  be  presented  in  the  Execu$ve  
Summary.  

-­‐  Generally,  an  Execu$ve  Summary  should  be  between  three  and  five  pages  
(depending  on  the  length  of  the  report).  

-­‐  The  average  busy  reader  should  come  away  with  an  understanding  of  
what  the  project  was  about,  the  main  evalua$on  ques$ons,  key  findings,  
and  major  conclusions  and  recommenda$ons  from  the  Execu$ve  
Summary.    

11  
152 Advanced Monitoring and Evaluation Workshop Manual

Components  of  an  Evalua$on  Report  


•  Introduc9on  
-­‐  The  introduc$on  should  inform  the  reader  about  the  context  in  which  the  
interven$on  took  place.  
   
-­‐  This  would  include  a  summary  of  the  relevant  history,  demography,  socio-­‐
economic  status,  and  basic  poli$cal  arrangements  of  the  community,  
country,  or  region.    
 
-­‐  If  the  evaluated  ac$vi$es  take  place  in  sub-­‐areas  of  the  host  country,  or  if  
there  are  other  contextual  spa$al  aspects  that  would  help  the  reader  to  
visualize  the  task  environment,  include  a  map  of  the  target  area  here  or  in  
the  next  sec$on  of  the  evalua$on  report.  

12  

Components  of  an  Evalua$on  Report  


•  The  Problem  
-­‐  Every  evalua$on  needs  to  place  the  project  it  examines  in  context  by  
describing  the  problem  and  the  nature  of  the  interven$on  
 
-­‐  Despite  its  importance,  many  evalua$on  reports  give  only  cursory  
a?en$on  to  the  problem.  
   
-­‐  It  represents  the  “gap”  between  the  reality  at  the  start  of  the  interven$on  
and  a  more  desirable  condi$on  the  interven$on  a?empted  to  create.    
 
-­‐  When  the  problem  is  not  well  defined  or  is  wrongly  stated,  the  ensuing  
interven$on  may  be  well  done  (efficiency),  but  will  have  li?le  impact  
(effec$veness).    

13  

Components  of  an  Evalua$on  Report  


•  The  Problem  Statement  
-­‐  It  should  explain  what  it  was  that  an  ac$vity  or  programme  set  out  to  fix  
or  change.  It  should  also  describe  any  theories  that  were  set  forth  in  
design  documents,  or  emerged  from  other  literature  the  evalua$on  team  
reviewed,  that  a?empt  to  explain  why  the  problem  exists.    
 
-­‐  When  baseline  informa$on  is  available,  it  can  be  helpful  to  include  charts  
and  tables  with  the  quan$ta$ve  data  describing  the  baseline  condi$ons  at  
the  beginning  of  the  project.    

14  
Writing an Evaluation Report 153

Components  of  an  Evalua$on  Report  


•  The  Theory  of  the  Interven3on  
-­‐  Now  that  the  reader    knows  what  the  interven$on  was  about.  Laying  out  
the  project’s  design  and  implementa$on  structure  helps  the  reader  to  
understand  your  evalua$on  findings  and  conclusions.    

-­‐  What  was  the  ac$vity  or  programme  about?  In  this  sec$on  the  evaluator  
must  provide  the  reader  a  concise  picture  of:    
a.  What  the  project  was  going  to  do?  
b.  What  the  objec$ves  were?  
c.  How  it  was  to  be  done?    
d.  Where  it  was  to  be  done?    
e.  Who  was  going  to  do  it?    
f.  At  what  cost?    
15  

Components  of  an  Evalua$on  Report  


•  Purpose  of  the  Evalua9on  
-­‐  This  sec$on  sets  out  the  purpose  of  the  evalua$on  and  describes  the  
intended  uses  of  the  evalua$on.  
 
-­‐   Sta$ng  the  purpose  of  the  evalua$on  helps  the  reader  to  understand  the  
management  or  programme  decision  that  led  to  the  evalua$on,  and  to  
frame  their  own  cri$cal  reading  of  the  report  from  that  perspec$ve.  
 
-­‐  Some  evalua$ons  are  mandatory,  s$ll,  it  is  useful  to  go  beyond,  “We  have  
do  it,”  as  an  answer  to  the  ques$on  of  “why”  the  evalua$on  is  being  done.  
 
-­‐  This  sec$on  also  should  include  the  ques$ons  that  the  evalua$on  will  
a?empt  to  answer.    

16  

Components  of  an  Evalua$on  Report  


•  Design  and  Methodology  
-­‐  The  credibility  of  an  evalua$on’s  conclusions  rests  on  the  quality  of  
the  evidence  that  supports  them  which,  in  turn  depends  on  the  
appropriateness  of  the  design  and  methodology  for  data  collec$on  
and  analysis  used  by  the  evalua$on  team.  

-­‐  In  this  sec$on  the  evalua$on  design  (quasi-­‐experimental,  mixed  


methods,  longitudinal  data,  randomized  trials,  etc.)  and  the  
methods  used  to  collect  data  should  be  presented  in  summary  
form.    

-­‐  The  descrip$on  should  include  the  unit  of  analysis,  selec$on  of  
samples,  data  collec$on  instruments,  types  of  data  collected,  
analy$c  techniques  used,  who  did  it,  and  when  it  was  done.    

17  
154 Advanced Monitoring and Evaluation Workshop Manual

 
 
 
 
•  Design  and  Methodology  con9nued…  
 
-­‐  Ques$onnaires,  observa$on  checklists,  descrip$ons  of  
sampling  procedures,  data  analysis  procedures,  and  other  
 
suppor$ng  materials  should  be  included  in  the  Annex.    
 
-­‐  Here,  if  space  permits,  a  very  useful  summary  chart  can  be  
displayed  which  aligns  the  evalua$on  ques$ons  with  the  data  
type  and  source  used  to  answer  each  ques$on.    

18  

Components  of  an  Evalua$on  Report  


Data  Analysis  
-­‐  Analysis  is  an  integral  step  in  the  process  by  which  an  evalua$on  team  
arrives  at  its  findings.  Omen  there  are  several  steps  in  an  analysis  process  
including:    
1.  Tabula$on  of  raw  data  and  calcula$on  of  frequencies  and  percentages  
for  a  survey,  set  of  structured  observa$ons,  or  physical  measures.    
2.  Comparison  of  data  for  a  target  area  or  group  to  non-­‐target  areas  or  
groups  (controls),  comparisons  to  interna$onal  norms  or  standards,  
and/or  comparison  to  expecta$ons.  
3.  Examina$on  of  various  explana$ons  as  to  WHY  a  result  or  impact  has  
occurred.    
-­‐  This  is  a  cri$cal  part  of  the  evalua$on  team’s  responsibility  to  explain,  
rather  than  just  observe.  For  every  finding,  the  team  needs  to  discuss  as  
many  alterna$ve  explana$ons  as  possible.    

19  

Components  of  an  Evalua$on  Report  


•  Findings  
-­‐  This  sec$on  must,  at  a  minimum,  address  the  ques$ons  posed  in  the  
evalua$on  protocol.  If  there  were  mul$ple  ques$ons  related  to  a  single  
macro  ques$on,  the  answers  should  be  bundled  in  a  manner  that  
contributes  to  answering  the  macro-­‐ques$on.    
-­‐  The  substance  of  the  Findings  sec$on  depends  very  much  on  the  
ques$ons  that  have  to  be  answered.  In  general,  these  findings  can  be  
presented  in  two  parts:    
a.  Process  Findings  about  the  Management  and  Implementa3on  of  the  
programme  
b.  Outcome  and  Results  Findings  about  the  Project’s  Achievements  and  
Consequences  

20  
Writing an Evaluation Report 155

Components  of  an  Evalua$on  Report  


a.  Process  Findings  about  the  Management  and  Implementa3on  of  the  programme  
-­‐  In  many  projects,  how  the  project  is  implemented  may  be  as  important  a  finding  in  an  
evalua$on  as  whether  the  objec$ves  were  achieved.  The  ques$ons  generally  answered  
in  a  process  evalua$on  are:    
§  Were  the  right  people  in  place?    
§  Were  the  inputs  well  organized  and  $mely?    
§  Were  reports  and  evalua$ons  done  and  were  they  used?    
§  Did  the  implementa$on  process  interfere  with  or  advance  the  achievement  of  stated  
objec$ves?    
-­‐  The  extent  to  which  the  evalua$on  pays  detailed  a?en$on  to  management  issues  is  
usually  a  func$on  of  the  scope  of  work  and  whether  the  evalua$on  is  a  mid-­‐term  or  
end  of  term  evalua$on.    
Note:  Process  ques3ons  are  also  a  principal  focus  of  scale-­‐up  projects  where  the  
interven3on  is  already  known  to  produce  certain  results.  In  these  cases,  the  key  
evalua3on  ques3ons  examine  the  degree  to  which  the  proper  processes  are  being  
followed.    

21  

Components  of  an  Evalua$on  Report  


b.  Outcome  and  Results  Findings  about  the  Project’s  Achievements  and  
Consequences  
-­‐  Results  are  usually  an  important  evalua$on  focus.  The  level  of  results  on  
which  an  evalua$on  reports  will  be  a  func$on  of  whether  the  evalua$on  
was  carried  out  at  the  ac$vity  level,  intermediate  results  level  or  strategic  
objec$ve  level.  
   
-­‐  Some  evalua$ons  examine  results  achieved  part-­‐way  through  the  
implementa$on  of  an  ac$vity  or  programme.  Others  look  at  results  at  the  
end  of  period  of  $me  while  other  evalua$ons  examine  longer-­‐term  
impact,  including  the  extent  to  which  results  were  sustained  amer  the  end  
of  the  project.    

22  

Components  of  an  Evalua$on  Report  


•  Findings  con9nued…  
-­‐  While  most  evalua$on  reports  include  separate  sec$ons  on  findings,  
conclusions,  and  recommenda$ons,  it  is  also  possible  to  prepare  a  report  
using  an  alterna$ve  structure,  e.g.  addressing  one  ques$on  at  a  $me  and  
providing  findings,  conclusions  and  recommenda$ons  on  each  ques$on  
separately.  This  alterna$ve  structure  works  best  in  situa$ons  where  the  
evalua$on  ques$ons  are  very  different  from  each  other.    

23  
156 Advanced Monitoring and Evaluation Workshop Manual

Components  of  an  Evalua$on  Report  


•  Conclusions  
-­‐  Conclusions  involve  judgments.  They  emerge  when  evalua$ve  reasoning  is  
used  to  interpret  evalua$on  findings  
-­‐  It  is  in  this  sec$on  that  an  evalua$on  team  most  omen  sets  forth  its  deduc$ons  
about  why  a  project  succeeded  or  failed  to  achieve  its  intended  results.  
Evalua$on  findings,  in  this  sense,  are  similar  to  a  set  of  medical  symptoms,  
while  conclusions  are  like  the  diagnosis.  Conclusions  interpret  what  findings  
mean.    
-­‐  Conclusions  should  be  the  capstone  of  the  evalua$on  report’s  narra$ve  
thread.  They  $e  together  the  team’s  findings  and  analyses  presented  in  
previous  sec$ons  as  well  as  establish  the  case  for  the  recommenda$ons  in  the  
final  sec$on.    

24  

Components  of  an  Evalua$on  Report  


•  Limita9ons/  Challenges  
-­‐  Adding  excep$ons  or  limita$ons  in  a  report  allows  the  reader  to  understand  
the  situa$on  surrounding  those  research  methods  chosen  and  those  results  or  
conclusions  met.  
   
-­‐  In  a  report,  the  researcher  will  acknowledge  the  limita$ons  of  their  research  
method  or  other  aspects  of  the  report  in  relevant  sec$ons  -­‐-­‐  for  example,  
limita$ons  affec$ng  the  methods  used  will  be  included  in  the  methodology  
sec$on  of  the  report.  
 
-­‐   Some  report  formats  specify  limita$ons  in  another  sec$on  of  the  report,  such  
as  only  in  the  conclusion,  but  it  is  normally  the  researcher  who  decides  where  
the  limita$ons  should  be  wri?en.    
 
 

25  

Components  of  an  Evalua$on  Report  


•  Recommenda9ons  
-­‐  This  sec$on  focuses  on  the  future.  It  is  where  you  get  a  chance  to  say  
what  changes  need  to  be  made.    
-­‐  Recommenda$ons,  in  the  judgment  of  the  evaluators,  may  range  from  
minor  ‘tweaking’  of  a  project  design  or  implementa$on  plan,  to  major  
restructuring,  or  ending  the  en$re  project.  If  the  project  was  a  complete  
success,  a  team  may  want  to  recommend  con$nua$on,  “scaling  up”  the  
interven$on  to  serve  more  people  or  a  larger  area,  or  an  analysis  of  its  
replica$on  poten$al  in  other  regions  with  similar  problems.    
-­‐  Prior  to  preparing  a  set  of  recommenda$ons,  an  evalua$on  team  is  
encouraged  to  review  the  purpose  and  the  uses  envisioned  when  the  
evalua$on  was  commissioned.    

26  
Writing an Evaluation Report 157

Components  of  an  Evalua$on  Report  


•  Good  Recommenda3ons:
a.  Follow directly from the evaluation’s findings and the conclusions;
b.  Are supported by thorough and sound analysis and evaluative
reasoning;
c.  Are “actionable,” meaning the changes are feasible and can
actually be made
d.  Identify who needs to take the recommended actions, whether
funder, the implementing partner, the private sector or some
element of the host government organization.

27  

Components  of  an  Evalua$on  Report  


•  Lessons  Learned  
-­‐  Efforts  to  derive  ‘Lessons  Learned’  are  not  always  
appropriate,  nor  are  all  clients  interested  in  this.  Usually,  end  
of  project  or  ex-­‐post  impact  studies  present  an  opportunity  to  
derive  lessons  learned,  as  the  project  experience  is  longer  and  
more  mature  than  would  be  found  in  a  mid-­‐term  evalua$on.    

28  

Components  of  an  Evalua$on  Report  


•  The  Annex  
-­‐  This  is  a  useful  place  to  put  important  material  that  doesn’t  go  into  the  
main  body  of  the  report,  but  can  be  helpful  to  the  reader  who  wants  to  
know  more.    
-­‐  Annexes  should  be  wri?en  as  stand  alone  pieces,  since  readers  omen  turn  
to  them  selec$vely.    

-­‐  What  usually  goes  into  the  annex?    


ü  The  Evalua$on  Protocol    
ü  Copies  of  the  data  collec$on  instruments  
ü  A  list  of  persons  interviewed    
ü  Addi$onal  informa$on  on  the  project  context  or  design  such  as  maps    
ü  A  list  of  eviden$ary  documents  reviewed  
ü  Details  about  the  methods  used  such  as  sample  size  calcula$on,  sampling,  
data  analysis    
 

29  
158 Advanced Monitoring and Evaluation Workshop Manual

In  Closing:  A  Few  Considera$ons  


•  Determine  who  is  expected  to  read  the  report  (specialists/
non-­‐specialists)  and  what  use  the  reader  is  likely  to  make  of  
the  report  before  wri$ng  it.  
•  Determine  if  the  report  is  for  internal  purposes  or  public  
distribu$on.  
•  Develop  an  outline  for  your  report  to  ensure  that  all  
necessary  informa$on  is  included.  
•  Determine  how  you  will  present  your  findings  -­‐  by  
stakeholder  group,  by  themes  that  emerged,  or  in  some  other  
way.  

30  

Some  Evalua$on  Report  Wri$ng  


Resources…  
•  Independent  Evalua$on  Group  of  the  World  Bank  website:  
h?p://www.ieg.worldbankgroup.org  
•  The  ABCs  of  Evalua$on  
h?p://my.safaribooksonline.com/book/project-­‐management/  
•  Evalua$on  Support  Guide  
h?p://www.evalua$onsupportscotland.org.uk/downloads/
SupportGuide11aug08.pdf  
•  Monitoring  &  Evalua$on:  Tools,  Methods  and  Approaches  
h?p://siteresources.worldbank.org  
•  The  Art  and  Architecture  of  Wri$ng  Evalua$on  Reports  
h?p://www.tbs-­‐sct.gc.ca/cee/career-­‐carriere/workshops-­‐
ateliers/aawer-­‐amrre-­‐eng.pdf  
  31  

Working  Session  
•  Using  the  components  outlined  in  this  
presenta$on,  for  a  programme/project  of  
your  choice,  generate  a  template  for  an  
evalua$on  report.  
•  Please  note  that  a  descrip$on  of  what  ought  
to  be  involved  under  each  component  should  
be  included.  

32  
159

Bibliography
American Evaluation Association. Guiding principles for evaluators. http://www.eval.org/publications/
guidingprinciples.asp (accessed September 2nd, 2012).

Bamberger M, Rugh J, Marby L. 2012. RealWorld Evaluation: Working under budget, time, data, and
political constraints. Sage Publications, Inc.

Becker H. 1970. Problems of inference and proof in participant observation. In H.S. Becker Sociological
work: Method and substance. Chicago: Aldine (Reprinted from American Sociological Review 1958 23:
652 – 660)

Boulmetis J, Dutwin J. 2011. The ABCs of Evaluation: Timeless techniques for program and project
managers. JohnWiley & Sons, Inc.

Bradford-Hill A. 1965. The environment and disease: Association or causation? Proceedings of the Royal
Society of Medicine 58: 295–300.

Canadian Evaluation Society. CES guidelines for ethical conduct. www.evaluationcanada.ca/site.


cgi?section=5&ssection=4&_lang=an// (accessed September 2nd, 2012)

Centre of Excellence for Evaluation (Treasury Board of Canada Secretariat). 2004. The art and
architecture of writing evaluation reports. www.tbs-sct.gc.ca/cee/career-carriere/workshops-ateliers/
aawer-amrre-eng.pdf// (accessed September 2nd, 2012)

Cook TD, Reichardt CR. 1979. Qualitative and quantitative methods in evaluation research. Sage
Publications, Inc

Creswell JW. 2002. Research Design: Qualitative, quantitative and mixed methods approaches.
Thousand Oaks, CA: Sage Publications Inc.

Creswell JW, Plano Clark V. 2006. Designing and conducting mixed methods research. Thousand Oaks,
CA: Sage Publications Inc.

Dawkins N. Evaluability Assessments: Achieving better evaluations. www.orau.gov/hsc/


hdspinstitute/2010/session-summaries/wk-21.htm. (Accessed November 11, 2012).

Epi Info Workshop. University of California, Berkeley – Center for Infectious Disease and Emergency
Preparedness [Online]. 2011 [cited 2011 Aug 24].  Available from URL : http://www.idready.org/epi/

Epi Info  Tutorials. Center for Disease Control and Prevention [Online]. 2011 [cited 2011 Aug 24]. 
Available from URL : http://wwwn.cdc.gov/epiinfo/html/tutorials.htm

Evaluation Support Scotland. Evaluation Support Guide 11: Report writing. www.
evaluationsupportscotland.org.uk/downloads/SupportGuide11aug08.pdf // (accessed September 2nd,
2012)

Fetterman DM, Abraham W. 2004. Empowerment evaluation principles in practice. New York: The
Guilford Press.

Glasgow RE, Lichtenstein E, Marcus AC. 2003. Why don’t we see more translation of health promotion
research to practice? American Journal of Public Health 93 (8):1261-1267.

Groves RM, Fowler FJ, Couper MP, Lepkowski JM, Singer E, Tourangeau R. 2004. Survey Methodology.
New Jersey: John Wiley and Sons.
160 Advanced Monitoring and Evaluation Workshop Manual

Hall D, Hall I. 1996. Practical social research: project work in the community. Hampshire: Palgrave
Macmillan.

Hatry H, van Houten T, Plantz MC, Taylor M. 1996. Measuring programme outcomes: A practical
approach. Alexandra, VA: United Way of America.

Kusek JZ, Rist RC. 2004. Ten Steps to a results based Monitoring and Evaluation system. Washington
DC.: The International Bank for Reconstruction and Development/ The World Bank.

Leviton LC, Collins CB, Laird BL, Kratt PP. 1998. Teaching evaluation using evaluability assessment.
Evaluation 4 (4): 389 -409.

Morgan DL. 1996. Focus Groups as Qualitative Research. Thousand Oaks, CA: Sage Publications Inc.

Morra - Imas LG, Rist RC. 2009. The road to results: Designing and conducting effective development
evaluations. Washington DC.: The International Bank for Reconstruction and Development/ The World
Bank.

Owen J. 2007. Program evaluation: Forms and approaches. Taylor and Francis Group.

Patton MQ. 2002. Qualitative research and evaluation methods. Thousand Oaks, CA: Sage Publications
Inc.

Patton MQ. 2008. Utilization-focused evaluation. Thousand Oaks, CA: Sage Publications Inc.

TCPS 2. 2010. Tri-Council Policy Statement: Ethical conduct for research involving humans. Ottawa.
Interagency Secretariat on Research Ethics.

Posavac EJ, Carey RG. 1992. Program evaluation: methods and case studies. Prentice Hall.

Reichardt CS, Cook TD. 1979. ‘Beyond qualitative versus quantitative methods’. In TD Cook and CS
Reichardt (eds). Qualitative and quantitative methods in evaluation research. Beverly Hills, CA,. Sage.
Pp 7-32.

Scriven M, Roth J. 1990. Needs assessment: concepts and practice. Reprinted in Evaluation Practice
11: 135-44.

Thompson NJ, McClintock HO. 2000. Demonstrating your programme’s worth: A primer on evaluation
for programmes to prevent unintentional injury. Georgia: CDC. www.cdc.gov/ncipc/pub-res/demonstr.htm
// (accessed September 2nd, 2012)

W.K. Kellogg Foundation. 1998. The W.K. Foundation evaluation handbook. Battle Creek, MI: WK
Kellogg Foundation. www.wkkf.org/documents/WKKF/EvaluationHandbook/EvalHandbook.pdf //
(accessed September 2nd, 2012)

World Bank. 2004. Monitoring and Evaluation: Some tools methods and approaches. Washington D.C.:
The International Bank for Reconstruction and Development/World Bank.

World Bank. 2011. Writing terms of reference for an evaluation: A how-to guide. Washington D.C.: The
International Bank for Reconstruction and Development/World Bank.

Yarbrough D, Shulha L, Hopson R, Caruthers F. 2011. The program evaluation standards: A guide for
evaluators and evaluation users. Sage Publishers Inc.
161
© 2012

Das könnte Ihnen auch gefallen