Beruflich Dokumente
Kultur Dokumente
WORKSHOP
Definitions , context, objectives and
functions
Evaluation Designs Characteristics
Types of Evaluation Models
ABCD Model
CIPP Model
Provus Discrepancy Model
Kerrigan Model
Goal Based Evaluation (GBE)
Goal Free Evaluation (GFE)
Evaluation Designs:
1.Seeks to clarify planning
alternatives and to improve
program
2.Loyalty to a particular program
or project; choice of evaluation
topics is determined by the
information needs of decision
makers.
Evaluation Designs:
3. Methodological choices are scientific but
non-experimental & quite often
naturalistic
4. Time-Frame for the production of results
is set by the program
5. Professional rewards consist in the
utilization of findings by decision makers
& demonstrated improvement in the
program implementation.
MODELS of
Research
Evaluation
Background
It is an evaluation model developed by
Jesus Ochave(1994)
Is comparable with classical models of
evaluation like those of Stufflebeams
CIPP Model and Provus Discrepancy Model
Comprehensive enough to allow the
evaluator to evaluate aspects of the
program which bears with the
effectiveness
Flexible and allows the evaluator to trace
the causal factors that explain program
effects without necessarily going through
a mass of other data
Four Major
Components of ABCD
Model
A- the
respondents/subjects/clientele
B- the program/ operations
C- the effects
D- the social impact
Each of these components has 2
dimensions, the intent/plans and
the actualities/ observations
STEPS
Identify who the respondents are
Provide a program/operation that will
suit best or answer the need of the
respondents
Check if the program/operation
provided has a better effects to the
respondents cognitive or affective
domain capabilities
Verify whether the operation
provides makes the respondents
useful members of the community
The Model
Spherical structure
-symbolizes the flexibility of
the components which will
actually depend on the kind
of program being evaluated
Polygon with many sides
-represents the many faces
or areas of concern of the
program
Vectors/arrows
-represent the direction of the effects
BOTIKA NG
BARANGAY
Implementation of the
Program / Operation
Effects:
Savings
Health Related
Benefits
Clients
Acceptance
Utilization
Satisfaction
Social Impact
Mortality Rate
CIPP MODEL
Daniel Stufflebeam
A decision focused approach to
evaluation
Emphasizes the systematic provision
of information for program
management
Makes evaluation directly relevant to
the needs of decision makers during
the different phases and activities of a
program
Provides useful information to
decision-makers with the overall goal
of program or project improvement
CIPP Model
Daniel Stufflebeam
Contex
t
Vision
Mission
Philosop
hy
Objectiv
es
Input
Faculty
Curricul
ar
Facilities
Library
Proces
s
Monitori
ng
Instructio
n
Manage
ment
Student
Activities
Student
Services
Produc
t
Student
Performa
nce
Cognitiv
e
Affective
Psychom
otor
Aspects of
Evaluation
Aspects of
Evaluation
Context Evaluation
Input evaluation
Context
Evaluation
Input
evaluation
Process
evaluation
Product
evaluation
Process evaluation
Product evaluation
Type of decision
Type of
decision
Kind of
question
answered
What should
we do?
How should we
do it?
Are we doing it
as planned?
Did it work?
Planning decisions
Structuring decisions
Planning
decisions
Structuring
decisions
Implementing
decisions
Recycling
decisions
Implementing decisions
Are we doing it as
planned?
Recycling decisions
Did it work?
Statement of the
Problem:
The main purpose of the study
is to make an evaluation of
the Community District
Hospital Nursing Service, from
January 2007-2009.
The findings serve as the
baseline data for the
envisioned hospital
accreditation and expansion
purposes.
Statement of the
Problem:
Specifically it seeks to answer
the following questions:
1. What is the existing status of
Community District Hospital
Nursing Service in terms of the
following variables as perceived
by the Nursing Service
personnel?
1.1 Context Variable
1.1.1 Philosophy and Objectives
1.1.2 Organization
1.1.3Policies
Statement of the
Problem:
1.2 Input Variables
1.2.1 Resource Management
1.2.2 Material Management
1.2.3 Financial Management
1.3 Process Variables
1.3.1 Patient Care Management
1.3.2 Reporting / Recording
1.3.3 Interdepartmental
Relations
1.4 Product Variable
1.4.1 Quality Nursing Service
4 Choices in Discrepancy:
F. FFFFFFF FF FFF FFFF FFFFF FF FFFFFFFFFF FF FF
FFFFFFFFFFF FFFFF.
F. FF F FFFFFFFFFFF FFFFF, FFFFFFF FFFFFFF FFF
FFFFFFFF FFFFF FFFFF FFFFF FFF FFFF F FFFFFF FF
FFFFFF FFF FFFFFFFF FFFFFFFFF FF FFFFFFFFFF.
F. FF #F FFFFFF FF FFFFFFFFFFFF, FFFF FFFFFFF FFFF
FF FFFFF F FFFFFFF FFFFFFFFFF, FF FFFFFFFF FFF
FFFFFFF, FFFF FFFFF FFF FFFFFFFFFFF FFFFFFFFFF
FFFFF FF FFFFF F.
F. FF #F FFFFFF FF FFFFFFFFFFFF FFFFFFFFF FFF
FFFFFFF.
Effectiveness of the
Provus Discrepancy
Model:
Where the key emphasis of evaluation is
Effectiveness of the
Provus Discrepancy
Model:
Effectiveness of the
Provus Discrepancy
Model:
Provus Discrepancy
Evaluation Model
by MALCOLM PROVUS
5 Stages
S Standard
P Program Performance
C Comparison of S with P
D Discrepancy Information
Resulting from C
T Terminate
A Alteration of P or S
S
C
P
D
A
KERRIGAN
EVALUATION
MODEL
by John Kerrigan
Training
Activity
(Reaction
Criteria)
Did the
Trainees
enjoy the
training?
Trained
Persons
(Learning
Criteria)
What did
the
trainees
learn?
The Job on
Organizatio
n
(Behavioral
Criteria)
Did the
Trainees
behavior
change on
the job?
Results in
Job and
Organizatio
nal
Performanc
e
(Results
Criteria)
Did the
Organizatio
n or project
improve in
performanc
e
Sample Research
An Evaluation of the Pre-Service
Orientation Program for Newly
Employed Staff Nurses in
Hospitals: a Kerrigan Evaluation
Model
Specifically it sought to
answer the following
questions:
1. Reaction Criteria
2. Learning Criteria
2.1 What did the trainees learn?
(cognitive, affective, psychomotor)
Specifically it sought to
answer the following
questions:
3. Behavioral Criteria
3.1 Did the trainees behavior
change on the job? (before after)
4. Results Criteria
4.1 Did the organization/institution
improve in their performance based
on the training?
GOAL- FREE
EVALUATION
PRESENTATION OUTLINE
A.Definition of Goal Free
Evaluation (GFE)
Purpose and activities of GFE
A.Arguments for GFE
B.Four Reasons for Doing GFE
C.Implementation
considerations
D.Framework
GOAL FREE
EVALUATION
An inductive and holistic
strategy aimed at
countering the logical
deductive limitations
inherent in the usual
quantitative goals-based
approach to evaluation
GOAL FREE
EVALUATION
It involves gathering data
on a broad array of actual
effects and evaluating the
importance of these effects
in meeting demonstrated
needs without being
constrained by a narrow
focus on stated goals.
GOAL FREE
EVALUATION
GF evaluator avoids learning
the stated
purpose/goals/intended
achievements of the program
prior to or during the
evaluation and interviews
program consumers.
Only the programs actual
outcomes and measureable
effects are studied, and these
are judged on the extent to
GOAL FREE
EVALUATION
This prevents tunnel vision, or only
looking at the program as it pertains to
the intended goals at the risk of
overlooking many positive and/or
negative unintended side-effects
The evaluator thus can be open to
whatever data emerge from the
phenomena of the program
GOAL FREE
EVALUATION
GOAL FREE
EVALUATION
GFE Evaluator asks: What does
the program actually do?
Rather than, What does the
program intend to do?
Merit is determined by relating
program effects to the
relevant needs of the
impacted population (Scriven,
1991. p 180)
A comprehensive needs
assessment is conducted
simultaneously with data
collection.
GOAL FREE
EVALUATION
The evaluator should provide
experiential accounts of
program activity so that
readers of the report can,
through naturalistic
generalization, arrive at their
own judgment of quality in
addition to those the
evaluator provides, (Stake,
2004 in Alkin, 2004, p.215)
GOAL FREE
EVALUATION
Can be conducted along with
goals-based evaluation, but
with separate evaluators
using each approach to
maximize its strengths and
minimize it weaknesses as
proposed by Scriven
GOAL FREE
EVALUATION
Defining question: What
are the actual effects of
the program on clients
(without regard to what
staff say they want to
accomplish)? To what
extent are real needs
being met?
GFE as a Supplement to
GBE
In synthesizing the GBE and GFE,
evaluators interpret the data while
discussing whether the evaluation
methods, results, and conclusion support
or contradict each other; and the
evaluators (possibly with key
stakeholders)weigh the data from both
approaches to make an evaluative
Goal-Based Evaluation
Goals/Objectives
Implementation
Formative/Summative
Evaluation