Sie sind auf Seite 1von 108

Basic

Program Evaluation
NTSC Training Materials
Purpose/Objectives

• Increase your knowledge of


processes involved in
program evaluation
• Provide information and
resources to help you design
and conduct your own
program evaluation
Program Evaluation
Training

This training presentation is


in 16 modules,
encompassing 9 steps in the
evaluation process
Program Evaluation
Training – Modules

• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
Program Evaluation
Training – Modules

• Module 7 – Data Collection Plan


• Module 8 – How to Collect Data
• Module 9 – Using Commercial
Instruments
• Module 10 – Using Self-Constructed
Instruments
• Module 11 – Collecting Data
Program Evaluation
Training – Modules

• Module 12 – Analyzing Data


• Module 13 – Drawing Conclusions
and Documenting Findings
• Module 14 – Disseminating Information
• Module 15 – Feedback for Program
Improvement
• Module 16 – Conclusion
Program Evaluation
Training – Modules

• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
Module 1 –
Introduction

Module 1 – Introduction
• Why evaluate?
• What is evaluation?
• What does evaluation do?
• Kinds of evaluation
Why Evaluate?

• Determine program outcomes


• Identify program strengths
• Identify and improve weaknesses
• Justify use of resources
• Increased emphasis on
accountability
• Professional responsibility to show
effectiveness of program
What is Program
Evaluation?
• Purposeful, systematic, and careful
collection and analysis of information
used for the purpose of documenting the
effectiveness and impact of programs,
establishing accountability, and
identifying areas needing change and
improvement
What Evaluation
Does

• Looks at the results of your


investment of time, expertise,
and energy, and compares those
results with what you said you
wanted to achieve
Kinds of
Evaluation

• Outcome
• Implementation
• Formative
• Summative
Outcome Evaluation

What: Identifies the results or effects of


a program
When: You want to measure students’
or clients’ knowledge, attitudes, and
behaviors as a result of a program
Examples: Did program increase
achievement, reduce truancy, create
better decision-making?
Implementation
Evaluation

What: Documents what the program is


and to what extent it has been
implemented
When: A new program is being
introduced; identifies and defines the
program; identifies what you are actually
evaluating
Examples: Who receives program,
where is program operating; is it being
implemented the same way at each site?
Timing of
Evaluation

• Formative
– as the program is happening to make
changes as program is being implemented

• Summative
– at the end of a program to document
results
Module 2 – Overview

• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
Overview – The 9-
step Process

• Planning
• Development
• Implementation
• Feedback
Overview – The 9-
step Process
Overview – The 9-
step Process
Overview – The 9-
step Process
Module 3 – Defining
the Purpose

• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
9-step Evaluation
Process

Step 1: Define
Purpose and Scope
Step 1: Scope/Purpose
of Evaluation

• Why are you doing the evaluation?


– mandatory? program outcomes? program
improvement?
• What is the scope? How large will the effort
be?
– large/small; broad/narrow
• How complex is the proposed evaluation?
– many variables, many questions?
• What can you realistically accomplish?
Resource
Considerations
• Resources
–$$
–Staff
•who can assist?
•need to bring in expertise?
•do it yourself?
•advisory team?
–Time
• Set priorities
• How you will use the information
Module 4 – Specifying
the Questions

• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence Needed
• Module 6 – Specifying the Design
9-step Evaluation
Process

Step 2: Specify
Evaluation
Questions
Evaluation
Questions

What is it that you want to know about


your program?
operationalize it (make it measurable)

Do not move forward


if you cannot answer
this question.
Sources of
Questions

• Strategic plans
• Mission statements
• Policies
• Needs assessment
• Goals and objectives
• National standards and guidelines
Broad Questions

• Broad Scope
– Do our students contribute positively to society after
graduation?
– Do students in our new mentoring program have a
more positive self-concept and better decision-making
skills than students without access to the mentoring
program?
– To what extent does the state’s career development
program contribute to student readiness for further
education and training and success in the workforce?
Narrow Questions

• Narrow Scope
– Can our 6th grade students identify appropriate
and inappropriate social behaviors?
– How many of our 10th grade students have
identified their work-related interests using an
interest inventory?
– Have 100% of our 10th grade students
identified at least 3 occupations to explore
further based on their interests, abilities, and
knowledge of education and training
requirements?
Exercise 1 –
Scope (p. 2 of Workbook)
• From the list of questions, identify
those that might be considered broad
and those that might be considered
narrow
• How large will the resources need to
be to answer the question
Exercise 2 – Scope
Write (p. 3 of Workbook)
• List one broad evaluation question
and one narrow evaluation question
Module 5 –
Identifying Evidence
Needed
• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
Identifying Evidence
Needed to Answer Your
Questions

• What evidence do you have to


answer your question?
Identifying Evidence
Needed to Answer Your
Questions

• Need to think about what


information you need in order to
answer your evaluation questions
Example Evidence
– Broad Scope
Do our students contribute positively
to society after graduation?
• Percent of our students that are employed, in
education or training programs, in the military,
are supporting a family by working at home,
and/or are volunteering for charitable causes 3
years after high school graduation

• Percent of our students that vote in local and


national elections 5 years after graduation
Example Evidence
– Narrow Scope
Have 100% of our 10th grade students
identified at least 3 occupations to
further explore that are based on their
interests, abilities, and knowledge of the
education and training requirements?

• Number of 11th and 12th grade students


participating in the career class that
demonstrated increased career maturity from a
pre- and post-test
Exercise 3 – Evidence
(p. 4 of Workbook)

• List evidence you need to have to


answer the question
Module 6 – Specifying
the Design

• Module 1 – Introduction
• Module 2 – Overview
• Module 3 – Defining the Purpose
• Module 4 – Specifying the Questions
• Module 5 – Identifying Evidence
Needed
• Module 6 – Specifying the Design
9-step Evaluation
Process

Step 3: Specify
Evaluation Design
Types of Designs

Relates to when data should be collected


• Status (here and now; snapshot)
• Comparison (group A vs. group B; program
A vs. program B)
• Change (what happened as a result of a
program; what differences are there between
time A and time B)
• Longitudinal (what happens over extended
time)
Exercise 4 –
Design (p. 5 of Workbook)

• What type of design fits each


evaluation question?
– Status
– Comparison
– Change
– Longitudinal
Module 7 – Data
Collection Plan

• Module 7 – Data Collection Plan


• Module 8 – How to Collect Data
• Module 9 – Using Commercial
Instruments
• Module 10 – Using Self-Constructed
Instruments
• Module 11 – Collecting Data
9-step Evaluation
Process

Step 4: Create a
Data Collection
Action Plan
Organize Your Evaluation
With a Data Collection
Action Plan

Evaluation What is How Collected/What


Question 1 Collected Technique
 
 
 
  
 
 
From Whom/ When How Data are to be
Data Sources Collected Analyzed
  and By
  Whom
   
Components of a Data
Collection Action Plan

• What Will be Collected?


–based on evidence required
• How Collected? Instrumentation
–surveys? published instrument? focus
group? observations?
Components of a Data
Collection Action Plan

• From Whom Collected?


–who or what provides evidence
• When Collected and by Whom?
–specific dates, times, persons
• How Data are to be Analyzed?
Data Sources —
Who and What

• Students
• Parents
• Teachers
• Counselors
• Employers
• Friends
• Documents and other records
Exercise 5 – Data
Sources (p. 6 of Workbook)

• Who/what are the data sources for


the following questions?
Module 8 – How to
Collect Data

• Module 7 – Data Collection Plan


• Module 8 – How to Collect Data
• Module 9 – Using Commercial
Instruments
• Module 10 – Using Self-Constructed
Instruments
• Module 11 – Collecting Data
Data Collection
Options

• Commercial instrument
• Survey/questionnaire
• Focus group/interviews
• Observations
• Archived information
Module 9 – Using
Commercial
Instruments

• Module 7 – Data Collection Plan


• Module 8 – How to Collect Data
• Module 9 – Using Commercial
Instruments
• Module 10 – Using Self-Constructed
Instruments
• Module 11 – Collecting Data
Commercial
Instruments

• Sometimes best to use published or


research instruments
– particularly for tough constructs
– since its not made specifically for you,
may not answer your question entirely
Sources of Information
on Instruments

• Counselors Guide to Career Assessment


Instruments
• Relevance, the Missing Link ─ A Guide for
Promoting Student Success Through Career

Development Education, Training, and


Counseling
• The Buros Institute
• ETS Test Collection
• The Association for Assessment in Counseling and
Education
Exercise 6: Decision-
Making Checklist (p. 7 of
Workbook)

• This checklist will help you conduct


a review of data collection
instruments that you are considering
using in your evaluation
Module 10 – Using Self-
Constructed
Instruments

• Module 7 – Data Collection Plan


• Module 8 – How to Collect Data
• Module 9 – Using Commercial
Instruments
• Module 10 – Using Self-Constructed
Instruments
• Module 11 – Collecting Data
Self-Constructed
Instruments:
Questionnaires
• Focus on evidence you need
• Use simple language
• Ask only what you need; keep it short
• Don’t use jargon
• Each question should focus on one idea
• Make sure terms are clear
• Make it easy for person to answer the questions
(check rather than write, where possible)
• Use extended response when you want details
Types of Scales (1)

Specific (yes, no; number; gender)


Extended (1-3, 1-5, 1-7)
Types of Scales (2)
Anchored Scales
Scales for Younger
Students
Self-Constructed
Instruments: Focus
Groups/Interviews

• Good to use when you want extended and


detailed responses
• Craft an agenda and stick to it
• Keep groups small (6-10); time short (1-
1.5 hours)
• Specify objectives of session
• Questions need to be clear; one question
at a time
• Encourage everyone to participate
• Use opportunity to probe deeper on a
topic
Observations and
Observational Checklist

• “You can observe a lot just by


watching.”
-- Yogi Berra

• Go to pages 8 and 9 of your


Workbook and review an example of
an observational checklist
Archives and
Documents
• Examine What’s Already Available
• Examples
– Attendance records
– Truancy reports
– Grades
– Bullying incidents
– Report cards
– Portfolios
– Discipline referrals
– Public service hours
– Police reports
Exercise 7 - Data
Collection Action Plan

• Reviewexamples of a completed
Data Collection Action Plan on
Pages 10-12 of the Workbook
Module 11 –
Collecting Data

• Module 7 – Data Collection Plan


• Module 8 – How to Collect Data
• Module 9 – Using Commercial
Instruments
• Module 10 – Using Self-Constructed
Instruments
• Module 11 – Collecting Data
9-step Evaluation
Process

Step 5: Collect
Data
How Much Data Should
You Collect?

• How much data do you need?


– 100% of target audience is ideal; may be too
expensive and time consuming
– If not 100%, sample is OK if group is
representative of group as a whole (population)
Types of Samples
Data Collection
Considerations

• When should you collect the information?

• Who should collect it?


Module 12 –
Analyzing Data

• Module 12 – Analyzing Data


• Module 13 – Drawing Conclusions and
Documenting Findings
• Module 14 – Disseminating Information
• Module 15 – Feedback for Program
Improvement
• Module 16 – Conclusion
9-step Evaluation
Process

Step 6: Analyze
Data
What is Data
Analysis?
• Data collected during program
evaluation are compiled and analyzed
(counting; number crunching)
• Inferences are drawn as to why some
results occurred and others did not
• Can be very complex depending on
your evaluation questions
• We will focus on simple things that can
be done without expert consultants
Types of Data Analysis
Simple Frequency Counts
Types of Data Analysis
Sort by Relevant
Categories
Types of Data Analysis
Calculate Percentages –
Exercise 8 (p.13 of Workbook)
Types of Data Analysis
Showing Change or
Differences
Types of Data Analysis
Reaching an Objective or
Goal
Types of Data Analysis
Observing Trends
Types of Data Analysis
Graph Results
Types of Data Analysis
Calculate Averages – Exercise 9
(p. 14 of Workbook)
Types of Data Analysis
Calculate Weighted
Averages
Types of Data Analysis
Calculate Weighted
Averages
Types of Data Analysis
Rank Order Weighted
Averages
Types of Data Analysis
Graph Weighted Averages
Using Focus
Group/Interview
Information

• Qualitative findings from focus


groups, extended response items,
etc., should be analyzed in a different
way
– Code words/frequency
– Identify themes
– Pull quotes
– Summarize and draw conclusions
Module 13 – Drawing
Conclusions and
Documenting Findings

• Module 12 – Analyzing Data


• Module 13 – Drawing Conclusions and
Documenting Findings
• Module 14 – Disseminating Information
• Module 15 – Feedback for Program
Improvement
• Module 16 – Conclusion
9-step Evaluation
Process

Step 7: Drawing
Conclusions and
Documenting
Findings
Drawing
Conclusions
• Examine results carefully and
objectively
• Draw conclusions based on your
data
• What do the results signify about
your program?
Exercise 10 – Interpreting
Results (p. 15-16 of
Workbook)

• Complete the Interpreting Results


Exercise on pages 15-16 of the Workbook.
Unintended
Consequences
• Watch for positive and negative
outcomes that you did not plan on
- For example, if your career development
program focuses on increasing students’
awareness of how to identify their interests and
skills, it may have the unintended consequence of
leaving little time for students to explore
occupations and jobs in their area.
- Or, if your program has overemphasized
the importance of getting a college education,
students may not be considering the positive
benefits of other kinds of postsecondary training.
What to Include in Your
Documentation

• Program description
• Evaluation questions
• Methodology (how and from whom and
when)
• Response rate
• Methods of analysis
• Conclusions listed by evaluation question
• General conclusions and findings
• Action items
• Recommendations for program
improvement and change
Document the
Successes and
Shortfalls

• Highlight and brag about positive


outcomes
• Document shortfalls
- Provides opportunities to
- improve program
- make recommendations to benefit the
program
Module 14 –
Disseminating
Information

• Module 12 – Analyzing Data


• Module 13 – Drawing Conclusions and
Documenting Findings
• Module 14 – Disseminating Information
• Module 15 – Feedback for Program
Improvement
• Module 16 – Conclusion
9-step Evaluation
Process

Step 8: Disseminate
Information
Determining
Dissemination Methods

• Inform all your relevant stakeholders


on results
• Dissemination methods should differ
by your target audience
Potential
Audiences
• Your program staff

• Businesses
Partners that work with your program
Employers

• School Level
School administrators
Counselors
Teachers
Students
Parents
Potential
Audiences
• Media
Local newspaper
TV station
Radio program
Community or school newsletter
• Education Researchers
• Members of Community or faith based organizations
Church members
Religious leaders
Rotary club
Boys or girls club
• Anyone who participated in your evaluation!
Dissemination
Techniques
• Reports
• Journal articles
• Conferences
• Career Newsletter/Tabloids
• Presentations
• Brochures
• TV and newspaper interviews
• Executive summary
• Posting on Web site
Exercise 11 – Disseminating
Information (p. 17 of
Workbook)

• Using the information provided in


exercise 6, describe how you would
disseminate the information to
– Program funders
– Parents
Module 15 – Feedback for
Program Improvement

• Module 12 – Analyzing Data


• Module 13 – Drawing Conclusions and
Documenting Findings
• Module 14 – Disseminating Information
• Module 15 – Feedback for Program
Improvement
• Module 16 – Conclusion
9-step Evaluation
Process

Step 9: Feedback
to Program
Improvement
Opportunities to
Fix Shortfalls
• Evaluation results may show areas where
improvement is necessary

- 25% of 11th graders are unable to


complete a skills based resume
- 85% of our students drop out of college in
the first year
- Most employers do not want your students
to serve an interns in their companies
Feedback to Program
Improvement

• You can use evaluation findings to


make program improvements
– Consider adjustments
– Re-examine/revise program strategies
– Change programs or methodologies
– Increase time with the program
• Use your results as a needs
assessment for future efforts
Module 16 - Conclusion

• Module 12 – Analyzing Data


• Module 13 – Drawing Conclusions and
Documenting Findings
• Module 14 – Disseminating Information
• Module 15 – Feedback for Program
Improvement
• Module 16 – Conclusion
Conclusion

Evaluation helps you:


• determine the effects of the program on
recipients
• know if you have reached your
objectives
• improve your program
Conclusion

• The 9-step process works

• A credible evaluation can be done with


careful planning and some basic math
skills
Exercise 12 – Developing a
Data Collection Action Plan
(page 18 of Workbook)

Using all the information you have gathered


from the workbook exercises and the
powerpoint slides, you can develop your own
Data Collection Action Plan on page 18 of your
Workbook

Das könnte Ihnen auch gefallen