You are on page 1of 38

module

5
Training Evaluation

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6-1

Introduction (1 of 2)
Training effectiveness refers to the benefits that the company and the trainees receive from training
Training outcomes or criteria refer to measures that the trainer and the company use to evaluate training programs

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6-2

Introduction (2 of 2)
Training evaluation refers to the process of collecting the outcomes needed to determine if training is effective
Evaluation design refers to from whom, what, when, and how information needed for determining the effectiveness of the training program will be collected

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6-3

Reasons for Evaluating Training (1 of 2)


Companies are investing millions of dollars in training programs to help gain a competitive advantage
Training investment is increasing because learning creates knowledge which differentiates between those companies and employees who are successful and those who are not

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6-4

Reasons for Evaluating Training (2 of 2)


Because companies have made large dollar investments in training and education and view training as a strategy to be successful, they expect the outcomes or benefits related to training to be measurable.

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6-5

Training evaluation provides the


data needed to demonstrate that training does provide benefits to the company.

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6-6

Formative Evaluation
Formative evaluation evaluation conducted to improve the training process Helps to ensure that:
the training program is well organized and runs smoothly trainees learn and are satisfied with the program

Provides information about how to make the program better

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6-7

Summative Evaluation
Summative evaluation evaluation conducted to determine the extent to which trainees have changed as a result of participating in the training program
May also measure the return on investment (ROI) that the company receives from the training program

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6-8

Why Should A Training Program Be Evaluated? (1 of 2)


To identify the programs strengths and weaknesses To assess whether content, organization, and administration of the program contribute to learning and the use of training content on the job To identify which trainees benefited most or least from the program

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6-9

Why Should A Training Program Be Evaluated? (2 of 2)


To gather data to assist in marketing training programs To determine the financial benefits and costs of the programs To compare the costs and benefits of training versus non-training investments To compare the costs and benefits of different training programs to choose the best program
McGraw-Hill/Irwin
2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 10

The Evaluation Process


Conduct a Needs Analysis

Develop Measurable Learning Outcomes and Analyze Transfer of Training

Develop Outcome Measures

Choose an Evaluation Strategy

Plan and Execute the Evaluation

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 11

Training Outcomes: Kirkpatricks Four-Level Framework of Evaluation Criteria


Level 1 2 3 4 Criteria Reactions Learning Behavior Results Focus Trainee satisfaction Acquisition of knowledge, skills, attitudes, behavior Improvement of behavior on the job Business results achieved by trainees

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 12

Outcomes Used in Evaluating Training Programs: (1 of 4)

Cognitive Outcomes

Skill-Based Outcomes

Affective Outcomes
McGraw-Hill/Irwin

Results

Return on Investment

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 13

Outcomes Used in Evaluating Training Programs: (2 of 4)


Cognitive Outcomes
Determine the degree to which trainees are familiar with the principles, facts, techniques, procedures, or processes emphasized in the training program Measure what knowledge trainees learned in the program

Skill-Based Outcomes
Assess the level of technical or motor skills Include acquisition or learning of skills and use of skills on the job
McGraw-Hill/Irwin
2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 14

Outcomes Used in Evaluating Training Programs: (3 of 4)


Affective Outcomes
Include attitudes and motivation Trainees perceptions of the program including the facilities, trainers, and content

Results
Determine the training programs payoff for the company

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 15

Outcomes Used in Evaluating Training Programs: (4 of 4)


Return on Investment (ROI)
Comparing the trainings monetary benefits with the cost of the training direct costs indirect costs benefits

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 16

How do you know if your outcomes are good?


Good training outcomes need to be: Relevant Reliable Discriminative Practical

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 17

Good Outcomes: Relevance


Criteria relevance the extent to which training programs are related to learned capabilities emphasized in the training program Criterion contamination extent that training outcomes measure inappropriate capabilities or are affected by extraneous conditions Criterion deficiency failure to measure training outcomes that were emphasized in the training objectives
McGraw-Hill/Irwin
2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 18

Criterion deficiency, relevance, and contamination:

Outcomes Measured in Evaluation

Outcomes Related to Training Objectives

Outcomes Identified by Needs Assessment and Included in Training Objectives

Contamination
McGraw-Hill/Irwin

Relevance

Deficiency
6 - 19

2005 The McGraw-Hill Companies, Inc. All rights reserved.

Good Outcomes (continued)


Reliability degree to which outcomes can be measured consistently over time Discrimination degree to which trainees performances on the outcome actually reflect true differences in performance Practicality refers to the ease with which the outcomes measures can be collected

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 20

Training Evaluation Practices


Percentage of Courses Using Outcome

80% 70% 60% 50% 40% 30% 20% 10% 0% Reaction

79%

38% 15%

9%

Cognitive

Behavior

Results
6 - 21

Outcomes
McGraw-Hill/Irwin
2005 The McGraw-Hill Companies, Inc. All rights reserved.

Training Program Objectives and Their Implications for Evaluation:


Objective Learning Transfer

Outcomes
Reactions: Did trainees like the program? Did the environment help learning? Was material meaningful? Pencil-and-paper tests Performance on a work sample Skill-Based: Ratings by peers or managers based on observation of behavior

Cognitive: Skill-Based:

Affective: Results:

Trainees motivation or job attitudes Did company benefit through sales, quality, productivity, reduced accidents, and complaints? Performance on work equipment
6 - 22

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

Evaluation Designs: Threats to Validity


Threats to validity refer to a factor that will lead one to question either:
The believability of the study results (internal validity), or The extent to which the evaluation results are generalizable to other groups of trainees and situations (external validity)

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 23

Threats to Validity
Threats To Internal Validity Company Persons Outcome Measures

Threats To External Validity Reaction to pretest Reaction to evaluation Interaction of selection and training Interaction of methods

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 24

Methods to Control for Threats to Validity

Pre- and Posttests


Use of Comparison Groups Random Assignment
McGraw-Hill/Irwin
2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 25

Types of Evaluation Designs


Posttest only
Pretest / Posttest Posttest only with Comparison Group Pretest / Posttest with Comparison Group
McGraw-Hill/Irwin
2005 The McGraw-Hill Companies, Inc. All rights reserved.

Time Series
Time Series with Comparison Group and Reversal

Solomon FourGroup

6 - 26

Comparison of Evaluation Designs


(1 of 2)
Measures
Design Posttest Only Groups Trainees Pre-training No Post-training Yes Cost Low Time Low Strength Low

Pretest / Posttest

Trainees

Yes

Yes

Low

Low

Medium

Posttest Only with Comparison Group Pretest / Posttest with Comparison Group

Trainees and Comparison Trainees and Comparison

No

Yes

Medium

Medium

Medium

Yes

Yes

Medium

Medium

High

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 27

Comparison of Evaluation Designs


(2 of 2)
Measures
Design Time Series Groups Trainees Pre-training Yes Post-training Yes, several Cost Medium Time Medium Strength Medium

Time Series with Comparison Group and Reversal


Solomon Four-Group

Trainees and Comparison


Trainees A Trainees B Comparison A Comparison B

Yes

Yes, several

High

Medium

High

Yes No Yes No

Yes Yes Yes Yes

High

High

High

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 28

Example of a Pretest / Posttest Comparison Group Design:


Pre-training Training Post-training Time 1 Post-training Time 2

Lecture
Self-Paced

Yes
Yes

Yes
Yes

Yes
Yes

Yes
Yes

Behavior Modeling No Training (Comparison)

Yes Yes

Yes No

Yes Yes

Yes Yes

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 29

Example of a Solomon Four-Group Design:


Pretest Group 1 Yes Training IL-based Posttest Yes

Group 2
Group 3 Group 4

Yes
No No

Traditional
IL-based Traditional

Yes
Yes Yes

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 30

Factors That Influence the Type of Evaluation Design


Factor Change potential Importance Scale How Factor Influences Type of Evaluation Design Can program be modified? Does ineffective training affect customer service, product development, or relationships between employees? How many trainees are involved?

Purpose of training
Expertise Cost Time frame

Is training conducted for learning, results, or both?


Can a complex study be analyzed? Is evaluation too expensive? When do we need the information?

Organization culture Is demonstrating results part of company norms and expectations?

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 31

Conditions for choosing a rigorous evaluation design: (1 of 2)


1. The evaluation results can be used to change

the program 2. The training program is ongoing and has the potential to affect many employees (and customers) 3. The training program involves multiple classes and a large number of trainees 4. Cost justification for training is based on numerical indicators
McGraw-Hill/Irwin
2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 32

Conditions for choosing a rigorous evaluation design: (2 of 2)


5. You or others have the expertise to design and

evaluate the data collected from the evaluation study 6. The cost of training creates a need to show that it works 7. There is sufficient time for conducting an evaluation 8. There is interest in measuring change from pretraining levels or in comparing two or more different programs
McGraw-Hill/Irwin
2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 33

Importance of Training Cost Information


To understand total expenditures for training, including direct and indirect costs To compare costs of alternative training programs To evaluate the proportion of money spent on training development, administration, and evaluation as well as to compare monies spent on training for different groups of employees To control costs
McGraw-Hill/Irwin
2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 34

To calculate return on investment (ROI), follow these steps: (1 of 2)


1. Identify outcome(s) (e.g., quality, accidents)
2. Place a value on the outcome(s) 3. Determine the change in performance after

eliminating other potential influences on training results. 4. Obtain an annual amount of benefits (operational results) from training by comparing results after training to results before training (in dollars)

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 35

To calculate return on investment (ROI), follow these steps: (2 of 2)


5. Determine training costs (direct costs + indirect costs + development costs + overhead costs + compensation for trainees) 6. Calculate the total savings by subtracting the training costs from benefits (operational results) 7. Calculate the ROI by dividing benefits (operational results) by costs

The ROI gives you an estimate of the dollar return expected from each dollar invested in training.

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 36

Determining Costs for a Cost-Benefit Analysis:

Direct Costs

Indirect Costs

Development Costs
McGraw-Hill/Irwin

Overhead Costs

Compensation for Trainees


6 - 37

2005 The McGraw-Hill Companies, Inc. All rights reserved.

Example of Return on Investment


Industry Bottling company Large commercial bank Electric & gas utility Oil company Health maintenance organization Training Program Workshops on managers roles Sales training Behavior modification Customer service Team training ROI 15:1 21:1 5:1 4.8:1 13.7:1

McGraw-Hill/Irwin

2005 The McGraw-Hill Companies, Inc. All rights reserved.

6 - 38