Click here to skip navigation
An official website of the United States Government - whitehouse.gov

Sign In

Training and Development Policy Wiki

Page History: Training Evaluation


Compare Page Revisions




Page Revision: 11/13/2012 12:07:57 PM


Training Evaluation

What is Training Evaluation?

Training evaluation is a continual and systematic process of assessing the value or potential value of a training course, activity or event.  Results of the evaluation are used to guide decision-making around various components of the training (e.g. instructional design, delivery, results) and it's overall continuation, modification, or elimination. 

To assist agencies in evaluating their training programs, OPM published the  Training Evaluation Field Guide in 2011. The Training Evaluation Field Guide is designed to assist agency training representatives in evaluating training  effectiveness and in demonstrating training value to stakeholders and decision makers. (Please contact us if you would like training on how you can use of the field guide in your agency. cheryl.ndunguru@opm.gov)).

Reporting Training Data

The law authorizes OPM to require Federal agencies to report training data.  An important part of the evaluation process involves consideration of training costs, and other elements not directly addressed in the typical evaluaiton.  Agencies should track and report accurate training data on all completed training events to OPM as prescribed by the Final Rule on Training Reporting Requirements, published on May 17, 2006, in the Federal RegisterThe Federal Workforce Flexibility Act of 2004 (P.L. 108-411). See

OPM provides training data reports as an objective summary of data gathered about various aspects or each agency training event.  View this quick video for more information on the training evaluation field guide and ways agencies can use evaluation and training data to inform decisions related to training investments. 

What is Program Evaluation and how is it related to Training Evaluation?

Program evaluation is a systematic study conducted to assess how well a program is working.  A program evaluation examines achievement of program objectives in the context of other aspects of program performance or in the context in which it occurs.Program evaluations are often conducted by experts external to the program,either inside or outside the agency.  The Government Accountability Office (GAO) works for Congress and regularly conducts Program Evaluations.  They are responsible for investigating how the Federal Government spends tax dollars.

A program evaluation is used to assess and asks questions about every aspect of a program or initiative (training programs and other types of programs), from the inputs (what resources were used to create the program) and to the outcomes (program results).  While the principles of training evaluation can apply in an overall program, training evaluation is used to assess the training/development activities within the program (e.g training courses and events).  Since programs may have activities in addition to training (e.g. services, meetings), training evaluation data can be used in the overall program evaluation. 

Please reference this LOGIC MODEL for further explanation of the differences and similarities.  A logic model provides a representation of the "theory of change" (if...then) behind a program or iniative.  Logic models can be completed for programs and initiatives and for individual courses, events or activities.  For more detailed information on logic models, you can watch this video or take EPA's FREE logic modeling course.

When we evaluate...

  • We examine the assumptions upon which an existing or proposed training course or program is based
  • We inquire, up front, about the expected results
  • We create then study the goals and objectives 
  • We collect information about inputs and outcomes.
  • We compare it to some pre-set standards.
  • We report findings in a manner that facilitates their use. 

Why should we evaluate?

Agencies are required to evaluate their training programs annually to determine how well such plans and programs contribute to mission accomplishment and meet organizational performance goals (5 CFR 410.202). In addition, demands to demonstrate training program efficiency, program effectiveness and public accountability are increasing. Use of evaluation data meets these demands in various ways:

Planning

  • To assess needs.
  • To set priorities.
  • To direct allocation of resources.
  • To guide policy

Analysis of course/program effectiveness or quality

  • To determine achievement of objectives.
  • To identify strengths and weaknesses of a program/course.
  • To determine the cost-effectiveness of a program/course.
  • To assess causes of success or failure.

Direct decision-making

  • To improve effectiveness.
  • To identify and facilitate needed change.
  • To continue expand or terminate a program/course.

Maintain accountability

  • To stakeholders.
  • To funding sources.
  • To the general public.

When to Evaluate

There are several basic questions to ask when deciding whether to carry out an evaluation. If the answers to these questions are "Yes", this may be the time to evaluate.

  • Is the program/course important or significant enough to warrant evaluation?
  • Is there a legal requirement to carry out a program evaluation?
  • Will the results of the evaluation influence decision-making about the program/course?
  • Will the evaluation answer questions posed by your stakeholders or those interested in the evaluation?

How To Evaluate

Once you've determined whether or not your program or course warrants evaluation, there are various methods and models agencies can use to evaluate their training courses.  Here are a few of the most popular:

Kirkpatrick 4 Levels

The four levels of Kirkpatrick's evaluation model essentially measure:

  • Reaction of trainee - what they thought and felt about the training
  • Learning - the resulting increase in knowledge or capability
  • Behavior - extent of behaviour and capability improvement and
  • implementation/application
  • Results - the effects on the business or environment resulting from the trainee's performance

All these levels are recommended for full and meaningful evaluation of learning in organizations.

Jack Phillips' Five Level ROI Model

Building upon the Kirkpatrack model, Jack Phillips added the fifth level the Return On Investment (ROI) produced by a training course using the financial formula:
ROI(%) = (Net Program Benefits/Program Costs) x 100

Robert Brinkerhoff's Success Case Method:

Brinkerhoff's six stage model is a comprehensive evaluation model that incorporates the results oriented aspects of the business and industry models and also the formative, improvement-orientated aspects of educational models (a systems perspective with an emphasis on return on investment).

A basic assumption of the six stage model is that the primary reason for evaluation should be to improve the program (systems perspective).

OPM's Training Evaluation Field Guide

To assist agencies in evaluating their training programs, OPM published the  Training Evaluation Field Guide in 2011. The Training Evaluation Field Guide is designed to assist agency training representatives in evaluating training effectiveness and in demonstrating training value to stakeholders and decision makers.

Field Guide Development Process

Data and information were gathered from fifteen federal agency representatives who volunteered their time to attend a one-day working meeting, participate in individual interviews and submit samples of their tools and case studies. This Field Guide reflects the input from the working group.

Key Audience and Usage

This Guide is designed for all federal employees who have a role in training evaluation and effectiveness within their agencies.

Specific users for this field guide are:

  • Training managers and supervisors
  • Training liaisons/coordinators
  • Agency evaluators
  • Instructional designers
  • Training facilitators
  • Any others who have a significant role in training effectiveness

Training Evaluators

Both Kirkpatrick and Phillips (and maybe others) offer "certifications" in training evaluation, however, a "certified" evaluator is not necessary to evaluate the effectiveness of agency training. The Training Evaluation Field Guide (linked above) and books on the various methods should provide enough information to successfully evaluate your agency training.

In addition, the American Evaluation Association (AEA) is an international professional association of evaluators devoted to the application and exploration of program evaluation, personnel evaluation, technology, and many other forms of evaluation. Evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness. AEA has approximately 5500 members representing all 50 states in the US as well as over 60 foreign countries.

Other Evaluation Resources

(Please feel free to add your evaluation methods and tools to this page)

Here are sample spreadsheets designed to track New Employee Orientation classes and Trainer Customer Satisfaction

Should you desire to look at more evaluation models, there are many to choose from...

  • Daniel Stufflebeam's CIPP Model (Context, Input, Process, Product)
  • Robert Stake's Responsive Evaluation Model
  • Robert Stake's Congruence-Contingency Model
  • Kaufman's Five Levels of Evaluation
  • CIRO (Context, Input, Reaction, Outcome)
  • PERT (Program Evaluation and Review Technique)
  • Alkins' UCLA Model
  • Michael Scriven's Goal-Free Evaluation Approach
  • Provus's Discrepancy Model
  • Eisner's Connoisseurship Evaluation Models
  • Illuminative Evaluation Model
  • Portraiture Model 
Control Panel

Unexpected Error

There was an unexpected error when performing your action.

Your error has been logged and the appropriate people notified. You may close this message and try your command again, perhaps after refreshing the page. If you continue to experience issues, please notify the site administrator.

Working...