Kirkpatrick's Learning and Training Evaluation Theory: Also Below - HRD Performance Evaluation Guide
Kirkpatrick's Learning and Training Evaluation Theory: Also Below - HRD Performance Evaluation Guide
evaluation theory
Donald L Kirkpatrick's training evaluation model - the four levels of learning
evaluation
Donald Kirkpatrick was president of the American Society for Training and Development
(ASTD) in 1975. Kirkpatrick has written several other significant books about training and
evaluation, more recently with his similarly inclined son James, and has consulted with some of
the world's largest corporations.
Donald Kirkpatrick's 1994 book Evaluating Training Programs defined his originally published
ideas of 1959, thereby further increasing awareness of them, so that his theory has now become
arguably the most widely used and popular model for the evaluation of training and learning.
Kirkpatrick's four-level model is now considered an industry standard across the HR and training
communities.
More recently Don Kirkpatrick formed his own company, Kirkpatrick Partners, whose website
provides information about their services and methods, etc.
reaction of student - what they thought and felt about the training
learning - the resulting increase in knowledge or capability
behaviour - extent of behaviour and capability improvement and
implementation/application
results - the effects on the business or environment resulting from the trainee's
performance
All these measures are recommended for full and meaningful evaluation of learning in
organizations, although their application broadly increases in complexity, and usually cost,
through the levels from level 1-4.
1 Reaction Reaction evaluation is 'Happy sheets', feedback Quick and very easy to obtain.
how the delegates felt forms.
about the training or Not expensive to gather or to
learning experience. Verbal reaction, post- analyse.
training surveys or
questionnaires.
2 Learning Learning evaluation is Typically assessments or Relatively simple to set up; clear-
the measurement of the tests before and after the cut for quantifiable skills.
increase in knowledge - training.
before and after. Less easy for complex learning.
Interview or observation
can also be used.
4 Results Results evaluation is the Measures are already in Individually not difficult; unlike
effect on the business or place via normal whole organisation.
environment by the management systems and
trainee. reporting - the challenge is Process must attribute clear
to relate to the trainee. accountabilities.
evaluation evaluation description and examples of evaluation tools and relevance and practicability
level and characteristics methods
type
1. Reaction Reaction evaluation is how Typically 'happy sheets'. Can be done immediately the
the delegates felt, and their training ends.
personal reactions to the Feedback forms based on
training or learning subjective personal reaction to the Very easy to obtain reaction
experience, for example: training experience. feedback
Did the trainees like and enjoy Verbal reaction which can be Feedback is not expensive to
the training? noted and analysed. gather or to analyse for
groups.
Did they consider the training Post-training surveys or
relevant? questionnaires. Important to know that
people were not upset or
Was it a good use of their Online evaluation or grading by disappointed.
time? delegates.
Important that people give a
Did they like the venue, the Subsequent verbal or written positive impression when
style, timing, domestics, etc? reports given by delegates to relating their experience to
managers back at their jobs. others who might be deciding
Level of participation. whether to experience same.
2. Learning Learning evaluation is the Typically assessments or tests Relatively simple to set up,
measurement of the increase before and after the training. but more investment and
in knowledge or intellectual thought required than
capability from before to after Interview or observation can be reaction evaluation.
the learning experience: used before and after although this
is time-consuming and can be Highly relevant and clear-cut
Did the trainees learn what inconsistent. for certain training such as
what intended to be taught? quantifiable or technical
Methods of assessment need to be skills.
Did the trainee experience closely related to the aims of the
what was intended for them to learning. Less easy for more complex
experience? learning such as attitudinal
Measurement and analysis is development, which is
What is the extent of possible and easy on a group scale. famously difficult to assess.
advancement or change in the
trainees after the training, in Reliable, clear scoring and Cost escalates if systems are
the direction or area that was measurements need to be poorly designed, which
intended? established, so as to limit the risk increases work required to
of inconsistent assessment. measure and analyse.
4. Results Results evaluation is the It is possible that many of these Individually, results
effect on the business or measures are already in place via evaluation is not particularly
environment resulting from normal management systems and difficult; across an entire
the improved performance of reporting. organisation it becomes very
the trainee - it is the acid test. much more challenging, not
The challenge is to identify which least because of the reliance
Measures would typically be and how relate to to the trainee's on line-management, and the
business or organisational key input and influence. frequency and scale of
performance indicators, such changing structures,
as: Therefore it is important to responsibilities and roles,
identify and agree accountability which complicates the
Volumes, values, percentages, and relevance with the trainee at process of attributing clear
timescales, return on the start of the training, so they accountability.
investment, and other understand what is to be measured.
quantifiable aspects of Also, external factors greatly
organisational performance, This process overlays normal good affect organisational and
for instance; numbers of management practice - it simply business performance, which
complaints, staff turnover, needs linking to the training input. cloud the true cause of good
attrition, failures, wastage, or poor results.
non-compliance, quality Failure to link to training input
ratings, achievement of type and timing will greatly reduce
standards and accreditations, the ease by which results can be
growth, retention, etc. attributed to the training.
Since Kirkpatrick established his original model, other theorists (for example Jack Phillips), and
indeed Kirkpatrick himself, have referred to a possible fifth level, namely ROI (Return On
Investment). In my view ROI can easily be included in Kirkpatrick's original fourth level
'Results'. The inclusion and relevance of a fifth level is therefore arguably only relevant if the
assessment of Return On Investment might otherwise be ignored or forgotten when referring
simply to the 'Results' level.
Learning evaluation is a widely researched area. This is understandable since the subject is
fundamental to the existence and performance of education around the world, not least
universities, which of course contain most of the researchers and writers.
While Kirkpatrick's model is not the only one of its type, for most industrial and commercial
applications it suffices; indeed most organisations would be absolutely thrilled if their training
and learning evaluation, and thereby their ongoing people-development, were planned and
managed according to Kirkpatrick's model.
For reference, should you be keen to look at more ideas, there are many to choose from...
Level 1 (Reaction)
Level 2 (Learning)
Level 4 (Results)
financial reports
quality inspections
interview with sales manager