Evaluating HRD Programs

advertisement
Evaluating HRD Programs
Chapter 7
Human Resource Development
Purpose of HRD Evaluation






Determine if accomplished objectives
Identify strengths and weaknesses
Cost-benefit analysis
Who should participate and who
benefited most
Determine if program appropriate
Database for decision making
Potential Questions to Be Addressed
in a Process Analysis
(During Training)
Was there a match between trainer, training techniques, and
training/learning objectives?
• Were lecture portions of the training effective?
Was involvement encouraged/solicited?
Were questions used effectively?
• Did the trainer appropriately conduct the various training methodologies
(case study, role play, etc.)?
Were they explained well?
Did the trainer use the allotted time for activities?
Was enough time allotted?
Did trainees follow instructions
Was there effective debriefing following exercises?
• Did the trainer follow the training design and lesson plans?
Was enough time given for each of the requirements?
Was time allowed for questions?
Kirkpatrick’s Levels of Criteria




Reaction – did trainees like the program
Learning – demonstration of learning at
end of program
Behavior – actual transfer to job
Results – impact on bottom line
including efficiency, productivity, cost,
etc.
Reaction to Training – Part 1 of 2
Answer the following questions about the training in Active Listening skills using the
scale below:
1= Strongly disagree 2= Disagree 3= Neither agree nor disagree 4= Agree
5=Strongly agree
1. The training met the stated objectives
1
2
3
4
5
2. The information provided was enough so I
understood the concepts being taught
1
2
3
4
5
3. The practice sessions provided were sufficient
to give me an idea of how to perform the skill
1
2
3
4
5
4. The feedback provided was useful in helping
me understand how to improve.
1
2
3
4
5
Reaction to Training – Part 2 of 2
5. The training session kept my interest throughout
1
2
3
4
5
6. The pace of the Active Listening session was
1. Way too fast
2. A bit fast
3. Just right
4. A bit slow
5. Way too slow
7. What did you like best about this part of the training
8. What would you have changed
Additional Comments:
Note: A similar scale would be used for each of the other components of training
that were taught.
Paper & Pencil Test for Evaluation of
Learning
Evaluation of learning
There is no specific time limit on this test, but most should be able to finish in
about one hour. Answers to the questions should be written in the
booklet provided.
Please read each question carefully as some of the questions have more
than one part to them.
1.
List four types of active listening and provide an example for each.
2.
List the steps in the Conflict resolution model. After each step, provide a
relevant example of a phrase that would represent that step.
Multiple choice or fill in the blank questions
3.
Potential Questions to Be Addressed
in a Process Analysis
(During Training)
Was there a match between trainer, training techniques, and
training/learning objectives?
• Were lecture portions of the training effective?
Was involvement encouraged/solicited?
Were questions used effectively?
• Did the trainer appropriately conduct the various training methodologies
(case study, role play, etc.)?
Were they explained well?
Did the trainer use the allotted time for activities?
Was enough time allotted?
Did trainees follow instructions
Was there effective debriefing following exercises?
• Did the trainer follow the training design and lesson plans?
Was enough time given for each of the requirements?
Was time allowed for questions?
Possible Additions to
Kirpatrick’s Model




Expanding reaction measures to include
reactions to training methods
Splitting reactions to assess perceptions
of enjoyment, usefulness, difficulty
Adding 5th level to include societal
contributions
Adding 5th level to include return on
investment
Data Collection Methods






Interview
Questionnaire
Direct observation
Tests and simulations
Archival performance data
Types of data – individual, group,
system wide
Research Design

Internal Validity



Did a change occur?
Was the change a result of the training?
External Validity

Will the change occur in other situations
with different trainees?
Threats to Internal Validity




History – events occurring during
training
Maturation – natural improvements with
development
Testing – effects on pre-tests on
changes
Instrumentation – different measures at
different point in time
Threats to Internal Validity
(continued)



Statistical regression – select trainees
measured at extremes in abilities/KSAs
and regress to mean
Reactive effects of research situation –
motivation (Hawthorne effect)
Multiple treatment effects – previous
training
Threats to External Validity



Representativeness of sample and
setting
Differential selection – basis for
choosing trainees
Experimental mortality – turnover
Experimental Designs




Control group – random assignment to
treatment and control groups so
trainees have similar characteristics
Two-group posttest only
Two-group pretest/posttest
Four group design – control for effects
of pretest and prior knowledge
Non-experimental Designs


Case study – intensive, descriptive
study with after only measures
One group pretest/posttest design
Quasi-experimental Designs


Nonequivalent control group design –
analyze for equivalence or use multiple
regression to control for demographic
factors
Time series – establish base line, then
training, then series of measures to
determine if change has occurred
Ethical Issues Concerning
Evaluation Research





Confidentiality
Informed consent
Withholding training – can provide later
Use of deception
Pressure to produce positive results
Assessing the Impact of HRD
Programs



Cost-benefit – monetary costs in
relation to nonmonetary benefits
Cost-effectiveness – monetary costs in
relation to monetary benefits
Return on investment = results/costs
Download