TUP-HRD Chapter 11

advertisement
Human Resource Development:
Managing Learning and Knowledge Capital
Chapter 11
Evaluation
Copyright © 2010 Tilde University Press
Misconceptions about evaluation
• Evaluation is a control system
– Likely pay-off balanced against the cost
– Costs unlikely to surface until a time in the future
– Such risks will eventually translate into facts, and the dues
will then have to be paid
• Most of the evaluation techniques discussed in this
chapter apply to the legitimate system
• Some effort to evaluate development in the shadow
system has to be made
2
Need for evaluation
• All HRD interventions will be evaluated anyway,
either formally or informally
• Legal imperatives
• Essential to the survival of the HRD function
• Role of the evaluation
–
–
–
–
Measure what change has occurred
Improve the other three stages of the HRD system
See if the change is attributable to the learning episode
See if the change was worthwhile
3
Assessment of learning
• Gathering, interpreting and describing of information
about learner achievement
• Constructive alignment – see Figure 11.1
• Cannot directly observe what learning has occurred:
– So, most assessment procedures ask the learner to perform a
behaviour
– It is this behaviour that is then measured.
• Several potential weaknesses. For example:
– The behaviour is only a sample of the total learning
– The behaviour represents only explicit knowledge
– Measurement vs Indicator
4
Types of assessment
• From easiest to most difficult
–
–
–
–
–
–
–
skills testing
objective written
subjective written
performance tests
learning diaries
analytical critiques
portfolio assessment
5
Assessment and the HLO
• Constructive alignment
– Relationship between the learning outcomes, the learning
strategies of choice, and the assessment types of choice
• Figure 11.3 provides a logical starting point for deciding
on appropriate assessment types.
• Initial decision may be adjusted
– Can be a range of complexity within each of the assessment
types
– Tend to use a number of assessments in combination
– Authentic assessment
– The characteristics of the learner
– Any issues raised in the HRDNI
6
Self-assessment
• Self-assessment is the best and richest form
of assessment
• Need to develop the learners to such an extent
that they value their own self-assessment above
all else
• The learner finally becomes an independent
learner
7
The HR developer’s dilemma
• Skills tests
• Reflect a simple situation
• Easy to show reliability and
• However, the observed behaviours of skills are blurred shadows
of the quality that is being assessed
• Portfolio assessment, depends on the honesty of the
learner
• Major strength when the development of the learner is the main
objective (i.e., encourages self-assessment)
– When assessment is for other purposes (e.g. monetary gain for
the learner), the use of the more complex assessment types
becomes more problematical
8
The meaning of scores
• A raw score - when assessment is gauged using a
quantitative figure
– What does the raw score mean?
– Criterion-referenced scoring
– Norm-referenced scoring
• Use of scores
– Formative assessment
– Summative assessment
9
Use of assessment
• Feedback
– To the learner
– To the HR developer
• Evaluation
– One of the levels of evaluation
10
Kirkpatrick’s four levels
• Reaction
– Reactions of the learners to the learning episode
– Usually measured with a questionnaire - happy sheets?
• Learning – discussed previously
• Behaviour
– Change in on-the-job behaviour
– Use performance appraisal process
• Results
– Impact of the learning episode on the organisation as a whole
– Tangible indicators (e.g. improved profits)
– Success Case Method
11
The presage factors
•
•
•
•
•
Stage I: evaluate needs and goals
Stage II: evaluate HRD design
Stage III: evaluate implementation
Stage IV: evaluate learning - similar to Kirkpatrick
Stage V: evaluate usage and endurance of learning similar to Kirkpatrick
• Stage VI: evaluate payoff - similar to Kirkpatrick
• Could also add another – evaluate the HR developer
• SEE Table 11.1 for a discussion on the eight (8) levels
of evaluation
12
Review of the four roles of evaluation
• The 8 levels of evaluation achieves first two:
– Identify what change has occurred
– Improve the other three stages—investigation,
design and implementation—of the HRD system
• Need to go further to achieve the last two:
– See whether the change is attributable to the
learning episode – scientific model
– See whether the amount of change was
worthwhile – cost benefit analysis
13
Scientific Model
• Based on the experimental methods
• In order from the simple and less costly to
the complex and more costly
–
–
–
–
–
Post-test
Pre-test–post-test
Time series
Control group
Solomon four
• A complex decision
14
Cost benefit Analysis
• Identify the costs, in dollar terms
– Relatively straight forward
– Cut off points? – e.g. apportioning electricity charges
• Identify the benefits accruing in dollar terms
– Sometimes easy – e.g. reduction in accidents
– Often difficult – e.g. value of change at the meta-ability level in
a learner
– Often use a ‘shadow value’ – e.g. promotions individual achieves
– Cut off points?
• The ratio of costs/benefits
– Should be in favour of the benefits
15
The evaluation plan
• Starts during the design stage
• Should include
–
–
–
–
–
–
–
–
–
–
Develop the assessment of learning first
Add further assessment of learning as required for the evaluation
Decide what presage variables will be evaluated and when
Review the investigating instruments and incorporate into plan
Design daily and overall course (or workshop) reaction sheets
Plan the scientific pre-test and post-test instruments, if needed.
Identify the methods for the behaviour and results levels
Decide if a cost–benefit analysis is to be used and plan for it
Prepare a budget for the evaluation plan
Send the evaluation plan to staff who are affected
16
The evaluation report
• Should at least include the following
–
–
–
–
Executive summary
Findings/recommendations section
Table of contents
Main body
• reasons for the evaluation
• list of the personnel involved
• discussion of - types of evaluations used and how the data
was collected and analysed
• discussion of the findings,
• list and a discussion of the recommendations
– Appendices
17
Download