Benchmarking for improvement

advertisement
Benchmarking for
improvement
Liz Hart
Director of Information
Services
Staffordshire University
Benchmarking for
improvement
Brief overview of purpose and current
situation in the UK
Benchmarking consortia
Examples of benchmarking in practice
Benchmarking in libraries
Growth of importance of quality
management
Accountability
Three “E”’’s:
Effectiveness
Efficiency
Economy
Purpose?
“Measurement and benchmarking are not
separate sciences or unique themes of
quality management, but rather strategic
approaches to getting the best out of
people, processes, plant and
programmes”
John Oakland (1993)
UK experience
40% of HE sector involved in some way
Variable involvement:
Structured to “unstructured”
SCONUL pilots project
Manual of methods
Issues from UK experience
Time and effort balanced by willingness to
engage
Ethics, confidentiality and honesty
Staff / user involvement essential
Process mapping / Activity Based Costing
Comparative measures difficult!
Good project management essential
Issues from UK experience
Embedding benchmarking in institutional
or service quality framework
Methodologies from the literature do work
in practice
Methodology
Oakland’s 15 stages: Plan and Action
Plan: 7 stages
Select processes
Identify appropriate benchmarks
Collect environmental data
Identify team
Decide on data collection methodology
Methodology
Project planning and timescales
Useful example: Advice desk project
Critical success factor - satisfied customers
“delightful” process
Help; situation; environment
Balanced against “right” information
7th Stage in Plan: implement data
collection
Methodology
Oakland’s second group, Action:
Compare data
Organise and catalogue for retrieval
Understand enabling processes
Set new objectives and standards
Develop action plans for new standards
Implement and embed changes
Methodology
Monitor and evaluate outcomes and
improvements
Review your measures to ensure usability
Sounds obvious - but it is a discipline
Benchmarking consortia
and other models
Motivation?
Individual
Step change in organisation
Conceptual review
Developmental and experimental
Commonwealth University Management
Benchmarking Club
Consortia
Benchmarking agreements
Framework for operation
Define clarity of purpose:
To produce beneficial cross University
analysis of process, statistical information and
service outcomes
Produce comparative data
Voluntary grouping
Equal partnership
Benchmarking agreements
Executive Group
Operations Group
To ensure consistency, comparative
outcomes and methodological enhancements
Sub groups led by one partner institution
Clear financial and resource based
Ensures work is evenly allocated and shared
Benchmarking agreements
“Get out” clauses
Non-participation due to internal
developments
3 months notice
Confidentiality
Critical and key to success
Openly share finance, staffing and process
information and data
Benchmarking agreements
Why is this so important?
Open environment
Facilitates working relationships
Mutual support in process of assessment
Shared objectives
And finally, politically advantageous
Learning from experience
Good project management
External marketing
Users and non-users
Marketing largely passive to date
Changing in 2002/2003 cycle
Tools and techniques
Mystery shopper
Used extensively in commercial sector
Dependent on robust and open relationship
between partner institutions
Essential to agree and clarify criteria for
measurement
Sensitivity regarding outcomes
Tools and techniques mystery shopper
Mystery shopper for website access
evaluation
Set questions assessing 3 variables:
Ease of access
 Success in finding information
Time taken
9 out of 10 questions the same
Shoppers were from other institutions
Tools and techniques mystery shopper
Shoppers accessed sites in predetermined
order
Outcomes?
Time - much longer than anticipated
1 site much more successful than other 3
Basic navigation OK
Access via username and password reduced
success by minimum of 25%
Tools and Techniques Exit interviews
Undertaken outside library environment
“Neutral” territory
Example from Advice Desk project
Difference between questionnaire approach
and facilitated approach
Facilitators gained more comprehensive and
open responses
Costly but more realistic views obtained
Tools and techniques behavioural study
Unobtrusive observation used in Advice
Desk project
Rejected as a method following trial
Sensitivity and influence on staff behaviour
Method valuable for “people flows”
Ability to predict demand and physical use
of space/s
Tools and techniques Measuring process times
Shelving project
Length of time to re-shelve
Tidiness and accuracy of items on shelves
Environmental and costing data
Changed timescales between 2000 and
2001 based on experience gained
Tracking slips
Improved shelving times by up to 50%
Tools and techniques Measuring process times
Improvement via Practical Action:
Targetting new staff appointments
Moving stock
Creating core team of shelving staff
More effective use of return shelves
Better trained / longer serving staff
undertaking initial sorting for floors
Tools and techniques Measuring process times
Improvement via Staff motivation
Greater motivation / self starting
Team planning
Willingness to reassign duties - flexibility
Local analysis to pinpoint problems
Shelving is (for now) the base of the pillar
Tools and techniques Measuring process times
Shelf tidiness
Counting samples across classmark ranges
Carried out over one block week (2000) then
1 day per week over 5 weeks (2001)
Outcomes?
2 sites with fast shelving had untidy shelves
1 site with quickest shelving had tidy
shelves!
Tools and techniques Measuring process times
Outcomes?
Will repeat as not entirely clear
Most significant factor seems to be priority
given to shelving compared with other duties
Staff response emphasised the importance of
speed and accuracy particularly in relation to
other processes (reservations for example)
Outcomes and benefits?
Provision of shared management
information
Establishing best practice
Identifying and implementing positive
change
Evaluating opinion, views and needs of
customers
Beginning to identify trends
Outcomes and benefits?
Networking - “invaluable”
Exchange of ideas and views
Staff development
Staff ownership and flexibility
Perspectives on roles:
Challenging the established or expected
outcome or view
Outcomes and benefits?
Institutional recognition
4 Universities Benchmarking Consortia has
proved benchmarking is:
Achievable
Repeatable
Valuable
In future it will be part of our routine...
Download