Course costing…

advertisement
Course costing…
and reflections on
process benchmarking
HESA: Process Benchmarking Seminar – June 2011
Dr Thomas Loya, Director
Planning and Management Information
University of Nottingham
Introduction
Overview
Process benchmarking: evaluating aspects of
processes in relation to peer group / sector ‘best
practice’ – but how best to determine?
1.Various models for identifying good practice
2.Course costing: University of Nottingham’s journey
2
HESA: Process Benchmarking Seminar – June 2011
Process BM: Other models
Independent HE Research Agencies (USA)
•
•
•
•
•
•
Extensive specialist research capability
Membership based - but wide representation
Expanding membership into UK
Strong basis for Identifying good practice
Comprehensive coverage of HE activities
Examples:
– Education Advisory Board
– Hanover Research Council
HESA: Process Benchmarking Seminar – June 2011
3
Process BM: Other models
Standards organisations (BSI, ISO, etc)
•
•
•
•
•
Justifiable claim to define best practice
Extensive cross-sector engagement
Strong links to international / global practice
Negligible engagement by UK HE sector!
Strong coverage of biz process improvement:
Risk mgmt, Info security, environmental mgmt,
business continuity, accessibility, business
system documentation, supply chain mgmt, etc
HESA: Process Benchmarking Seminar – June 2011
4
Course costing
Purposes and issues
•
•
•
•
•
•
Assess teaching cost, income & profitability
Focus resource on higher net revenue areas
Cultural change: cost & efficiency awareness
Identify all costs - in Schools and ‘centre’
Provide powerful form of mgmt information
Not to be used in isolation; context is a
review / refresh of institutional portfolio
HESA: Process Benchmarking Seminar – June 2011
5
Course costing
Two overall approaches
• Bottom-up: activity based costing
–
–
–
–
Aggregate up specific activity costs
Detailed time, quantity, activity data
Fine grained; costly to gather & analyse data
Successful one-School pilot => scalability issues
• Top-down
– Parcel out high-level costs to modules
– Aggregate up costs to courses
– Broad-brush, use centrally held data
– Set out on this path for first institution-wide exercise
HESA: Process Benchmarking Seminar – June 2011
6
Course costing
The Model:
Costs
TRAC Teaching
Costs
By School
Teaching
Assessment
Fixed Costs
HESA: Process Benchmarking Seminar – June 2011
Other
Variable Costs
7
Course costing
The Model:
Shares
Module Fixed
Cost Amount
Credits * School
share * weight
HESA: Process Benchmarking Seminar – June 2011
Module
Variable Cost
Amount
Credits * School
share * weight * students
8
Course costing
The Model:
Modules
Module
Variable Cost
Amount
Module Fixed
Cost Amount
Fixed Costs
Module
Income
Total
Module
Income
and Costs
HESA: Process Benchmarking Seminar – June 2011
Variable Costs
Central Costs
9
Course costing
The Model:
Courses
Module
Income and
Costs
Students by
Course and
Module
HESA: Process Benchmarking Seminar – June 2011
Course
Income and
Costs
10
Course costing
Outputs
• Income, school and central costs =>
operating and net margins, for every course
• Currently covers UG + PGT, for 2 years
• Identify cost drivers, areas for review
• Valuable input to academic strategy
development and institutional portfolio review
HESA: Process Benchmarking Seminar – June 2011
11
Course costing
Some issues and next steps
•
•
•
•
•
Costs dependent on module classification
Skepticism about use of TRAC data
Input data as valuable as headline figures
Complex chain of reasoning to mgmt action
Have moved to bottom-up approach!
HESA: Process Benchmarking Seminar – June 2011
12
Course costing
Practical considerations
• High data volumes led to use of Cognos EP
for data management and analysis
• Long period to register impact of changes
• To gain full value, needs to be repeatable
• Bottom-up data gathering is costly
• Can appear contrary to academic culture
HESA: Process Benchmarking Seminar – June 2011
13
Course costing
Implications for benchmarking…
• Feasibility dependencies: data availability,
data mgmt capability, drivers, scale, scope…
• Costs may reflect mission, subject mix,
organisation structure, efficiency of ops
• Powerful internal metric, but…
– Not likely to be a quick win
– Difficult/impossible to include quality
– Limited scope comparability b/w HEIs
HESA: Process Benchmarking seminar – June 2011
14
Finally…
Issues, questions and concerns
•
•
•
•
•
•
Best practice vs just ‘good’ practice(s)?
Efficiency - institution or sector attribute?
Competition - cuts both ways
Can be irrational to share real innovation
Myth of ‘HE exceptionalism’
What do we want from HESA?
HESA: Process Benchmarking Seminar– June 2011
15
Thank you
For questions, please contact:
Thomas.loya@nottingham.ac.uk
Download