`Quality assurance as a contributor to rankings or rankings as a

advertisement
Analysis of systemic reasons for lower
competitiveness of European universities
Global rankings do not demonstrate higher
(or lower) ‘competitiveness’ in terms of
what (most) universities are trying to do !
‘Quality assurance as a contributor
to rankings or rankings as a driver
of quality assurance?’
(what are the real indicators of
competitiveness ? )
Nick Harris
UK Bologna Expert
(former Director QAA UK)
rankings certainly do not reflect the interests of
all, but .. some ‘traditional dimensions’ of
competitiveness:
•
•
•
•
•
money (medicine and ‘big physics’ will always win)
power and prestige (hire [ex ] ‘movers and shakers’)
research (hire the best you can get – on short contracts)
knowledge transfer (build a ‘science park’)
support global / national / local industry (and culture)
(choose your partners – carefully)
• social inclusion and mobility ( sorry ? .. what ? )
.... and ... ?
supporting...
TEACHING AND LEARNING !
‘the student experience’ ..
an indicator of competitiveness ?
• ‘input’ indicators / measures ..
– the staff (experience), the facilities, the curriculum
• ‘output’ indicators / measures ..
– staff performance, student performance
• staff performance assessed by who ? . the students ?
• student achievements assessed by .. what ?
– qualifications gained, employability, earnings,
learning outcomes
assessing institutions ..
against each other -- using KPIs
or against their own mission --- using KPIs?
• the quest for excellence ..
– but not easy for an institution with a broad mission
.. can everyone be excellent at everything ?!
• evaluation of units (faculties/depts/programmes)
or whole institutions ?
– there is valid interest in both ..
quantitative QA contributes (easily) to rankings
but distorts the ‘academic endeavour’
UK experience showed that the use of quantitative QA
indicators at programme level has a major impact !
• more and more transparent QA
• more centralised QA for consistency across the HEI
• faculty staff ‘learn to playing the game’
• reducing cost effectiveness after 1st round
assessment of institutions against
standard (national?) KPIs
contributes (easily) to rankings ..
but distorts ‘institutional behaviour’
UK experience showed that the use of quantitative QA
indicators at institutional level has a major impact !
• more and more transparent governance
• more centralised QA for consistency across the HEI
• rectors ‘learn to play the game’
• reducing cost effectiveness after 1st round
assessment of institutions against
mission influenced KPIs
does not contributes so easily to rankings ..
but can encourage differentiation between HEIs
UK experience showed that the use of quantitative QA
indicators at institutional level has a major impact !
• more and more transparent QA
• more centralised QA for consistency across the HEI
• Rectors ‘learn to play their game’
– but they have to be ‘nimble’
– a change in leadership can ‘perturb’ the
institutions
so .. where now ?
• stick with ‘old’ ideas about competitiveness /
rankings .. and reinforce the stereotypes .. OR
• rethink our notion of competitiveness and its
assessment and
– celebrate diversity ..
– recognise differentiation ..
– and assess against what HEIs claim they are
doing
does it do what it says on the tin?
and be smart in how/what we assess ..
• evaluate only what is necessary
(risk based QA)
• recognise when sufficient
• avoid the ‘meaningless
(but easy to measure’)
and be proportionate
University
of Geneva
Download