USING WORLD UNIVERSITY RANKING SYSTEMS TO INFORM

advertisement
USING WORLD UNIVERSITY RANKING
SYSTEMS TO INFORM AND GUIDE
STRATEGIC POLICY MAKING. A CASE
STUDY OF A CANADIAN RESEARCH
UNIVERSITY
AC21 International FORUM 2010
Shanghai Jiao Tong University
Roland Proulx
Université de Montréal
- Canada
PURPOSE PURPOSE OF THE PRESENTATION
THPRESENTATION
The intent of this paper is to illustrate, in the format of a case
study, how the international rankings have initiated and
informed a strategic thinking and planning process at the
University of Montreal.
The methodology used:
• Set the strategic position of the University from a global perspective :
 assessing the “overall scores” of the rankings as a general reference
 and addressing, beyond the “overall scores”, the various individual
performance indicators;
• Apply, according to the intent of the organization, the various scores
and indicators to a formal strategic decision making process.
Outline of the presentation
 The Findings
 Ranking of University of Montreal according to the
“overall scores”
 Ranking of University of Montreal according to
selectively chosen research indicators
 Diagnosis and strategies
THE FINDINGS
FINDING # 1
UNIVERSITY’S WORLD POSITION
ACCORDING THE FINAL SCORES (2008)
World
North America
Europe
Canada
94 (62)*
63 (45)
24 (12)
5 (4)
THES
91
38
35
5
ARWU
101
59
35
6
LEIDEN (crown indicator)
125
87
38
5
WEB (2009)
63
54
9
5
RESEARCH INFOSOURCE
na
na
na
6
RANKINGS
TAIWAN
The University of Montreal is clearly ranked among the top 100 world
universities, though on the border line on most of the rankings. On the North
American scene, the University does not make the top 50, though it does rank
among the top 50 European universities. Among the 15 Canadian research
universities with medical schools, the rank of the University of Montreal
fluctuates from position 4 to 6.
FINDING # 2
Ranking of University of Montreal
according to selectively chosen research indicators
• For the purpose of a more targeted and more limited analysis
of the University of Montreal as a world class research
intensive university, the University have decided to evaluate
and position its performance mostly against the various
research indicators, thus considering the indicators per se
without the biases of the selection of criteria, the score
calculating and sorting, adopting a customizing process for
planning purposes (the CHE model).
• Marginson (2008) has rightly argued that “Each indicator has
distinct meanings for measures of value for the k-economy,
and for policies of innovation and performance improvement”
The research performance indicators used
perform ance indicators in research
according to 3 sub-categories
24%
43%
33%
publicat ions (7)
cit at ions (9)
impact (5)
Université de Montréal: its position according to the ranks of
the indicators: world, canada, north america, europe
criteria
productivity
source
indicators
weight
world
Canada
N.America
Europe
ARWU
Score on SCI (2007)
20%
63
5
38
14
LEIDEN
Number of publications (2003-2007)
N/A
117
5
56
43
TAIWAN
Number of articles (1997-2007)
10%
84
6
57
36
TAIWAN
Current articles (2007)
10%
67
5
45
23
Excellence
THES
Citations/Faculty (2002-2007)
20%
104
4
48
37
& intensity
LEIDEN
Citations / publication (2003-2007)
N/A
104
5
69
35
ARWU
Score on HiCi
20%
194=
12=
113
62
TAIWAN
HiCi papers (1997-2007)
15%
92
6
37
29
TAIWAN
Hi-Index (2006-2007)
20%
76
4
30
19
TAIWAN
Hi-Impact journal articles (2007)
15%
77
5
24
22
ARWU
Score on N&S (2003-2007)
20%
170=
6=
87
66
TAIWAN
11 year citations (1997-2007)
10%
93
6
70
34
LEIDEN
Number of citations (2003-2007)
N/A
110
5
63
40
TAIWAN
number of citations (2006-2007)
10%
76
4
56
21
TAIWAN
Average # of citations (1997-2007)
10%
79
8
58
64
LEIDEN
CPP/FCSm (2003-2007)
N/A
125
5
87
38
impact
Summarizing the posisitions
 The various indicators related to research
productivity and impact tend to give a good mark to
the University of Montreal (with the exception of the
Leiden Ranking 2008), being ranked in the top 100
 But the excellence/intensity indicators do not
compete as well…
 On most indicators, the University performs well on
the European scene, being among the top 50, but is
unable to make the top 50 North American
Universities and reach the first positions among the
Canadian research universities
 As a whole, the most indicators are borderline…
Diagnosis and Strategies
The reputation of University of Montreal
• The THES "Peer review" indicator classifies the
University of Montreal to the 58th rank
among the top 100 universities worldwide, to
the 25th rank among the North American
universities and to the 14th rank among the
European universities.
Diagnosis
 Doubts may be expressed whether the specific indicators do reflect and
support such a reputation and classification, at least for how long! The
research performance of the University of Montreal remains fragile and
precarious and may not substantiate and support its reputation.
 The productivity indicators are on the border line of the top 100 worldwide and of the top 50 North American
 The excellence/intensity and impact indicators have serious problems on
the world and North American scenes
 the Institution have not succeeded to improve its position among the
Canadian research universities.
 As a whole, we notice a lost of performance when shifting from
productivity indicators to intensity and impact indicators
the University’s future strategic direction and plan related to research are
at stake
Formulating Strategic Goals
• The University of Montreal has no choice: it must consolidate
and improve significantly its performance in scientific
publications and citations if the Institution aims at
maintaining its enviable international reputation.
• The University must then take action to position its research
performance
 clearly under the threshold of the top 100 world universities
(preferably under the top 75)
 the top 50 North American universities (preferably under the top 40);
 the University of Montreal must also aim at being clearly ranked
among the 3 first Canadian research universities.
Crafting Strategies
To confirm the international stature of the
University, 2 sets of well targeted strategies
must be implemented.
Strategy #1
The first set would aim at recovering the numerous
publications and citations not counted in or left aside
by the major databases: Thomson Reuters, Scopus
and Google Scholar.
 The problems of language
 Absence of institutional signature
 ISI Highly Cited Researchers missing
Strategy # 2
A second set of strategies to be crafted,
beyond the potential recovery that should
yield the bibliometric evaluation of the
databases, must deal with the increase in the
number, quality and impact of the
publications attributed to the University of
Montreal.
Mapping the scenarios
• Three Canadian research universities (Toronto, UBC
and McGill) are clearly ranked among the top 100
world universities by all ranking systems and two
others are listed among the top 100 according to
some systems.
• Being easily benchmarked with full access to reliable
data ((publications numbers, citations numbers,
citations per publications and field normalized
average impact), the first five universities will serve
here as the main reference for the various mapping
scenarios.
Scientific publications, citations per faculty and field nornalized impact (2003-2007)
Top six Canadian research universities
PUBLICATIONS CITATIONS FT FACULTY PUB/FAC CIT/FAC CIT/PAPER CCP/FCSm
TORONTO
31 170
251 802
2 400
13,0
105
8,1
1,45
UBC
18 679
139 266
2 190
8,5
64
7,5
1,51
McGILL
17 241
131 902
1 590
10,8
83
7,7
1,41
ALBERTA
14 224
85 414
1 602
8,9
53
6,0
1,18
MONTREAL
12 840
84 304
1 887
6,8
45
6,6
1,26
McMASTER
9 428
66 754
1 194
7,9
56
7,1
1,41
Sources: Thomson Reuters reported by University of Toronto, Research Infosource and Leiden ranking.
Implementing the scenario
To align the research performance of the University of Montreal
from its present world positions to the 3d and 4th quartiles of the
top 100 world universities and to position itself among the 3 first
Canadian research universities –in other words to live up to its
reputation (THES) and visibility (WEB),
 The average number of publications and citations should
massively be increased;
 Accordingly the average normalized field impact (CPP/FCSm) of
the papers published by the Faculty members, the ratio of 1, 26
should be increased: a special care should then be given to
publishing in high impact fields and journals.
Concluding remarks
• The author of this paper has argued that the world
university rankings can reasonably inform an
effective decision making process, if such a process
goes beyond the overall scores to take full advantage
of the individual indicators, being selectively chosen
according to the strategic intent of the institution
and linked to a formal strategic environmental
thinking exercise leading to operational changes.
We have then learned that
 In encapsulating the rankings with a single composite index and
focusing mostly on world “top” universities, the league tables have
somehow hidden and concealed the very broad information used for
their production.
 We must restore the intrinsic value and individual indicators: "Each
indicator has distinct meanings for measures of value for the keconomy, and of policies for innovation and performance
improvement" (Marginson 2008: 17).
 In this sense, the individual indicators provided by the rankings draw
the contours of a worldwide genuine public information system
capable of producing a customized ranking “à la carte” of universities
(Bourdin Report 2007-2008).
Download