21st International Conference on Higher Education (ICHE) “University Values – University Integrity” August 26-28, 2010 Trier, Germany Integrity in Higher Education: Quality Improvement versus Ranking Improvement Nachum Finger Ben-Gurion University of the Negev and the Council for Higher Education, Israel 1 Table of Contents The Role of the University Rankings of HEI Rankings examples Methodology issues Integrity issues The Israel HE System Habermas’ essay Implications to quality Quality Evaluation Pressure to Rank Summary 2 The Role of the University… In his essay “The University in a Democracy – Democratization of the University” Habermas refers to the following news item as it appeared in the Frankfurter Allgemeine Zeitung of January 11, 1967 to discuss the role of the University… From: Toward a Rational Society, Jürgen Habermas, Heineman Educational Books, London 1971. 3 The News Item… “In the vicinity of Sde Boker in the Negev, Israel's large desert, Ben-Gurion wants to found a university town to serve the exploitation of this desert area. The new town is being planned for ten thousand students and the corresponding number of faculty and is to bring Israeli youth into contact with the development of the desert through the acquisition of the necessary knowledge of the natural sciences and technology. It is intended primarily to develop the trained personnel who will be necessary for future industry in the desert. In particular, the development of such industry will involve enterprises that require much scientific knowledge and little raw material.” From: Toward a Rational Society, Jürgen Habermas, Heineman Educational Books, London 1971. 4 The Role of the University… “First, the university has the responsibility of ensuring that its graduates are equipped, no matter how indirectly, with a minimum of qualifications in the area of extrafunctional abilities…” “Second, it belongs to the task of university to transmit, interpret, and develop the cultural tradition of the society...” “Third, the university has always fulfilled a task that is not easy to define; today we would say that it forms the political consciousness of its students…” From: Toward a Rational Society, Jürgen Habermas, Heineman Educational Books, London 1971. 5 Our perception as to the role of the university relate directly to the importance we attach to the idea of ranking and especially to Quality Improvement vs. Ranking Improvement as reflected in following examples and citations. 6 Methodology SJTU Academic Ranking of World Universities Nobel laureates (staff) Nobel laureates (alumni) Highly cited researchers Articles published Articles cited Size 20% 10% 20% 20% 20% 10% From: A Faustian Contract: Institutional Responses to National and International Rankings. Peter W A West OBE. IMME Conference, September 8-10, 2008, Paris 7 Methodology THE-SQ World University Ranking Student-staff ratio Recruiter survey Peer survey International staff International students Articles cited 20% 10% 40% 5% 5% 20% From: A Faustian Contract: Institutional Responses to National and International Rankings. Peter W A West OBE. IMME Conference, September 8-10, 2008, Paris 8 Methodology U.S. News and World Report College Rankings Peer assessment Percentage of classes with fewer than 20 students Percentage of classes with more than 50 students Average faculty salary Percentage of professors with highest degree in field Student / faculty ratio Percentage of professors who are full-time Spending per student Percentage of students in top 10% of high school class Student SAT scores Acceptance rate Graduation rate Retention rate Alumni giving rate Graduation rate performance (predicted versus actual) 25% 6% 2% 7% 3% 1% 1% 10% 6% 7.5% 1.5% 16% 4% 5% 5% From: America’s Best Colleges: 2007 Edition, U.S. News and World Report LP, 2006. 9 Some Questions about Ranking Methodologies 1. 2. 3. 4. 5. 6. 7. Choice of indicators Arbitrariness of weights Formula changes Reliance on polls – a) Who is polled b) What weight Statistical validity Use of Quality Assessment as input Inconsistencies between ranking methodologies Source: Ranking in Higher Ed. Institutions. Anthony Stella & David Woodhouse. AUQA, 2006. 10 Questions continued… 1. 2. 3. Bias against Humanities and Social Sciences Inconsistent classification of institutions Inappropriate measures of teaching quality. Source: Richard Holmes, Asian Journal of University Education 1(1), 2006, 1-14. 11 “In 2004 the oldest public university in Malaysia, the University of Malaya, was ranked by the Times Higher Education Supplement at No. 89 in the world. The vice-chancellor ordered huge banners declaring “UM a world’s top 100 university” placed around the city. But last year the THES changed the definition of Chinese and Indian students at UM from international to national and the university’s position in the reputational surveys that comprise 50 percent of the THES index also declined. The result was that UM dropped from 89 to 169. The university’s reputation abroad and at home was in free fall. When the VC’s position came up for renewal by the Government last March, he was replaced.” Source: Simon Marginson, “Rankings Ripe for Misleading. ”The Australian. December 6, 2006. 12 “Many of our American colleagues say that they would like to resist the rankings, but fear it can’t be done, especially if only a few institutions act. A growing number of Canadian institutions began to raise the same alarm, ultimately resulting in 25 of our 90+ institutions – including many of our leading universities – banding together to take just such a stand against the fall rankings issue of Maclean’s, our Canadian equivalent.” Source: Indira Samarasekera, “Rising Up Against Rankings.” Inside Higher Ed. April 2, 2007. 13 “However, global comparisons are possible only in relation to one model of institution, that of the comprehensive research intensive university, and for the most part are tailored to sciencestrong and English-speaking universities. Neither the Shanghai nor the Times rankings provide guidance on the quality of teaching.” Source: Simon Marginson, Marijk van der Wende. (2007) “To Rank or To Be Ranked: The Impact of Global Rankings in Higher Education”. Journal of Studies in Intl Ed. 14 “Performance indicators currently used by higher education institutions are generally chosen because they are readily quantifiable and available, and not because they accurately assess the quality of teaching (Burmons, Brouwer, Veld and Marthens, 1987). Therefore over-interpreting performance indicators is even more dangerous (Chalmers, 2007). Source: Fabrice Henard, “Learning our Lesson” Review of Quality Teaching in Higher Education, OECD, 2010, p. 81 15 On Feb. 17, 2010 ABA President Carolyn Lam commissioned an examination of the ranking of law schools. Of the adverse effects of US News ranking – three of greatest concern: The current methodology tends to increase the costs of legal education for students The current methodology tends to discourage the award of financial aid based upon need The current methodology tends to reduce incentives to enhance the diversity of the legal profession. Source: Report of the Special Committee on the U.S. News and World Report Ranking– Section on Legal Education and Admission to the BAR 16 Just recently… “Like other observers of global university rankings, I’ve been intrigued by the trash talk in recent months between the British publication Times Higher Education and the higher ed research-consulting group QS (short for Quacquarelli Symonds). For six years, beginning in 2004, the two organizations worked together to produce the influential and oft-condemned World University Rankings, the chief rival to the almost-as-controversial Academic Rankings of World Universities inaugurated the previous year by Shanghai Jiao Tong University. But last October, Times Higher abruptly announced that it was parting ways with QS and would instead completely revamp its ranking in partnership with Thomson Reuters, the global information firm.” Source: Wildavsky, B. “Global Rankings Smackdown!” The Chronicle of Higher Education. July 15, 2010. 17 Contribution of Rankings “…While HE leaders are concerned about the impact of rankings, they are also increasingly responsive and reactive to them. In addition, key stakeholders use rankings to influence their decisions: students use rankings to ‘shortlist’ university choice, and others make decisions about funding, sponsorship and employee recruitment. Rankings are also used as a ‘policy instrument’ to underpin and quicken the pace of HE reform.” Source: Hazelkorn, Ellen. (2008). “Learning the Live with League Tables and Ranking: The Experience of Institutional Leaders.” Higher Education Policy. 21, 193. 18 Contribution continued… 1. 2. 3. 4. Majority of students use rankings in their decision-making. Employers look at ranking. Leaders in HE take action based on ranking. Governments/Sponsors sometimes make strategic decisions based on rankings. Source: Ellen Hazelkorn. “What Have We Learned About and From Rankings.” CHEA Annual conference, Jan 2010. 19 Contribution continued… “Rankings are significant drivers of a school’s reputation. Good performance can double inquiries and applications and allow schools to charge prestige premiums. Financial Times top decile MBA programmes charge, on average, just below $80,000 for an MBA. Bottom decile schools charge only $37,000… This paper finds that it is impossible to challenge the criteria set out by a variety of rankings organisations and it is ill-advised to boycott rankings. Schools are advised to consider which criteria reflect areas needing improvement and to continue “playing the game.” Source: Peters, Kai. (2007). “Business School Rankings: Content and Context.” Journal of Management Development. 20 Contribution continued… “We found that moving onto the front page of the U.S. News rankings provides a substantial boost in the following year’s admissions indicators for all institutions. In addition, the effect of moving up or down within the top tier has a strong impact on institutions ranked in the top 25, especially among national universities. In contrast, the admissions outcomes of liberal arts colleges – particularly those in the lower half of the top tier – were strongly influenced by institutional prices.” Source: Bowman, N., Bastedo, N. (2009). “Getting on the Front Page: Organizational Reputation, Status Signals, and the Impact of U.S. News and World Report on Student Decisions.” Res High Ed. 21 Integrity: Playing the Rankings Game “For ten years Reed has declined to fill out the annual peer evaluations and statistical surveys that U.S. News uses to compile its rankings. It has three primary reasons for doing so. First, one-size-fits-all rankings schemes undermine the institutional diversity that characterizes American higher education. Second, the rankings reinforce a view of education as strictly instrumental to extrinsic goals such as prestige or wealth. Third, rankings create powerful incentives to manipulate data and distort institutional behavior for the sole or primary purpose of inflating one’s score. Because the rankings depend heavily on unaudited, self-reported data, there is no way to en sure either the accuracy of the information or the reliability of the resulting rankings.” Source: Diver, Colin. “Is There Life After Rankings?” The Atlantic. November 2005. 22 Integrity continued… “Freedom from temptation to game the ratings formula… “Since the mid-1990s numerous stories in the popular press have documented how various schools distort their standard operating procedures, creatively interpret survey instructions, or boldly misreport information in order to raise their rankings.” Source: Colin Diver (President, Reed College), “Is There Life After Rankings?” The Atlantic, November, 2005. 23 Integrity continued… 1. 2. 3. 4. 5. 6. 7. 8. Failure to report low SAT scores from foreign students, athletes, other special admissions; Exaggerate per capita instructional expenditure; Artificially drive up numbers of applicants; Inflate yield rate by rejecting or wait-listing the highest achievers; Inflate graduate employment rate (law); Raise student selectivity (admit fewer 1st year students (law); “Dumping” closest peers Self-promotion. Source: Colin Diver (President, Reed College), “Is There Life After Rankings?” The Atlantic, November, 2005. 24 Integrity continued… Focus more on grades and less on undergraduate institutions at admission. Focus on LSAT. Other admission changes: 1. 2. 3. 4. 5. Admit fewer 1st year stud. more transfers. Reject some students with high LSAT. Reject students with limited prospects of employment. Focus scholarship of applicants with LSAT just above median. Start part-time program. Focus the curriculum on what is needed for Bar passage. 25 6. 7. 8. 9. 10. 11. 12. 13. Spend money on glossy, colorful advertising. Raise tuition for all, increase scholarships to those with numbers (LSAT, UGPA) Pay your own utilities. Encourage everyone and her sister to apply. Hire your own. Make it difficult for faculty members to leave in the Fall. Increase the number of “books” in the “library”. Decrease funding of the library and other unites with an abnormally high proportion of positive externalities. Source: The Interplay between Law School Rankings, Reputations, and Resource Allocation: Ways Rankings Mislead. Jeffrey Evans Stake, Indiana Law Journal vol. 81, p. 229 (2006). 26 Integrity continued… “The problem is that the U.S. News college rankings are far from reliable. Turns out that some of their numbers are made up. I know that first hand…. I was recently informed by the director of the data research at U.S. News, the person at the magazine who has a lot to say about how the rankings are computed, that absent students’ SAT scores, the magazine will calculate the college’s ranking by assuming an arbitrary average SAT score of one standard deviation (roughly 200 points) below the average score of our peer group… The message is clear. Unless we are willing to be badly misrepresented, we had better send the information the magazine wants.” Source: Michele Tolela Myers, President – Sarah Lawrence College. “The Cost of Bucking College Rankings.” The Washington Post, March 11, 2007. 27 Integrity continued… “We conducted interviews with top management team members from the top 50 business schools in the USA to assess the effects of business school rankings on the conduct of business education. These informants characterized the rankings process predominantly as a game where the players face a field that is not always level and where the rules are not only ill-specified but also subtly changing.” Kevin Corley & Dennis Gioia. (2000). “The Rankings Game: Managing Business School Reputation.” Corporate Reputation Review, 3(4). 28 The Israeli Higher Education System and Quality Assessment 29 Israel: Some Basic Data – 2009 * Area Population GDP State Budget Education Budget * Higher Ed. Budget 22,145 sq. km. 7.5 million ~780 billion NIS 316.5 billion NIS 30.3 billion NIS 6.5 billion NIS Not including Higher Ed. Budget 30 Higher Education in Israel Facts & Figures 2009/10 INSTITUTIONS Universities Open University Art Academies Comprehensive Colleges Engineering Colleges Teacher’s Colleges Non-Budgeted Colleges 66 7 1 2 12 7 24 13 31 Higher Education in Israel Facts & Figures Students Bachelor Master Ph.D. Other (Dip.) Faculty Tech & Admin. BUDGET (Tot.) 280,000 221,420 47,300 10,300 980 ~ 13,000 ~ 12,000 ~$2 billion 32 Higher Education in Israel: Governance Some 60% - 70% of the higher education budget comes from the Government It is usually based on a 5-year plan through … Negotiations between the Finance Ministry and the Planning and Budgeting Committee (PBC) of the Council for Higher Education (CHE) 33 Breakdown of the Income of the Institutions of Higher Education PBC Allocations – 65% Other – 14% Tuition Fees – 21% 34 The Council for Higher Education The Law The framework of the system of higher education in Israel is defined in the Council for Higher Education Law – 1958, with 11 amendments enacted over a period of 40 years. This law established the Council for Higher Education and the procedures for the accreditation of institutions of higher education. 35 Academic Freedom Article 15 of the Law guarantees that the institutions of higher education are autonomous in the conduct of their academic and administrative affairs within the framework of their budgets and their terms of accreditation. 36 The Council’s Responsibilities Accreditation To grant a permit for the opening and maintenance of an institution of higher education; To accredit an institution as an institution of higher education; To revoke the accreditation of an accredited institution. 37 Approval of New Degrees & Programs To authorize an accredited institution to confer an academic degree To approve new programs of study in existing institutions 38 Licensing Foreign Institutions To license the branches and extensions of foreign institutions of higher education which operate in Israel. 39 The Planning and Budgeting Committee The Council delegated to the Planning and Budgeting Committee (PBC) its responsibilities of planning and budgeting. The PBC is therefore the executive arm of the Council. 40 The PBC as a Buffer To be an independent intermediary body between the Government and the institutions of higher education, in all matters relating to allocations for higher education To negotiate with the Ministry of Finance the share of higher education in the state budget. 41 Allocation of Funds To exclusively allocate the budget to institutions of higher education, taking into account the needs of society and the state, while safe-guarding academic freedom and assuring advancement of research and teaching 42 Accountability To ensure that institutional budgets are balanced and executed according to plan Planning and Coordination To draw up plans for coordinated and efficient development of higher education on the national level 43 Recommendations to the Council To submit its recommendations to the Council for Higher Education concerning requests to open new institutions or new units in existing institutions, after examination of the planning and budgetary points of view 44 The CHE Mandate: Summary TERMS MENTIONED Securing funds Planning Licensing Accreditation Allocation of funds Accountability TERMS ABSENT Review Re-accreditation Quality assurance Evaluation Assessment 45 CHANGE… June 2003 The CHE adopts the recommendation of a National Committee to institute Quality Assessment and Assurance throughout the entire Higher Education System 2003/04 CHE establishes a QA unit and the first two disciplines are chosen for a pilot evaluation 2004/05 The Process is underway… 46 What prompted this change? Some major reasons Transition to mass higher education Internationalization of higher education Economic/budgetary pressure Pressure from stakeholders An inducive / ripe environment Perhaps . . . a realization by CHE that as part of the expanded accreditation some control may have been lost and another look may be beneficial 47 Transition to Mass Higher Education Institutions Universities Open University Art Academies Comprehensive Colleges Engineering Colleges Teachers’ Colleges Non-Budgeted Colleges Total 1990/91 7 1 2 0 2 7 2 21 *Students 89,000 *Not including branches/operations of foreign institutions 2005/06 7 1 2 8 8 27 8 61 ~250,000 48 Economic Budgetary Pressure Government budgetary cuts Higher education institutions find themselves in the red. Some blame: Lack of managerial-ism Lack of prioritization Lack of control and accountability 49 Pressure from Stakeholders Government/ Politicians Boards of Trustees International Academic Advisory Committees Students Donors International Environment – General – Academic Industry (“Clients”) 50 Inducive /Ripe Environment The 80s and 90s brought “In Search of Excellence” Deming et al. “Quality is Free” TQM All Sectors – Industry, Public, Defense – become heavily involved with Quality …Finally Higher Education joins in! 51 CHE - Realization Accreditation Re-accreditation 52 Main Purposes of the QA Activity To create a culture of continuous quality improvement. To bring about the continuous improvement of the various academic fields. To be an active participant in the global HE quality evaluation and improvement endeavors . 53 Issues evaluated Mission, goals Study programs – all degrees Faculty – achievement, promotion criteria, etc. Students ─ admissions, grading, services, etc. Organization ─ committees; decision process Infrastructure ─ labs, library, IT, etc. Community involvement and cooperation 54 Main Features of Adopted QA Process In developing our quality assessment process we “borrowed” from the experience of many countries- Europe, USA, Canada, Australia..: All Institutions are evaluated every 8 years (Not yet implemented) All programs are evaluated every 6 years External Review Committee (top in discipline) Appointed by & Reports to CHE On-Site visits by Committee Self-evaluation process as basis for review. 55 Unique Features All programs within a discipline are reviewed at the same time by the same committee. Committee asked to assess “fitness for purpose”. No comparisons. No ranking. Committee is asked to provide: Individual reports for each program to serve as guidelines for improvement General overview of discipline to serve as a guideline for CHE & PBC policy decisions Set of standards. 56 Trapped… The decision to evaluate all programs in a given discipline ended up being an “invitation” for pressure to rank the evaluated programs. CHE is for the time being withstanding this pressure claiming that ranking will inhibit the improvement process via all the “ranking games” possibilities which contradict the “fitness for purpose” approach. 57 Conclusions Rankings of higher ed institutions – be it global or national – are probably here to stay. One cannot perceive a situation that will cause U.S. News & World Report, the Times Higher Ed. Supplement, Business Week, etc. to give up such a marketing bonanza. The question is to what extent should academia be a willingly contributing partner in this enterprising endeavor. 58 Conclusions continued… If you don’t have a national ranking system try to delay. If there is a ranking system try to live with it without losing sight of your own mission! Or in the words of Colin Diver: “The Rankings are merely intolerable; unilateral disarmament is suicide.” Source: Colin Diver (President, Reed College), “Is There Life After Rankings?” The Atlantic, November, 2005. 59 Back to Habermas… Rankings seem to be detrimental to the social and cultural roles of the university and possibly to its overall mission. 60 THANK YOU FOR LISTENING 61