The Full Wiki

Times Higher Education World University Rankings: Wikis

Advertisements
  

Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.

Encyclopedia

From Wikipedia, the free encyclopedia

The Times Higher Education World University Rankings have been running since 2004. It is one of the two most popular international ranking systems, based on its appearance on web searches. [1] (The other being the Academic Ranking of World Universities compiled by Shanghai Jiao Tong University.) Between 2004 and 2009 the rankings were compiled in partnership with Quacquarelli Symonds and were known as the Times Higher Education-QS World University Rankings. From 2010 onwards they will be known as the Times Higher Education World University Rankings and will be compiled in partnership with Thomson Reuters. Thomson Reuters has the largest citation database available and is the only true cited index reference[2]

Times Higher Education World University Rankings
[[Image:THE WORLD RANKINGS(LOW).jpg|220px|alt=|]]
Times Higher Education World University Rankings logo
Editor Phil Baty
Categories Higher education
Frequency Annual
Publisher TSL Education Ltd
Country United Kingdom
Language English
Website http://www.timeshighereducation.co.uk/

Contents

World Rankings between 2004 and 2009

For main article on rankings between 2004 and 2009 see Times Higher Education-QS World University Rankings

Times Higher Education World University Rankings -- the decision to split from QS

After the 2009 rankings, Times Higher Education took the decision to break from QS and instead signed an agreement with Thomson Reuters to provide the data for its annual World University Rankings. Times Higher Education will develop a new rankings methodology in the coming months, in consultation with its readers, its editorial board and Thomson Reuters. Thomson Reuters will collect and analyse the data used to produce the rankings on behalf of Times Higher Education. The results will be published annually from autumn 2010.[3]

Times Higher Education is currently inviting readers to comment on how the methodology for compiling the rankings can be improved.[4]

Commenting on Times Higher Education’s decision to split from QS, editor Ann Mroz said: “universities deserve a rigorous, robust and transparent set of rankings - a serious tool for the sector, not just an annual curiosity." She went on to explain the reason behind the decision to continue to produce rankings without QS’ involvement, saying that: "The responsibility weighs heavy on our shoulders...we feel we have a duty to improve how we compile them.”[5]

Criticism of Times Higher Education-QS World University Rankings

Several universities in the UK and the Asia-Pacific region have commented on the rankings positively. Vice-Chancellor of Massey University, Professor Judith Kinnear says the Times Higher Education-QS ranking is a “wonderful external acknowledgement of several University attributes, including the quality of its research, research training, teaching and employability.“ She says the rankings are a true measure of a university’s ability to fly high internationally: “The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand’s Performance Based Research Fund (PBRF) measure or the Shanghai rankings.” [6]

However, the Times Higher Education-QS World University Rankings have been criticised by many more[7] for placing too much emphasis on peer review, which receives 40 per cent of the overall score. Some people have expressed concern about the manner in which the peer review has been carried out.[8] In a report[9], Peter Wills from the University of Auckland, New Zealand wrote of the Times Higher Education-QS World University Rankings:

“But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions.”

Some errors have also been reported in the faculty-student ratio used in the ranking. At the 16th Annual New Zealand International Education Conference held at Christchurch, New Zealand in August 2007, Simon Marginson presented a paper[10] that outlines the fundamental flaws underlying the Times Higher Education-QS World University Rankings. A similar article[11] (also published by the same author) appeared in The Australian newspaper in December 2006. Some of the points mentioned include:

“Half of the THES index is comprised by existing reputation: 40 per cent by a reputational survey of academics (‘peer review’), and another 10 per cent determined by a survey of 'global employers'. The THES index is too easily open to manipulation as it is not specified who is surveyed or what questions are asked. By changing the recipients of the surveys, or the way the survey results are factored in, the results can be shifted markedly.

  1. The pool of responses is heavily weighted in favour of academic 'peers' from nations where The Times (sic) is well-known, such as the UK, Australia, New Zealand, Malaysia and so on.
  2. It’s good when people say nice things about you, but if it is better when those things are true. It is hard to resist the temptation to use the THES rankings in institutional marketing, but it would be a serious strategic error to assume that they are soundly based.
  3. Results have been highly volatile. There have been many sharp rises and falls, especially in the second half of the THES top 200 where small differences in metrics can generate large rankings effects. Fudan in China has oscillated between 72 and 195, RMIT in Australia between 55 and 146. In the US, Emory has risen from 173 to 56 and Purdue fell from 59 to 127.”

Although THES-QS had introduced several changes in methodology in 2007 which were aimed at addressing some of the above criticisms[12], the ranking has continued to attract criticisms. In an article[13] in the peer-reviewed BMC Medicine authored by several scientists from the US and Greece, it was pointed out:

“If properly performed, most scientists would consider peer review to have very good construct validity; many may even consider it the gold standard for appraising excellence. However, even peers need some standardized input data to peer review. The Times (sic) simply asks each expert to list the 30 universities they regard as top institutions of their area without offering input data on any performance indicators. Research products may occasionally be more visible to outsiders, but it is unlikely that any expert possesses a global view of the inner workings of teaching at institutions worldwide. Moreover, the expert selection process of The Times (sic) is entirely unclear. The survey response rate among the selected experts was only <1% in 2006 (1,600 of 190,000 contacted). In the absence of any guarantee for protection from selection biases, measurement validity can be very problematic.”

Alex Usher, vice president of the Educational Policy Institute in the US, commented:

“Most people in the rankings business think that the main problem with The Times (sic) is the opaque way it constructs its sample for its reputational rankings - a not-unimportant question given that reputation makes up 50% of the sample. Moreover, this year's switch from using raw reputation scores to using normalized Z-scores has really shaken things up at the top-end of the rankings by reducing the advantage held by really top universities - University of British Columbia (UBC) for instance, is now functionally equivalent to Harvard in the Peer Review score, which, no disrespect to UBC, is ludicrous. I'll be honest and say that at the moment the THES Rankings are an inferior product to the Shanghai Jiao Tong’s Academic Ranking of World Universities.”

Academics have also been critical of the use of the citation database, arguing that it undervalues institutions who excel in the social sciences. Ian Diamond, chief executive of the Economic and Social Research Council wrote to Times Higher Education in 2007, saying:[14]

“The use of a citation database must have an impact because such databases do not have as wide a cover of the social sciences (or arts and humanities) as the natural sciences. Hence the low position of the London School of Economics, caused primarily by its citations score, is a result not of the output of an outstanding institution but the database and the fact that the LSE does not have the counterweight of a large natural science base.”

The latest criticism of the Times Higher Education-QS league tables came from Andrew Oswald, professor of economics at University of Warwick:[15]

“This put Oxford and Cambridge at equal second in the world. Lower down, at around the bottom of the world top-10, came University College London, above MIT. A university with the name of Stanford appeared at number 19 in the world. The University of California at Berkeley was equal to Edinburgh at 22 in the world. Such claims do us a disservice. The organisations who promote such ideas should be unhappy themselves, and so should any supine UK universities who endorse results they view as untruthful. Using these league table results on your websites, universities, if in private you deride the quality of the findings, is unprincipled and will ultimately be destructive of yourselves, because if you are not in the truth business what business are you in, exactly? Worse, this kind of material incorrectly reassures the UK government that our universities are international powerhouses. Let us instead, a bit more coolly, do what people in universities are paid to do. Let us use reliable data to try to discern the truth. In the last 20 years, Oxford has won no Nobel Prizes. (Nor has Warwick.) Cambridge has done only slightly better. Stanford University in the United States, purportedly number 19 in the world, garnered three times as many Nobel Prizes over the past two decades as the universities of Oxford and Cambridge did combined.”

See also

Times Higher Education

References

External links

Advertisements

Advertisements






Got something to say? Make a comment.
Your name
Your email address
Message