The Full Wiki

THE - QS World University Rankings: Wikis

Advertisements

Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.

Encyclopedia

Advertisements

From Wikipedia, the free encyclopedia

Times Higher Education-QS World University Rankings was an annual publication that ranked the "Top 200 World Universities", and was published by Times Higher Education and Quacquarelli Symonds (QS) between 2004 and 2009. The full listings, which are broken down by subject and region, feature on the Times Higher Education website with the full 600 ranked universities, interactive rankings tables and detailed methodology published on the QS website. The best-known college and university rankings in the United States—compiled by US News & World Report—bases its "World's Best Universities" rankings on data from the Times Higher Education-QS World University Rankings.[1]

The ranking weights are:

  • Peer Review Score (40%)
  • Recruiter Review (10%)
  • International Faculty Score (5%)
  • International Students Score (5%)
  • Faculty/Student Score (20%)
  • Citations/Faculty Score (20%).

Contents

Changes to the World University Rankings Partnership

For full article please see Times Higher Education World University Rankings

After the 2009 rankings, Times Higher Education took the decision to end their relationship with QS and instead signed an agreement with Thomson Reuters to provide the data for its annual World University Rankings. Times Higher Education will develop a new rankings methodology in the coming months, in consultation with its readers, its editorial board and the firm. Thomson Reuters will collect and analyse the data used to produce the rankings on behalf of Times Higher Education. The results will be published annually from autumn 2010.[2][3]

From November 2010, QS Quacquarelli Symonds, who have bought the exclusive rights to the domain name of World University Rankings, will continue to produce them independently of Times Higher Education. These rankings will be produced using data collected and analysed over the past six years by QS and Scopus by Elsevier.

2009 Rankings (full data)

The full table of the 2009 top 200 universities along with all the analysis and methodology was published on the Times Higher Education website at one minute past midnight on 8 October 2009.[4] The full 600 ranked universities, school profiles and detailed methodology was published on the QS website, TopUniversities.com [5] , on 9 October 2009.

Top 3 universities per country (in the top 100):

Times Higher Education - QS World University Rankings (Top 20)

2009 rankings[6] 2008 rankings[7] 2007 rankings[8] 2006 rankings[9] 2005 rankings[10] 2004 rankings[11] University Country Average score
01 01 01 01 01 01 Harvard University  United States 01
02 03 02= 02 03 06 University of Cambridge  United Kingdom 03
03 02 02= 04= 07 08 Yale University  United States 04
04 07 09 25 28 34 University College London  United Kingdom 18
05= 06 05 09 13 14 Imperial College London  United Kingdom 09
05= 04 02= 03 04 05 University of Oxford  United Kingdom 04
07 08 07= 11 17 13 University of Chicago  United States 11
08 12 06 10 09 09 Princeton University  United States 09
09 09 10 04= 02 03 Massachusetts Institute of Technology  United States 06
10 05 07= 07 08 04 California Institute of Technology  United States 07
11 10 11 12 20 19 Columbia University  United States 14
12 11 14 26 32 28 University of Pennsylvania  United States 21
13 13= 15 23 27 25 Johns Hopkins University  United States 19
14 13= 13 13 11 52 Duke University  United States 19
15 15 20= 15 14 23 Cornell University  United States 17
16 17 19 06 05 07 Stanford University  United States 12
17 16 16 16 23 16 Australian National University  Australia 17
18 20 12 21 24 21 McGill University  Canada 19
19 18 38= 29 36 31 University of Michigan  United States 29
20= 23 23 33= 30 48 University of Edinburgh  United Kingdom 30
20= 24 42 24 21 10 ETH Zurich (Swiss Federal Institute of Technology)  Switzerland 24

Commentary

Positive

Several universities in the UK and the Asia-Pacific region have commented on the rankings. Vice-Chancellor of Massey University, Professor Judith Kinnear says the Times Higher Education-QS ranking is a “wonderful external acknowledgement of several University attributes, including the quality of its research, research training, teaching and employability.“ She says the rankings are a true measure of a university’s ability to fly high internationally: “The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand’s Performance Based Research Fund (PBRF) measure or the Shanghai rankings.” [12]

Ian Leslie, the pro-vice chancellor for research at Cambridge University said: "It is very reassuring that the collegiate systems of Cambridge and Oxford continue to be valued by and respected by peers, and that the excellence of teaching and of research at both institutions is reflected in these rankings."

The vice-chancellor of Oxford University, Dr. John Hood, said: "The exceptional talents of Oxford's students and staff are on display daily. This last year has seen many faculty members gaining national and international plaudits for their teaching, scholarship and research, and our motivated students continue to achieve in a number of fields, not just academically. Our place amongst the handful of truly world-class universities, despite the financial challenges we face, is testament to the quality and the drive of the members of this university's environment."

Vice-Chancellor of the University of Wollongong in Australia, Professor Gerard Sutton, said the ranking was a testament to a university’s standing in the international community, identifying… “an elite group of world-class universities.” [13]

Critical

The rankings have been criticized for placing too much emphasis on peer review, which receives 40% of the overall score, and some have expressed concern about the manner in which the peer review has been carried out.[14] It has also been criticised, by a member of the University of Auckland, New Zealand, for the volatility of its results, with results sometime "shifting markedly", year on year.[15] Others have criticised the "opaque way it constructs its samples" for peer-review. [16] Andrew Oswald has questioned the rankings on the basis that the respective league-table positions of the universities do not, at least in certain examples, correspond to the amount of Nobel Prizes they have recently won, arguing that "Stanford University in the United States, purportedly number 19 in the world, garnered three times as many Nobel Prizes over the past two decades as the universities of Oxford and Cambridge did combined."[17]

However, several changes in methodology were introduced in 2007 which were aimed at addressing the above criticisms.[18] But it has since been argued, in at least one paper, that the current method of peer-review is still insufficiently standardised, lacking "input data on any performance indicators".[19]

Quacquarelli Symonds has been faulted for numerous data collection errors. For instance between 2006 and 2007 Washington University in St. Louis fell from 48th to 161th because QS mistakenly replaced Wash U with the University of Washington in Seattle.[20] QS committed a similar error when collecting data for Forbes Magazine confusing the University of North Carolina's Kenan-Flagler business school with one from North Carolina Central University.

Commenting on Times Higher Education's decision to split from QS, editor Ann Mroz said: "universities deserve a rigorous, robust and transparent set of rankings - a serious tool for the sector, not just an annual curiosity." She went on to explain the reason behind the decision to continue to produce rankings without QS' involvement, saying that: "The responsibility weighs heavy on our shoulders...we feel we have a duty to improve how we compile them."[21]

References

  1. ^ http://www.usnews.com/blogs/college-rankings-blog/2009/10/22/check-out-the-new-list-of-the-worlds-best-universities.html
  2. ^ http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=408881&c=2
  3. ^ http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=408908&navcode=105
  4. ^ "Times Higher Education-QS World University Rankings 2009". http://www.timeshighereducation.co.uk/Rankings2009-Top200.html. 
  5. ^ "THE-QS World University Rankings 2009". http://www.topuniversities.com/world-university-rankings. 
  6. ^ "THE-QS World University Rankings 2009". http://www.timeshighereducation.co.uk/hybrid.asp?typeCode=431&pubCode=1&navcode=148. 
  7. ^ "THE-QS World University Rankings 2008". http://www.timeshighereducation.co.uk/hybrid.asp?typeCode=416&pubCode=1&navcode=137. 
  8. ^ "THES-QS World University Rankings 2007". http://www.timeshighereducation.co.uk/hybrid.asp?typeCode=142&pubCode=1&navcode=118. 
  9. ^ "THES World University Rankings 2006". http://www.timeshighereducation.co.uk/hybrid.asp?typeCode=160&pubCode=1&navcode=119. 
  10. ^ "THES-QS World University Rankings 2005". http://www.timeshighereducation.co.uk/hybrid.asp?typeCode=174&pubCode=1&navcode=120. 
  11. ^ "THES World University Rankings 2004". http://www.timeshighereducation.co.uk/hybrid.asp?typeCode=194&pubCode=1&navcode=120. 
  12. ^ Flying high internationally
  13. ^ "UOW listed in Top 200 World University Rankings"
  14. ^ Rankings: Marketing Mana or Menace? by Simon Marginson
  15. ^ Rankings Ripe for Misleading by Simon Marginson
  16. ^ The Times Higher Education Rankings and the Dawn of Global Higher Education Data Standards by Alex Usher
  17. ^ There's nothing Nobel in deceiving ourselves by Andrew Oswald, The Independent on Sunday
  18. ^ Sowter, Ben (1 November 2007). THES – QS World University Rankings 2007 - Basic explanation of key enhancements in methodology for 2007"
  19. ^ International ranking systems for universities and institutions: a critical appraisal by John Ioannidis et. al.
  20. ^ http://rankingwatch.blogspot.com/2007/11/another-kenan-flagler-case-of.html
  21. ^ http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=408968&c=1

See also

External links


THE-QS World University Rankings is an annual publication that ranks the "Top 200 World Universities", and is published by Times Higher Education (THE) and Quacquarelli Symonds (QS). The full listings feature on the QS website and on the THE website. They have been running since 2004 and are broken down by subject and region.

The ranking weights are:

  • Peer Review Score (40%)
  • Recruiter Review (10%)
  • International Faculty Score (5%)
  • International Students Score (5%)
  • Faculty/Student Score (20%)
  • Citations/Faculty Score (20%).

Contents

THE - QS World University Rankings (Top 20)

2008 rankings 2007 rankings 2006 rankings 2005 rankings 2004 rankings University Country Average score
01 01 01 01 01 Harvard University US 01
02 02= 04= 07 08 Yale University US 04
03 02= 02 03 06 University of Cambridge UK 02
04 02= 03 04 05 University of Oxford UK 03
05 07= 07 08 04 California Institute of Technology US 06
06 05 09 13 14 Imperial College London UK 08
07 09 25 28 34 University College London UK 21
08 07= 11 17 13 University of Chicago US 10
09 10 04= 02 03 Massachusetts Institute of Technology US 05
10 11 12 20 19 Columbia University US 14
11 14 26 32 28 University of Pennsylvania US 22
12 06 10 09 09 Princeton University US 07
13= 13 13 11 52 Duke University US 20
13= 15 23 27 25 Johns Hopkins University US 21
15 20= 15 14 23 Cornell University US 17
16 16 16 23 16 Australian National University Australia 17
17 19 06 05 07 Stanford University US 09
18 38= 29 36 31 University of Michigan US 30
19 17 19= 16 12 University of Tokyo Japan 17
20 12 21 24 21 McGill University Canada 20

Commentary

The THE rankings have been publicised by the leading UK newspapers, such as The Guardian[1] and The Times[2] (THE is no longer owned by the company, News International, that owns The Times).

Several universities in the UK and the Asia-Pacific region have also commented on the rankings. Vice-Chancellor of Massey University, Professor Judith Kinnear says the THE-QS ranking is a “wonderful external acknowledgement of several University attributes, including the quality of its research, research training, teaching and employability.“ She says the rankings are a true measure of a university’s ability to fly high internationally: “The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand’s Performance Based Research Fund (PBRF) measure or the Shanghai rankings.” [3]

Ian Leslie, the pro-vice chancellor for research at Cambridge University said: "It is very reassuring that the collegiate systems of Cambridge and Oxford continue to be valued by and respected by peers, and that the excellence of teaching and of research at both institutions is reflected in these rankings." [1]

The vice-chancellor of Oxford University, Dr. John Hood, said: "The exceptional talents of Oxford's students and staff are on display daily. This last year has seen many faculty members gaining national and international plaudits for their teaching, scholarship and research, and our motivated students continue to achieve in a number of fields, not just academically. Our place amongst the handful of truly world-class universities, despite the financial challenges we face, is testament to the quality and the drive of the members of this university's environment." [1]

Vice-Chancellor of the University of Wollongong in Australia, Professor Gerard Sutton, said the ranking was a testament to a university’s standing in the international community, identifying… “an elite group of world-class universities.” [4]

Criticism

The Academic Ranking of World Universities by Shanghai Jiao Tong University has been suggested to be more respectable despite its perceived bias towards the natural sciences.[5] [6] The THE Rankings have been criticized[7] for placing too much emphasis on peer review, which receives 40% of the overall score. Some have expressed concern on the manner in which the peer review has been carried out. In a certain report[5], Peter Wills from the University of Auckland, New Zealand wrote of the QS-THE Ranking:

"But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions."

Some errors have also been reported on the faculty-student ratio used in the ranking. At the 16th Annual New Zealand International Education Conference held at Christchurch, New Zealand in August 2007, Simon Marginson presented a paper[8] which outlines the fundamental flaws underlying the QS-THES Rankings. A similar article[9] (also published by the same author) appeared in The Australian newspaper in December 2006. Some of the points mentioned include:

"Half of the THES index is comprised by existing reputation: 40 per cent by a reputational survey of academics (‘peer review’), and another 10 per cent determined by a survey of ‘global employers’. The THES index is too easily open to manipulation as it is not specified who is surveyed or what questions are asked. By changing the recipients of the surveys, or the way the survey results are factored in, the results can be shifted markedly."
  1. The pool of responses is heavily weighted in favour of academic ‘peers’ from nations where the Times is well-known, such as the UK, Australia, New Zealand, Malaysia and so on.
  2. Results have been highly volatile. There have been many sharp rises and falls, especially in the second half of the THES top200 where small differences in metrics can generate large rankings effects. Fudan in China has oscillated between 72 and 195, RMIT in Australia between 55 and 146. In the US, Emory has risen from 173 to 56 and Purdue fell from 59 to 127.

Although THES-QS had introduced several changes in methodology in 2007 which were aimed at addressing some of the above criticisms[10], the ranking has continued to attract criticisms. In an article[11] in the peer-reviewed BMC Journal authored by several scientists from USA and Greece, it was pointed out:

"If properly performed, most scientists would consider peer review to have very good construct validity; many may even consider it the gold standard for appraising excellence. However, even peers need some standardized input data to peer review. The Times simply asks each expert to list the 30 universities they regard as top institutions of their area without offering input data on any performance indicators. Research products may occasionally be more visible to outsiders, but it is unlikely that any expert possesses a global view of the inner workings of teaching at institutions worldwide. Moreover, the expert selection process of The Times is entirely unclear. The survey response rate among the selected experts was only <1% in 2006 (1 600 of 190 000 contacted). In the absence of any guarantee for protection from selection biases, measurement validity can be very problematic."

Alex Usher, Vice President of the Educational Policy Institute in USA, commented:[6]

"Most people in the rankings business think that the main problem with the Times is the opaque way it constructs its sample for its reputational rankings - a not-unimportant question given that reputation makes up 50% of the sample. Moreover, this year's switch from using raw reputation scores to using normalized Z-scores has really shaken things up at the top-end of the rankings by reducing the advantage held by really top universities - University of British Columbia (UBC) for instance, is now functionally equivalent to Harvard in the Peer Review score, which, no disrespect to UBC, is ludicrous. I'll be honest and say that at the moment the THES Rankings are an inferior product to the Shanghai Jiao Tong's Academic Ranking of World Universities."

The latest criticism of the QS-THE league tables came from Andrew Oswald, Professor of Economics at University of Warwick:[12]

"This put Oxford and Cambridge at equal second in the world. Lower down, at around the bottom of the world top-10, came University College London, above MIT. A university with the name of Stanford appeared at number 19 in the world. The University of California at Berkeley was equal to Edinburgh at 22 in the world. Such claims do us a disservice. The organisations who promote such ideas should be unhappy themselves, and so should any supine UK universities who endorse results they view as untruthful. Using these league table results on your websites, universities, if in private you deride the quality of the findings, is unprincipled and will ultimately be destructive of yourselves, because if you are not in the truth business what business are you in, exactly? Worse, this kind of material incorrectly reassures the UK government that our universities are international powerhouses. Let us instead, a bit more coolly, do what people in universities are paid to do. Let us use reliable data to try to discern the truth. In the last 20 years, Oxford has won no Nobel Prizes. (Nor has Warwick.) Cambridge has done only slightly better. Stanford University in the United States, purportedly number 19 in the world, garnered three times as many Nobel Prizes over the past two decades as the universities of Oxford and Cambridge did combined. "

References

See also

External links


Advertisements






Got something to say? Make a comment.
Your name
Your email address
Message