QS World University Rankings

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The QS World University Rankings is a ranking of the world’s top 700 universities by Quacquarelli Symonds using a method that has published annually since 2004.

The QS rankings were originally published in publication with Times Higher Education from 2004 to 2009 as the Times Higher Education-QS World University Rankings. In 2010, Times Higher Education and QS ended their collaboration. QS assumed sole publication of the existing methodology, while Times Higher Education created a new ranking methodology, published as Times Higher Education World University Rankings.

Contents

[edit] History

The need for an international ranking of universities was highlighted in December 2003 in Richard Lambert’s review of university-industry collaboration in Britain [1] for HM Treasury, the finance ministry of the United Kingdom. Amongst its recommendations were world university rankings, which Lambert said would help the UK to gauge the global standing of its universities.

The idea for the rankings was credited in Ben Wildavsky's book, The Great Brain Race: How Global Universities are Reshaping the World,[1] to then-editor of Times Higher Education, John O'Leary. Times Higher Education chose to partner with educational and careers advice company Quacquarelli Symonds (QS) to supply the data, appointing Martin Ince [2], formerly deputy editor and later a contractor to THE, to manage the project.

Between 2004 and 2009, Quacquarelli Symonds (QS) produced the rankings in partnership with Times Higher Education (THE). In 2009, THE announced they would produce their own rankings, the Times Higher Education World University Rankings, in partnership with Thomson Reuters. THE cited a weakness in the methodology of the original rankings,[2] as well as a perceived favoritism in the existing methodology for science over the humanities,[3] as one of the key reasons for the decision to split with QS.

QS retained the intellectual property in the Rankings and the methodology used to compile them[citation needed] and continues to produce the rankings, now called the QS World University Rankings.[4] THE created a new methodology, first published independently as the Times Higher Education World University Rankings in September 2010 and altered again for publication by THE on October 6, 2011.

QS publishes the results of the original methodology in key media around the world, including US News & World Report in the United States and Chosun Ilbo in Korea. The first rankings produced by QS independently of THE, and using QS's consistent and original methodology, were released on September 8, 2010, with the second appearing on September 6, 2011.

[edit] Method

QS tried to design its rankings to look at a broad range of university activity. Six indicators are used.

[edit] Academic peer review (40%)

The most controversial part of the QS World University Rankings is their use of an opinion survey referred to as the Academic Peer Review. Using a combination of purchased mailing lists and applications and suggestions, this survey asks active academics across the world about the top universities in fields they know about. QS has published the job titles and geographical distribution of the participants.

The 2011 rankings made use of responses from 33,744 people from over 140 nations in its Academic Peer Review, including votes from the previous two years rolled forward provided there was no more recent information available from the same individual. Participants can nominate up to 30 universities but are not able to vote for their own. They tend to nominate a median of about 20, which means that this survey includes over 500,000 data points. More here [3].

In 2004, when the rankings first appeared, academic peer review accounted for half of a university's possible score. In 2005, its share was cut to 40 per cent because of the introduction of the Recruiter Review.

[edit] Recruiter review (10%)

This part of the ranking is obtained by a similar method to the Academic Peer Review, except that it samples recruiters who hire graduates on a global or significant national scale. The numbers are smaller – 16,875 responses from over 130 countries in the 2011 Rankings – and are used to produce 10 per cent of any university’s possible score. This survey was introduced in 2005 in the belief that employers track graduate quality, making this a barometer of teaching quality, a famously problematic thing to measure. University standing here is of especial interest to potential students.

More here [4].

[edit] Faculty student ratio (20%)

This indicator accounts for 20 per cent of a university’s possible score in the rankings. It is a classic measure used in various ranking systems as a surrogate for teaching commitment, but QS has admitted that it is less than satisfactory. More on the QS website here [5].

[edit] Citations per faculty (20%)

Citations of published research are among the most widely used inputs to national and global university rankings. The QS World University Rankings used citations data from Thomson (now Thomson Reuters) from 2004 to 2007, and since then uses data from Scopus, part of Elsevier. The total number of citations for a five-year period is divided by the number of academic staff in a university to yield the score for this measure, which accounts for 20 per cent of a university’s possible score in the Rankings.

QS has explained that it uses this approach, rather than the citations per paper preferred for other systems, because it reduces the effect of biomedical science on the overall picture – bio-medicine has a ferocious “publish or perish” culture. Instead QS attempts to measure the density of research-active staff at each institution. But issues still remain about the use of citations in ranking systems, especially the fact that the arts and humanities generate comparatively few citations. More here [6].

QS has conceded the presence of some data collection errors regarding citations per faculty in previous years' rankings.[5]

One interesting issue is the difference between the Scopus and Thomson Reuters databases. For major world universities, the two systems capture more or less the same publications and citations. For less mainstream institutions, Scopus has more non-English language and smaller-circulation journals in its database. But as the papers there are less heavily cited, this can also mean fewer citations per paper for the universities that publish in them.

More on this at [7]

[edit] International orientation (10%)

The final ten per cent of a university’s possible score is derived from measures intended to capture their internationalism: five percent from their percentage of international students, and another five percent from their percentage of international staff.

This is of interest partly because it shows whether a university is putting effort into being global, but also because it tells us whether it is taken seriously enough by students and academics around the world for them to want to be there.

More at [8].

[edit] Data sources

The information used to compile the World University Ranking comes partly from the online surveys carried out by QS, partly from Scopus, and partly from an annual information-gathering exercise carried out by QS itself. QS collects data from universities directly and from their web sites and publications, and from national bodies such as education ministries and the National Center for Education Statistics in the US and the Higher Education Statistics Agency in the UK.

[edit] Aggregation

The data are aggregated into columns according to its Z score, an indicator of how far removed any institution is from the average. Between 2004 and 2007 a different system was used whereby the top university for any measure was scaled as 100 and the others received a score reflecting their comparative performance. According to QS, this method was dropped because it gives too much weight to some exceptional outliers, such as the very high faculty/student ratio of the California Institute of Technology. In 2006, the last year before the Z score system was introduced, Caltech was top of the citations per faculty score, receiving 100 on this indicator, because of its highly research and science-oriented approach. The next two institutions on this measure, Harvard and Stanford, each scored 55. In other words, 45 per cent of the possible difference between the world's universities was between the top university and the next one (in fact two) on the list, leaving every other university on Earth to fight over the remaining 55 per cent.

Likewise in 2005, Harvard was top university and MIT was second with 86.9, so that 13 per cent of the total difference between all the world's universities was between first and second place. In 2011, the University of Cambridge was top and the second institution, Harvard, got 99.34. So the Z score system allows the full range of available difference to be used in a more informative way.

[edit] Classifications

In 2009, a column of classifications was introduced to provide additional context to the rankings tables. Universities are classified by size, defined by the size of the student body; comprehensive or specialist status, defined by the range of faculty areas in which programs are offered; and research activity, defined by the number of papers published in a five-year period.

[edit] Fees

In 2011, QS began publishing average fees data for the universities it ranks. These are not used as an indicator in the rankings, but are clearly of immense interest and reveal much about a university's self-image and market position.

QS publishes domestic and international fees for undergraduate and postgraduate study.

[edit] Results

The institutions that dominate the QS rankings are either large, general research institutions or large science and technology specialists such as MIT or Imperial College. To be included in the QS World University Rankings, institutions must teach in at least two of the five main areas of academic life (the social sciences, the arts and humanities, biomedicine, engineering and the physical sciences), and must teach undergraduates.

[edit] Faculty-level analysis

QS publishes a simple analysis of the top 100 institutions in each of the five faculty-level areas mentioned above: natural sciences, technology, biology and medicine, social sciences and the arts and humanities. These five tables list universities in order of their Academic Peer Review score. They also give the citations per paper for each institution, but the two data sets are not aggregated.

QS uses citations per paper rather than per person partly because it does not hold details of the academic staff in each subject area, and partly because the number of citations per paper should be a consistent indicator of impact within a specific field with a defined publishing culture.

[edit] Effects

Web-raters such as Alexa [www.alexa.com] verify that the QS Rankings attract millions of page views. As the main audience are 18-24 year olds (according to Alexa in October 2011), it is rational to suppose that students and potential students are their main user. Some universities, especially in Asia, have a target to be well-placed in the rankings. In October 2011, Queen’s University Belfast (UK) was advertising with the slogan “Destination Global Top 100.” [9]

QS has formed an international advisory board [6] for the Rankings, convened by Martin Ince [10], and with members in Europe, Asia, Africa and North and South America.

[edit] 2011 rankings

The 2011 QS World University Rankings ranked the University of Cambridge first in the world for the second year in succession. Harvard University remained in second place, ahead of its neighbour MIT in third place. Yale remained in fourth place, ahead of the University of Oxford in fifth.

The rankings covered 712 institutions out of 2,919 on which QS held data. There were universities from 32 nations in the top 200 and from 50 in the top 500. The average age of the top 200 universities was 187.3 years. The newest top 10 university, Chicago, was founded in 1890.

The 2011 rankings list the top 400 universities in rank order, followed by institutions 401-600 in bands of 50 and a further 112 grouped as "601+."

[edit] Top 20 in the QS World University Rankings

2011 Rank[7] 2010 Rank[8] 2009 Rank[9] 2008 Rank[10] 2007 Rank[11] 2006 Rank[12] 2005 Rank[13] 2004 Rank[14] Mean Rank Mean Rank Position University Country
1 1 2 3 2 2 3 6 2.5 2 University of Cambridge UK
2 2 1 1 1 1 1 1 1.3 1 Harvard University US
3 5 9 9 10 4 2 3 5.6 5 Massachusetts Institute of Technology US
4 3 3 2 2 4 7 8 4.1 3 Yale University US
5 6 5 4 2 3 4 5 4.3 4 University of Oxford UK
6 7 5 6 5 9 13 14 8.1 7 Imperial College London UK
7 4 4 7 9 25 28 34 14.8 12 University College London UK
8 8 7 8 7 11 17 13 9.9 9 University of Chicago US
9 12 12 11 14 26 32 28 18.0 14 University of Pennsylvania US
10 11 11 10 11 12 20 19 13.0 11 Columbia University US
11 13 16 17 19 6 5 7 11.8 10 Stanford University US
12 9 10 5 7 7 8 4 7.8 6 California Institute of Technology US
13 10 8 12 6 10 9 9 9.6 8 Princeton University US
14 15 19 18 38 29 36 31 25.0 19 University of Michigan US
15 16 15 15 20 15 14 23 16.6 13 Cornell University US
16 17 13 13 15 23 27 25 18.6 15 Johns Hopkins University US
17 19 18 20 12 21 24 21 19.0 17 McGill University Canada
18 18 20 24 42 24 21 10 22.1 18 ETH Zurich (Swiss Federal Institute of Technology) Switzerland
19 14 14 13 13 13 11 52 18.6 15 Duke University US
20 22 20 23 23 33 33 48 27.8 20 University of Edinburgh UK

[edit] Commentary

Several universities in the UK and the Asia-Pacific region have commented on the rankings positively. Vice-Chancellor of New Zealand's Massey University, Professor Judith Kinnear, says that the Times Higher Education-QS ranking is a "wonderful external acknowledgement of several University attributes, including the quality of its research, research training, teaching and employability." She says the rankings are a true measure of a university's ability to fly high internationally: "The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand's Performance Based Research Fund (PBRF) measure or the Shanghai rankings."[15]

Martin Ince,[16] chair of the Advisory Board for the Rankings, points out that their volatility has been reduced since 2007 by the introduction of the Z-score calculation method and that over time, the quality of QS's data gathering has improved to reduce anomalies. In addition, the academic review is now so big that even modestly ranked universities receive a statistically valid number of votes.

[edit] Criticism

The THE-QS World University Rankings have been criticised by many for placing too much emphasis on peer review, which receives 40 percent of the overall score. Some people have expressed concern about the manner in which the peer review has been carried out.[17] In a report,[18] Peter Wills from the University of Auckland, New Zealand wrote of the Times Higher Education-QS World University Rankings:

But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions.

Quacquarelli Symonds has been faulted for some data collection errors. Between 2006 and 2007 Washington University in St. Louis fell from 48th to 161st because QS confused it with the University of Washington in Seattle.[19] QS committed a similar error when collecting data for Fortune Magazine confusing the University of North Carolina's Kenan-Flagler business school with one from North Carolina Central University.

Some errors have also been reported in the faculty-student ratio used in the ranking. At the 16th Annual New Zealand International Education Conference held at Christchurch, New Zealand in August 2007, Simon Marginson presented a paper[20] that outlines the fundamental flaws underlying the Times Higher Education-QS World University Rankings. A similar article[21] (also published by the same author) appeared in The Australian newspaper in December 2006. Some of the points mentioned include:

Half of the THES index is comprised by existing reputation: 40 per cent by a reputational survey of academics ('peer review'), and another 10 per cent determined by a survey of 'global employers'. The THES index is too easily open to manipulation as it is not specified who is surveyed or what questions are asked. By changing the recipients of the surveys, or the way the survey results are factored in, the results can be shifted markedly.

  1. The pool of responses is heavily weighted in favour of academic 'peers' from nations where The Times is well-known, such as the UK, Australia, New Zealand, Malaysia and so on.
  2. It's good when people say nice things about you, but it is better when those things are true. It is hard to resist the temptation to use the THES rankings in institutional marketing, but it would be a serious strategic error to assume that they are soundly based.
  3. Results have been highly volatile. There have been many sharp rises and falls, especially in the second half of the THES top 200 where small differences in metrics can generate large rankings effects. Fudan in China has oscillated between 72 and 195, RMIT in Australia between 55 and 146. In the US, Emory has risen from 173 to 56 and Purdue fell from 59 to 127.

THES-QS had introduced several changes in methodology in 2007 which were aimed at addressing some of the above criticisms,[22] the ranking has continued to attract criticisms. In an article[23] in the peer-reviewed BMC Medicine authored by several scientists from the US and Greece, it was pointed out:

If properly performed, most scientists would consider peer review to have very good construct validity; many may even consider it the gold standard for appraising excellence. However, even peers need some standardized input data to peer review. The Times simply asks each expert to list the 30 universities they regard as top institutions of their area without offering input data on any performance indicators. Research products may occasionally be more visible to outsiders, but it is unlikely that any expert possesses a global view of the inner workings of teaching at institutions worldwide. Moreover, the expert selection process of The Times is entirely unclear. The survey response rate among the selected experts was only <1% in 2006 (1,600 of 190,000 contacted). In the absence of any guarantee for protection from selection biases, measurement validity can be very problematic.

Alex Usher, vice president of Higher Education Strategy Associates in Canada, commented:

Most people in the rankings business think that the main problem with The Times is the opaque way it constructs its sample for its reputational rankings - a not-unimportant question given that reputation makes up 50% of the sample. Moreover, this year's switch from using raw reputation scores to using normalized Z-scores has really shaken things up at the top-end of the rankings by reducing the advantage held by really top universities - University of British Columbia (UBC) for instance, is now functionally equivalent to Harvard in the Peer Review score, which, no disrespect to UBC, is ludicrous. I'll be honest and say that at the moment the THES Rankings are an inferior product to the Shanghai Jiao Tong’s Academic Ranking of World Universities.

Academics have also been critical of the use of the citation database, arguing that it undervalues institutions who excel in the social sciences. Ian Diamond, former chief executive of the Economic and Social Research Council and now vice-chancellor of the University of Aberdeen wrote to Times Higher Education in 2007, saying:[24]

The use of a citation database must have an impact because such databases do not have as wide a cover of the social sciences (or arts and humanities) as the natural sciences. Hence the low position of the London School of Economics, caused primarily by its citations score, is a result not of the output of an outstanding institution but the database and the fact that the LSE does not have the counterweight of a large natural science base.

Criticism of the Times Higher Education-QS league tables also came from Andrew Oswald, professor of economics at University of Warwick:[25]

This put Oxford and Cambridge at equal second in the world. Lower down, at around the bottom of the world top-10, came University College London, above MIT. A university with the name of Stanford appeared at number 19 in the world. The University of California at Berkeley was equal to Edinburgh at 22 in the world. Such claims do us a disservice. The organisations who promote such ideas should be unhappy themselves, and so should any supine UK universities who endorse results they view as untruthful. Using these league table results on your websites, universities, if in private you deride the quality of the findings, is unprincipled and will ultimately be destructive of yourselves, because if you are not in the truth business what business are you in, exactly? Worse, this kind of material incorrectly reassures the UK government that our universities are international powerhouses. Let us instead, a bit more coolly, do what people in universities are paid to do. Let us use reliable data to try to discern the truth. In the last 20 years, Oxford has won no Nobel Prizes. (Nor has Warwick.) Cambridge has done only slightly better. Stanford University in the United States, purportedly number 19 in the world, garnered three times as many Nobel Prizes over the past two decades as the universities of Oxford and Cambridge did combined.

The most recent criticism of the old system came from Fred L. Bookstein, Horst Seidler, Martin Fieder and Georg Winckler in the journal Scientomentrics for the unreliability of QS's methods:

Several individual indicators from the Times Higher Education Survey (THES) data base the overall score, the reported staff-to-student ratio, and the peer ratings—demonstrate unacceptably high fluctuation from year to year. The inappropriateness of the summary tabulations for assessing the majority of the “top 200” universities would be apparent purely for reason of this obvious statistical instability regardless of other grounds of criticism. There are far too many anomalies in the change scores of the various indices for them to be of use in the course of university management.[26]

[edit] QS Asian University Rankings

In 2009, Quacquarelli Symonds (QS) launched the QS Asian University Rankings in partnership with The Chosun Ilbo newspaper in Korea. They rank the top 200 Asian universities and has now appeared three times. The University of Hong Kong was top in 2010 and The Hong Kong University of Science & Technology was top in 2011. These rankings use some of the same criteria as the World University Rankings as well as other measures such as incoming and outgoing exchange students

[edit] QS Latin American University Rankings

The QS Latin American University Rankings [11] were launched in October 2011. They use academic opinion (30 per cent), employer opinion (20 per cent), publications per faculty member, citations per paper, academic staff with a PhD, faculty/student ratio and web visibility (10 per cent each) as measures. These criteria were developed in consultation with experts in Latin America, and the web visibility data come from Webometrics [12]. This ranking showed that the University of São Paulo in Brazil is the region's top institution.

[edit] QS World University Rankings by Subject

In 2011, QS began ranking universities around the world by subject. The rankings are based on citations, academic peer review, and recruiter review, with the weightings for each dependent upon the culture and practice of the subject concerned. They are published in five clusters; engineering; biomedicine; the natural sciences; the social sciences; and the arts and humanities.

[edit] References

  1. ^ Princeton University Press, 2010
  2. ^ Mroz, Ann. "Leader: Only the best for the best". Times Higher Education. http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=408968&c=1. Retrieved 2010-09-16. 
  3. ^ Baty, Phil (2010-09-10). "Views: Ranking Confession". Inside Higher Ed. http://www.insidehighered.com/views/2010/03/15/baty. Retrieved 2010-09-16. 
  4. ^ Labi, Aisha (2010-09-15). "Times Higher Education Releases New Rankings, but Will They Appease Skeptics?". The Chronicle of Higher Education (London, UK). http://chronicle.com/article/Times-Higher-Education/124455/?sid=at&utm_source=at&utm_medium=en. Retrieved 2010-09-16. 
  5. ^ http://rankingwatch.blogspot.com/2007/11/another-kenan-flagler-case-of.html
  6. ^ "QS World University Rankings Advisory Board". Top Universities. 2009-11-12. http://www.topuniversities.com/university-rankings/world-university-rankings/advisory-board. Retrieved 2010-09-16. 
  7. ^ "THE-QS World University Rankings 2011". http://www.topuniversities.com/university-rankings/world-university-rankings/2011. 
  8. ^ "THE-QS World University Rankings 2010". http://www.topuniversities.com/university-rankings/world-university-rankings/2010/results. 
  9. ^ "THE-QS World University Rankings 2009". http://www.topuniversities.com/university-rankings/world-university-rankings/2009/results. 
  10. ^ "THE-QS World University Rankings 2008". http://www.topuniversities.com/university-rankings/world-university-rankings/2008/results. 
  11. ^ "THE-QS World University Rankings 2007". http://www.topuniversities.com/university-rankings/world-university-rankings/2007/results. 
  12. ^ "THE-QS World University Rankings 2006". http://www.topuniversities.com/university-rankings/world-university-rankings/2006/results. 
  13. ^ "THE-QS World University Rankings 2005". http://www.topuniversities.com/university-rankings/world-university-rankings/2005/results. 
  14. ^ "THE-QS World University Rankings 2004". http://www.topuniversities.com/university-rankings. 
  15. ^ Flying high internationally[dead link]
  16. ^ http://www.martinince.eu
  17. ^ Holmes, Richard (2006-09-05). "So That's how They Did It". Rankingwatch.blogspot.com. http://rankingwatch.blogspot.com/2006/09/so-thats-how-they-did-it-for-some-time.html. Retrieved 2010-09-16. 
  18. ^ Response to Review of Strategic Plan by Peter Wills
  19. ^ "University Ranking Watch". Rankingwatch.blogspot.com. 2007-11-11. http://rankingwatch.blogspot.com/2007/11/another-kenan-flagler-case-of.html. Retrieved 2010-09-16. 
  20. ^ "Rankings: Marketing Mana or Menace? by Simon Marginson" (PDF). http://www.cshe.unimelb.edu.au/people/staff_pages/Marginson/KeyMar8-10Aug07.pdf. Retrieved 2010-09-16. 
  21. ^ December 06, 2006 12:00AM (2006-12-06). "Rankings Ripe for Misleading by Simon Marginson". Theaustralian.news.com.au. http://www.theaustralian.news.com.au/story/0,20867,20877456-12332,00.html. Retrieved 2010-09-16. 
  22. ^ Sowter, Ben (1 November 2007). THES – QS World University Rankings 2007 - Basic explanation of key enhancements in methodology for 2007"
  23. ^ "1741-7015-5-30.fm" (PDF). http://www.biomedcentral.com/content/pdf/1741-7015-5-30.pdf. Retrieved 2010-09-16. 
  24. ^ "Social sciences lose 1". Timeshighereducation.co.uk. 2007-11-16. http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=311132. Retrieved 2010-09-16. 
  25. ^ "There's nothing Nobel in deceiving ourselves by Andrew Oswald, The Independent on Sunday". News.independent.co.uk. 2007-12-13. http://news.independent.co.uk/education/higher/article3245492.ece. Retrieved 2010-09-16. 
  26. ^ "Scientometrics, Volume 85, Number 1". SpringerLink. http://www.springerlink.com/content/d02771828173244w/. Retrieved 2010-09-16. 

[edit] External links

Personal tools
Namespaces

Variants
Actions
Navigation
Interaction
Toolbox
Print/export
Languages