QS World University Rankings

From Wikipedia, the free encyclopedia
QS World University Rankings
Editor Danny Byrne
Categories Higher education
Frequency Annual
Publisher QS Quacquarelli Symonds Limited
Country United Kingdom
Language English
Website QS World University Rankings

The QS World University Rankings are annual university rankings published by Quacquarelli Symonds (QS) which provides overall rankings as well as rankings for individual subjects. QS also publishes additional regional rankings, the QS Asian University Rankings, the QS Latin American University Rankings, and the QS BRICS University Rankings, all of which are independent of and different to the major world rankings due to differences in the criteria and weightings used to generate them.[1]

The publisher originally released its rankings in publication with Times Higher Education from 2004 to 2009 as the Times Higher Education-QS World University Rankings, but the two ended their collaboration in 2010. QS assumed sole publication of the pre-existing methodology, while Times Higher Education created a new one with Thomson Reuters, published as Times Higher Education World University Rankings.

The QS World University Rankings is regarded as one of the three most influential and widely observed international university rankings, along with the Times Higher Education World University Rankings and the Academic Ranking of World Universities.[2][3][4]

History

The need for an international ranking of universities was highlighted in December 2003 in Richard Lambert’s review of university-industry collaboration in Britain[5] for HM Treasury, the finance ministry of the United Kingdom. Amongst its recommendations were world university rankings, which Lambert said would help the UK to gauge the global standing of its universities.

The idea for the rankings was credited in Ben Wildavsky's book, The Great Brain Race: How Global Universities are Reshaping the World,[6] to then-editor of Times Higher Education (THE), John O'Leary. THE chose to partner with educational and careers advice company Quacquarelli Symonds (QS) to supply the data, appointing Martin Ince,[7] formerly deputy editor and later a contractor to THE, to manage the project.

Between 2004 and 2009, QS produced the rankings in partnership with THE. In 2009, THE announced they would produce their own rankings, the Times Higher Education World University Rankings, in partnership with Thomson Reuters. THE cited a weakness in the methodology of the original rankings,[8] as well as a perceived favoritism in the existing methodology for science over the humanities,[9] as one of the key reasons for the decision to split with QS.

QS retained the intellectual property in the Rankings and the methodology used to compile them [citation needed] and continues to produce the rankings, now called the QS World University Rankings.[10] THE created a new methodology with Thomson Reuters, published as the Times Higher Education World University Rankings in September 2010.

Methodology of the major rankings

QS publishes the rankings results in key media around the world, including US News & World Report in the United States and Chosun Ilbo in Korea. The first rankings produced by QS independently of THE, and using QS's consistent and original methodology, were released on September 8, 2010, with the second appearing on September 6, 2011.

QS tried to design its rankings to look at a broad range of university activity.

Academic peer review (40%)

The most controversial part of the QS World University Rankings is their use of an opinion survey referred to as the Academic Peer Review. Using a combination of purchased mailing lists and applications and suggestions, this survey asks active academicians across the world about the top universities in fields they know about. QS has published the job titles and geographical distribution of the participants.

The 2011 rankings made use of responses from 33,744 people from over 140 nations in its Academic Peer Review, including votes from the previous two years rolled forward provided there was no more recent information available from the same individual. Participants can nominate up to 30 universities but are not able to vote for their own. They tend to nominate a median of about 20, which means that this survey includes over 500,000 data points.[11]

In 2004, when the rankings first appeared, academic peer review accounted for half of a university's possible score. In 2005, its share was cut to 40 per cent because of the introduction of the Recruiter Review.

Faculty student ratio (20%)

This indicator accounts for 20 per cent of a university’s possible score in the rankings. It is a classic measure used in various ranking systems as a surrogate for teaching commitment, but QS has admitted that it is less than satisfactory.[12]

Citations per faculty (20%)

Citations of published research are among the most widely used inputs to national and global university rankings. The QS World University Rankings used citations data from Thomson (now Thomson Reuters) from 2004 to 2007, and since then uses data from Scopus, part of Elsevier. The total number of citations for a five-year period is divided by the number of academicians in a university to yield the score for this measure, which accounts for 20 per cent of a university’s possible score in the Rankings.

QS has explained that it uses this approach, rather than the citations per paper preferred for other systems, because it reduces the effect of biomedical science on the overall picture – bio-medicine has a ferocious “publish or perish” culture. Instead QS attempts to measure the density of research-active staff at each institution. But issues still remain about the use of citations in ranking systems, especially the fact that the arts and humanities generate comparatively few citations.[13]

QS has conceded the presence of some data collection errors regarding citations per faculty in previous years' rankings.[14]

One interesting issue is the difference between the Scopus and Thomson Reuters databases. For major world universities, the two systems capture more or less the same publications and citations. For less mainstream institutions, Scopus has more non-English language and smaller-circulation journals in its database. But as the papers there are less heavily cited, this can also mean fewer citations per paper for the universities that publish in them.[13] This area has been criticized for undermining universities which do not use English as their primary language.[15] Citations and publications in a language different from English are harder to come across. The English language is the most internationalized language and therefore the most popular when citing.

Recruiter review (10%)

This part of the ranking is obtained by a similar method to the Academic Peer Review, except that it samples recruiters who hire graduates on a global or significant national scale. The numbers are smaller – 16,875 responses from over 130 countries in the 2011 Rankings – and are used to produce 10 per cent of any university’s possible score. This survey was introduced in 2005 in the belief that employers track graduate quality, making this a barometer of teaching quality, a famously problematic thing to measure. University standing here is of especial interest to potential students.[16]

International orientation (10%)

The final ten per cent of a university’s possible score is derived from measures intended to capture their internationalism: five percent from their percentage of international students, and another five percent from their percentage of international staff. This is of interest partly because it shows whether a university is putting effort into being global, but also because it tells us whether it is taken seriously enough by students and academics around the world for them to want to be there.[17]

Data sources

The information used to compile the World University Ranking comes partly from the online surveys carried out by QS, partly from Scopus, and partly from an annual information-gathering exercise carried out by QS itself. QS collects data from universities directly, from their web sites and publications, and from national bodies such as education ministries and the National Center for Education Statistics in the US and the Higher Education Statistics Agency in the UK.

Aggregation

The data are aggregated into columns according to its Z score, an indicator of how far removed any institution is from the average. Between 2004 and 2007 a different system was used whereby the top university for any measure was scaled as 100 and the others received a score reflecting their comparative performance. According to QS, this method was dropped because it gives too much weight to some exceptional outliers, such as the very high faculty/student ratio of the California Institute of Technology. In 2006, the last year before the Z score system was introduced, Caltech was top of the citations per faculty score, receiving 100 on this indicator, because of its highly research and science-oriented approach. The next two institutions on this measure, Harvard and Stanford, each scored 55. In other words, 45 per cent of the possible difference between all the world's universities was between the top university and the next one (in fact two) on the list, leaving every other university on Earth to fight over the remaining 55 per cent.

Likewise in 2005, Harvard was top university and MIT was second with 86.9, so that 13 per cent of the total difference between all the world's universities was between first and second place. In 2011, the University of Cambridge was top and the second institution, Harvard, got 99.34. So the Z score system allows the full range of available difference to be used in a more informative way.

Classifications

In 2009, a column of classifications was introduced to provide additional context to the rankings tables. Universities are classified by size, defined by the size of the student body; comprehensive or specialist status, defined by the range of faculty areas in which programs are offered; and research activity, defined by the number of papers published in a five-year period.

Fees

In 2011, QS began publishing average fees data for the universities it ranks. These, however, are not used as an indicator in the rankings.

QS publishes domestic and international fees for undergraduate and postgraduate study.

Faculty-level analysis

QS publishes a simple analysis of the top 400 institutions in each of the five faculty-level areas mentioned above: natural sciences, technology, biology and medicine, social sciences and the arts and humanities. These five tables list universities in order of their Academic Peer Review score. They also give the citations per paper for each institution, but the two data sets are not aggregated.

QS uses citations per paper rather than per person partly because it does not hold details of the academic staff in each subject area, and partly because the number of citations per paper should be a consistent indicator of impact within a specific field with a defined publishing culture.

Major rankings

Overall rankings

QS publishes a ranking of just over 700 of the top universities around the world, the first 50 of which are listed below (according to the latest result):

QS World University Rankings — Top 50
2013/14[18] 2012/13[19]2011/12[20]2010/11[21]InstituteRegion
1135 Massachusetts Institute of Technology United States
2322Harvard University United States
3211University of Cambridge United Kingdom
4474 University College London United Kingdom
5667Imperial College London United Kingdom
6556 University of Oxford United Kingdom
7151113Stanford University United States
8743Yale University  United States
9888University of Chicago United States
1010129California Institute of Technology United States
1091310Princeton University United States
12131818Swiss Federal Institute of Technology in Zurich (ETH Zurich)  Switzerland
1312912University of Pennsylvania United States
14111011Columbia University United States
15141516Cornell University United States
16161617Johns Hopkins University United States
17212022University of Edinburgh United Kingdom
17192329University of Toronto Canada
19293532Swiss Federal Institute of Technology in Lausanne (EPFL)  Switzerland
19262721King's College London United Kingdom
21181719McGill University Canada
22171415University of Michigan United States
23201914Duke University United States
24252831National University of Singapore Singapore
25222128University of California, Berkeley United States
26232223The University of Hong Kong Hong Kong
27242620Australian National University Australia
28343333Ecole Normale Supérieure France
29272426Northwestern University United States
30283027University of Bristol United Kingdom
31363138The University of Melbourne Australia
32302524The University of Tokyo Japan
33322930The University of Manchester United Kingdom
34334040The Hong Kong University of Science and Technology Hong Kong
35353225Kyoto University Japan
35374250Seoul National University South Korea
37384148University of Wisconsin-Madison United States
38393837The University of Sydney Australia
39403742The Chinese University of Hong Kong Hong Kong
40313435University of California, Los Angeles United States
41413636Ecole Polytechnique France
41475874Nanyang Technological University Singapore
43464843The University of Queensland Australia
44434441New York University United States
45515245University of Copenhagen Denmark
46444647Peking University China
47423939Brown University United States
48484754Tsinghua University China
49455144University of British Columbia Canada
50555351Ruprecht-Karls-Universität Heidelberg Germany
  • For the full list, see the official website.
  • For the rankings before 2010, see the articles about results of the THE-QS World University Rankings:
THE–QS World University Rankings, 2004
THE–QS World University Rankings, 2005
THE–QS World University Rankings, 2006
THE–QS World University Rankings, 2007
THE–QS World University Rankings, 2008
THE–QS World University Rankings, 2009

Rankings by subjects

QS also ranks universities by subject, with 30 such rankings appearing in 2013. These rankings are drawn up on the basis of academic opinion, recruiter opinion and citations. Different academic disciplines are sorted into five big categories: Arts & Humanities, Engineering & Technology, Life Sciences& Medicine, Natural Sciences and Social Sciences & Management, each of which shows the top 200 universities in that particular field.

QS University Subject Rankings' categories[22]
Art & Humanities Engineering & Technology Life Sciences & Medicine Natural Sciences Social Sciences
DisciplineDisciplineDiscipline Discipline Discipline
Philosophy Computer Science
&
Information Systems
Medicine Physics
&
Astronomy
Statistics
&
Operational Research
Modern Languages Chemical Engineering Biological Sciences Mathematics Sociology
Geography Civil
&
Structural Engineering
Psychology Environmental Sciences Politics
&
International Studies
History Electrical
&
Electronic Engineering
Pharmacy
&
Pharmacology
Earth
&
Marine Sciences
Law
Linguistics Mechanical, Aeronautical
&
Manufacturing Engineering
Agriculture
&
Forestry
Chemistry Economics
&
Econometrics
English Language
&
Literature
Materials Sciences Accounting
&
Finance
Communication
&
Media Studies
Education

QS Top 50 under 50

QS releases a list of QS Top 50 under 50 annually to rank those universities which have established for not more than 50 years. This league table is based on their position in the QS World University Rankings of the previous year.[23]

Regional rankings

QS Asian University Rankings

In 2009, QS launched the QS Asian University Rankings or QS University Rankings: Asia in partnership with The Chosun Ilbo newspaper in Korea to rank universities in Asia independently.

These rankings use some of the same criteria as the world rankings, but there are changed weightings and new criteria. One addition is the criterion of incoming and outgoing exchange students. Accordingly, the "QS World University Rankings" and the "QS Asian University Rankings" released in the same academic year are different and independent of each other.[1] For example, the University of Hong Kong being 22nd and 23rd worldwide in the year of 2011 and 2012 was regarded as the best tertiary institution in Asia by the QS World University Rankings,[19][20] while the Hong Kong University of Science & Technology topped the tables of the QS Asian University Rankings in 2011 and 2012.[24][25]

The QS Asian University Rankings rank the top few hundred tertiary institutes in Asia, the first 50 of which are as follows:

QS Asian University Rankings — Top 50
2013[26]2012[24]2011[25]2010[27]2009[28]InstituteRegion
11124The Hong Kong University of Science and Technology Hong Kong
23211The University of Hong Kong Hong Kong
223310National University of Singapore Singapore
44668Seoul National University South Korea
56131210Peking University  China
6711137Korea Advanced Institute of Science and Technology South Korea
75542The Chinese University of Hong Kong Hong Kong
79121417Pohang University of Science and Technology South Korea
98453The University of Tokyo Japan
1010785Kyoto University Japan
1017171814Nanyang Technological University Singapore
1212151518City University of Hong Kong Hong Kong
13139119Tokyo Institute of Technology Japan
1415161615Tsinghua University  China
1511876Osaka University Japan
1616181925Yonsei University South Korea
17149913Tohoku University Japan
1818141012Nagoya University Japan
1921262933Korea University South Korea
2022181715Kyushu University Japan
2124274344Sungkyunkwan University South Korea
2220212122National Taiwan University Taiwan
2319212426Fudan University  China
2423202220Hokkaido University Japan
2526303038The Hong Kong Polytechnic University Hong Kong
2627242524University of Science and Technology of China  China
2729333429Shanghai Jiao Tong University  China
2828273232Zhejiang University  China
2928292727Nanjing University  China
3049527174National Chiao Tung University Taiwan
3131313440National Tsing Hua University Taiwan
3230242320Keio University Japan
3335394239Universiti Malaya Malaysia
3432232019University of Tsukuba Japan
3541426257Kyung Hee University South Korea
3633444946Hanyang University South Korea
3737323143National Cheng Kung University Taiwan
3836373936Indian Institute of Technology Delhi India
3934383630Indian Institute of Technology Bombay India
4040454842Ewha Womans University South Korea
4139352623Kobe University Japan
4238342830Mahidol University Thailand
4348494573Hong Kong Baptist University Hong Kong
4442463937Waseda University Japan
4550404147National Yang Ming University Taiwan
46456489110Beijing Normal University  China
4744413828Hiroshima University Japan
4843474435Chulalongkorn University Thailand
4945435349Indian Institute of Technology Madras India
506489nanaTaipei Medical University Taiwan
506080na49University of Santo Tomas Philippines

QS Latin American University Rankings

The QS Latin American University Rankings or QS University Rankings: Latin America[29] were launched in 2011. They use academic opinion (30 per cent), employer opinion (20 per cent), publications per faculty member, citations per paper, academic staff with a PhD, faculty/student ratio and web visibility (10 per cent each) as measures. These criteria were developed in consultation with experts in Latin America, and the web visibility data come from Webometrics . This ranking showed that the University of São Paulo in Brazil is the region's top institution. It was first in the first and second editions in 2011 and 2012 respectively.

QS University Rankings: BRICS

QS collaborates with the Russian News to launch the third regional rankings regarding the BRICS countries (Brazil, Russia, India, China and South Africa), known as the QS University Rankings: BRICS. This ranking adopts 8 indicators, which are derived from but different in weightings to those of the world rankings, to select the top 100 higher learning institutions in these regions. The BRICS ranking only takes mainland China's universities into account, excluding other Greater China places' such as those in Hong Kong and Taiwan.[30]

QS University Rankings: BRICS — Top 50[30]
2013InstitutionRegion
1 Tsinghua University  China
2 Peking University  China
3 Lomonosov Moscow State University  Russia
4 Fudan University  China
5 Nanjing University  China
6 University of Science and Technology of China  China
6 Shanghai Jiao Tong University  China
8 Universidade de São Paulo  Brazil
9 Zhejiang University  China
10 Universidade Estadual de Campinas  Brazil
11 University of Cape Town  South Africa
12 Beijing Normal University  China
13 Indian Institute of Technology Delhi  India
14 Saint-Petersburg State University  Russia
15 Indian Institute of Technology Bombay  India
16 Indian Institute of Technology Madras  India
17 Indian Institute of Technology Kanpur  India
18 Indian Institute of Technology Kharagpur  India
19 Universidade Federal do Rio de Janeiro  Brazil
20 Sun Yat-sen University  China
21 Xi'an Jiaotong University  China
22 Novosibirsk State University  Russia
23 Harbin Institute of Technology  China
23 Nankai University  China
25 Universidade Estadual Paulista "Júlio de Mesquita Filho"  Brazil
26 Wuhan University  China
27 Tongji University  China
28 Shanghai University  China
29 Universidade Federal de São Paulo  Brazil
30 Stellenbosch University  South Africa
31 University of The Witwatersrand  South Africa
32 Beihang University  China
33 Bauman Moscow State Technical University  Russia
34 Indian Institute of Technology Roorkee  India
35 Universidade Federal de Minas Gerais  Brazil
36 Pontificia Universidade Católica de São Paulo  Brazil
37 Moscow State Institute of International Relations  Russia
38 Universidade Federal do Rio Grande Do Sul  Brazil
39 Beijing Institute of Technology  China
40 Xiamen University  China
41 Pontificia Universidade Católica do Rio de Janeiro  Brazil
42 Renmin University of China  China
43 University of Pretoria  South Africa
43 Tianjin University  China
43 Beijing Jiaotong University  China
43 Universidade Federal de São Carlos  Brazil
47 Saint Petersburg State Polytechnical University  Russia
48 Universidade de Brasilia  Brazil
48 Huazhong University of Science and Technology  China
50 National Research University – Higher School of Economics  Russia

QS Stars

QS also offers universities a way of seeing their own strengths and weaknesses in depth. Called QS Stars, this service is separate from the QS World University Rankings. It involves a detailed look at a range of functions which mark out a modern university. Universities can get from one star to five, or Five Star Plus for the truly exceptional.

QS Stars ratings are derived from scores on eight criteria. They are: ● Research Quality ● Teaching Quality ● Graduate Employability ● University Infrastructure ● Internationalisation ● Innovation and knowledge transfer ● Third mission activity, measuring areas of social and civic engagement ● Special criteria for specific subjects

Stars is an evaluation system, not a ranking. About 100 institutions had opted for the Stars evaluation as of early 2013. In 2012, fees to participate in this program were $9850 for the initial audit and an annual license fee of $6850.[31]

Commentary

Several universities in the UK and the Asia-Pacific region have commented on the rankings positively. Vice-Chancellor of New Zealand's Massey University, Professor Judith Kinnear, says that the Times Higher Education-QS ranking is a "wonderful external acknowledgement of several University attributes, including the quality of its research, research training, teaching and employability." She said the rankings are a true measure of a university's ability to fly high internationally: "The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand's Performance Based Research Fund (PBRF) measure or the Shanghai rankings."[32] In September 2012 the British newspaper The Independent described the QS World University Rankings as being "widely recognised throughout higher education as the most trusted international tables".[33]

Martin Ince,[34] chair of the Advisory Board for the Rankings, points out that their volatility has been reduced since 2007 by the introduction of the Z-score calculation method and that over time, the quality of QS's data gathering has improved to reduce anomalies. In addition, the academic and employer review are now so big that even modestly ranked universities receive a statistically valid number of votes. QS has published extensive data [35] on who the respondents are, where they are, and the subjects and industries to which the academicians and employers respectively belong.

General criticisms

Many are concerned with the use or misuse of survey data.

Since the split from Times Higher Education, further concerns about the methodology QS uses for its rankings have been brought up by several experts. Simon Marginson, professor of higher education at University of Melbourne and a member of the THE editorial board, in the article "Improving Latin American universities' global ranking" for University World News on 10 June 2012, said: "I will not discuss the QS ranking because the methodology is not sufficiently robust to provide data valid as social science." [36]

In an article for the New Statesman entitled "The QS World University Rankings are a load of old baloney", David Blanchflower, a leading labour economist, said: "This ranking is complete rubbish and nobody should place any credence in it. The results are based on an entirely flawed methodology that underweights the quality of research and overweights fluff... The QS is a flawed index and should be ignored." [37]

In an article titled The Globalisation of College and University Rankings and appearing in the January/February 2012 issue of Change magazine, Philip Altbach, professor of higher education at Boston College and also a member of the THE editorial board, said: “The QS World University Rankings are the most problematical. From the beginning, the QS has relied on reputational indicators for half of its analysis … it probably accounts for the significant variability in the QS rankings over the years. In addition, QS queries employers, introducing even more variability and unreliability into the mix. Whether the QS rankings should be taken seriously by the higher education community is questionable."[38]

The QS World University Rankings have been criticised by many for placing too much emphasis on peer review, which receives 40 percent of the overall score. Some people have expressed concern about the manner in which the peer review has been carried out.[39] In a report,[40] Peter Wills from the University of Auckland, New Zealand wrote of the Times Higher Education-QS World University Rankings:

But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions.

QS points out that no survey participant, academic or employer, has been offered a financial incentive to respondents. And academics cannot vote for their own institution.

THES-QS introduced several changes in methodology in 2007 which were aimed at addressing these criticisms,[41] the ranking has continued to attract criticisms. In an article[42] in the peer-reviewed BMC Medicine authored by several scientists from the US and Greece, it was pointed out:

If properly performed, most scientists would consider peer review to have very good construct validity; many may even consider it the gold standard for appraising excellence. However, even peers need some standardized input data to peer review. The Times simply asks each expert to list the 30 universities they regard as top institutions of their area without offering input data on any performance indicators. Research products may occasionally be more visible to outsiders, but it is unlikely that any expert possesses a global view of the inner workings of teaching at institutions worldwide. Moreover, the expert selection process of The Times is entirely unclear. The survey response rate among the selected experts was only <1% in 2006 (1,600 of 190,000 contacted). In the absence of any guarantee for protection from selection biases, measurement validity can be very problematic.

Alex Usher, vice president of Higher Education Strategy Associates in Canada, commented:

Most people in the rankings business think that the main problem with The Times is the opaque way it constructs its sample for its reputational rankings - a not-unimportant question given that reputation makes up 50% of the sample. Moreover, this year's switch from using raw reputation scores to using normalized Z-scores has really shaken things up at the top-end of the rankings by reducing the advantage held by really top universities - University of British Columbia (UBC) for instance, is now functionally equivalent to Harvard in the Peer Review score, which, no disrespect to UBC, is ludicrous. I'll be honest and say that at the moment the THES Rankings are an inferior product to the Shanghai Jiao Tong’s Academic Ranking of World Universities.

Academicians have also been critical of the use of the citation database, arguing that it undervalues institutions which excel in the social sciences. Ian Diamond, former chief executive of the Economic and Social Research Council and now vice-chancellor of the University of Aberdeen and a member of the THE editorial board, wrote to Times Higher Education in 2007, saying:[43]

The use of a citation database must have an impact because such databases do not have as wide a cover of the social sciences (or arts and humanities) as the natural sciences. Hence the low position of the London School of Economics, caused primarily by its citations score, is a result not of the output of an outstanding institution but the database and the fact that the LSE does not have the counterweight of a large natural science base.

The most recent criticism of the old system came from Fred L. Bookstein, Horst Seidler, Martin Fieder and Georg Winckler in the journal Scientomentrics for the unreliability of QS's methods:

Several individual indicators from the Times Higher Education Survey (THES) data base the overall score, the reported staff-to-student ratio, and the peer ratings—demonstrate unacceptably high fluctuation from year to year. The inappropriateness of the summary tabulations for assessing the majority of the “top 200” universities would be apparent purely for reason of this obvious statistical instability regardless of other grounds of criticism. There are far too many anomalies in the change scores of the various indices for them to be of use in the course of university management.[44]

Subject rankings reliability

The QS subject rankings have been dismissed as unreliable by some critics, including most notably Brian Leiter, who points out that programmes which are known to be high quality, and which rank highly in the Blackwell rankings (e.g., the University of Pittsburgh) fare poorly in the QS ranking for reasons that are not at all clear.[45][46]

In other areas, QS has highly ranked programmes which do not exist,[47] as in Geography, in which 5 of the top 10 did not actually have graduate programmes in geography. In Linguistics, the QS rankings are entirely out of step with the most recent NRC rankings; NRC ranks the doctoral programmes of the University of Massachusetts Amherst and the University of Maryland at College Park among the very best in the U.S.A. (tied for #3 in S-Rank), while QS ranks them 29th and 49th in the world, respectively.[48]

Notes and references

  1. 1.0 1.1 "Asian University Rankings - QS Asian University Rankings vs. QS World University Rankings™". "The methodology differs somewhat from that used for the QS World University Rankings..." 
  2. Ariel Zirulnick. "New world university ranking puts Harvard back on top". The Christian Science Monitor. 
  3. "We're fighting above our weight when it comes to uni rankings". The Australian. 
  4. Indira Samarasekera and Carl Amrhein. "Top schools don't always get top marks". The Edmonton Journal. Archived from the original on 2010-10-03. 
  5. Lambert Review of Business-University Collaboration
  6. Princeton University Press, 2010
  7. Martin Ince Communications
  8. Mroz, Ann. "Leader: Only the best for the best". Times Higher Education. Retrieved 2010-09-16. 
  9. Baty, Phil (2010-09-10). "Views: Ranking Confession". Inside Higher Ed. Retrieved 2010-09-16. 
  10. Labi, Aisha (2010-09-15). "Times Higher Education Releases New Rankings, but Will They Appease Skeptics?". The Chronicle of Higher Education (London, UK). Retrieved 2010-09-16. 
  11. "2011 Academic Survey Responses". Retrieved 12 September 2013. 
  12. QS Intelligence Unit | Faculty Student Ratio. Iu.qs.com. Retrieved on 2013-08-12.
  13. 13.0 13.1 QS Intelligence Unit | Citations per Faculty. Iu.qs.com. Retrieved on 2013-08-12.
  14. University Ranking Watch
  15. "Global university rankings and their impact,". "European University Association". Retrieved 3, September, 2012
  16. QS Intelligence Unit | Employer Reputation. Iu.qs.com. Retrieved on 2013-08-12.
  17. QS Intelligence Unit | International Indicators. Iu.qs.com. Retrieved on 2013-08-12.
  18. "QS World University Rankings (2013)". 
  19. 19.0 19.1 "QS World University Rankings (2012)". 
  20. 20.0 20.1 "QS World University Rankings (2011)". 
  21. "QS World University Rankings (2010)". 
  22. "QS University Subject Rankings". 
  23. "QS Top 50 under 50". Quacquarelli Symonds. Retrieved 2013-07-07. 
  24. 24.0 24.1 "QS Asian University Rankings (2012)". 
  25. 25.0 25.1 "QS Asian University Rankings (2011)". 
  26. "QS Asian University Rankings (2013)". 
  27. "QS Asian University Rankings (2010)". 
  28. "QS Asian University Rankings (2009)". 
  29. QS Latin American University Rankings - 2011. Top Universities (2012-12-19). Retrieved on 2013-08-12.
  30. 30.0 30.1 "QS University Rankings: BRICS". Quacquarelli Symonds. 2013-12-17. Retrieved 2013-12-17. 
  31. "Ratings at a Price for Smaller Universities". The New York Times. 30 December 2012. Retrieved 10 September 2013. 
  32. Flying high internationally
  33. "Cambridge loses top spot to Massachusetts Institute of Technology". The Independent. 11 September 2012. Retrieved 11 September 2012. 
  34. Martin Ince Communications Limited
  35. QS World University Rankings | QS Intelligence Unit
  36. Improving Latin American universities' global ranking - University World News
  37. The QS World University Rankings are a load of old baloney
  38. Change Magazine - January-February 2012
  39. Holmes, Richard (2006-09-05). "So That's how They Did It". Rankingwatch.blogspot.com. Retrieved 2010-09-16. 
  40. Response to Review of Strategic Plan by Peter Wills
  41. Sowter, Ben (1 November 2007). THES – QS World University Rankings 2007 - Basic explanation of key enhancements in methodology for 2007"
  42. "1741-7015-5-30.fm" (PDF). Retrieved 2010-09-16. 
  43. "Social sciences lose 1". Timeshighereducation.co.uk. 2007-11-16. Retrieved 2010-09-16. 
  44. "Scientometrics, Volume 85, Number 1". SpringerLink. Retrieved 2010-09-16. 
  45. Leiter Reports: A Philosophy Blog: Guardian and "QS Rankings" Definitively Prove the Existence of the "Halo Effect". Leiterreports.typepad.com (2011-06-05). Retrieved on 2013-08-12.
  46. Leiter Reports: A Philosophy Blog: The QS Subject Rankings are Complete Garbage. Leiterreports.typepad.com (2012-07-30). Retrieved on 2013-08-12.
  47. Sedghi, Ami. (2011-06-03) The world's top 100 universities ranked for arts and humanities disciplines | News. theguardian.com. Retrieved on 2013-08-12.
  48. NRC Rankings Overview: Linguistics - Faculty - The Chronicle of Higher Education. Chronicle.com (2010-09-30). Retrieved on 2013-08-12.

External links

This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.