Times Higher Education-QS World University Rankings | Wikipedia audio article

Times Higher Education-QS World University Rankings | Wikipedia audio article

August 22, 2019 0 By Stanley Isaacs


QS World University Rankings is an annual
publication of university rankings by Quacquarelli Symonds (QS). Previously known as Times Higher Education–QS
World University Rankings, the publisher had collaborated with Times Higher Education (THE)
magazine to publish its international league tables from 2004 to 2009 before both started
to announce their own versions. QS then chose to continue using the pre-existing
methodology while Times Higher Education adopted a new methodology to create their rankings. The QS system now comprises the global overall
and subject rankings (which name the world’s top universities for the study of 48 different
subjects and five composite faculty areas), alongside five independent regional tables
(Asia, Latin America, Emerging Europe and Central Asia, the Arab Region, and BRICS).Being
the only international ranking to have received International Ranking Expert Group (IREG)
approval, the QS ranking is viewed as one of the three most-widely read university rankings
in the world, along with Academic Ranking of World Universities and Times Higher Education
World University Rankings. However, it has been criticized for its overreliance
on subjective indicators and reputation surveys, which tend to fluctuate over the years. Concern also exists regarding the global consistency
and integrity of the data used to generate QS ranking results.==History==
A perceived need for an international ranking of universities for UK purposes was highlighted
in December 2003 in Richard Lambert’s review of university-industry collaboration in Britain
for HM Treasury, the finance ministry of the United Kingdom. Amongst its recommendations were world university
rankings, which Lambert said would help the UK to gauge the global standing of its universities. The idea for the rankings was credited in
Ben Wildavsky’s book, The Great Brain Race: How Global Universities are Reshaping the
World, to then-editor of Times Higher Education (THE), John O’Leary. THE chose to partner with educational and
careers advice company Quacquarelli Symonds (QS) to supply the data, appointing Martin
Ince, formerly deputy editor and later a contractor to THE, to manage the project. Between 2004 and 2009, QS produced the rankings
in partnership with THE. In 2009, THE announced they would produce
their own rankings, the Times Higher Education World University Rankings, in partnership
with Thomson Reuters. THE cited an asserted weakness in the methodology
of the original rankings, as well as a perceived favoritism in the existing methodology for
science over the humanities, as two of the key reasons for the decision to split with
QS. QS retained intellectual property in the prior
rankings and the methodology used to compile them and continues to produce rankings based
on that methodology, which are now called the QS World University Rankings.THE created
a new methodology with Thomson Reuters, and published the first Times Higher Education
World University Rankings in September 2010.==Global rankings=====Overall=======Methodology====
QS publishes the rankings results in the world’s media and has entered into partnerships with
a number of outlets, including The Guardian in the United Kingdom, and Chosun Ilbo in
Korea. The first rankings produced by QS independently
of THE, and using QS’s consistent and original methodology, were released on September 8,
2010, with the second appearing on September 6, 2011. QS designed its rankings in order to assess
performance according to what it believes to be key aspects of a university’s mission:
teaching, research, nurturing employability, and internationalisation.Academic peer review
This is the most controversial part of the methodology. Using a combination of purchased mailing lists
and applications and suggestions, this survey asks active academicians across the world
about the top universities in their specialist fields. QS has published the job titles and geographical
distribution of the participants.The 2017/18 rankings made use of responses from 75,015
people from over 140 nations for its Academic Reputation indicator, including votes from
the previous five years rolled forward provided there was no more recent information available
from the same individual. Participants can nominate up to 30 universities
but are not able to vote for their own. They tend to nominate a median of about 20,
which means that this survey includes over 500,000 data points. The average respondent possesses 20.4 years
of academic experience, while 81% of respondents have over a decade of experience in the academic
world.In 2004, when the rankings first appeared, academic peer review accounted for half of
a university’s possible score. In 2005, its share was cut to 40 per cent
because of the introduction of the Employer Reputation Survey. Faculty student ratio
This indicator accounts for 20 per cent of a university’s possible score in the rankings. It is a classic measure used in various ranking
systems as a proxy for teaching commitment, but QS has admitted that it is less than satisfactory.Citations
per faculty Citations of published research are among
the most widely used inputs to national and global university rankings. The QS World University Rankings used citations
data from Thomson (now Thomson Reuters) from 2004 to 2007, and since then has used data
from Scopus, part of Elsevier. The total number of citations for a five-year
period is divided by the number of academics in a university to yield the score for this
measure, which accounts for 20 per cent of a university’s possible score in the Rankings. QS has explained that it uses this approach,
rather than the citations per paper preferred for other systems, because it reduces the
effect of biomedical science on the overall picture – bio-medicine has a ferocious “publish
or perish” culture. Instead QS attempts to measure the density
of research-active staff at each institution. But issues still remain about the use of citations
in ranking systems, especially the fact that the arts and humanities generate comparatively
few citations.However, since 2015, QS have made methodological enhancements designed
to remove the advantage institutions specializing in the Natural Sciences or Medicine previously
received. This enhancement is termed faculty area normalization,
and ensures that an institution’s citations count in each of QS’s five key Faculty Areas
is weighted to account for 20% of the final citations score.QS has conceded the presence
of some data collection errors regarding citations per faculty in previous years’ rankings.One
interesting issue is the difference between the Scopus and Thomson Reuters databases. For major world universities, the two systems
capture more or less the same publications and citations. For less mainstream institutions, Scopus has
more non-English language and smaller-circulation journals in its database. But as the papers there are less heavily cited,
this can also mean fewer citations per paper for the universities that publish in them. This area has been criticized for undermining
universities which do not use English as their primary language. Citations and publications in a language different
from English are harder to come across. The English language is the most internationalized
language and therefore is also the most popular when citing. Employer review
This part of the ranking is obtained by a similar method to the Academic Peer Review,
except that it samples recruiters who hire graduates on a global or significant national
scale. The numbers are smaller – 40,455 responses
from over 130 countries in the 2016 Rankings – and are used to produce 10 per cent of
any university’s possible score. This survey was introduced in 2005 in the
belief that employers track graduate quality, making this a barometer of teaching quality,
a famously problematic thing to measure. University standing here is of special interest
to potential students, and acknowledging this was the impetus behind the inaugural QS Graduate
Employability Rankings, published in November 2015.International orientation
The final ten per cent of a university’s possible score is derived from measures intended to
capture their internationalism: five percent from their percentage of international students,
and another five percent from their percentage of international staff. This is of interest partly because it shows
whether a university is putting effort into being global, but also because it tells us
whether it is taken seriously enough by students and academics around the world for them to
want to be there.====Reception====
In September 2015, both The Guardian and The Daily Mail referred to the QS World University
Rankings as “the most authoritative of their kind”. In 2016, Ben Sowter, Head of Research at the
QS Intelligence Unit, was ranked in 40th position in Wonkhe’s 2016 ‘Higher Education Power List’. The list enumerated what the organisation
believed to be the 50 most influential figures in UK higher education.Several universities
in the UK and the Asia-Pacific region have commented on the rankings positively. Vice-Chancellor of New Zealand’s Massey University,
Professor Judith Kinnear, says that the Times Higher Education-QS ranking is a “wonderful
external acknowledgement of several university attributes, including the quality of its research,
research training, teaching and employability.” She said the rankings are a true measure of
a university’s ability to fly high internationally: “The Times Higher Education ranking provides
a rather more and more sophisticated, robust and well rounded measure of international
and national ranking than either New Zealand’s Performance Based Research Fund (PBRF) measure
or the Shanghai rankings.” In September 2012 the British newspaper The
Independent described the QS World University Rankings as being “widely recognised throughout
higher education as the most trusted international tables”.Angel Calderon, Principal Advisor
for Planning and Research at RMIT University and member of the QS Advisory Board, spoke
positively of the QS University Rankings for Latin America, saying that the “QS Latin American
University Rankings has become the annual international benchmark universities use to
ascertain their relative standing in the region”. He further stated that the 2016/17 edition
of this ranking demonstrated improved stability.====Criticisms====
Certain commentators have expressed concern about the use or misuse of survey data. However, QS’s Intelligence Unit, responsible
for compiling the rankings, state that the extent of the sample size used for their surveys
mean that they are now “almost impossible to manipulate and very difficult for institutions
to ‘game’”. They also state that “over 62,000 academic
respondents contributed to our 2013 academic results, four times more than in 2010. Independent academic reviews have confirmed
these results to be more than 99% reliable”. Furthermore, since 2013, the number of respondents
to QS’s Academic Reputation Survey has increased again. Their survey now makes use of nearly 75,000
academic peer reviews, making it “to date, the world’s largest aggregation of feeling
in this [the global academic] community.” The QS World University Rankings have been
criticised by many for placing too much emphasis on peer review, which receives 40 percent
of the overall score. Some people have expressed concern about the
manner in which the peer review has been carried out. In a report, Peter Wills from the University
of Auckland wrote of the Times Higher Education-QS World University Rankings: But we note also
that this survey establishes its rankings by appealing to university staff, even offering
financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest
interest to rank their own institution more highly than others. This means the results of the survey and any
apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic
value in any case. We are vehemently opposed to the evaluation
of the University according to the outcome of such PR competitions. However, QS state that no survey participant,
academic or employer, is offered a financial incentive to respond, while no academic is
able to vote for their own institution. This renders this particular criticism invalid,
as it is based on two incorrect premises: (1) that academics are currently financially
incentivized to participate, and (2) that conflicts of interests are created by academics
being able to vote for their own institution. Academicians previously criticized of the
use of the citation database, arguing that it undervalues institutions which excel in
the social sciences. Ian Diamond, former chief executive of the
Economic and Social Research Council and now vice-chancellor of the University of Aberdeen
and a member of the THE editorial board, wrote to Times Higher Education in 2007, saying:
The use of a citation database must have an impact because such databases do not have
as wide a cover of the social sciences (or arts and humanities) as the natural sciences. Hence the low position of the London School
of Economics, caused primarily by its citations score, is a result not of the output of an
outstanding institution but the database and the fact that the LSE does not have the counterweight
of a large natural science base. However, in 2015, QS’s introduction of faculty
area normalization ensured that QS’s rankings no longer conferred an undue advantage or
disadvantage upon any institution based on their particular subject specialisms. Correspondingly, the London School of Economics
rose from 71st in 2014 to 35th in 2015 and 37th in 2016.Since the split from Times Higher
Education in 2009, further concerns about the methodology QS uses for its rankings have
been brought up by several experts. In October 2010, criticism of the old system
came from Fred L. Bookstein, Horst Seidler, Martin Fieder and Georg Winckler in the journal
Scientomentrics for the unreliability of QS’s methods: Several individual indicators from
the Times Higher Education Survey (THES) data base the overall score, the reported staff-to-student
ratio, and the peer ratings—demonstrate unacceptably high fluctuation from year to
year. The inappropriateness of the summary tabulations
for assessing the majority of the “top 200” universities would be apparent purely for
reason of this obvious statistical instability regardless of other grounds of criticism. There are far too many anomalies in the change
scores of the various indices for them to be of use in the course of university management. In an article for the New Statesman entitled
“The QS World University Rankings are a load of old baloney”, David Blanchflower, a leading
labour economist, said: “This ranking is complete rubbish and nobody should place any credence
in it. The results are based on an entirely flawed
methodology that underweights the quality of research and overweights fluff… The QS is a flawed index and should be ignored.” However, Martin Ince, chair of the Advisory
Board for the Rankings, points out that their volatility has been reduced since 2007 by
the introduction of the Z-score calculation method and that over time, the quality of
QS’s data gathering has improved to reduce anomalies. In addition, the academic and employer review
are now so big that even modestly ranked universities receive a statistically valid number of votes. QS has published extensive data on who the
respondents are, where they are, and the subjects and industries to which the academicians and
employers respectively belong. The QS Subject Rankings have been dismissed
as unreliable by Brian Leiter, who points out that programmes which are known to be
high quality, and which rank highly in the Blackwell rankings (e.g., the University of
Pittsburgh) fare poorly in the QS ranking for reasons that are not at all clear. However, the University of Pittsburgh was
ranked in the number one position for Philosophy in the 2016 QS World University Rankings by
Subject, while Rutgers University – another university that Leiter argued was given a
strangely low ranking – was ranked number three in the world in the same ranking. An institution’s score for each of QS’s metrics
can be found on the relevant ranking page, allowing those wishing to examine why an institution
has finished in its final position to gain access to the scores that contributed to the
overall rank.In an article titled The Globalisation of College and University Rankings and appearing
in the January/February 2012 issue of Change magazine, Philip Altbach, professor of higher
education at Boston College and also a member of the THE editorial board, said: “The QS
World University Rankings are the most problematical. From the beginning, the QS has relied on reputational
indicators for half of its analysis … it probably accounts for the significant variability
in the QS rankings over the years. In addition, QS queries employers, introducing
even more variability and unreliability into the mix. Whether the QS rankings should be taken seriously
by the higher education community is questionable.”Simon Marginson, professor of higher education at
University of Melbourne and a member of the THE editorial board, in the article “Improving
Latin American universities’ global ranking” for University World News on 10 June 2012,
said: “I will not discuss the QS ranking because the methodology is not sufficiently robust
to provide data valid as social science”. QS’s Intelligence Unit counter these criticisms
by stating that “Independent academic reviews have confirmed these results to be more than
99% reliable”.====Results====
The 2019 QS World University Rankings, published on June 6, 2018, was the fifteenth edition
of the overall ranking. It confirmed Massachusetts Institute of Technology
as the world’s highest-ranked university for a seventh successive year. In doing so, MIT broke the record of consecutive
number-one positions. For the rankings before 2010, see THE-QS World
University Rankings.===Young Universities===
QS also releases the QS Top 50 under 50 Ranking annually to rank universities which have been
established for under 50 years. These institutions are judged based on their
positions on the overall table of the previous year. From 2015, QS’s “‘Top 50 Under 50” ranking
was expanded to include the world’s top 100 institutions under 50 years of age, while
in 2017 it was again expanded to include the world’s top 150 universities in this cohort. In 2017, the table was topped by Nanyang Technological
University of Singapore for the fourth consecutive year. The table is dominated by universities from
the Asia-Pacific region, with the top six places taken by Asian institutions.===Faculties and subjects===
QS also ranks universities by academic discipline organized into 5 faculties, namely Arts & Humanities,
Engineering & Technology, Life Sciences & Medicine, Natural Sciences and Social Sciences & Management. The methodology is based on surveying expert
academics and global employers, and measuring research performance using data sourced from
Elsevier’s Scopus database. In the 2018 QS World University Rankings by
Subject the world’s best universities for the study of 48 different subjects are named. The two new subject tables added in the most
recent edition are: Classics & Ancient History and Library & Information Management. The world’s leading institution in 2018’s
tables in terms of most world-leading positions is Harvard University, which is number one
for 14 subjects. Its longtime rankings rival, Massachusetts
Institute of Technology, is number one for twelve subjects.==Regional rankings and other tables=====QS Graduate Employability Rankings===
In 2015, in an attempt to meet student demand for comparative data about the employment
prospects offered by prospective or current universities, QS launched the QS Graduate
Employability Rankings. The most recent instalment, released for the
2017/18 academic year, ranks 500 universities worldwide. It is led by Stanford University, and features
five universities from the United States in the top 10. The unique methodology consists of five indicators,
with three that do not feature in any other ranking.===Asia===
In 2009, QS launched the QS Asian University Rankings or QS University Rankings: Asia in
partnership with The Chosun Ilbo newspaper in Korea to rank universities in Asia independently. The Ninth instalment, released for the 2017/18
academic year, ranks the 350 best universities in Asia, and is led by Nanyang Technological
University, Singapore.These rankings use some of the same criteria as the world rankings,
but there are changed weightings and new criteria. One addition is the criterion of incoming
and outgoing exchange students. Accordingly, the performance of Asian institutions
in the QS World University Rankings and the QS Asian University Rankings released in the
same academic year are different from each other.===Latin America===
The QS Latin American University Rankings or QS University Rankings: Latin America were
launched in 2011. They use academic opinion (30%), employer
opinion (20%), publications per faculty member, citations per paper, academic staff with a
PhD, faculty/student ratio and web visibility (10 per cent each) as measures.The 2016/17
edition of the QS World University Rankings: Latin America ranks the top 300 universities
in the region. The Universidade de São Paulo retained its
status as the region’s best university.===Africa===
The number of universities in Africa increased by 115 percent from 2000 to 2010, and enrollment
more than doubled from 2.3 million to 5.2 million students, according to UNESCO. However, only one African university was among
the worlds 100 best, to judge the world universities ranking of 2016.===BRICS===
This set of rankings adopts 8 indicators to select the top 100 higher learning institutions
in BRICS countries. Institutions in Hong Kong, Macau and Taiwan
are not ranked here.===QS Best Student Cities Ranking===
In 2012, QS launched the QS Best Student Cities ranking – a table designed to evaluate which
cities were most likely to provide students with a high-quality student experience. Five editions of the ranking have been published
thus far, with Paris taking the number-one position in four of them. The 2017 edition was also the first one to
see the introduction of student opinion as a contributory indicator. The most recent edition of the ranking was
released on May 9, 2018. It saw London take the number-one spot from
Montreal.==Events==
QS Quacquarelli Symonds organises a range of international student recruitment events
throughout the year. These are generally oriented towards introducing
prospective students to university admissions staff, while also facilitating access to admissions
advice and scholarships. In 2018, over 300 events were hosted, attended
by 220,000 candidates, in 100 cities across 50 countries. Separated into ‘tours’, QS’ event offerings
typically comprise a series of university and business school fairs.===World MBA Tour===
The QS World MBA Tour is the world’s largest series of international business school fairs,
attended by more than 60,000 candidates in 100 cities across 50 countries.===World MBA Tour Premium===
QS World MBA Premium also focuses on MBA student recruitment, but invites only business schools
ranked in the top 200 internationally, according to the QS World University Rankings. The event aims to provide a more holistic
overview of an MBA degree, with enhanced focus on pre- and post-study processes and insights.===World Grad School Tour===
The QS World Grad School Tour focuses on international postgraduate programs, particularly specialised
Master’s degrees and PhDs in FAME (Finance, Accounting, Management and Economics) and
STEM disciplines.===World University Tour===
The QS World University Tour has an emphasis on undergraduate student recruitment, inviting
undergraduate programs only.===Connect Events===
QS Connect MBA and QS Connect Masters differ from other event series’ in that an open
fair format is not followed. Instead, candidates take part in pre-arranged
1-to-1 interviews with admissions staff, based on pre-submitted CVs and academic profiles.==QS Stars==
QS also offers universities an auditing service that provides in-depth information about institutional
strengths and weaknesses. Called QS Stars, this service is separate
from the QS World University Rankings. It involves a detailed look at a range of
functions which mark out a modern, global university. The minimum result that a university can receive
is zero Stars, while truly exceptional, world-leading universities can receive ‘5*+’, or ‘Five Star
Plus’, status. The QS Stars audit process evaluates universities
according to about 50 different indicators. By 2018, about 20 different universities worldwide
had been awarded the maximum possible Five Star Plus rating.QS Stars ratings are derived
from scores on in eight out of 12 categories. Four categories are mandatory, while institutions
must choose the remaining four optional categories. They are: Teaching
Employability Research
Internationalization Facilities
Online/Distance Learning Arts & Culture
Innovation Inclusiveness
Social Responsibility Subject Ranking
Program StrengthStars is an evaluation system, not a ranking. About 400 institutions had opted for the Stars
evaluation as of early 2018. In 2012, fees to participate in this program
were $9850 for the initial audit and an annual license fee of $6850.==Notes