Times Higher Education World University Ranking | Wikipedia audio article

September 20, 2019 0 By Stanley Isaacs


Times Higher Education World University Rankings
is an annual publication of university rankings by Times Higher Education (THE) magazine. The publisher had collaborated with Quacquarelli
Symonds (QS) to publish the joint THE–QS World University Rankings from 2004 to 2009
before it turned to Thomson Reuters for a new ranking system. The publication now comprises the world’s
overall, subject, and reputation rankings, alongside three regional league tables, Asia,
Latin America, and BRICS & Emerging Economies which are generated by different weightings. THE Rankings is often considered as one of
the most widely observed university rankings together with Academic Ranking of World Universities
and QS World University Rankings. It is praised for having a new, improved ranking
methodology since 2010; however, undermining of non-science and non-English instructing
institutions and relying on subjective reputation survey are among the criticism and concerns.==History==
The creation of the original Times Higher Education–QS World University Rankings was
credited in Ben Wildavsky’s book, The Great Brain Race: How Global Universities are Reshaping
the World, to then-editor of Times Higher Education, John O’Leary. Times Higher Education chose to partner with
educational and careers advice company QS to supply the data. After the 2009 rankings, Times Higher Education
took the decision to break from QS and signed an agreement with Thomson Reuters to provide
the data for its annual World University Rankings from 2010 onwards. The publication developed a new rankings methodology
in consultation with its readers, its editorial board and Thomson Reuters. Thomson Reuters will collect and analyse the
data used to produce the rankings on behalf of Times Higher Education. The first ranking was published in September
2010.Commenting on Times Higher Education’s decision to split from QS, former editor Ann
Mroz said: “universities deserve a rigorous, robust and transparent set of rankings – a
serious tool for the sector, not just an annual curiosity.” She went on to explain the reason behind the
decision to continue to produce rankings without QS’ involvement, saying that: “The responsibility
weighs heavy on our shoulders…we feel we have a duty to improve how we compile them.”Phil
Baty, editor of the new Times Higher Education World University Rankings, admitted in Inside
Higher Ed: “The rankings of the world’s top universities that my magazine has been publishing
for the past six years, and which have attracted enormous global attention, are not good enough. In fact, the surveys of reputation, which
made up 40 percent of scores and which Times Higher Education until recently defended,
had serious weaknesses. And it’s clear that our research measures
favored the sciences over the humanities.”He went on to describe previous attempts at peer
review as “embarrassing” in The Australian: “The sample was simply too small, and the
weighting too high, to be taken seriously.” THE published its first rankings using its
new methodology on 16 September 2010, a month earlier than previous years.The Times Higher
Education World University Rankings, along with the QS World University Rankings and
the Academic Ranking of World Universities are described to be the three most influential
international university rankings. The Globe and Mail in 2010 described the Times
Higher Education World University Rankings to be “arguably the most influential.”In 2014
Times Higher Education announced a series of important changes to its flagship THE World
University Rankings and its suite of global university performance analyses, following
a strategic review by THE parent company TES Global.==Methodology=====
Criteria and weighting===The inaugural 2010-2011 methodology contained
13 separate indicators grouped under five categories: Teaching (30 percent of final
score), research (30 percent), citations (research impact) (worth 32.5 percent), international
mix (5 percent), industry income (2.5 percent). The number of indicators is up from the Times-QS
rankings published between 2004 and 2009, which used six indicators.A draft of the inaugural
methodology was released on 3 June 2010. The draft stated that 13 indicators would
first be used and that this could rise to 16 in future rankings, and laid out the categories
of indicators as “research indicators” (55 percent), “institutional indicators” (25 percent),
“economic activity/innovation” (10 percent), and “international diversity” (10 percent). The names of the categories and the weighting
of each was modified in the final methodology, released on 16 September 2010. The final methodology also included the weighting
signed to each of the 13 indicators, shown below:
The Times Higher Education billed the methodology as “robust, transparent and sophisticated,”
stating that the final methodology was selected after considering 10 months of “detailed consultation
with leading experts in global higher education,” 250 pages of feedback from “50 senior figures
across every continent” and 300 postings on its website. The overall ranking score was calculated by
making Z-scores all datasets to standardize different data types on a common scale to
better make comparisons among data.The reputational component of the rankings (34.5 percent of
the overall score – 15 percent for teaching and 19.5 percent for research) came from an
Academic Reputation Survey conducted by Thomson Reuters in spring 2010. The survey gathered 13,388 responses among
scholars “statistically representative of global higher education’s geographical and
subject mix.” The magazine’s category for “industry income
– innovation” came from a sole indicator, institution’s research income from industry
scaled against the number of academic staff.” The magazine stated that it used this data
as “proxy for high-quality knowledge transfer” and planned to add more indicators for the
category in future years.Data for citation impact (measured as a normalized average citation
per paper), comprising 32.5 percent of the overall score, came from 12,000 academic journals
indexed by Thomson Reuters’ large Web of Science database over the five years from 2004 to
2008. The Times stated that articles published in
2009–2010 have not yet completely accumulated in the database. The normalization of the data differed from
the previous rankings system and is intended to “reflect variations in citation volume
between different subject areas,” so that institutions with high levels of research
activity in the life sciences and other areas with high citation counts will not have an
unfair advantage over institutions with high levels of research activity in the social
sciences, which tend to use fewer citations on average.The magazine announced on 5 September
2011 that its 2011–2012 World University Rankings would be published on 6 October 2011. At the same time, the magazine revealed changes
to the ranking formula that will be introduced with the new rankings. The methodology will continue to use 13 indicators
across five broad categories and will keep its “fundamental foundations,” but with some
changes. Teaching and research will each remain 30
percent of the overall score, and industry income will remain at 2.5 percent. However, a new “international outlook – staff,
students and research” will be introduced and will make up 7.5 percent of the final
score. This category will include the proportion
of international staff and students at each institution (included in the 2011–2012 ranking
under the category of “international diversity”), but will also add the proportion of research
papers published by each institution that are co-authored with at least one international
partner. One 2011–2012 indicator, the institution’s
public research income, will be dropped.On 13 September 2011, the Times Higher Education
announced that its 2011–2012 list will only rank the top 200 institutions. Phil Baty wrote that this was in the “interests
of fairness,” because “the lower down the tables you go, the more the data bunch up
and the less meaningful the differentials between institutions become.” However, Baty wrote that the rankings would
include 200 institutions that fall immediately outside the official top 200 according to
its data and methodology, but this “best of the rest” list from 201 to 400 would be unranked
and listed alphabetically. Baty wrote that the magazine intentionally
only ranks around 1 percent of the world’s universities in a recognition that “not every
university should aspire to be one of the global research elite.” However, the 2015/16 edition of the Times
Higher Education World University Rankings ranks 800 universities, while Phil Baty announced
that the 2016/17 edition, to be released on 21 September 2016, will rank “980 universities
from 79 countries”.The methodology of the rankings was changed during the 2011-12 rankings
process, with details of the changed methodology here. Phil Baty, the rankings editor, has said that
the THE World University Rankings are the only global university rankings to examine
a university’s teaching environment, as others focus purely on research. Baty has also written that the THE World University
Rankings are the only rankings to put arts and humanities and social sciences research
on an equal footing to the sciences. However, this claim is no longer true. In 2015, QS introduced faculty area normalization
to their QS World University Rankings, ensuring that citations data was weighted in a way
that prevented universities specializing in the Life Sciences and Engineering from receiving
undue advantage.In November 2014, the magazine announced further reforms to the methodology
after a review by parent company TES Global. The major change being all institutional data
collection would be bought in house severing the connection with Thomson Reuters. In addition, research publication data would
now be sourced from Elsevier’s Scopus database.===Reception===
The reception to the methodology was varied. Ross Williams of the Melbourne Institute,
commenting on the 2010–2011 draft, stated that the proposed methodology would favour
more focused “science-based institutions with relatively few undergraduates” at the expense
of institutions with more comprehensive programmes and undergraduates, but also stated that the
indicators were “academically robust” overall and that the use of scaled measures would
reward productivity rather than overall influence. Steve Smith, president of Universities UK,
praised the new methodology as being “less heavily weighted towards subjective assessments
of reputation and uses more robust citation measures,” which “bolsters confidence in the
evaluation method.” David Willetts, British Minister of State
for Universities and Science praised the rankings, noting that “reputation counts for less this
time, and the weight accorded to quality in teaching and learning is greater.” In 2014, David Willetts became chair of the
TES Global Advisory Board, responsible for providing strategic advice to Times Higher
Education.===Criticism===
Times Higher Education places a high importance on citations to generate rankings. Citations as a metric for effective education
is problematic in many ways, placing universities who do not use English as their primary language
at a disadvantage. Because English has been adopted as the international
language for most academic societies and journals, citations and publications in a language different
from English are harder to come across. Thus, such a methodology is criticized for
being inappropriate and not comprehensive enough. A second important disadvantage for universities
of non-English tradition is that within the disciplines of social sciences and humanities
the main tool for publications are books which are not or only rarely covered by digital
citations records.Times Higher Education has also been criticized for its strong bias towards
institutions that taught ‘hard science’ and had high quality output of research in these
fields, often to the disadvantage of institutions focused on other subjects like the social
sciences and humanities. For instance in the former THE-QS World University
Rankings, LSE was ranked 11th in the world in 2004 and 2005, but dropped to 66th and
67th in the 2008 and 2009 edition. In January 2010, THE concluded the method
employed by Quacquarelli Symonds, who conducted the survey on their behalf, was flawed in
such a way that bias was introduced against certain institutions, including LSE.A representative
of Thomson Reuters, THE’s new partner, commented on the controversy: “LSE stood at only 67th
in the last Times Higher Education-QS World University Rankings – some mistake surely? Yes, and quite a big one.” Nonetheless, after the change of data provider
to Thomson Reuters the following year, LSE fell to 86th place, with the ranking described
by a representative of Thomson Reuters as ‘a fair reflection of their status as a world
class university’. LSE despite being ranked continuously near
the top in its national rankings, has been placed below other British universities in
the Times Higher Education World Rankings in recent years, other institutions such as
Sciences Po have suffered due to the inherent methodology bias still used. Trinity College Dublin’s ranking in 2015 and
2016 was lowered by a basic mistake in data it had submitted; education administrator
Bahram Bekhradnia said the fact this went unnoticed evinced a “very limited checking
of data” “on the part of those who carry out such rankings”. Bekhradnia also opined “while Trinity College
was a respected university which could be relied upon to provide honest data, unfortunately
that was not the case with all universities worldwide.”In general it is not clear who
the rankings are made for. Many students, especially the undergraduate
students, are not interested in the scientific work of a facility of higher education. Also the price of the education has no effects
on the ranking. That means that private universities on the
North American continent are compared to the European universities. Many European countries like France, Sweden
or Germany for example have a long tradition on offering free education within facilities
of higher education.==World Education rankings=====Young Universities===
In addition, THE also provides 150 Under 50 Universities with different weightings of
indicators to accredit the growth of institutions that are under 50 years old. In particular, the ranking attaches less weight
to reputation indicators. For instance, University of Canberra Australia
established in Year 1990 at the rank 50 of 150 Under 50 Universities.===Subject===
Various academic disciplines are sorted into six categories in THE’s subject rankings:
“Arts & Humanities”; “Clinical, Pre-clinical & Health”; “Engineering & Technology”; “Life
Sciences”; “Physical Sciences”; and “Social Sciences”.==World Reputation Rankings==THE’s World Reputation Rankings serve as a
subsidiary of the overall league tables and rank universities independently in accordance
with their scores in prestige.Scott Jaschik of Inside Higher Ed said of the new rankings:
“…Most outfits that do rankings get criticised for the relative weight given to reputation
as opposed to objective measures. While Times Higher Education does overall
rankings that combine various factors, it is today releasing rankings that can’t be
criticised for being unclear about the impact of reputation – as they are strictly of
reputation.”==
Regional rankings=====
Asia===From 2013 to 2015, the outcomes of the Times
Higher Education Asia University Rankings were the same as the Asian universities’ position
on its World University Rankings. In 2016, the Asia University Rankings was
revamped and it “use the same 13 performance indicators as the THE World University Rankings,
but have been recalibrated to reflect the attributes of Asia’s institutions.”===Emerging Economies===
The Times Higher Education Emerging Economies Rankings (Formerly known as BRICS & Emerging
Economies Rankings) only includes universities in countries classified as “emerging economies”
by FTSE Group, including the “BRICS” nations of Brazil, Russia, India, China and South
Africa. Hong Kong institutions are not included in
this ranking.==Notes