Login

close

Login

If you are a registered HEi-know user, please log in to continue.


Unregistered Visitors

You must be a registered HEi-know user to access Briefing Reports, stories and other information and services. Please click on the link below to find out more about HEi-know.

Find out more
Study highlights dissatisfaction among students with multiple disadvantages

Over a quarter of students from multiple disadvantaged groups are dissatisfied with their non-academic higher education experience, new research shows.

HEi News Roundup live

Live higher education news roundup

HEi-think: Room for constructively critical students on OfS panel

Nicola Dandridge, chief executive of the Office for Students, outlines her vision for engaging with students and ensuring effective student representation on the OfS.

Universities reduce carbon emissions but still set to miss targets, says report

Research published by sustainability consultancy Brite Green shows English universities have achieved their best year-on-year reduction in carbon emissions to date - but the sector is still not on track to meet targets for 2020 set by the Higher Education Funding Council for England.

VCs' pay code to be drawn up by CUC

HEi-know Exclusive: New guidance on vice chancellors’ pay, covering transparency, publication of pay differentials and membership of remuneration committees, is to be drawn up by the Committee of University Chairs (CUC). The move follows universities minister Jo Johnson’s call for a “remuneration code” to be developed.

HE Insight Paper 11 - Ranking the rankings: University league tables pros and cons

University rankings are proliferating in the UK and around the world.  Despite continuing criticism, they enjoy a growing following and impact. Their methodologies continue to develop, providing a wide range of ways of comparing and evaluating universities. Far from going away, university league tables are likely to loom larger in the years ahead. HEi-know examines the current rankings, and their pros and cons.

 

Higher education institutions have a love/hate relationship with rankings. This often depends on how they perform in them, but there is no question that rankings have grown in importance as globalisation has taken hold and higher education is increasingly regarded as an economic engine. The many league tables are calculated on a wide range of indicators. There are rankings based on subjects, geography and economic grouping, institutions’ ages, environmental performance and even students' sex lives.

It is the international rather than the domestic league tables that have made the most dramatic impact, according to Professor Ellen Hazelkorn, director of research and enterprise at the Dublin Institute of Technology, who has written a book on the subject (Rankings and the Reshaping of Higher Education, published by Palgrave).

“Today, politicians across the political spectrum regularly refer to rankings as a measure of economic strength and ambition, students use them to help inform their choice, and universities use them to help set and define targets or brand and advertise themselves,” she says.

Governments and regional bodies such as the European Commission are almost obsessed with the rankings, which they see as the outward manifestation of the success of their higher education systems, having put the development of competitive higher education and research systems at the heart of their national economic strategies. In Russia, for example, President Vladimir Putin has made it a key policy objective to have five Russian universities in the top 100 of the Times Higher Education World University Rankings by 2020. Japan’s president wants to see 10 Japanese universities in the world top 100 by 2023.

At the same time, however, many universities and vice-chancellors are publicly quite dismissive of them – and for some good reasons – while being very interested in them privately. Criticism of rankings is intense in academic journals, where their value is questioned, but universities increasingly accept them and acknowledge their value.  

All three of the major global university rankings, QS, the Times Higher Education’s World University Rankings and the Shanghai Jiao Tong University’s Academic Rankings of World Universities, are criticised frequently. The criticisms fall broadly into two categories: doubt about the very idea of these kinds of league tables, and the methodologies they employ.

Peter Scott, professor of higher education studies at UCL Institute of Education, points out that all universities are being judged against the same standard. Yet universities are different and are meant to be doing different jobs.

Professor Hazelkorn underlines the dangers of distorted comparisons. “There are 18,000 higher-education institutions internationally, and we are obsessing over less than 100 of them, which is 0.5 per cent of institutions in the world,” she says. “We are obsessing about a very, very small number of students.”

The concentration on elite universities is a major worry to some policy-makers and academics. That is because the methodologies of many of the best-known rankings, including the Shanghai Jiao Tong Academic ranking of World Universities and the rankings produced by QS and the Times Higher, rely heavily on publication and citation data and academic reputation surveys.

Similarly, the methodology of rankings comes in for a good deal of criticism from academics like Hazelkorn and Scott who specialise in higher education. They argue that a lot of data is self-reported by universities and can be manipulated. The number of students achieving good degrees, spending in different areas and the employment prospects of graduates are instances of data universities supply.

But, partly acknowledging that the league tables have become a fact of university life, most criticism now is directed at the tables’ methodologies.  The QS ranking has come in for particular criticism owing to its greater reliance on reputational surveys than other tables. One of its fiercest critics is Simon Marginson, professor of international higher education at UCL Institute of Education, who divides rankings into three main categories: those that rely wholly on bibliometric or other objective research metrics; multi-indicator rankings like those produced by the Times Higher and US News & World Report, which give weights to objective and subjective indicators; and a category uniquely occupied by QS which uses reputational measures accounting for half of a university’s QS ranking.

Phil Baty, editor of the Times Higher World University Rankings, admits that the biggest criticism that can be made of their rankings is that they put a heavy weighting on opinion surveys (the opinion survey of research reputation is given a weighting of 18 per cent whereas the opinion survey on teaching has a weighting of 15 per cent). Research is not necessarily served well by traditional research measures of metrics and citations, he says. “We try really hard to get the opinion surveys right.”

The EU became so exercised about rankings that it established its own well-funded U-Multirank to break the mould of traditional world university league tables. Rather than ranking a “top 400” or “top 800” it developed a web-based tool to encourage like-with-like comparisons. Unsurprisingly, European universities did well and there were some star performers from Africa. However, many British universities have boycotted U-Multirank and few take it seriously.

Paul Greatrix, registrar at the University of Nottingham, says that league tables do not necessarily reflect what they purport to measure – the quality of education or the quality of the institutions. “A lot of the measures they use are things that happen to be measured and can be turned into a ranking rather than something reflecting the undergraduate experience or the research quality of the institution,” he says.

Administrators agree with the academics that rankings can be unfair and unhelpful. That is because an institution, which performs outstandingly in a subject but averagely in others, is never going to be placed at the top of a table. Therefore the tables could be deterring students from considering applying to a university offering a course that might be perfect for them.

Other drawbacks are that they neglect the arts, humanities and social sciences, focusing heavily on science research, and that teaching is given little weight.

In recent years, however, attitudes towards university rankings have become more accepting and even favourable.  From the point of view of the rankers, at least, criticism of their work has subsided. Phil Baty argues that universities are far more engaged with them than they used to be. “When I first became involved there was a fair bit of hostility, certainly in the UK,” he says. “But in the past five years attitudes have changed dramatically. It’s not as though we are embraced with love and affection but we are more tolerated now.”

Evidence for this change can be found among university press officers – who are in the front line of dealing with the good and bad publicity league tables can create – and academics.  The QS, Times Higher’s ranking and the Shanghai Jiao index are taken seriously. The domestic rankings most respected by the University of Leicester’s press office, for example, are The Guardian’s, the Complete University Guide and the combined Times/Sunday Times rankings, according to Ather Mirza, Leicester's press office’s director.

Ally Mogg, public relations director at Sheffield Hallam, agrees. His university was pleased to be named “top modern university in the North of England” by the Times/Sunday Times last year. “That provided us with the opportunity to sell the university to students particularly those looking to come to a modern university offering a year in industry,” he says.

As a modern university, Sheffield Hallam is less interested in international league tables that put a lot of emphasis on research and citations such as Shanghai Jiao but it does take notice of i-graduate’s International Student Barometer, in which it performs well. It also pays attention to the People and Planet ranking of environmental and ethical performance.

Professor Edward Peck, Vice-Chancellor of Nottingham Trent, may be speaking for many vice-chancellors when he says: “I think one of the things that all universities would say is that they tend to value the rankings where their universities do well. They tend to take these the most seriously. Sometimes that is a function of their position and sometimes a function of the methodology.”

Thus Nottingham Trent takes The Guardian’s rankings seriously because it places more emphasis on teaching whereas the Times/Sunday Times emphasises research excellence more. “I suspect the Russell Group would give a different interpretation,” he says. “That’s completely understandable.” Nottingham Trent also pays attention to the Complete University Guide.

Professor Peck also likes the QS World University Rankings. That is partly because QS lists the top 800 or so universities in the world and Nottingham Trent comes in at over 700 whereas it does not appear in other global rankings that cast their net less wide.

What the rankers say

Not surprisingly, producers of league tables defend their work and highlight the benefits it brings. John O’Leary, who sits on QS’s executive board, says that the proportion of opinion relied on in the QS rankings is not much more than in the Times Higher’s, when the Times Higher’s teaching survey is added in.

As editor of the Times/Sunday Times joint rankings, he says that league tables are a help to people who don’t know a lot about universities and did not receive good careers advice at school.  He admits, however: “The trouble is that they seem to be used most by people who do have access to decent advice and come from better-off families both here and in the USA.” He also believes that they keep universities on their toes.

Judy Friedberg, Guardian universities editor, says: “I think it is beneficial for universities to have a transparent and fair set of measures against which they know they are being rated. Our process is rigorously and regularly monitored by a review group of statisticians and planners drawn from a range of universities around the UK.”

So, international league tables are seen to be driving a reshaping of higher education systems internationally. Similarly, at institutional level, rankings are affecting policy and strategy. They have a big effect on academic careers, on the allocation of resources and on establishing a hierarchy of universities within and between countries.

Universities make extra effort to pursue graduates of subjects where employment prospects are good and less effort where they are not. They also do all they can to achieve a good NSS score. “Whether this is in the best interests of students is another matter,” says Professor Scott. “They’re not interested in that. Institutional behaviour is driven by what scores they get rather than by what’s right.”

While university administrators acknowledge such criticisms – and share many of them – they deny that league table rankings can be manipulated. Chasing rankings success can backfire if a university decides, for example, to increase its entry grades and suffers a drop in intake. And if it opts to increase the number of First class degrees it awards it may be found out – with adverse consequences.

“If you are chasing rankings, you will fail,” says Paul Greatrix. Instead universities need to concentrate on the fundamentals of teaching and research. These really matter but the outcome of any improvements will take time to show, longer than most vice-chancellors and governing bodies are prepared for.

Whatever the pitfalls of rankings,  they now seem firmly established commercially. The rankers acknowledge that the rankings’ marketplace has expanded hugely. “It has gone through the roof since 2001”, according to Bernard Kingston, founder and chairman of the Complete University Guide – who admits that they also make money from advertising by universities on their site.

“We are making money now,” says Kingston. “But we plough lots of it back into the business.”

The Times Higher, which is owned by a private equity company, receives 60 million page impressions a year for its world rankings, according to Baty. “We do make money from advertising,” he adds.

The Complete University Guide has more than 1 million users a month, one third from overseas; the Guardian University Guide is read by 3 million unique users a year, according to Judy Friedberg, Guardian universities editor; and QS has had 41,937,606 page views for its rankings in the past year. In the month after the world rankings were published the figure was 3,648,356.

The only newspapers to say they do not make money are the Times/Sunday Times. “The rankings certainly don’t make money,” says O’Leary. “They are provided as a service to readers because they think their readers like them.”

Apart from the producers of league tables, it is clear that many other people – students, parents, politicians, the media and employers – like rankings and find them useful. As Richard Taylor, chief operating officer of Loughborough University, says: “People will want to rank and it’s inevitable that people will want to look at these rankings. In the case of higher education, the cost of tuition and maintenance is so high that people will seek out reassurance that their investment is going to be beneficial.

“If one of these benefits is enhanced employability, then they’re going to look to the reputation and rankings as a proxy for that.”

Academics such as Scott are less sanguine. Pointing out that the media makes money out of them and that they help to sell newspapers, Scott says that as higher education becomes more of a market emphasising public relations and branding so universities relish being able to say that are in the top 10 or the top 20.

Nevertheless, most universities have made their peace with rankings by becoming more involved in helping with the methodology, realising that league tables can help them to recruit students and burnish their reputations. And governments view them as a necessary tool in their desire to benchmark their institutions. “Rankings are a necessary evil,” says Loughborough’s Richard Taylor.

 

 

HEi-know University Rankings Table

This article was first published by HEi-know on 1 July 2015

Ranking

Pros

Cons

Usual month of publication

People and Planet league table

Claims to be the only independent and comprehensive ranking of universities by environmental and ethical performance

In 2015, 69 of 151 universities boycotted the table on the grounds of time involved in collating information and criticism of the methodology.

January

Stonewall University Guide

For lesbian, gay and bisexual prospective students. Scores universities on a 10-point checklist of things they think universities should have in place

Specialised for its target groups.

January

Push

The ruthlessly independent guide to universities aimed at students and concentrating on student life eg price of beer, the number of good-looking men and women. Not a league table as such -- though there is a tool to create your own.

Because it "tells it like it is", it can annoy universities

March

U-multirank

EU-funded interactive "user-driven" league tables that allow users to create personalised rankings

Concerns about methodology have led some UK universities to boycott it

March

Complete University Guide

Free to access, comprehensive and stable, including research excellence, student satisfaction and other measures. Covers rankings for 67 subjects.

Because it gives emphasis to research it tends to favour established universities at the expense of the new universities

April

Times Higher Education 100 Under 50 Rankings

It ranks the top 100 universities under 50 years old. Shows the rising stars with potential rated according to he same 13 indicators as the THE's world rankings

Older universities are not assessed, and those that were in the top 50 drop out as they get older

April

What Uni?

Part of Hotcourses. Based on surveys of 20,000 students who are asked what they think of accommodation, teaching, and so on at universities

Criticised for being based on the opinion of only about 70 students per university

April

Brite Green's carbon emissions league table

Ranks best and worst performers among Hefce-funded universities and colleges. League table is based on the percentage change in emissions from 2005 to 2013

University can complain that like People and Planet the sustainability consultancy is using a league table to promote itself and its ideas

May

Guardian University League Table

Strong on student satisfaction and takes progress in HE into account with "value added" score. Covers rankings of 53 subjects.

Does not measure research excellence and suffers from volatility, according to critics eg Glyndwr rose 44 places last year and dropped 39 this year

May

Universitas 21

A comparative ranking of 50 national higher-education systems by Universitas 21, an international university network. It tells us which countries have quality HE systems

The European University Association says its methodology could benefit from further refinement because of its reliance on "elitist" indicators

May

Crime in Student Cities and Towns

Produced by the Complete University Guide, it is the only table to cover crime

Official data for crime against students is not available, so this covers burglary robbery and violent crime in cities and towns containing students

June/July

i-graduate International Student Barometer

Measures international student satisfaction

Covers only UK, US and Australia. It is subjective and doesn't necessarily measure quality of courses

August

Shanghai Jiao Tong Academic Ranking of World Universities

Funded by the Chinese government, it is the grand daddy of international rankings. Highly respected and stable

Criteria include number of citations  and Nobel Prize winners. Critics say it is biased towards natural sciences and English language journals

August

Google Most Searched For universities

The tech giant ranks the 20 universities that are most regularly mentioned. It presents a very different picture to global league tables.

It makes no attempt to judge universities on quality. Is simply about number of hits.

September

QS Top 50 under 50

Highlights the world's top 50 universities established in the last 50 years

Older universities are not assessed, and those that were in the top 50 drop out as they get older

September

QS World University Rankings

The largest, longest established and most widely read of the international rankings, covering more than 800 universities worldwide. Consistent methodology means that it is stable

Academics and other critics are unhappy that 50 per cent of the score goes on opinion surveys from academics and employers

September

LinkedIn University Rankings

Based on the career outcomes of students in the US, Canada and the UK, the social networking site outlines which degrees get its members the best jobs in certain sectors

In the UK, it covers just five sectors: accounting, finance, investment banking, marketing and media

October

Times Higher Education World University Ranking

It claims to be a sophisticated ranking, the most balanced and comprehensive of the global tables with 13 indicators against QS and Shanghai's six.

Biggest criticism is that it has a heavy weighting for opinion, 18 per cent on research and 15 per cent on teaching

October

Times/Sunday Times

Merger of two highly respected rankings with a big following. It is stable and covers rankings of 66 subjects

Behind a pay wall, so access may be limited

October

US News & World Report Best Global Universities Rankings

The first American publisher to enter the global rankings judging global research reputation, publications and number of highly cited papers

Criticised for its emphasis on surveys of opinion

October

Back