Elizabeth (Lizzie) Gadd is Head of Research Culture & Assessment at Loughborough University.


James Wilsdon is professor of research policy at University College London and executive director of the Research on Research Institute (RoRI)


Stephen Curry is Assistant Provost (Equality, Diversity & Inclusion) at Imperial College London

Whichever team lifts the FIFA Men’s World Cup in ecstatic victory later this month, no-one – bar the most ardent losing fans outraged by a controversial penalty decision – will quibble with the result.

The beautiful game has long been metricised: quality is quantified without incident, through goals scored and points accrued.

However, more complex human activities, such as research and impact, resist simple quantification and demand close attention to how quantitative and qualitative information can be combined to assess and incentivise the outputs and outcomes we want to see.

A winter tide

To that end, today we publish Harnessing the Metric Tide, a short, sharp look once more at the role of metrics in UK research assessment, updating and reflecting on progress since The Metric Tide review was published back in 2015. Our headlines are ten recommendations on how to improve the handling and use of quantitative and qualitative information in the REF, and how best to boost the roll-out of responsible research assessment across the higher education and research sector.

Informing our reflections were three roundtables at which community stakeholders shared their hopes and fears for the future of responsible research assessment. A recurring theme at these events was the pernicious influence of data providers who did not adhere to principles of responsible metrics, in particular the global university rankings.

Scientometricians defined for us the many and varied ways such providers failed to meet basic standards for responsible assessment practice: opaque indicators that are poor proxies for the dimension being assessed; surveys with weak methodologies; weightings without justification. University leaders bemoaned the negative impacts on their own efforts to evaluate responsibly; how do you move away from citation-based assessment when citation-based assessment is the way that such providers assess you? And university mission groups described these data providers’ role in perpetuating inequalities; where the same large, old, and wealthy subset of a diverse and multi-faceted HE sector continue to scoop all the recognition and the lion’s share of any resulting funds.

These problems arise externally but have become deeply internalised. They must now be addressed.

Discussions in new FoRRA

Our proposals first are for an expanded remit for the UK Forum for Responsible Research Metrics, set up by UUK and the main UK funders in 2015 as a consequence of The Metric Tide. This should be rebranded and revitalised as the UK Forum for Responsible Research Assessment (UK-FFoRA). With this renewed mandate, we recommend that the FoRRA completes work started in 2015 to establish a set of principles for responsible research information management, to which all evaluators – including data providers – should adhere. We recommend that the UK-FoRRA takes a proactive role in supplying evidence-informed advice to the sector, and, critically, acts as a focal point for engagement with third-party data providers that may not be living up to community expectations of what best practice now looks like.

We also recommend fresh emphasis on the co-design and co-interpretation of data and indicators by the communities under evaluation—with the hope of ‘designing out’ some of the more problematic forms of assessment. And we propose broadening the data sources used to represent our research activity; moving away from narrowly-focussed bibliometric datasets to those that surface things we as a community really value and care about (for example, gender pay-gaps, and leadership survey data). We call this “data for good”.

The league game

Tucked away at the bottom of our list of recommendations is a call to the sector to think more critically about how university league tables based on weak methodologies hinder our collective efforts to evaluate more responsibly.

For too long, university rankings have been seen as something out there yet separate from the sector’s commitments to responsible practices. Institutions ostensibly dedicated to EDI and research culture will promote without a flicker of irony a microscopic and statistically-irrelevant rise in their ranking placement.

So we call on more institutions to at least concede the limitations of league tables and adopt a more critical mode of public engagement with, and promotion of, the rankings. This could include signing up to the INORMS More Than Our Rank initiative as a way to acknowledge the shortcomings of rankings whilst celebrating, in a narrative way, the broader diversity of contributions their institution makes.

Getting ranking out of policy

But even this, we feel, is not enough. Given the UK government has legitimised the place of global university rankings by offering High Potential Visas to people simply by virtue of graduating from an institution that appears in at least two of three international league tables, we think it is time for this issue to be poked at and scrutinised by the House of Commons Science & Technology Committee.

The outsourcing of institutional missions and values to unaccountable ranking agencies is harming a thriving, forward-looking and agile university sector. We believe this should be investigated by MPs as a matter of urgency.

We are not alone in recognising the negative impacts of the university rankings on a healthy research ecosystem. The recent Agreement on Reforming Research Assessment, now owned by the Coalition on Advancing Research Assessment (CoARA), has as one of its four core commitments the need for universities to “avoid the use of rankings in researcher assessment”. We call on more UK institutions to give serious thought to engaging with the Coalition, not least because over half of its commitments are about the implementation of responsible practice—something we recognise in our first recommendation must now be a high priority.

Looking back on the recommendations of the original Metric Tide review, we can all agree that the UK HE sector has made significant progress towards more responsible uses of quantitative and qualitative research measures. Harnessing the Metric Tide maps out a series of next steps to build on this momentum, with a view to delivering research assessments that support a healthy and diverse research ecosystem.

To succeed in this and secure the prize of a thriving, productive and diverse UK research system, we need all the players to take to the field and give it their best.

Harnessing The Metric Tide is published today as a contribution to the Future of Research Assessment Programme (FRAP) and can be downloaded here. Alongside other new FRAP-commissioned evidence, this review will be launched and debated at a workshop later today (Monday 12 December), with speakers including Jessica Corner, Executive Chair of Research England. The event will be online via Zoom and is open to all by registering here.

Leave a Reply