An important paper out this week in the journal “PLOS Biology” distils into a set of six principles much of what we have learnt in recent years about how to assess researchers and their work.
As the authors remind us: “How we evaluate scientists reflects what we value most – and don’t – in the scientific enterprise, and powerfully influences scientists’ behaviour.”
In the UK, the way we think, talk and argue about research assessment is intimately bound up with the Research Excellence Framework (REF). While still owned by the four UK funding bodies, management of the REF transferred on 1 April to its new home in Research England, part of UK Research and Innovation (UKRI). This brings the two strands of the dual-support funding system under one roof for the first time.
How RSE/RAE/REF has changed
All eyes are now on REF 2021, the eighth cycle of assessment since the process was introduced in 1986. Over time, the design and delivery of the exercise has become more complex, as have the multiple purposes to which it is directed.
The research assessment timeline
Date | Exercise | Coordinating body | Key design changes |
---|---|---|---|
1986 | Research Selectivity Exercise (RSE) | Universities Grants Committee | 37 cost-centres; 4-part questionnaire on research income, expenditure, planning priorities & output |
1989 | Research Selectivity Exercise (RSE) | Universities Funding Council | 152 units of assessment; 70 peer review panels; 2 outputs per member of staff |
1992 | Research Assessment Exercise (RAE) | HEFCE | HEIs select which staff to submit; 5-point scale; 2800 submissions to 72 UoAs; introduction of census date |
1996 | Research Assessment Exercise (RAE) | HEFCE | Up to four outputs per researcher; 69 UoAs |
2001 | Research Assessment Exercise (RAE) | HEFCE | 2600 submissions to 69 units of assessment; 5 umbrella groups of panel chairs for consistency |
2008 | Research Assessment Exercise (RAE) | HEFCE | 67 subpanels under 15 main panels; results presented as quality profiles |
2014 | Research Excellence Framework (REF) | HEFCE | 4 main panels; 36 subpanels; introduction of 20% impact element |
2021 | Second Research Excellence Framework (REF) | UKRI (Research England + devolved FCs) | All staff with ‘significant responsibility for research’ submitted. Impact 25%. |
Summary table of RSE/RAE/REF exercises by James Wilsdon
After a suite of evaluations of REF 2014, and a more fundamental review by Lord Stern, this evaluation machine (to borrow Peter Dahler-Larsen’s term) is once more cranking into gear. We now know the broad shape and rules of the 2021 framework, and who will sit on the various panels. More detailed guidance on the submission process and panel criteria will be with us soon.
To get us to this point, there has been a huge amount of debate about the pros, cons, burdens, and benefits of the REF. That debate continues, and will no doubt ebb and flow through to 2021 and beyond.
Real-time evaluation
This time, Research England is also trying something different: the Real-Time REF Review pilot. Rather than waiting for REF 2021 to complete before looking afresh at how it is working, a new pilot will test the feasibility of evaluating perceptions and experiences of the REF in real-time, among researchers at all career stages, and across a wide range of disciplines and universities.
Over the next six months, a research team from the universities of Cardiff and Sheffield will work with Research England to trial this more open and formative approach. We want to better understand the changing effects of the REF on research cultures, institutions, and individuals – and particularly whether recent changes prompted by the Stern Review succeed or fail in their goal of reducing burdens on researchers and institutions.
In its pilot phase, the Real-Time REF Review will focus on four universities – Cardiff, Sheffield, Sussex, and Nottingham Trent. We will survey the attitudes, perceptions, and initial experiences of REF 2021 among a cohort of researchers, across eight units of assessment, two under each of the main REF panels. If the results prove interesting, the pilot will hopefully be expanded into a larger-scale, longitudinal study of how the REF plays out over the next four years.
Understanding experiences
Formal consultations are often dominated by institutional voices. The Real-Time REF Review will tap into a more diverse and distributed set of perspectives and experiences than ever before. It is one example of a more creative, strategic, and long-term approach to evaluation, which Research England is keen to support – and which should be made easier by the unified structures and enhanced analytical capabilities of UKRI.
From an academic and policy perspective, it also reflects a shift now underway in several countries towards more sophisticated forms of “research on research” – or meta-research. This is an area where the UK needs to strengthen capacity if it is to make sensible use of the extra investment now being promised through the government’s R&D intensity target of 2.4% of GDP.
The prospect of another REF review may cause some to roll their eyes or gnash their teeth. We won’t pretend it will change REF 2021 – the rules for the current cycle are now pretty fixed. But it does give us a head start, and a much richer evidence base, on which to draw in thinking about what comes next.
It is important to understand with greater rigour and granularity how different researchers experience the REF.
Should REF 2027 bring more of the same, or can we do better? Where and how is it being implemented well, and what could be improved? Should we rip it up and start again? Share how it is for you – and help us, collectively, to decide.
This is a really interesting and useful article, thank you. Indeed I completely agree that it is imperative to understand with greater rigour and granularity the effects of REF on different researchers, particularly women and/or those with caring responsibilities, as well as for ECRs.
Abolish the REF.