How exactly could REF 2028 evidence research culture?

REF 2028 will place increased importance on research culture – but how’s this going to work in practice? Mark Whelan digs into the nuts and bolts of assessment

Mark Whelan is Research Culture Manager at Queen Mary University of London

The Future Research Assessment Programme’s initial decision to raise the weighting of the research environment portion of the forthcoming REF in 2028 from 15 per cent to 25 per cent has been welcomed by some, while more cautiously received by others.

When it comes to assessing this now broader category (which will also be renamed People, Culture and Environment), the fact remains that the written statements presenting an institution’s research culture will probably only constitute around 20 per cent of the subscore for the 25 per cent weighting. This means most of the assessment will, as in 2021, be based upon indicators drawn from metrics, institutional awards and statistical data, collected in the form of a questionnaire yet to be devised.

While using a questionnaire-style template to collect evidence could simplify the assessment exercise for universities and REF assessment panels, it comes with drawbacks. Larger institutions who can employ more staff to collect and crunch data will have an advantage over smaller institutions who cannot, while the nagging observation remains that picking a few indicators to evidence something as complex as research culture does not do the latter justice.

In fact, a reliance on metrics might hinder the development of a positive research culture. There is an impression among some researchers that the sector’s overreliance on metrics is, in fact, one of the things that’s wrong with research culture right now. In a survey conducted in 2020, for example, the Wellcome Trust discovered that only 14 per cent of researchers surveyed believed current metrics improved research culture, while a further 43 per cent believed that metrics were held to be more important than research quality itself.

The appearance of the Metric Tide 2 report in December 2022, a follow-up to the influential Metric Tide report of 2015 that advocated for the more responsible use of metrics in research assessment, has moved the debate forward. Metric Tide 2 contains some general suggestions on how to use “data for good” (see p. 33 of the report) when it comes to assessing the health of research ecosystems and culture, but further discussion is needed to work out what specific metrics can be used in practice and how.

So how can the future REF move forward from its 2021 assessment of Research Environment to assess People, Culture and Environment?

What was measured in REF 2021?

Although it isn’t the most exciting way to spend an afternoon, looking at the indicators used to determine the Research Environment scores for REF 2021 is a starting point for thinking about how the next REF can measure the quality of the People, Culture and Environment submission in 2028.

Upon first sight, the indicators for something as broad as Research Environment were not what you would expect. Signing up to different concordats was important. Has your institution, for example, signed up to the Concordat on Open Research Data and the Concordat to Support Research Integrity? A simple yes or no sufficed here. It is, of course, easier to sign up to concordats than to commit to their principles and policies in practice, but we’ll leave this issue to one side for now.

The awards offered by Advance HE, such Athena Swan or the Race Equality Charter, also appeared as indicators. These awards offer firmer ground than concordats when it comes to assessing an institution’s research culture and environment, as each award framework has its own assessment panel that give out gold, silver or bronze badges (and sometimes, no badge at all) depending on the quality of the application and the evidence submitted within.

And how diverse are your sources of research income? The more varied the better, with fourteen different columns of funder subdividing (among others) the royal academies from UK-based charities from government and industrial sources. Although this indicator in 2021 probably told us more about who funded the research rather than the environment in which the research was produced, the situation might well be different in 2027. Funders such as the Wellcome Trust and UKRI are demonstrating an increasing interest in research culture. Grants of research funding in the future will probably be contingent upon the recipient institution demonstrating or committing to improve a positive research culture.

Other indicators collected in 2021 perhaps get us closer to research culture, while some might not feature in the next REF at all. The institution’s gender pay-gap and the make-up of its community of research staff, including its ethnicity profile and contractual status (with the higher the percentage on permanent contracts as opposed to fixed-term contracts the better) are understandable statistics to ask for when considering the environment in which research is undertaken.

In terms of more obscure indicators, even seasoned university administrators might never have heard of the Human Resources Excellence in Research Award (HREiR) administered by Vitae, and resort to Google to work out if their institution holds it. Whether a university’s place in the Stonewall Workplace Equality Index will remain an indicator is unclear, given the withdrawal of several universities in England and Scotland from the league table.

More of the same for 2028?

What other indicators can the sector provide? REF planners are to consult with the sector on identifying suitable metrics and indicators for the People, Culture and Environment element and will use the definition of research culture outlined by the Royal Society as a starting point. The latter is useful given the competing definitions of research culture out there, so at least university leaders have a definition to work with, even if it is a broad one.

The indicators REF planners are interested in collecting (we’re now onto point 44 on page 9 of this document) sounds a lot like what was asked for in 2021, with its mention of open research practices, EDI data, and information on staff career paths and progression. What else can REF planners ask for in an easily answerable questionnaire-type format?

What else could be measured?

As mentioned above, concordats and sector-wide codes of conduct feature heavily in the REF 2021 indicators for Research Environment, but signing up to concordats is not the same as committing to their principles in practice, and here might be an avenue for the collection of data that can help evidence a university’s commitment to investing in its people, culture, and environment.

Universities across the UK, for example, have signed the Researcher Development Concordat of 2019 (RDC), that includes the commitment to guarantee every researcher no less than 10 days protected development time per year (or pro rata if part time) to advance their career. Signatories to the RDC report every other year to Universities UK on their progress in delivering upon the RDC’s principles and therefore collect data on research staff engagement with training and professional development. A future REF could ask “what percentage of your research staff make use of their 10 days protected development time each year?” Posing such questions will help identify universities that seek to meet their concordat commitments in practice and asks for data universities are generally collecting already.

Similar questions can be posed about other concordats. The Concordat to Support Research Integrity, to take another example, commits a signatory to provide suitable training opportunities surrounding research ethics and integrity for researchers. “What percentage of your research staff have completed training in research ethics and integrity?” could well be another question in the future REF questionnaire. One point to consider here is how such a metric could be accompanied by a sense of how the institution or its people have changed as a result of the training, as attending training in itself is not necessarily evidence that the university and its staff have changed their practices and behaviour.

A further development in the research environment since the last REF is the continued rise of the Résumé for Research and Innovation. More popularly known as the Narrative CV, it is an application format designed to move assessment of researchers away from metrics and outputs to a broader consideration of an applicant’s skills and achievements.

UKRI made the Narrative CV compulsory for applicants applying to any of its funding programs where evidence of a researcher’s track record is required, so despite criticisms and concerns this initiative is not going away. An item querying the adoption of this format of CV on an institutional or department level could be placed into a questionnaire. Given the central role that the Narrative CV will play in fostering (in UKRI’s words) “a more inclusive and supportive research and innovation culture,” its inclusion in REF 2028 is to be expected.

What will REF 2028 bring?

REF planners are in a bit of a bind. Asking for more metrics and indicators might mean they can gain a sharper view of research culture in the UK, but the administrative burden on both the universities they wish to assess and their own assessment panels would rise too. If they don’t revise the metrics and indicators for a future REF then they risk re-running the same exercise with probably similar results.

If the REF is to make good on its commitment to foreground research culture, then a set of indicators decided upon in the late-2010s will surely have to be revised. We will learn more about the nitty-gritty underlying the assessment of People, Culture and Environment in early 2024 after a sector-wide consultation.

Leave a Reply