Jonathan Grant is director of Different Angles and a contributing editor of Wonkhe


Martin Szomszor is Chief Data Scientist at Electric Data Solutions

New research into the REF2021 impact case studies shows that impact is often “exported” within the UK, levelling up the benefits of research between regions. But we also see that only a handful of universities have a “hyperlocal” impact within 25km of their institution.

Much has been written about “levelling up” the higher education and research activity of the UK – just check out the Wonkhe archives to get a sense of how this Johnsonian soundbite has taken hold.

But the idea of regional inequities is not new – especially in the allocation of research funds with concerns about the “golden triangle” of Oxford, Cambridge and London being at least a generation old. The long history of analysing regional research inequities has been dominated by mapping the inputs to research – that is, money – with limited analysis on the outputs (i.e. publications), and none that we are aware of on beneficiaries (impact).

Research across regions

To look at how impact is distributed across the UK we analysed the REF 2021 impact case studies (ICS), geotagging place names with the longitude and latitude of where the research occurred (that is, the submitting university) – but also, importantly, the location of the impact.

This showed that impact is spread across the UK and occurs both locally but also is exported between regions. For example, we find that 60 per cent of research impact is exported from the region in which the research took place, with the biggest exporter being southeast England, which exports 69 per cent of its impact. Scotland is the lowest exporter of impact, but even there just under half (46 per cent) of impact takes places in other parts of the UK beyond Scotland.

There are also interesting differences by subject area. For example, we find that REF Panel D (arts and humanities) case studies report considerably more impact within the region of the submitting institution, at 52 per cent, when compared to Panel A (medicine, health and life sciences, 33 per cent), Panel B (physical sciences, engineering and mathematics, 37 per cent), and Panel C (social sciences, 30 per cent).

The units of assessment with the lowest rates of regional impact are law (21 per cent), agriculture, food and veterinary sciences (19 per cent), and economics and econometrics (11 per cent). Those with the highest rates of regional impact are English language and literature (64 per cent), archaeology (59 per cent), and classics (56 per cent).

We can also use this analysis to look at impact across the regions and compare this to funding. From this we found that 19 per cent of the impact from the total number of ICS occurred in London (1,222 out of 6,361), but that London receives 24 per cent of the funding. Looking across clusters of regions we can see that 35 per cent of impact occurs in the golden triangle (London, the Southeast, and the East of England), compared to 49 per cent of research funding.

Right on your doorstep

Using these data we can also investigate “hyperlocal impact” – that is, impact that is co-located with the institution from which the case study was submitted. This seems an important indicator in the context of the civic university agenda and understanding whether institutions are “walking the talk” in their local commitments.

For this, we defined hyperlocal impact as occurring within 25km of the higher education institution. Based on this definition, only 19 institutions had more than half of their case studies demonstrating hyperlocal impact, and notably many of these institutions are specialist arts institutions, where the total number of case studies is typically relatively small.

For those larger institutions (defined as 10 or more ICS), 80 per cent of impact was hyperlocal for the University of Arts London and Manchester Metropolitan University, followed by the University of Bolton (64 per cent) and University of Sunderland (56 per cent). It should be noted that 143 of the 155 submitting institutions had at least one case study with hyperlocal impact.

Hyperlocal disappointment

So what to make of these data? There are two key observations for us. First is that the debate on levelling in the context of research and innovation is more nuanced than suggested by crude analysis of research investments. This is not to suggest that regional inequities are not an issue, but to highlight the need to look across a range of indicators.

Related to this, given much talk about the civic responsibilities of universities, we find the number of hyperlocal impacts disappointingly low.

Whether this is an artefact of the selection of case studies by institutions we don’t know, but one way universities can contribute to their locality is through research impact focused on local issues. This, we would suggest, could be incentivised in future iterations of the REF – by, for example, double weighting hyperlocal impact case studies.

7 responses to “Should the UK research system aspire to hyperlocal impact?

  1. How well do the case studies that universities reported in the 2021 research excellence framework reflect universities’ total research impact? If I were selecting research excellence framework case studies I would choose the ones I thought would most impress the assessors, which would not be the ones with the most local impact.

  2. I think part of the story about local impact is that many institutions were fearful of submitting local case studies of impact lest they be deemed too small-scale to get a high score for overall significance. I was party to discussions in one institution where there was debate about whether a case study impacting the whole of Wales was ‘big enough’!

  3. As two comments and the final para of the piece suggest, we cannot take the REF case studies as necessarily typical, and certainly not as a comprehensive record, of a university’s outut. That said, the question raised is an interesting one and merits further attention – provided that we also avoid the trap of thinking that local necessarily excludes wider impact.

  4. I agree with these comments and for me this raises the question as to how we (RE and the tax payers that fund their activity) measure excellence in the context of funded-research. Maybe, as this article suggests, a greater emphasis should be placed on the weight associated with the generation of hyperlocal impacts. If this happened I am sure you would see this reflected in the case studies submitted.

  5. Although I find this analysis interesting, we have to be aware of its limitations. Regional location of impact was not data supplied by the submitting institution (unlike eg country of impact), nor was the regional nature of the impact an explicit part of the assessment. Searching for mentions of geographical locations within section 4 of impact case studies can provide false positives (eg where a location is part of the narrative of the pathway to impact, and not the location of the beneficiaries) and false negatives (eg where specific regional beneficiaries are not mentioned in a narrative about national impact).

  6. Notwithstanding the very good point by Anne above, I wonder how the contradiction between ‘internationally recognised’ research publications, specific to particular journal listings and the idea of hyperlocal impact, and how this can be managed. I can speak anecdotally to suggest the latter is too often sacrificed for the former.

  7. Many research impacts are national in scale and won’t mention every region affected. For example, at my university we had REF2021 case studies on NICE guidelines, landlord and tenant regulation, UK energy markets, UK energy supply etc. In my mind these benefit the whole UK and are both local and national.
    An emphasis on hyperlocal impacts could miss the opportunity and incentive to scale these up to other parts of the country.

Leave a Reply