This article is more than 9 years old

The hidden bang-for-buck heroes of UK research

So, the results are in, and most of the sector is looking at the winners and losers: in league tables, power ratings, grade point averages, and the rest – but Graeme Wise on the data blog is following the money.
This article is more than 9 years old

Graeme Wise is a Contributing Editor to Wonkhe.

Over the last two days the sector has gorged itself on REF2014. Higher education institutions got their own data first, and then a bit later on saw everyone else’s, along with most of the media. As a result, at least two separate and quite different league tables have emerged, the ‘winners’ have been declared (though we all knew who they would be well in advance), all down the table those who played ‘the game’ well are charging their glasses and those who didn’t are licking their wounds.

But in Whitehall, and especially the Treasury, officials will be soberly scanning their data sets for ‘best value’. The REF isn’t just a measure of research performance in the abstract – when combined with other data it can also be put to use as an indicator of value for money. Data from HESA tells us that over the period covered by the REF, the public have invested approximately £7.5bn in university research (through the full range of funding and research councils). They will be asking – where has this money been applied to good effect, achieving the best quality research for what has been put into funding that research over those five years?

We’ve done some work on this, and the early findings are pretty interesting. Our approach was to take, for each HEI, the total receipts from public research grants from 2008 to 2013 and divide it by the FTE number of staff submitted into REF2014 (to get to a proxy for ‘spending per active researcher’ across the period); we then cross-referenced that with the proportion of research from each institution rated 4* or 3*, as a measure of performance.

There were some obvious outliers. Imperial College spent over £400,000 per head submitted over those five years and achieved superb results. The LSE spent only £50,000 per head in the same period and were similarly outstanding on performance. This difference is of course a function of subject profile for two quite specialised institutions and there is no doubt it was money well spent at the former just as much as the latter, despite its cost.

The real interest comes when we use this approach to pull out some multi-faculty institutions with a reasonably broad subject mix that on the face of it have achieved some very different results with the funds committed. Here, for instance, are five apparently solid bang-for-buck performers:

Institution nameFTE research staff submittedPublic Research Income 2008-13 / FTE submitted (£)4*+3* Quality (%)Notional 'VFM ratio' (Res. Inc./FTE submitted : 4*+3* quality)
University of Bath462154,00086.91.77
University of York643133,00083.31.60
King's College London1369131,00085.21.54
Swansea University370122,00080.21.52
Lancaster University580120,00082.31.46

These institutions stand out for having decent levels of submission and all scored above 80%, despite their economy with precious research money. They won’t ‘win’ on an index of research power, but they could be viewed as the ‘hidden heroes’ on value. They also catch the eye for being relatively mixed; sure, none are post-92s, but this isn’t just a case of the ‘usual suspects’ as the major league tables might suggest.

For comparison, here are some less positive examples:

Institution nameFTE research staff submittedPublic Research Income 2008-13 / FTE submitted (£)4*+3* Quality (%)Notional 'VFM ratio' (Res. Inc./FTE submitted : 4*+3* quality)
University of Essex339249,00077.43.22
University of Dundee396243,00075.53.22
University of Liverpool760195,00080.72.42
University of Strathclyde558179,00078.92.27
Aberystwyth University317148,00067.42.20

There are questions to be asked here, but it’s important to recognise there may also be good answers. On the face of it, these places don’t seem to be suffering from a higher regional cost base than the first five cases, but there may be a stronger claim for some measure of subject skew (for medicine, for example, in two cases). But it is eminently possible that any of them may have experienced some exceptional costs that we can’t see from the high level figures (a peril of using even moderately ‘big data’). They may be levering large sums of private research income that these figures don’t show. They may have – admirably – taken some bigger risks on ‘blue skies’ research and not had it pay off. And it may be that some of our ‘hidden heroes’ are actually really ‘villains’ – achieving apparently great value by cross-subsidising from other income sources, while others prioritise teaching and the student experience. It’s also possible that more general questions over ‘results inflation’ in this REF may cast a shadow over the performance of some, and thereby their efficiency. Nevertheless, the differences we can see here are quite sharp and should be looked into further.

The method we’ve used here is certainly experimental and we invite and welcome more analysis from others – what we’re mainly interested in is attempting to get past the default to ‘standard league tables’ that really tell us so little. We don’t yet know how policy makers will respond to REF2014 in their financial settlement, but inquiries into how well the funds allocated on the back of RAE2008 have actually been used will be significant in the rooms where that gets decided – or at least they ought to be. For the time being, as it’s nearly Christmas, let’s give some benefit of our data-integrity doubts and raise a minor toast to the efficiency champion of REF2014: City University pulled out a 75.7% 4*+3* result from a 377 FTE submission, after getting just £37,000 per FTE submitted over the five year period in public research funding. Don’t they at least get a mince pie for that?

3 responses to “The hidden bang-for-buck heroes of UK research

  1. Fascinating stuff, but I really think the more revealing stuff down this line will be analysis by panel. That way you control (partially!) for discipline composition. And there is much argument – well founded IMO – about comparability between panels.

    Funders rightly don’t choose to fund The University of Poppleton, they choose to fund the Department of Pork Studies at the U of P, or a specific proposal from Prof Bacon.

  2. Could you give a bit more detail on how you’ve derived the total public research income for the five years, please? I’m having a little difficulty replicating the figures with Heidi data.

Leave a Reply