This article is more than 2 years old

Who paid the price for provider survival during the pandemic?

There's a new report on how universities' finances were regulated during the pandemic. David Kernohan and Jim Dickinson find a struggle to define and secure value
This article is more than 2 years old

David Kernohan is Deputy Editor of Wonkhe

Jim is an Associate Editor at Wonkhe

The Office for Students (OfS) marrs a “good start” to analysing the financial risks associated with higher education in “difficult times” by not doing enough to “get providers on board with its approach”.

That’s one key criticism of the regulator from the National Audit Office (NAO) report – Regulating the financial sustainability of higher education providers in England – as summarised by Public Accounts Committee chair Meg Hillier MP.

The other is perhaps more existential – OfS does not have (and some would suggest, will never have) a calculus for the value for money accorded to students. Hillier luxuriates in the HEPI/AdvanceHE survey data showing a decline in perceptions of value among students during the pandemic restriction.

She says that “students deserve better” – but the NAO report contains no recommendation (short of the prospect of an all years National Student Survey) that would allow students to get what they apparently deserve.

OfS’ school report

An NAO “Value for Money” report does pretty much what you would expect – it analyses the safeguards and monitoring in place across a given area of government spending, makes an assessment of whether value for money has been achieved, and makes recommendations to improve regulation and delivery.

This particular report covers the 2020-21 academic year – with risks and tensions associated with the pandemic providing a set of complex stress tests for a newish-regulator. The early predictions of institutional failures in the low teens did not come to pass, but how much of that was due to the actions of the regulator is not clear.

Instead, the report gets deep into the processes (and, yes, principles) that OfS uses to monitor financial health and value – and offers a great deal of insight to the general reader in the process.

Key facts and key findings

The report begins with “key facts” about the higher education sector – and we immediately learn that 10 providers are facing “enhanced monitoring” from the OfS on the grounds of financial stability (now the only issue on which OfS carries out advanced monitoring).

As 64 providers out of the 247 then in scope forecast falling below 30 days’ net liquidity (a standard test of financial health) at some point in the next two years, these universities and colleges enjoying enhanced monitoring must have been in some very serious financial straits. There are another thirteen that were undergoing other forms of engagement so the regulator can better understand the levels of risk.

You can kind of follow these twenty or so providers in financial difficulty throughout the documentation. We learn that eighteen registered providers applied to the Department for Education “restructuring regime”, and that three subsequently entered it. Thirty-three out of 247 providers had in-year forecast liquidity below 30 days. Mind you, 98 out of 245 providers underwent an OfS “detailed review” of their financial viability and sustainability.

Of course, no names are named. We are told that – much like the wider sector – this group of providers facing difficulties were diverse: small specialist providers, high tariff established providers, low tariff providers, and new providers all featured in the group.

For those that followed the speculation, and the torturous analysis of HESA finance data, that characteristed 2020 and 2021, these numbers ring true. These (plus many others) were the providers that would have carried unsustainable levels of risk had the government made the much-advanced recommendation that fees should be repaid by providers to students – a move that, without substantial government underwriting, would have destabilised providers and harmed the student experience they could offer.

As such, what the NAO is summarising here is what we already knew – that enrolments (and so income) largely held up, and despite offering something substantially different to that which was normal and/or promised, the lack of any lever causing that income to fall through rebates or refunds kept providers afloat, and in some cases enabled them to return really quite healthy results.


The NAO couldn’t really ignore the A levels debacles manifesting themselves throughout the period under review. It notes the “market stability” number controls proposal from mid-2020, introduced originally “to ensure a fair, structured distribution of students across providers” – it then reminds us that that proposal was scrapped at the last minute, and laments that we ended up with anything but a fair, structured distribution of students across providers, summarising the different impacts as follows:

For higher-tariff universities, this would mean revising assumptions about staff numbers, accommodation and teaching spaces, with the result that it would be challenging for institutions to protect the overall experience of each student. For universities with lower entry requirements normally expecting to recruit large numbers of students through clearing – those who had not achieved the grades for their first choice of university – there was a risk that it could make their financial position much less secure.

What’s odd is that while the NAO says that all raised questions about the ability of providers to offer the student experience they promised (in ways that weren’t directly about pandemic restrictions) it doesn’t really link the issues caused by that oversubscription or undersubscription back to student perceptions. To be fair, it’s a report about OfS regulation rather than the pandemic in general, but you get the sense that the NAO is much more comfortable analysing OfS’ efforts to look at finance directors’ spreadsheets than it is understanding the impact of OfS’ approach to regulation on students and therefore their views on value for money.

Students in the office

OfS has always expressed a concern about the plight of students at risk from what it terms a provider’s “disorderly exit” from higher education. The NAO report makes it clear that the stepping up – from student protection plans to something with a bit more heft – was a choice made in the face of greatly increased financial risk.

There’s less support for language around “adequate” online learning and support. As is rightly pointed out, it was never clear what students could expect when Michelle Donelan said they could expect no refunds if the provision was “adequate” – both the Office of the Independent Adjudicator (OIAHE) and the Office for Students were lined up to receive complaints and “notifications” from students without ever spelling out what it was reasonable for a student to expect during the pandemic.

The NAO attributes the dip in student perceptions of value for money (in the HEPI/AdvanceHE survey) to the unexpected experience of online learning – but never really evaluates the decisions OfS made to make resultant perceptions from students better or worse as a result.

The use of a survey here speaks to the problems inherent in developing a value for money measure. NAO notes the OfS caveat that value “may mean different things to different people or may change over time” and as such measures it via an external survey but has no targets to improve it. With the (DfE-sparked) direction of travel of the NSS away from customer-style measures like satisfaction it is unlikely that we will see one emerge anytime soon.

Regulators under scrutiny

We’ve recently noted the oncoming post-legislative review of the Higher Education Research Act 2017 – a chance for the Education Committee to take a detailed view of the successes and challenges faced by the regulator and the framework it has inhabited and defined since inception. NAO notes that the Office for Students is also due a “tailored review” from DfE in 2023 – another chance to take a hard look at regulatory practice in English higher education.

It does seem to be the season to scrutinise the central scrutinisers – with the rest of DfE tertiary policy and communications returning to the mothership, and some odd consultation questions about responsibilities we are beginning to see question marks forming about just how special higher education regulation is and needs to be.

This NAO review isn’t entirely part of that, of course. As a newish player in the regulatory space OfS is ticking many of the PAC boxes – though the visible policy of working on rather than with the sector has been spotted and noted. And (for a provider keen to regulate with data and dashboards) the data on the OfS dashboard isn’t quite enough to convince us that it is self-regulating – the suggestion that it works to gather more sector feedback should produce some interesting new points of light in these blank space.

What’s next?

At least some of the material here is interesting because it concerns a continued theme that we’ve been highlighting here on the site – the in-principle lack of clarity over the respective roles of DfE and OfS, and the in-practice tensions that have rarely been far from the surface. NAO’s report for example recommends that DfE makes clear what tolerance the government has for provider failure – although it’s unlikely that Michelle Donelan is going to say out loud that a couple of private providers on the edge of London can go but don’t you dare let a modern university in midlands or north go to the (red) wall.

Similarly, when NAO says that DfE and OfS should jointly assess how the redistribution of student numbers between providers as a result of higher A-level grades awarded in 2020 and 2021 has affected students’ experiences and providers’ finances to understand consequences for this year, you get the sense that that’s because it’s not entirely clear to the NAO who ought to do it, let alone how they might go about it while escaping respective culpability for letting it all happen in the first place.

More straightforwardly, NAO says that OfS should communicate more effectively with the sector to build trust in its approach as a regulator; improve providers’ understanding of its attitude to risk and how it defines risk-based, proportionate, regulation; be more ready to share sector insights to improve efficiency and competitiveness in the sector; and set out how it will secure provider and stakeholder views of its work. There will be no arguments from the secor on those – although many are growing weary of promises in this space that never seem to be progressed.

Back on the student interest, NAO says that OfS should review, improve where necessary and then reauthorise student protection plans for all providers to ensure they remain adequate and can respond to new risks, and prioritise finalising its key performance indicator on how it assesses the value for money students that see in their education, and set out how its work will reverse students’ declining satisfaction rates. That’s interesting – because while one impact of chaos in the market might be to end up collapsing financially, another might be to deliver something substantially different to that which students were promised or expecting.

With too many students in some providers and not enough in others, inflation through the roof, domestic undergraduate fees frozen and providers being egged on to address their portfolio of provision via B3 and the new TEF, it seems likely that a “new risk” that student protection arrangements ought to apply to is courses not being delivered as advertised even if they don’t formally close – with resultant student dissatisfaction. Providers will argue that much of that is outside of its control, is better than going to the wall, and is really DfE’s fault. Whether OfS punches up to DfE or down to providers in the face of all that will likely reveal the truth about whether it’s really on the side of students.

Leave a Reply