What OfS has learned from the first round of quality assessment

We're pending regulatory action, but OfS are interested in what can be learned from the eleven investigation reports that have been published

David Kernohan is Deputy Editor of Wonkhe

The Office for Students’ (OfS) first round of quality assessments, which began in 2022 (with the majority of “boots on the ground” investigation taking place in the latter part of that year and early 2023) has now concluded.

We’ve now had eight reports on the quality of business studies provision (most recently regarding the investigation at Regent College), and three on computing courses – a total of eleven investigations concluded. Each report has dug into specific courses within an area of provision at a provider – four did not generate sufficient evidence that pointed to a need for regulatory action, while we await an official response to the remaining seven.

What we have today is not those regulatory responses – rather it is OfS’ own reflections on the findings overall in the form of an insight brief. These need, of course, need to be seen in the light of the the Fit for the Future report on OfS conducted by David Behan which, somewhat controversially for those who would prefer to see a sector-led approach to quality with student involvement, urged the regulator to focus more of its attention on quality and standards. Indeed, the whole investigatory approach is up for redesign as a result.

The publication does not, of course, “constitute legal or regulatory advice” – nor does it attempt to explain OfS’ regulatory decisions. The first chunk of the brief does something that OfS should ideally have done at the start of the process – set out the way in which these quality assessments work. There’s also useful context: the focus of the first round was on larger providers, both in terms of the subject population and the overall student population. The decision around subject was made first, followed by a decision to look at the bottom quartile of performance (based on the traditional B3 progression measures, plus NSS results), then size, then the existence of any third party notifications.

Assessments were “designed to identify particular risks”, so do not constitute a comprehensive test of compliance. Assessment teams were commendably keen to involve students, but not in the formalised way that could be said to comply with international quality assurance standards.

The actual meat of this insight brief will be very familiar to those who have read one or more of the investigation reports – with common findings presented around four themes of:

  • Delivery of courses and resources
  • Academic support and student engagement with courses
  • Assessment if learning
  • Academic leadership and oversight

It’s clear that OfS intends the summarised failings to be a spur for quality enhancement activity – but it appears to perceive each instance of a suboptimal experience as something that a provider has simply decided to do.

There’s no consideration here of the wider issues that may have brought about a low quality student experience, so we are left with perhaps unhelpfully generic suggestions that staff should probably be up to speed with disciplinary norms in teaching practice and suitable content, that teaching should probably be engaging, that things on slides should be explained to students (but not just read out), that delivery timetables should suit student needs, and that your provider should employ the right staff in the right numbers. And so on.

To be honest, any provider that needs to be told stuff like this probably doesn’t deserve to be a part of the sector. The larger and more established providers reviewed certainly know that these are the required standards – and if they are missing them, are adapting at a subject level to a change in institutional (and indeed sector) level to a change in institutional financial circumstances.

That’s not to excuse providers – there’s enough in a few of these reports to give anyone pause – but fundamentally a student having a bad experience probably doesn’t care whether it’s because that’s the best experience a provider is able to give them because of financial pressures or some other institutional or departmental failure to perform adequately. The end result is the same either way.

And when OfS concludes:

We know that universities and colleges work assiduously to ensure that they provide a good education to their students, and we hope that the information in this brief, and the much greater detail given in the reports themselves, will be useful in fulfilling these essential aspects of that work. We expect to publish further information about any regulatory interventions that draw on the findings in these assessment reports.

it doesn’t help those students (the majority of whom were on these courses in 2022 having since graduated or left) either.

The Behan Review asked OfS to focus on quality issues, but it also asked it to have regard for the financial health of the sector. To take the conversation about teaching quality beyond generalities, it feels to me as if the regulator needs to ask the difficult questions around the way teaching quality, the student experience, and financial stability interact.

2 responses to “What OfS has learned from the first round of quality assessment

  1. How long can anyone seriously think that the OfS has any credibility. If the government are not already looking to abolish or replace it, then they are wasting valuable time. The OfS has so fundamentally failed in its role that no one can seriously think it should remain

Leave a Reply