Jim is an Associate Editor at Wonkhe

When the fourth single from the Office for Students’ new quality album dropped the other day, I happened to be in a room of SU officers and managers.

They’d never been fans of OfS’ previous work – they’d regarded it as fairly distant from the tastes of their own students, not commercial enough, too focussed on impressing (music) journos – but this time, it felt like #newmusictuesday had delivered the goods.

This was a room of people that had spent years making carefully constructed arguments about the efficacy of and model for personal tutoring, which had tended to feel more like an ambition than a policy. Seeing the regulator pick that up elsewhere was pleasing.

They’d spent some years – more intensely in the cost of living crisis – making careful arguments about timetabling, “opt in” lecture capture and the need to balance the need for immersive, in-person contact with the complexity of students’ lives. The news that OfS was concerned about that was helpful.

And while watching the investigation team politely frame incredulity at an apparent lack of understanding of why outcomes metrics were the way they were was annoying (“why didn’t they ask students”), it cheered up a representative organisation that knows that understanding students is just as important in the “quality” conversation as understanding “provision”.

This, like so many others, was an SU whose officers sit in on appeals – where the heartbreaking stories of poorly delivered support end up individually dismissed as a student’s fault to protect the integrity of academic judgement – with more to do on capturing those stories at strategic level so that failures really are a student’s fault.

As such, this was easily the most interested and engaged I’ve seen an SU in OfS’ output since its inception. I may win few sector friends for saying this – but with several caveats, for me the reports are slowly building out to be a triumph – and have the potential to become a major turning point for higher education quality in the UK.

Greetings, music lovers

For all the (often justified) moaning and wailing about the slowly constructed definition of minimum quality, the process of selecting providers and subjects for focus, and the process of going in and getting these inspections done, I think four things are true.

The first is that I can’t remember reading – at least not in recent history – reports that appear to actually get meaningfully at a chunk (or “pocket” in OfS parlance) of provision. So many of the external quality processes that float around higher education – and I include the TEF in this – operate at such an extrapolated level as to feel hopelessly generalist and meaningless (in fact it’s the only process that has reviewed quality at subject level since subject review). In all four, you get a real sense that a team of experts had a good poke about, asked some decent questions, and reached some meaningful conclusions.

The second is that I doubt there’s a university in the country that isn’t reading the Bolton and Beds reports in particular and feeling a little shiver down the spine. The tendency to “externalise” student failure, the sclerotic bureaucracy, the inadequate reach of many academic student support “systems” and the contemporary challenges of delivering what are still on paper full-time courses to what in reality are part-time students all feel common, and all deserve an airing – and in all four reports so far, they do get it. OfS CEO Susan Lapworth’s promise of using regulation to send wider signals has finally begun.

The third is the way in which we might hope that the process starts to drive conversations about the things that matter. I am still coming across providers where it’s apparent that the descriptions of quality laid out by OfS in the B Conditions are not so much disagreed with as have just never been discussed, let alone with students. Maybe there’s a line or two in there that some would baulk at, or a phrase that many would rephrase, but I am yet to present them to groups of students who critique them – rather they’re thrilled that someone is finally talking about them – and framing them as a universal right rather than a luxury for the future is a useful distinction.

But the fourth is that this surely is what people want a regulator to be doing. Speaking at the Chartered Association of Business Schools’ annual shindig the other day, Susan Lapworth argued that if a regulator can’t tackle questions of quality, there’s not much point in having the regulator. There will be many who will argue against that inside the sector (on the basis that it’s OfS doing it) – and almost nobody outside of it.

My own qualification would be that once you zoom out a little, a co-regulation style assurance agency focussed on enhancement that works with providers above a quality baseline ought not to be an uncomfortable bedfellow with more direct regulation of provision where it falls below it – it appears to be personalities and market dogma that has stopped those being integrated sensibly for the time being. But the idea that we’ll go back to pretty much trusting huge providers via a 10 page report on a provider’s propensity to create reports every seven years is surely for the birds.

The only chart that counts

All of that said, it’s hard to believe that the first four singles represent the best of what’s possible here. As such I’ve got ten suggestions for how I think OfS could build on this initial string of hits to establish itself as a stadium draw rather than a forgotten act at your town’s Party in the Park.

  1. I perhaps would say this, but in all four reports the cursory mentions of student engagement – focussed on “things happening” rather than impact – are deeply unsatisfying. In particular, where the teams have found concerns, the first question I would ask is whether student reps – both at subject and provider level – have raised those issues. If not, why not? And if they have, why haven’t they resulted in action? Did students even know that the nostrums in the B Conditions are things they can universally expect? Are reps invited to make judgements, or just “raise problems”. There’s a huge gap here.
  2. Related to that is a process point about the inspection visits. My understanding is that each time, the team has spoken to provider staff and managers, and students in the subject, but not the SU. The OIA to its credit never misses this trick – and has a clutch of tales about the way in which that sort of engagement reveals things that would otherwise be missed. Not only should OfS correct the nonsense that students can’t bring expertise to the inspection teams, it should ensure those teams talk to SU officers, staff, and caseworkers to get both unvarnished views and constructive context. OfS might also ask itself – if the SUs or the reps saw themselves the problems reflected in the “concerns” conclusions, why they weren’t using OfS’ arguably flawed notifications process?
  3. The selection issue is one that continues to trouble. There was a time – lost deep in OfS set up history – where the plan was to randomly sample. The “risk based” approach sounds fine, but it risks missing all sorts of pockets of provision as a result. In particular, while in theory indicators on experience and outcomes play a role, it’s always going to be the latter that attract the attention – not least because the NSS doesn’t actually ask about large swathes of the B Conditions. And a reliance on poor outcomes flagging poor experience will always miss those whose experience is poor but whose students have the financial, educational and cultural capital to succeed anyway. That is, by definition, unfair.
  4. That the reports get at actual provision (rather than many of the QAA reports of old that felt like reviews of internal reviews) is refreshing. But one of the holes in all four reports is real reflection on why QA processes in the four providers were, or were not, avoiding problems or driving improvement. In the two “concern” reports, not only are there questions about consumer protection, student protection (given the cessation of a partnership) and the efficacy of complaints schemes, there are also real questions about whether the academic governance was working. It is, after all, a key aspect of having Degree Awarding Powers to start with.
  5. The whole question of partnership provision does need a good going over. In the Beds report, it’s clear that substantial provision with the London School of Commerce was “in scope” – but not at all clear whether the “on-site” visits ever got to one of its centres and spoke to its students given the agreement seemed to switch to “cease and teach out” mid-way through. That the franchise agreement was apparently canned three days before the first site visit and just over a month after the first data request is a conjunction of events that we might argue is at best unfortunate. Any involvement of any partner – be it one that supplies a placement, a year abroad or an entire degree with a foundation year – increases risk. We ought to expect to see more focus in future reports as a result.
  6. Maybe this will come – but when I’ve been thinking about how to convert these reports and OfS’ blended learning review into lessons for SUs and their reps, I’ve had a struggle. OfS would do well to find ways to make it easy both for students and universities to disseminate what it’s learning here, putting meaning meat on the framework bones for those making decisions on a day to day basis.
  7. If it must persist with its “risk based” approach for providers, it really does need to consider the risk from a student point of view rather than a provider one. At Guild HE’s conference the other day, OfS’ new Director of Regulation Phillipa Pickford argued that the business providers in this recent round focused on those “with larger student populations” so its interventions would have have “an impact on a significant number of students”. Quite why students at large providers should enjoy enhanced protection while those at smaller ones might exist in poor provision unnoticed is not clear.
  8. I’d also like to see some reflection on the themes being picked up – the “blended” challenges for busy students is one, and I expect another that will emerge will be the interactions between B4 and providers’ ongoing academic integrity challenges. It’s fine to have a fixed framework – it’s also necessary to keep it under open review on the basis of both external change and emerging findings.
  9. It’s also the case, I think, that the “subject level” focus is about right – and does raise huge questions about the averaging involved in OfS’ other big medal exercise on quality enhancement, and whether it tells anyone anything meaningful about quality in a large multi subject university. If nothing else, the idea that providers where large subject areas have “concerns” can still be flying silver or bronze TEF banners feels less than optimal.
  10. And again, I would say this – but being in listening mode over the experience of managers of providers who’ve been through the process is fine, but OfS really needs to get into listening mode from the student reps and SUs who’ve been on the periphery of these visits. How was it for them? Did the reports reflect their concerns? Did they feel engaged in the process? It’s for them, after all. They’re partners – not lab rats and not mere recipients.

Direct from Gallup headquarters

More broadly, there are those, I suspect, that will look at the reports – especially those where “concerns” were found – and wonder whether it’s all possible. Whether, with the unit of resource where it is and the sector in the state that it is and student support being where it is, that it is in fact possible to reach the minimum level of “quality” that the conditions imply – especially given some of Pickford’s wider concerns about financial sustainability.

Given what I do and who I speak to, I am perhaps inclined to be more cynical about whether it’s all do-able than many, especially where international recruitment didn’t hit targets this summer. But if I’m right, we need to know. If we have shifted from sensible cross subsidies to painting the corridors the King will walk along on his visit while the rest of the campus falls to bits, evidencing that will be painful but very helpful.

There are others who will accept that, but will look at the process and worry (partly because of the need to keep the show on the road) that bad leaders will react to these kinds of interventions and pile more pressure onto already stretched staff. OfS should be sensitive to that, more interested in how its regulation “lands” inside providers than it often appears to be, and much clearer about what happens (including decision making over imposing conditions or penalties) once these reports are produced.

But as I say, if it’s the case that hitting the minimums in the B Conditions is a fiction, pretending otherwise – and piling everything into a catch and mouse game of pretence won’t help a sector starved of resource. And if it isn’t, a hundred odd inspection reports to point to that show a sector working well would help too.

OfS has more to do here and more work is needed to get it right. But for the time being, the four reports so far are easily the best work I’ve seen OfS produce since its inception. Roll on the second album.

One response to “OfS’ quality reports are music to students’ ears

  1. I agree with you, the reports really are a good read and informative and get to the point.

    They show that self assessment has its limitations and outsider involvement is essential.

    Halftime students on full time courses is a big issue in some cases. This model is unworkable.

    Also unreasonable is allowing so may students with no previous experience of learning or any qualifications to start a course they are very likely to fail. The extra support required of tutors to get students to the next stage is not fair and setting up both to fail is undesireable.

    The lack of data on student attendance I found unbelievable. It was as if senior management did not want to know what the data they didn’t have might show.

Leave a Reply