Jim is an Associate Editor at Wonkhe

For the duration of its short life, the Office for Students (OfS) has had a problem with monitoring.

Ofsted inspectors appear with a clipboard – with debates in the school, college and skills sectors about the amount and nature of any notice of such visits. Back in the day (and to this day around devolved nations), QAA would facilitate peer review teams – who would turn up and both review the carefully prepared self-assessment, and ask a series of difficult questions.

But OfS was set up to be… different.

An avowedly outcomes-focussed regulator was always going to depend principally on lag indicators – prior “performance”, signalled through metrics generated by satisfaction surveys, or employability and continuation rates. This is not a blog about the general suitability of or effectiveness of such a regime – there’s plenty of that material elsewhere on the site and on the wider interweb. Nor is this a blog about how useless that regime starts to look – both in principle and in practice – during and after a pandemic. Both me and DK have been there before.

What it is is a blog about covering cuts to regulation and offering false assurance to students, politicians and the wider public by wilfully misunderstanding the relationship between students, student representatives, students’ unions and their universities – leaving them damned if they do, and damned them if they don’t.

Let us know

So what’s going on here then?

OfS has published a fancy new PDF guide aimed at students, student reps and students’ unions on what it calls “notifications” – the process of telling OfS about issues within universities or colleges that relate to the regulator’s responsibilities. It’s basically a handy guide to grassing up your uni to OfS, and is both beautifully designed and a pain to read on mobiles.

There’s a summary of what this is all about and supposed to achieve from OfS’ student panel chair Martha Longden over on Wonkhe SUs.

The press release is a delight. The guide has been released as part of OfS’s wider student engagement strategy – designed to “facilitate and encourage” students and their representatives to “highlight concerns” and to ensure the system is “straightforward to understand and use”, with “practical advice” on how and when students can make use of the scheme.

The intro makes clear that OfS has no role in dealing with individual complaints or with disputes between students and their university or college. Later sections describe the relationship between OfS and students, describe what a “notification” is, how to submit one, and lists some examples of what students and their reps might raise as part of a notification.

What the guide doesn’t do is address in any meaningful way why a student, student representative or students’ union would bother. It doesn’t address what doing so might achieve, or what the ideal would be internally before getting this far. And nor does it reveal what OfS’ emphasis on it hides in the process – that the regulator doesn’t have much of a clue about what’s going on in universities.

The panopti-con

You probably weren’t concentrating during the third week of December, but back before Christmas OfS press released a grab bag of regulatory documents (including its approach to publishing information about providers, its consultation on fines and a consultation on reportable events) with quite an interesting headline:

Regulator sets out how students can register concerns.

This was an exciting moment. As I said at the time, along with rent rebates, education that doesn’t harm their mental health, action on climate change, tuition fee refunds and a dialling down of the marketisation of their universities, top of many a student activist’s Christmas list was… a clearer process for regulatory concern registration.

What emerged then was a fairly dry and unenlightening guidance note for providers on notifications, with a promise that what would follow later would be a version of said note designed specifically with students, student reps and students’ unions in mind.

And that’s what’s appeared now.

But what also appeared in that pre-Christmas release was a rewrite of Regulatory advice 15: Monitoring and intervention. It was a pretty comprehensive rewrite of a document of the same name from 2019, the main change in which was the removal of “random sampling” as an approach to monitoring that we saw signalled in that Department for Education (DfE) “Reducing bureaucratic burden” policy paper a while back.

The some-seeing eye

In OfS’ regulatory framework, various types of monitoring activity cause OfS to respond proportionately to regulatory risks and identify changes to risk levels:

  • Lead indicators are constructed from data and information flows, in what it used to hope were “as near real time” as possible – to assist OfS to identify trends and anticipate future events. Metrics, in other words.
  • Reportable events are a requirement on providers to notify OfS of material decisions/issues/changes. A sort of self-declaration of a risk that a regulatory requirement won’t be met, with a threat if you don’t declare.
  • Notifications are information from students, staff members and other people andorganisations that tell OfS something material about a provider’s compliance with the ongoing conditions of registration. They might include complaints, whistleblowing, general allegations or just concerns that people want to report.

If that collection of three things sounded a little weak (especially during a pandemic), the good news is that there used to be a fourth item. Like a kind of OfS version of the Ofsted panopticon clipboard.

Out of a hat

In its regulatory framework, OfS said it would operate a process to reassess providers’ compliance with their ongoing conditions of registration for a “random sample” of providers each year.

This was, in context, a good idea.

It was, for example, going to provide assurance about the effectiveness of ongoing monitoring approaches – by comparing findings from random sampling against findings from other types of general monitoring. OfS said doing so would enable it to better understand the effectiveness of its overall approach and decide whether changes to its approach might be required – something we’d all be keen to see.

It was also going to act as an incentive process. By moving from scheduled cyclical reviews to a random sampling approach, the idea was that providers will be incentivised to ensure that they satisfy conditions of registration on an ongoing basis. Again, in context, highly sensible stuff.

It was also going to mean that the team in Nicholson House would gain a better understanding of sectoral practice – by reviewing in detail how individual providers meet their conditions, OfS was going to be able to identify and recognise good practice. Lovely jubbly.

There was even a bit of game theory in there, and I love me a bit of game theory:

To maintain proportionality, no provider will be subject to further selection by random sampling if it has been sampled during the previous three years. The OfS will begin by sampling five per cent of all registered providers each year. The probability of being assessed will increase incrementally for each year in which a provider is not sampled. The systemic benefits of uncertainty are intended to promote the desired provider behaviours, while creating a more proportionate system overall.

Budgeting and burden

The problem is that this approach has now gone. To save some money on its growing costs budget, late last year OfS offered up killing off random sampling to DfE – and DfE dutifully agreed, nodding it through in the name of regulatory burden reduction while allowing OfS to rewrite history:

This process was not designed primarily as a mechanism to reassess risk for an individual provider, but would nevertheless involve significant assessment activity in relation to a provider’s continuing compliance with its conditions of registration.

So as I say, during a pandemic in particular that leaves you in a bit of a monitoring hole – so to the rescue comes a beefing up of reportable events via students, student reps and students’ unions.

We’re not sampling. But we are “inviting” students and their reps to tell us stuff.

In practice, it is difficult to explain quite how flawed this monitoring sticking plaster really is. But I’m going to try.

Start with why

The first big problem here is that as ever, OfS makes clear how someone might raise an issue, and even talks about the sorts of issue they might raise – without ever making clear what may or may not be regarded as an actual issue.

Imagine, for example, there’s a book you need that’s available as an e-book and the software you need won’t run properly on your ageing laptop. OfS says you might raise a notification if you are:

Not being given adequate support – for example, where IT systems do not support effective online or blended learning and the university or college is not taking appropriate steps to address this.

I’ve asked five different people this afternoon whether they think the quote matches the issue, and got five different answers. If we overpromise the ability of the system to fix things, students will be furious. If we underpromise, they won’t use it.

Then again, the guide says you might raise a concern about:

The quality of teaching, the availability of resources, or the fairness of assessment.

Why define “quality” teaching, what “availability” means or go near suggesting how “fairness” might be judged when you can just imply it?

Perhaps you could discuss:

Academic support not being available in the way students had expected – for example, a university or college’s personal tutoring system not working effectively and in the way set out in the course handbook.

…but how far off does it have to be? How many students need to be affected? How long must this have been going on?

The pandemic examples are great:

A university or college… refusing to consider refunds for students who wish to leave their course as a result of changes that have been made because of the pandemic.

Does this mean that students who wish to leave their course as a result of changes that have been made because of the pandemic are entitled to refunds? Including for rent and their other costs? Really?

Self-isolating students in university- or college-owned accommodation not being provided with the things they need to learn effectively – for example, adequate internet connectivity, access to food, or wellbeing support.

Where in the regulatory framework does it mention student accommodation at all – let alone differentiating between people in provider run halls from those in private halls, HMOs or their family home?

A university or college stopping face-to-face teaching and providing significantly reduced course content or contact hours through online teaching, without a clear plan to make this up later in the year.

Oh – is that the standard now? Students have the right to equivalent “live” contact hours? And reduced from what – last year? What they were promised? What they were notified about?

Again, for those at the back. If we overpromise the ability of the system to fix things, students will be furious. If we underpromise, they won’t use it.

Use it then lose it

Then there’s who’s likely to make use of this system. Back here in the real world, ordinary students are unlikely to be clamouring to use a process for telling OfS things when there’s a big sticker on the system that says “this is not a complaints system and we may not do anything about your notification”.

So outside of SUs, in reality it’s only highly organised and articulate groups of students that are likely to find and then use the scheme. In this paper on Public Service Reform, outgoing OfS Chair Michael Barber warned about badly designed voice and engagement initiatives that:

Favour the better off… reliance on bottom up pressure from citizens may worsen equity as the articulate, confident, better off middle classes profit at the expense of the less capable poor.

Maybe students’ unions will use the scheme – but why? Who are the students’ unions that are going to “grass up” their university to the regulator without any sense of what it might achieve or who it might benefit?

It’s also hugely dangerous to use the volume or nature of notifications as an indicator of the scale or breadth of problems in a provider. It’s much more likely to tell you lots about the funding of the SU or the social class of its students. I could supply you data on both of those now.

Here’s some game theory of my own. OfS rarely talks about its regulation of individual providers – there are no examples of notifications leading to change here – and so if an SU used it, and this process fixed anything other than something dramatic and scandalous, you could then:

  1. Reveal to your members that you’d notified OfS, destroying your relationship with the university in the process.
  2. Hide from your members that you’d notified OfS, destroying your relationship with your members in the process.

Readers predisposed to view SUs as being too close to universities that are surprised at that dilemma are reminded that SUs have no labour to meaningfully withdraw, depend almost entirely on grant income from their universities and necessarily have to engage in more assertive, “behind the scenes” influencing than others would be comfortable with.

Who’s got the power

Ultimately, what the guide does is suggest that the way students and their reps should engage in discussions about quality and provision in their institution isn’t by raising issues in meetings, or running campaigns, or training reps, or empowering students by helping them understand their rights.

There’s nothing in here on ensuring that students, reps and unions understand the regulatory framework, or have helpful data with which to hold decision makers to account locally, or have the skills and confidence to contribute as assertive partners to improving the student experience internally.

It’s OfS yet again doing what it always does – not really being interested in giving students power but in hoarding it for itself – not concerned with student representation or student engagement per se, but interested in it only in the way it might prop up OfS’ inconsistent monitoring, on the cheap, and in a hopelessly benevolant-paternalistic-Sauron-of-HE kind of way.

It doesn’t even raise the prospect of gathering from providers the volume and nature of actual complaints that don’t get as far as the OIA.

Thankfully, there is an alternative.

Swiss cheese

Readers that have been around for a while will recognise the OfS notifications process as a rebrand of an old idea – what the Quality Assurance Agency used to call its “concerns scheme”, and then HEFCE used to refer to as its “Unsatisfactory Quality” scheme.

Neither were ever heavily used, and certainly weren’t reliable enough to use as a key plank of any regulatory or quality strategy. In 2015/16, for example, QAA only got 28 concerns about providers in England, only followed up on 21, only investigated seven of them, and only five resulted in the provider doing anything. I’m not sure a PDF and some round tables would improve on those scores.

An add on, “release valve” kind of thing – yes. A way of systematically and reliably monitoring providers? Don’t be silly.

But those readers around for a while (and those around now in the nations) will also be thinking about a different kind of contact between students, reps, students’ unions, universities and the Assurance Agency.

The old QAA “Student Written Submission” into institutional review – now an annually established process in Wales – enabled (but never forced) SUs to submit a report on the university’s provision judged against the Quality Code. SUs were supported to develop their evidence gathering and student rep schemes, officers and staff were enabled to understand the quality code, and the threat of public criticism would gently and assertively nudge a university into fixing issues without the raising of those issues being seen as a threat.

Even providers without an SU were encouraged to form a group of student reps and support them to write a report.

That’s what we could have now. OfS could formally suggest that one of the things universities and students unions work together on to demonstrate a commitment to student engagement (and therefore compliance with the student engagement expectation) would be an annual Student Written Submission to the governing body – copied to OfS as part of the notifications process.

OfS could hold events training up SU officers and staff on what’s in the quality code, and we would proudly and periodically see the results of that kind of engaged, assertive partnership recorded in reports like this.

Just as “random sampling” was supposed to do, that’s a solution that would provide assurance about the effectiveness of ongoing monitoring approaches, act as an incentive to maintain compliance, and mean that the team in Nicholson House would gain a better understanding of sectoral practice – but unlike random sampling, would have students, student leadership and student engagement at the heart of the process as co-producers of their education.

Instead, OfS has a guide out on grassing up your vice chancellor to the regulator – with all of the signals on trust and partnership that that implies. It should not be surprised if few take it up on the offer.

One response to “Should students tell tales on their university or college?

  1. “It’s OfS yet again doing what it always does – not really being interested in giving students power but in hoarding it for itself – not concerned with student representation or student engagement per se, but interested in it only in the way it might prop up OfS’ inconsistent monitoring, on the cheap, and in a hopelessly benevolent-paternalistic-Sauron-of-HE kind of way.”

    Nailed their default behaviour and intentions in one there…

Leave a Reply