On the day that we found out that student perceptions of value for money had fallen to a record low – driven in part by falling contact hours, missing practical components, slower feedback and a lack of interaction (online or otherwise) with others – the boss of the body that has supposed to have been keeping an eye on it all had some interesting things to say.
During her keynote to the Higher Education Policy Institute’s annual conference, Office for Students (OfS) CEO Nicola Dandridge said that that the regulator had received over 400 “notifications” of issues related to students’ experiences since March 2020, and added that they were consistently “raising issues about teaching quality and assessment”.
She said that OfS “followed up on these” and that its interventions were “important in ensuring that students…received a better experience than they otherwise would have done”.
In other words, the message to students was – look, I know it’s been bad for you all. But imagine how much worse it would have been without us!
She said there were examples of universities “delivering significantly reduced course content to what was promised”, along with cases of “self-isolating students not being provided with the right equipment or resources or the support to enable them to continue studying”.
And she also said that students had not always been given clear information last year about what might happen if predictable restrictions meant delivery would have to move online.
The all seeing eye
We do have to do a bit of decoding and unpacking to work out what’s going on here. As we know, whenever anyone has raised these sorts of things with universities minister Michelle Donelan all year, they’ve been reassured universities should maintain “quality and quantity”, that steps have been taken to ensure that “all students, regardless of their background, have the resources to study remotely”, that OfS has been “taking very seriously” the potential impacts of the pandemic on teaching and learning, and has been “actively monitoring providers” to ensure that they “maintain the quality” of their provision.
The problem is that OfS set itself up as an outcomes-based regulator. Beyond a baseline of “performance” on hard-to-directly-attribute outcomes (a baseline has been in flux, in court and unenforceable for over a year) providers are pretty much free to do as they wish. So when a pandemic renders that sort of long-term assessment unhelpful and meaningless, you have an obvious problem. And I don’t mean one that has only become obvious in hindsight.
What you end up having to do is a few things. First you default to the bit of your framework that refers to compliance with guidance on consumer protection law so you can stress how important it is that students aren’t misled or oversold, that promises to them are kept, and if they can’t they can obtain appropriate redress. You also stress the things in your description of universal quality that relate to access, reinterpreting them for a Covid context.
Next you have to have a way of determining whether these things are actually happening, because with this stuff you can’t rely on dashboards of metrics collected after the education happens. You could randomly sample the providers you have on your register – but you’d already agreed with your sponsor government department to drop doing that when you were having a tough conversation about your budget.
So instead you have to puff up an obscure process called “notifications”, a kind of complaints process that isn’t a complaints process consisting of people telling you about things that are happening on the ground.
And then eventually, you’re asked to appear at the annual conference of the Higher Education Policy Institute at which you’re told that, en masse, the students that appear in the title above the door don’t believe that the things that you put in place for the pandemic worked. So you go for “well – could have been worse”.
Notify me this
The thing about OfS’ notifications process is that it sounds OK, but we have absolutely no way of judging the effectiveness of the regime.
We don’t have any breakdown, by issue, of the 400 notifications. We don’t know how many providers that covered. We don’t know how many of the 400 were followed up. We can’t see which ones were and which ones were not, and why.
We don’t know about the supposed action beyond being assured it was taken. We don’t see any record of any action taken, either in terms of the action or which providers it was taken on, or why. We don’t even know if the notifiers know, and we can guess their SU doesn’t know. We don’t know if other students know.
Where a notification was also a suitable issue for redress, we can’t see the examples so that both providers and students can draw lessons for the future. And we don’t know if those notifiers were told they could (and should) also formally complain.
Did any of them get partial refunds or compensation? If many had promises broken, and many didn’t give consent to the changes in the provision that ensued because they weren’t sufficiently informed, students were almost certainly entitled to refunds or compensation. But we don’t know.
We don’t know how many providers told OfS to take a running jump. We don’t know how many were borderline cases, and what in the end amounted to a breach of any of the OfS conditions and what wasn’t.
We don’t know how many providers are still being monitored or why. Because beyond formal conditions of registration, OfS never comments on the registration of individual providers.
We don’t know if any OIA casework is manifesting as “notifications” under the agreement that OfS and OIA has. We don’t know what sort of view OfS took on notions of “equivalence” that many providers have claimed, or on what OfS has regarded as “reasonable” or “unreasonable” during the pandemic.
We don’t know the make up of the 400 notifiers. We don’t know who knows about this mysterious process and who doesn’t. It might be that only those with the sharpest elbows use levers like this which would be a problem. We don’t even know if OfS knows.
We don’t know the character of the 400 notifications. What were they about? When did they come in? And do we think it’s OK that OfS only heard from 400 people when HEPI’s figures suggest tens of thousands feel the experience was worse than they were led to expect?
We have no case studies, no actual examples, no stats, no evidence or basically anything other than “our interventions were important”.
Oh you’ll learn
Look at almost every other regulator, and you regularly get formal reports of the actions it is taking and who it’s taking them over. Every week for our work with SUs I see the bodies that the Information Commissioner’s Office is upset with, or the charities that the Charity Commission is investigating.
It’s important – that transparency ensures that users and their representative bodies are able to draw learning from the interventions. Good regulation intervenes when it has to, and empowers others to raise issues in ways that are rooted in real life examples of their rights being upheld.
But as it stands, OfS is like some sort of protection racket / mafia boss, and as I say regularly, seems completely uninterested in empowering the students in its name in favour of a kind of regulatory swagger that always seems to be more about its own power.
Of course, this is all about England, and it’s hard to believe that Dandridge’s findings haven’t been seen in Scotland, Wales or NI either. The powers are different, but we’ve barely heard a peep from HEFCW or the SFC. And when we asked Diane Dodds’ office for a copy of the communique she told the press she’d sent to providers about their responsibilities towards students, the NI government decided to treat it as an FOI request!
And anyway. If it’s the case that there were widespread problems with broken promises, over promising and delivery against legal contractual obligations, has the UK-wide Competition and Markets Authority been told? What action is it taking? It is, after all, the actual consumer protection law enforcer.
But just to return to OfS. As I say, when it comes to the pandemic, we have no case studies, no actual examples, no stats, no evidence or basically anything other than “our interventions were important”.
I’m sorry if this sounds grumpy, but I just don’t think that’s good enough. It’s important that we get to evaluate the effectiveness of its regulatory approach. If nothing else, the Office for Students collected over £28m in registration fees from universities last year, and spent over £24m on wages.
Is it really too much to ask to get a bit more detail than “well, could have been worse”?