This article is more than 1 year old

Notifications might (eventually) lead to regulation

What happens when you notify OfS of a concern? As David Kernohan finds out, not very much - for quite a long while.
This article is more than 1 year old

David Kernohan is Deputy Editor of Wonkhe

We’re all in favour of regulatory transparency.

Yesterday’s release of four sets of “operational measures” from the Office for Students, should – therefore – be a cause for celebration. It’s a welcome extension to the kind of thing we used to grab from the Chief Executive’s update in the board papers – and the first step in the National Audit Office recommended revamp of key performance measures.

At last – we get details on how long it takes for everything from reportable events to new degree awarding powers to get resolved – and how many instances of each the regulator deals with. Good, sensible, news.

And yet somehow OfS has contrived to shoot itself in the foot again.

Performance management

There’s nothing the matter – per se – with the data on the number of reportable events, notifications, registration applications, and DAPs applications. Though if forced to quibble I’d ideally have liked to have seen a longer time series on reportable events (I know the definition changed, but not really by that much).

[Full screen]

And it’s great to see the contextual data on reportable events (we learn for instance, that just one report per quarter was on quality and standards!) – I’d have loved to have seen that detail for notifications too but I’m sure that will emerge in time.

[Full screen]

Think of a number

My issue is with the calculation of the maximum resolution time for reportable events and notifications. OfS calculates a median resolution period, and then – entirely reasonably – uses a value three standard deviations above the mean to offer a “maximum response time”. But it arbitrarily sets the period for each calculation based on “breaks in the pattern” of results.

[Full screen]

So we see that notifications, for example, had a maximum response time of 410 days (yes that is more than a year!) for the period between the start of 2021 and 27 April 2021, and then the value fell to 310.1 days (between 28 April and 4 August 2021), 228.9 days (6 August 2021 and 19 January 2022) and finally 95.6 days (21 January to 25 April, where published records end.

The maximum response time here – just to be clear – isn’t the notification that waited resolution the longest in that period. That, for completeness, is a notification that went in on 12 February 2021 that saw a decision on 3 July 2022 – some 506 days later. It is three standard deviations above the mean, as set out by OfS. And response doesn’t mean the issue is sorted – it means that the OfS has made a decision whether to do any regulating or not on that issue.

Fellow data nerds will have spotted that OfS could use this arbitrary break point to develop a time series where the prevailing maximum time can be seen to be reducing. Other ways of splitting the data – given the entertainingly variable values of means and standard deviations given what you feed into them – would give other results.

As an example, I’ve used calendar months as my unit of analysis. Here, for February 2021 the calculated maximum response time is a jaw-dropping 599.9 days.

[Full screen]

As OfS says:

It is important that we consider the information in incoming notifications quickly so that we have up-to-date risk assessments. If our approach for notifications is efficient, then we would expect short resolution times and few unresolved cases.

Indeed.

Lockdown hurricane

We were in lockdown in February 2021, with some students slowly returning to a “blend” of face to face and on campus. Michelle Donelan had written one of her famous “letters to students to explain the return to campus, and the conditions under which this would happen. Had students not received the return that the minister promised (perhaps their course was in a subject where the rules were not clear), they would have had recourse to alert OfS – who were “monitoring” the situation.

And nearly two years later, the Office for Students may find time to decide whether it was an regulatory issue or not. Before it begins to take action.

I know that this is a silly distortion of data, but the way OfS has presented this analysis is also a silly distortion of data. We should let performance speak for itself – any attempt to shape presentation to tell a particular story undermines the very transparency and trust that performance measures like this are supposed to engender.

And it is pointless – if you just took the longest wait in each month as a maximum the trend is very clearly (and convincingly) downward. OfS is getting better (or, at least, quicker) at dealing with notifications.

Both the National Audit Office and the Public Accounts Committee have called – as we’ve been over on the site before – for the Office for Students to get its act together as a regulator. The sector needs a strong, independent, regulatory to take the serious business of research and higher education out of an increasingly toxic and childish political environment. This kind of stuff doesn’t help anyone.

2 responses to “Notifications might (eventually) lead to regulation

  1. The other element that is damaging here is that if they cannot even publish their own shortcomings without choosing to distort the figures, why would any of us trust them to not do the same with an outcome of a visit to our institution (for example)?

    Such silliness erodes trust even further (not that I am sure that there was much to begin with).

    1. That’s it, in a nutshell, isn’t it. Providers are – reasonably and rightly – supposed to be transparent, open and honest, admit their mistakes, rectify them (see Student Protection Plans, for one). That’s how it should be. The OfS on the other hand is pretty much unique among public bodies for its obfuscation and self-aggrandisement. Would be great to see them, just once, admit they got something wrong or weren’t doing well enough…

Leave a Reply