How do we know if OfS is any good?

OfS, inevitably, has thoughts. And data.

David Kernohan is Deputy Editor of Wonkhe

If you are running a regulator, and you want to have evidence that the approach to regulation you are taking is the right one.

Your key performance measures will be carefully chosen to back up your strategic priorities – and a lot depends on the data that underpins tem.

The Office for Students has eleven key performance measures, and we’ve just seen the release of data to underpin five of them – the remainder (1,2,3,4,7 and 9) will arrive later this spring. So how is OfS doing, does it reckon?

KPM 5: The proportion of students who are aware of the OfS

This curious measure sits under OfS’ target to be collaborative, and it turns out that just 21 per cent of undergraduate students (and 39 per cent of postgraduate students) know that OfS is a thing. The eventual intention is to measure the proportion of students reporting trust and confidence, but on those awareness numbers that would currently look pretty embarrassing

What is odd is that we get this data from a much wider Savanta student pulse survey – something that is going to build into a resource of huge regulatory use. With around 1,300 students per wave – a sample made representative using quotas based on age, ethnicity, and sex at undergraduate and postgraduate level, and then the total sample made representative between full and part time students and between students with the free school meals marker and those without.

From the wider survey we learn from the most recent wave that 80 per cent of the sample agreed that the higher education experience they are receiving is the one that they have been promised. We don’t get splits (or even a full list of questions) in what OfS are releasing, so if they want to develop it into an official statistic there’s a bit of work to do there.

From what we do get the most startling finding is that more than a third of current students did not feel that they had enough money to cover regular living expenses, and more than four in ten didn’t have enough money to participate fully in their course. I was particularly struck by this as these are not issues that OfS really has power to do much about – it is a curious choice of question given the need to inform regulatory activity, and short of feeding it up the line to the minister it is difficult to imagine what use it will serve given that we are hardly short of evidence that students are struggling financially.

More germane to conditions of registration are student assessments of the support they are offered: 84 per cent felt that academic support has been good, just under three quarters feel that personal and wellbeing support has been good (strong growth wave-on-wave) and 72 per cent were happy with “administrative support”.

KPM 6: Attendance at OfS events, number of non-regulatory visits

Again under collaboration, the existence of this indicator is testament to the desire for OfS staff to improve communication with the sector, something that has also brought about the provider panel. Attendances at OfS events are at record levels, but as we don’t get a count of the number of events (these are only OfS owned large events, so no measure of the valuable work the regulator does in getting to sector professional conferences) in each quarter it is difficult to read success into this.

Non-regulatory visits is another one – it is nice to see OfS staff getting trips out to meet staff and students, but surely the number of these is as dependent on staff availability and provider capacity to host rather than any measure of the overall quality of sector communications.

KPM 8: Timeliness of key processes

OfS expects to carry out a registration assessment – right through from application to decision – in an average of 175 days, with 436 days being the top end limit. We are told that these have “remained constant” between 1 April 2023 and the enforced break but there is no time series. Things are only marginally better for DAPs assessment – 9 were completed in 2024 compared to 2 in 2023, but this is against a backdrop of more assessment conducted, and in 2024 7 decisions were made more than two weeks late.

There’s more detail on the access and participation plan assessments, but the news is less good – the most recent (June – September 2025) tranche of submissions had a mean resolution time of 138 days, although things were easier for variations (just 30 days). You’d really need to know more about the workload that the team were facing during that period to make a judgement as to organisational performance – and I can’t help but think the data produced on these kinds of measures under the previous KPM regime was more detailed and more useful.

KPM 10: Engagement with non-regulatory OfS materials, and KPM 11: Media citations of OfS analysis and insight

In content productions or marketing terms, these are measurements of reach and impact. KPM 10 is basically web hits on non-regulatory pages, and serves mainly to demonstrate how good the OfS comms team is at publishing compelling content. The other one is one for the press and public affairs team, and demonstrates the number of times OfS work (I assume as against mentions of OfS as a body that probably needs to do some regulating) is mentioned in the media.

These are, to be clear, reasonable measures of comms and public affairs performance that could be useful in steering work in those areas (although KPM4 – the proportion of accountable officers reporting trust and confidence in OfS communications – is the one I’m really waiting for. But quite how these rose to the level of published organisational KPMs is not clear to me.

For an avowedly responsive organisation like OfS, survey responses (students, accountable officers) should really constitute key measures on communications – volume is no substitute for quality here. And elsewhere I would be looking at student satisfaction via NSS, continuation and completion rates, perhaps even employer satisfaction. Perhaps it would even be worth tracking OfS staff morale – something that we know has been a problem in the past.

Generally KPMs need to be strategic, and I don’t get the sense that what is currently on offer really challenges organisational leaders on strategic progress.

1 Comment
Oldest
Newest
Inline Feedbacks
View all comments