We’re still waiting for proper OfS performance measures

A new arrangement of Office for Students performance measures doesn't convince David Kernohan that England's regulator is engaging with criticism

David Kernohan is Acting Editor of Wonkhe

Sometimes I really don’t understand the Office for Students.

In the teeth of an oncoming financial squeeze, with a new government looking for efficiency savings and with a documented interest in getting rid of arms length bodies, the performance of our regulators is under intense scrutiny.

Problem state

The National Audit Office and the Public Accounts Committee have already laid into the state of OfS performance measures this year, noting that:

Out of 26 indicators, eight are still in development or have incomplete performance information, and a further 11 indicators do not yet have associated targets

and, importantly:

The OfS does not routinely ask providers and sector stakeholders for feedback on its own performance as a regulator

There’s no strong measure on value for money because they don’t define it – the whole thing is such a mess that NEO recommended that DfE should:

review, improve and agree with the OfS the key performance measures and other indicators it uses to hold the OfS to account, to include measures of the impact of the regulatory regime, rather than measures outside the OfS’s control

Solutions focus

So today sees the launch of a new set of OfS KPMs. Obviously they take all of these recommendations into account and the new KPMs are presented as a complete set of easy to understand regulatory information that would allow Kit Malthouse to glance over a dashboard and click away satisfied.

Alas, no.

Of eleven new KPMs, seven are not present – with due dates ranging from October to “later this academic year”. A whole section on operational measures is due later this month.

And this isn’t just missing data – we don’t even get a sense of what will be presented outside of broad categories like “access to higher education” or “extent of student outcomes”. Is there anything on provider feedback (the old, never used, KPM21)? There is not.

Compare and contrast

As it happens, I had a list of the old KPMs from January (as usually happens, OfS has blitzed all documentation on the old approach). Then there were 26KPMs – of which eight were listed as being in development and three had defined targets linked to them. Just six have survived the cull as things stand:

  • Scary sounding new KPM1 on poor quality provision looks similar to old and never used KPM13, though we don’t have the detail on that yet.
  • Gaps between degree outcomes by ethnicity (old KPM4 – though tweaked to look at just firsts rather than firsts and 2:1s)
  • Students responding positively on NSS (Old KPM10, though now a basket of scales – teaching on my course, assessment and feedback, learning resources, academic support, and student voice for new KPM4, just the first three for KPM9B)
  • The proportion of full-time undergraduate students achieving first class degrees (old KPM18)
  • That strange student survey on value for money (old KPM19)
  • And the efficient regulation measures – number of data collections, enhanced monitoring, fee levels – shift from old KPM26 to new KPM11.

Clearly more will move across in some form as decisions are made at OfS, but it is an odd decision not to make this clear at this stage.

Those KPMs in full

Here’s the new ones (note this is paginated):

CategoryKPMTitleDetailTargetStatusOld KPM
Quality and standardsKPM1Extent of poor student outcomesNoDue October 2022KPM13
Quality and standardsKPM2Student outcomes for all registered providersNoDue October 2022
Quality and standardsKPM3Assessment and awardsProportion of full-time undergraduate students achieving first class degreesNoKPM18
Quality and standardsKPM4Students' views on aspects of qualityPercentage of undergraduate students responding positively to National Student Survey (NSS) questions about aspects of qualityNoKPM10
Equality of opportunityKPM5Access to higher educationNoDue October 2022
Equality of opportunityKPM6Success and progressionNoDue later in 2022-23
Equality of opportunityKPM7Degree attainment by ethnicityDifference between proportion of students within ethnic groups achieving first class degrees and the overall proportion for all studentsNoKPM4
Equality of opportunityKPM8Student choiceNoDue October 2022
Enabling regulationKPM9Value for money9A: Percentage of undergraduate students who say that university offers good value for moneyNoKPM19
Enabling regulation9B: Percentage of undergraduate students responding positively to National Student Survey (NSS) questions about aspects of qualityNoKPM10
Enabling regulation9C-9E: Proportion of students at providers with student outcomes indicators above our numerical thresholdsNoDue October 2022
Enabling regulationKPM10Student protectionNoDue later in 2022-23
Enabling regulationKPM11Efficient regulation11A: Minimum and maximum number of OfS data and information returns for providersNoKPM26
Enabling regulation11B: Average number of OfS conditions of registration subject to enhanced monitoring per registered providerNoKPM26
Enabling regulation11C: Average amount of regulatory fees paid by providers per studentNoKPM26
Operational measuresOMCore regulatory activityNoDue September 2022

And here’s the old ones:

CategoryKPMTitleTargetStatus Jan 2022
ParticipationKPM1Gap in participation between most and least represented groupsNo
ParticipationKPM2Gap in participation at higher-tariff providers between the most and least represented groupsYes
ParticipationKPM3Gap in non-continuation between most and least represented groupsYes
ParticipationKPM4Gap in degree outcomes (1sts or 2:1s) between white students and black studentsYes
ParticipationKPM5Gap in degree outcomes (1sts or 2:1s) between disabled students and non-disabled studentsYes
ParticipationKPM6The proportion of access and participation plans that contain robust evaluation methods, focused on impact and leading to improved practiceNoIn development
ParticipationKPM7Comparison of outcomes achieved through access to money spent on accessNo
ExperienceKPM8Diversity of provider choice within subjectNo
ExperienceKPM9Diversity of subject choice by region of domicileNo
ExperienceKPM10Students responding positively to the NSS question on overall satisfactionNo
ExperienceKPM11Postgraduate measure of student satisfactionNoIn development
ExperienceKPM12The extent to which providers effectively demonstrate the learning gain of their studentsNoIn development
ExperienceKPM13The extent and impact of poor learning and teachingNoIn development
ExperienceKPM14The impact on students of course, campus or provider closureNo
OutcomesKPM15Graduates in highly skilled or professional rolesNoIn development
OutcomesKPM16Employers think that graduates are equipped with the right skills and knowledgeNo
OutcomesKPM17Graduate wellbeingNo
OutcomesKPM18Students achieving 1stsNo
Value for moneyKPM19Students who believe university provides good value for moneyNo
Efficiency and effectivenessKPM20Key Performance Targets metNoIn development
Efficiency and effectivenessKPM21External survey of perceptions of the OfSNoIn development
Efficiency and effectivenessKPM22The extent to which staff understand and feel they contribute towards the OfS’s strategic objectivesNo
Efficiency and effectivenessKPM23OfS staff survey results on staff satisfactionNo
Efficiency and effectivenessKPM24Ratio of performance against targets to annual spendNoIn development
Efficiency and effectivenessKPM25The number of internal support function standards that are metYes
Efficiency and effectivenessKPM26Regulatory burdenNo

It’s clear that many of the changes reflect changes to the OfS strategy.

What to make of it

Quis regit regulatores, as a former Prime Minister may have put it. There are standards for regulation in public life – as well as scrutiny from the NAO and PAC we have a Regulator’s Code which is managed for some reason by the Office for Product and Safety Standards.

There is also a UK regulators’ network – a membership organisation with most of the big names as members. This works as a self-policing uber-regulator, benchmarking, setting and maintaining standards, and using scorecards for self-assessment. Notably, the Office for Students is not a member.

But even outside of all this, there is a sense that a regulator should be accountable to those it regulates and those it regulates on behalf of, to the taxpayer, to sponsors and to government officials. I was hoping this release was the OfS finally getting its house in order. I was mistaken.

The OfS has the habit of holding the sector to account via the analysis of data. The increasing vigour of investigations and tough talk does not represent a decline of sector standards, it represents a failure of regulations. Quality rules were set on establishment in 2018 – investigations and concerns now represent a failure of regulation as much as they do deficiencies in the sector.

2 responses to “We’re still waiting for proper OfS performance measures

  1. Interesting point about the current investigations representing failures of regulation as much as they do deficiencies of providers. Between the new model for QA that HEFCE introduced in 2016 (not fully implemented in a number of important ways), and OfS’s Regulatory Framework from 2018, it’s now six years since we last had a round of quality audits of existing providers in England (as done previously under contract by QAA). Essentially in England we’ve missed a full round of what gets termed ‘cyclical audit’, a process that is still operating and fit for purpose in the other UK nations. Could be that we’ve lost something of value.

  2. In 2018, Nicola Dandridge wrote a feature for WONKHE that concluded as follows:

    “I hope these key performance measures will focus these debates on what really matters, and enable students, citizens, and the sector to judge us by our record.”

    https://wonkhe.com/blogs/how-ofs-will-measure-its-own-performance/

    The comment made then has stood the test of time:

    ‘Click the link provided and then pinch yourself: OfS has no fewer than 26 KPIs – on average more than five for each one of its strategic aims.

    With so many metrics, is this a pious dream?’

    We now know the answer.

Leave a Reply