Sometimes I really don’t understand the Office for Students.
In the teeth of an oncoming financial squeeze, with a new government looking for efficiency savings and with a documented interest in getting rid of arms length bodies, the performance of our regulators is under intense scrutiny.
Problem state
The National Audit Office and the Public Accounts Committee have already laid into the state of OfS performance measures this year, noting that:
Out of 26 indicators, eight are still in development or have incomplete performance information, and a further 11 indicators do not yet have associated targets
and, importantly:
The OfS does not routinely ask providers and sector stakeholders for feedback on its own performance as a regulator
There’s no strong measure on value for money because they don’t define it – the whole thing is such a mess that NEO recommended that DfE should:
review, improve and agree with the OfS the key performance measures and other indicators it uses to hold the OfS to account, to include measures of the impact of the regulatory regime, rather than measures outside the OfS’s control
Solutions focus
So today sees the launch of a new set of OfS KPMs. Obviously they take all of these recommendations into account and the new KPMs are presented as a complete set of easy to understand regulatory information that would allow Kit Malthouse to glance over a dashboard and click away satisfied.
Alas, no.
Of eleven new KPMs, seven are not present – with due dates ranging from October to “later this academic year”. A whole section on operational measures is due later this month.
And this isn’t just missing data – we don’t even get a sense of what will be presented outside of broad categories like “access to higher education” or “extent of student outcomes”. Is there anything on provider feedback (the old, never used, KPM21)? There is not.
Compare and contrast
As it happens, I had a list of the old KPMs from January (as usually happens, OfS has blitzed all documentation on the old approach). Then there were 26KPMs – of which eight were listed as being in development and three had defined targets linked to them. Just six have survived the cull as things stand:
- Scary sounding new KPM1 on poor quality provision looks similar to old and never used KPM13, though we don’t have the detail on that yet.
- Gaps between degree outcomes by ethnicity (old KPM4 – though tweaked to look at just firsts rather than firsts and 2:1s)
- Students responding positively on NSS (Old KPM10, though now a basket of scales – teaching on my course, assessment and feedback, learning resources, academic support, and student voice for new KPM4, just the first three for KPM9B)
- The proportion of full-time undergraduate students achieving first class degrees (old KPM18)
- That strange student survey on value for money (old KPM19)
- And the efficient regulation measures – number of data collections, enhanced monitoring, fee levels – shift from old KPM26 to new KPM11.
Clearly more will move across in some form as decisions are made at OfS, but it is an odd decision not to make this clear at this stage.
Those KPMs in full
Here’s the new ones (note this is paginated):
Category | KPM | Title | Detail | Target | Status | Old KPM |
---|---|---|---|---|---|---|
Quality and standards | KPM1 | Extent of poor student outcomes | No | Due October 2022 | KPM13 | |
Quality and standards | KPM2 | Student outcomes for all registered providers | No | Due October 2022 | ||
Quality and standards | KPM3 | Assessment and awards | Proportion of full-time undergraduate students achieving first class degrees | No | KPM18 | |
Quality and standards | KPM4 | Students' views on aspects of quality | Percentage of undergraduate students responding positively to National Student Survey (NSS) questions about aspects of quality | No | KPM10 | |
Equality of opportunity | KPM5 | Access to higher education | No | Due October 2022 | ||
Equality of opportunity | KPM6 | Success and progression | No | Due later in 2022-23 | ||
Equality of opportunity | KPM7 | Degree attainment by ethnicity | Difference between proportion of students within ethnic groups achieving first class degrees and the overall proportion for all students | No | KPM4 | |
Equality of opportunity | KPM8 | Student choice | No | Due October 2022 | ||
Enabling regulation | KPM9 | Value for money | 9A: Percentage of undergraduate students who say that university offers good value for money | No | KPM19 | |
Enabling regulation | 9B: Percentage of undergraduate students responding positively to National Student Survey (NSS) questions about aspects of quality | No | KPM10 | |||
Enabling regulation | 9C-9E: Proportion of students at providers with student outcomes indicators above our numerical thresholds | No | Due October 2022 | |||
Enabling regulation | KPM10 | Student protection | No | Due later in 2022-23 | ||
Enabling regulation | KPM11 | Efficient regulation | 11A: Minimum and maximum number of OfS data and information returns for providers | No | KPM26 | |
Enabling regulation | 11B: Average number of OfS conditions of registration subject to enhanced monitoring per registered provider | No | KPM26 | |||
Enabling regulation | 11C: Average amount of regulatory fees paid by providers per student | No | KPM26 | |||
Operational measures | OM | Core regulatory activity | No | Due September 2022 |
And here’s the old ones:
Category | KPM | Title | Target | Status Jan 2022 |
---|---|---|---|---|
Participation | KPM1 | Gap in participation between most and least represented groups | No | |
Participation | KPM2 | Gap in participation at higher-tariff providers between the most and least represented groups | Yes | |
Participation | KPM3 | Gap in non-continuation between most and least represented groups | Yes | |
Participation | KPM4 | Gap in degree outcomes (1sts or 2:1s) between white students and black students | Yes | |
Participation | KPM5 | Gap in degree outcomes (1sts or 2:1s) between disabled students and non-disabled students | Yes | |
Participation | KPM6 | The proportion of access and participation plans that contain robust evaluation methods, focused on impact and leading to improved practice | No | In development |
Participation | KPM7 | Comparison of outcomes achieved through access to money spent on access | No | |
Experience | KPM8 | Diversity of provider choice within subject | No | |
Experience | KPM9 | Diversity of subject choice by region of domicile | No | |
Experience | KPM10 | Students responding positively to the NSS question on overall satisfaction | No | |
Experience | KPM11 | Postgraduate measure of student satisfaction | No | In development |
Experience | KPM12 | The extent to which providers effectively demonstrate the learning gain of their students | No | In development |
Experience | KPM13 | The extent and impact of poor learning and teaching | No | In development |
Experience | KPM14 | The impact on students of course, campus or provider closure | No | |
Outcomes | KPM15 | Graduates in highly skilled or professional roles | No | In development |
Outcomes | KPM16 | Employers think that graduates are equipped with the right skills and knowledge | No | |
Outcomes | KPM17 | Graduate wellbeing | No | |
Outcomes | KPM18 | Students achieving 1sts | No | |
Value for money | KPM19 | Students who believe university provides good value for money | No | |
Efficiency and effectiveness | KPM20 | Key Performance Targets met | No | In development |
Efficiency and effectiveness | KPM21 | External survey of perceptions of the OfS | No | In development |
Efficiency and effectiveness | KPM22 | The extent to which staff understand and feel they contribute towards the OfS’s strategic objectives | No | |
Efficiency and effectiveness | KPM23 | OfS staff survey results on staff satisfaction | No | |
Efficiency and effectiveness | KPM24 | Ratio of performance against targets to annual spend | No | In development |
Efficiency and effectiveness | KPM25 | The number of internal support function standards that are met | Yes | |
Efficiency and effectiveness | KPM26 | Regulatory burden | No |
It’s clear that many of the changes reflect changes to the OfS strategy.
What to make of it
Quis regit regulatores, as a former Prime Minister may have put it. There are standards for regulation in public life – as well as scrutiny from the NAO and PAC we have a Regulator’s Code which is managed for some reason by the Office for Product and Safety Standards.
There is also a UK regulators’ network – a membership organisation with most of the big names as members. This works as a self-policing uber-regulator, benchmarking, setting and maintaining standards, and using scorecards for self-assessment. Notably, the Office for Students is not a member.
But even outside of all this, there is a sense that a regulator should be accountable to those it regulates and those it regulates on behalf of, to the taxpayer, to sponsors and to government officials. I was hoping this release was the OfS finally getting its house in order. I was mistaken.
The OfS has the habit of holding the sector to account via the analysis of data. The increasing vigour of investigations and tough talk does not represent a decline of sector standards, it represents a failure of regulations. Quality rules were set on establishment in 2018 – investigations and concerns now represent a failure of regulation as much as they do deficiencies in the sector.
Interesting point about the current investigations representing failures of regulation as much as they do deficiencies of providers. Between the new model for QA that HEFCE introduced in 2016 (not fully implemented in a number of important ways), and OfS’s Regulatory Framework from 2018, it’s now six years since we last had a round of quality audits of existing providers in England (as done previously under contract by QAA). Essentially in England we’ve missed a full round of what gets termed ‘cyclical audit’, a process that is still operating and fit for purpose in the other UK nations. Could be that we’ve lost something of value.
In 2018, Nicola Dandridge wrote a feature for WONKHE that concluded as follows:
“I hope these key performance measures will focus these debates on what really matters, and enable students, citizens, and the sector to judge us by our record.”
https://wonkhe.com/blogs/how-ofs-will-measure-its-own-performance/
The comment made then has stood the test of time:
‘Click the link provided and then pinch yourself: OfS has no fewer than 26 KPIs – on average more than five for each one of its strategic aims.
With so many metrics, is this a pious dream?’
We now know the answer.