How bureaucratic is the Office for Students?
I don’t know what everyone else’s favourite OfS Key Performance Measure (KPM) is but mine is absolutely KPM26 – both for the data made available and the insight into the regulator’s corporate personality it offers.
Last year, it was my sacred duty to inform you that OfS had written near enough the Lord of the Rings in regulation: an epic 420,721 words. This year brings glad tidings: the previous record has been smashed, with 2020-21 yielding a remarkable 596,852 words of regulation – getting Frodo There and Back Again, and leaving plenty of room for all six appendices including “The Languages and Peoples of the Third Age”.
It has clearly been a busy (arguably exceptionally so) year at Nicholson house, but even with this triumph our regulator is underselling itself. The totals cited above do not include:
- Press releases, news items or blog posts that highlight and refer to regulatory documents that are already included in the measure;
- Summaries of regulatory documents that are already included in the measure;
- Public communications about the establishment and results of pilot studies of regulatory processes;
- Publications about OfS strategy, plans, finances or performance;
- Public research, evaluation, and data analysis reports (except where these also announce changes in regulation);
- Public guidance to providers on effective practice that does not constitute a regulatory requirement;
- Equality impact assessments of regulatory requirements;
- Regulatory documents specifically for providers outside England (as is the case with some NSS publications) or non-registered providers (as is the case with some Prevent publications); and
- Documents about our regulation that are aimed at students and other stakeholders, rather than providers.
I’d wager registered providers need very much to be on top of the detail of most of these documents too (and wonderfully, the total does not include the many words on methodology and practice within the spreadsheets – some of the OfS’ very best work and the source of the otherwise unpublished list above). It is – and please take this from a dedicated reader of higher education regulatory documentation – a lot of stuff.
What else do we learn?
Providers still have to return the same number of data returns as they did last year – a maximum of 16 items and a minimum of 4, depending on registration. The removal of last years interim financial return drops the maximum number of returns an Approved provider makes by one – and also prompts the question whether the multiple return dates through Data Futures will count separately.
The average number of Enhanced Monitoring conditions placed on a registered provider has dropped to one (from 1.2 last year). In real terms this is eighty less conditions – and no, we don’t know which providers have them.
Only 4.8 per cent of registered (Approved (fee cap)) providers had to agree a new access and participation plan this year – this is because they all had agree five year plans last year.
The cost of paying the sector’s required subscriptions to the OfS and two mandatory subscriptions to the designated bodies was £37,325,519, or £19.98 per student. We don’t know how this compares to last year, but we do know (from the annual report) that £26.3m of this was to the Office for Students, which is a lot – but less than OfS spent on administration.
One response to “How bureaucratic is the Office for Students?”
I also found this results publication particularly irksome – how can a body that purports to be able to tell so much about quality from the data it reviews about providers, publish such dross about itself?
Its all a bit navel gazing too.
“We consider our documents to be readable – using the Flesch Reading Ease formula” – move over Deep Heat.
Not a sinlge qualitiative element, or anything involving the regulated in the mix – does this suggest that the OFS will be able to replace the NSS with a mechanism that similarly avoids any subjective student input?