The rumour goes that Michelle Donelan was flabbergasted that regulating universities didn’t actually mean visiting them.
As concerns about “poor quality online provision” swirled in May and June, her questions to the Office for Students assumed a deep and comprehensive knowledge of practice within providers based on observations. Of course, we don’t visit universities in England to monitor them anymore, other than in very specific and concerning cases.
It was with this apocryphal story in mind that I read today’s DfE publication on “Reducing bureaucratic burdens on research, innovation, and higher education – complete with a foreword written by three separate ministers with distinct and overlapping responsibilities. Based on the contents of this briefing, I don’t think the number of university regulatory visits in England will be going up any time soon.
The approach involves both:
Outlining where we intend to remove or reduce reporting requirements and address unnecessary bureaucratic processes immediately”
And, longer term:
Setting out areas that OfS, UKRI and NIHR plan to review over the next few months with a view to reducing reporting requirements and administrative burdens as much as possible”
The big news is on the National Student Survey. The rich history of the tool has been increasingly coloured by the perception that it drives providers to – well, behave according to market forces. We’ve all read and heard about gameplaying and incentives, and the wider concern that “giving students what they want” is akin to dumbing down. The document is particularly brutal on this latter point:
Since its inception in 2005, the NSS has exerted a downwards pressure on standards within our higher education system, and there have been consistent calls for it to be reformed. There is valid concern from some in the sector that good scores can more easily be achieved through dumbing down and spoon-feeding students, rather than pursuing high standards and embedding the subject knowledge and intellectual skills needed to succeed in the modern workplace.”
The admission comes that the use of NSS in rankings and league tables is a downside that is beginning to outweigh the upside of information for providers and regulators. It seems the four applicants who used “Discover Uni” to make their choice of course and provider may have been tempted to choose a higher education experience that is “easy and entertaining, rather than robust and rigorous”.
Yes, but does it correlate?
In a clear hint as to what is to come, DfE note that the NSS results do not correlate with “other, more robust measures” such as drop-out rates and progressing to skilled employment.
That much is true (I’ve plotted at pseudo-course level based on unistats data, which represents the best available data):
However, it is also the case that the more robust measures don’t correlate with each other either – indeed, it looks rather worse to my eye.
Still, I’m sure that DfE have their reasons. In the meantime let’s hope we don’t lose the Unistats dataset in all this chaos.
All of these ideas will feature in a wide-ranging OfS review of the NSS, which will aim to:
- reduce the bureaucratic burden it places on providers
- ensure it does not drive the lowering of standards or grade inflation
- provide reliable data on the student perspective at an appropriate level, without depending on a universal annual sample
- examine the extent to which data from the NSS should be made public
- ensure the OfS has the data it needs to regulate quality effectively
- ensure the NSS will stand the test of time and can be adapted and refined periodically to prevent gaming
All this will be done by the end of 2020.
It was the future, once
The document also takes aim at huge swathes of the regulatory framework.
Notable by its absence is the Teaching Excellence and Student Outcomes Framework (TEF). Rumoured to have fallen from ministerial favour in recent times, we are still waiting for the publication of the statutory Pearce Review (more than a year late). With NSS effectively out of the picture, and the “more robust” highly skilled employment and continuation metrics praised, we basically have a picture of a possible TEF that focuses only on outcomes. Given that these measures are closely correlated with the schooling and background of graduates, it is perhaps ironic that TEF may become TOF (the Teaching Outcomes Framework).
In practical terms the OfS widespread use of “enhanced monitoring” – all those regular updates the registration team like to brag to the Board that they have requested (at least 261 providers) will end, with the minor concerns that underpin them being either ignored or escalated to the (publishable) “specific condition” level. Both providers and the regulator have been keen to keep these issues out of the public eye, but with “enhanced monitoring” those scary “formal communications” (sent to at least 306 providers!) will need to get a whole lot scarier – and it is instructive to note that of the 464 EM’s in place as of October 2019, 77 referred to access and participation, and 77 to student outcomes. The OfS will report on progress after three months.
There’s also been concerns about Data Futures, a phrase that I feel like I’ve typed every month for the past three years. The original idea was termly data collection points to power Michael Barber’s dashboard (I gloss over the actual HEDIIP backstory here), but this is now under review. The Random Sampling element of the regulatory framework (where 5 per cent of all providers will undergo the equivalent of the registration process each year) is being abandoned.
In a raft of less prominent measures, there will be no further regulatory action on student transfer, the death knell for HESA Estates and non-academic staff records is sounded (again), and the venerable TRAC(T) approach to the cost of teaching is under review, with TRAC(T) 2019-20 returns cancelled. And there is a chance of a less onerous Transparency Condition, with the amount of information related to offers to and acceptance of places to be examined.
Cutting the cost
It’s long been a theme on Wonkhe, but the surprisingly high cost of regulation is under review too. The aim of an internal OfS efficiency review is to reduce registration fees by 10 per cent over 2 years – to help it on the way the statutory review of registration fees for 2021-22 will move to this autumn. OfS, with typical charitabilitly, has suggested that the designated Data and Quality Bodies release their (comparatively minimal) registration fees by a similar amount.
In parallel, providers are expected to examine their own internal processes:
Government is clear that providers must also play their own part in this: by reducing their own unnecessary bureaucracy, administrative tasks and requirements placed on academics that do not demonstrably add value”
What kind of thing? Well, first in the firing line are “voluntary membership awards or other forms of recognition to support or validate an organisation’s performance in particular areas” – stuff like Athena Swan, for example. Regulators have been asked to ensure that they don’t place weight on the presence or absence of such markers or scheme membership in regulatory or funding activities. And OfS will be clear that voluntary codes and guidance do not constitute regulatory requirements.
A streamlined, two stage, application process for UKRI is the centrepoint of wider moves to standardise everything from CV formats to criteria presentation. Reporting requirements will be harmonsied with other funders, and outcomes monitoring will increasingly happen via ORCID and other data integration – which could mean an end to the time researchers spend with everyone’s least favourite marine animal, ResearchFish.
There’s a review of TRAC running alongside the planned review of TRAC(T), and there’s a move towards:
Streamlining the 200+ research and innovation grant schemes run by UKRI e.g. moving to single institutional “Impact Acceleration Accounts” for all future funding rounds and maximising the standardisation of Terms and Conditions.”
NIHR research processes will be moving in a similar direction.
What does this all mean?
We’ve got the complete common room checklist of gripes addressed, and many academics of a certain persuasion will be utterly delighted with some of the more crowd pleasing elements. But there are some underlying themes to concern us.
We are still seemingly committed to a data driven regulatory framework, but with less checks and balances. Regulation now sits further from the English sector than it has since the 80s, but has greater powers to challenge and chastise providers than ever before. Barber’s old concept of a “responsive regulator” appears to be greatly diminished – a leaner and less interactive OfS will make decisions from afar, under a growing amount of ministerial direction.
As regards actual quality assurance – in terms of actually looking at what providers are doing rather than outputs – we similarly find ourselves at a very low ebb. If Michelle Donelan is concerned about the quality of online learning during the next pandemic, there will be less immediate insight than ever.