The rumour goes that Michelle Donelan was flabbergasted that regulating universities didn’t actually mean visiting them.
As concerns about “poor quality online provision” swirled in May and June, her questions to the Office for Students assumed a deep and comprehensive knowledge of practice within providers based on observations. Of course, we don’t visit universities in England to monitor them anymore, other than in very specific and concerning cases.
It was with this apocryphal story in mind that I read today’s DfE publication on “Reducing bureaucratic burdens on research, innovation, and higher education – complete with a foreword written by three separate ministers with distinct and overlapping responsibilities. Based on the contents of this briefing, I don’t think the number of university regulatory visits in England will be going up any time soon.
The approach involves both:
Outlining where we intend to remove or reduce reporting requirements and address unnecessary bureaucratic processes immediately”
And, longer term:
Setting out areas that OfS, UKRI and NIHR plan to review over the next few months with a view to reducing reporting requirements and administrative burdens as much as possible”
Getting radical
The big news is on the National Student Survey. The rich history of the tool has been increasingly coloured by the perception that it drives providers to – well, behave according to market forces. We’ve all read and heard about gameplaying and incentives, and the wider concern that “giving students what they want” is akin to dumbing down. The document is particularly brutal on this latter point:
Since its inception in 2005, the NSS has exerted a downwards pressure on standards within our higher education system, and there have been consistent calls for it to be reformed. There is valid concern from some in the sector that good scores can more easily be achieved through dumbing down and spoon-feeding students, rather than pursuing high standards and embedding the subject knowledge and intellectual skills needed to succeed in the modern workplace.”
The admission comes that the use of NSS in rankings and league tables is a downside that is beginning to outweigh the upside of information for providers and regulators. It seems the four applicants who used “Discover Uni” to make their choice of course and provider may have been tempted to choose a higher education experience that is “easy and entertaining, rather than robust and rigorous”.
Yes, but does it correlate?
In a clear hint as to what is to come, DfE note that the NSS results do not correlate with “other, more robust measures” such as drop-out rates and progressing to skilled employment.
That much is true (I’ve plotted at pseudo-course level based on unistats data, which represents the best available data):
However, it is also the case that the more robust measures don’t correlate with each other either – indeed, it looks rather worse to my eye.
Still, I’m sure that DfE have their reasons. In the meantime let’s hope we don’t lose the Unistats dataset in all this chaos.
All of these ideas will feature in a wide-ranging OfS review of the NSS, which will aim to:
- reduce the bureaucratic burden it places on providers
- ensure it does not drive the lowering of standards or grade inflation
- provide reliable data on the student perspective at an appropriate level, without depending on a universal annual sample
- examine the extent to which data from the NSS should be made public
- ensure the OfS has the data it needs to regulate quality effectively
- ensure the NSS will stand the test of time and can be adapted and refined periodically to prevent gaming
All this will be done by the end of 2020.
It was the future, once
The document also takes aim at huge swathes of the regulatory framework.
Notable by its absence is the Teaching Excellence and Student Outcomes Framework (TEF). Rumoured to have fallen from ministerial favour in recent times, we are still waiting for the publication of the statutory Pearce Review (more than a year late). With NSS effectively out of the picture, and the “more robust” highly skilled employment and continuation metrics praised, we basically have a picture of a possible TEF that focuses only on outcomes. Given that these measures are closely correlated with the schooling and background of graduates, it is perhaps ironic that TEF may become TOF (the Teaching Outcomes Framework).
In practical terms the OfS widespread use of “enhanced monitoring” – all those regular updates the registration team like to brag to the Board that they have requested (at least 261 providers) will end, with the minor concerns that underpin them being either ignored or escalated to the (publishable) “specific condition” level. Both providers and the regulator have been keen to keep these issues out of the public eye, but with “enhanced monitoring” those scary “formal communications” (sent to at least 306 providers!) will need to get a whole lot scarier – and it is instructive to note that of the 464 EM’s in place as of October 2019, 77 referred to access and participation, and 77 to student outcomes. The OfS will report on progress after three months.
There’s also been concerns about Data Futures, a phrase that I feel like I’ve typed every month for the past three years. The original idea was termly data collection points to power Michael Barber’s dashboard (I gloss over the actual HEDIIP backstory here), but this is now under review. The Random Sampling element of the regulatory framework (where 5 per cent of all providers will undergo the equivalent of the registration process each year) is being abandoned.
In a raft of less prominent measures, there will be no further regulatory action on student transfer, the death knell for HESA Estates and non-academic staff records is sounded (again), and the venerable TRAC(T) approach to the cost of teaching is under review, with TRAC(T) 2019-20 returns cancelled. And there is a chance of a less onerous Transparency Condition, with the amount of information related to offers to and acceptance of places to be examined.
Cutting the cost
It’s long been a theme on Wonkhe, but the surprisingly high cost of regulation is under review too. The aim of an internal OfS efficiency review is to reduce registration fees by 10 per cent over 2 years – to help it on the way the statutory review of registration fees for 2021-22 will move to this autumn. OfS, with typical charitabilitly, has suggested that the designated Data and Quality Bodies release their (comparatively minimal) registration fees by a similar amount.
In parallel, providers are expected to examine their own internal processes:
Government is clear that providers must also play their own part in this: by reducing their own unnecessary bureaucracy, administrative tasks and requirements placed on academics that do not demonstrably add value”
What kind of thing? Well, first in the firing line are “voluntary membership awards or other forms of recognition to support or validate an organisation’s performance in particular areas” – stuff like Athena Swan, for example. Regulators have been asked to ensure that they don’t place weight on the presence or absence of such markers or scheme membership in regulatory or funding activities. And OfS will be clear that voluntary codes and guidance do not constitute regulatory requirements.
Research funding
A streamlined, two stage, application process for UKRI is the centrepoint of wider moves to standardise everything from CV formats to criteria presentation. Reporting requirements will be harmonsied with other funders, and outcomes monitoring will increasingly happen via ORCID and other data integration – which could mean an end to the time researchers spend with everyone’s least favourite marine animal, ResearchFish.
There’s a review of TRAC running alongside the planned review of TRAC(T), and there’s a move towards:
Streamlining the 200+ research and innovation grant schemes run by UKRI e.g. moving to single institutional “Impact Acceleration Accounts” for all future funding rounds and maximising the standardisation of Terms and Conditions.”
NIHR research processes will be moving in a similar direction.
What does this all mean?
We’ve got the complete common room checklist of gripes addressed, and many academics of a certain persuasion will be utterly delighted with some of the more crowd pleasing elements. But there are some underlying themes to concern us.
We are still seemingly committed to a data driven regulatory framework, but with less checks and balances. Regulation now sits further from the English sector than it has since the 80s, but has greater powers to challenge and chastise providers than ever before. Barber’s old concept of a “responsive regulator” appears to be greatly diminished – a leaner and less interactive OfS will make decisions from afar, under a growing amount of ministerial direction.
As regards actual quality assurance – in terms of actually looking at what providers are doing rather than outputs – we similarly find ourselves at a very low ebb. If Michelle Donelan is concerned about the quality of online learning during the next pandemic, there will be less immediate insight than ever.
While we’re used to ministerial statements expressing some unorthodox views, the section on the NSS is remarkable from an official publication. This is, after all, part of the official government information set, but suddenly we have a collation of worries that we would be trying to unpick if they turned up in a think tank blog, but this is a document from government.
There’s completely weird parts, It’s surprised the author that providers analyse the NSS results?
“There is a sense that the level of activity it drives in universities and colleges has become excessive and inefficient. For example, we are aware that some providers employ analysts to drill down into NSS performance, in some cases at module level, and investigate any sub-par performance.”
Clearly they are surprised that there are small providers where all the students say they are satisfied – a function that this can happen. All providers have courses where all the students say ‘yes’ to Q27 – it just stands out in FECs and alternative providers.
The intriguing statements are the ones which begin: “The OfS is required in law to”… It’s not burden that OfS have invented, but burden imposed through (badly drafted) primary legislation. (eg Transparency Condition by law requires publication of admissions data specifically by gender, ethnicity and socio-economic group – it is not conditional).
This suggests that the minister is happy for primary legislation to be ignored (ie breaking the law in a very specific and limited way).
[It will need primary legislation to change HERA ’17 – this sort of requirement should have been secondary legislation to give some flexibility to DfE/OfS]
OfS response is interesting. I think we may have the makings of another Ofqual style “our minister said THIS but our statutory duty said THAT” argument. This government seems to create them.