It’s fair to say that reviewing the NSS was not top of the Office for Students’ list of priorities this autumn.
Until recently, the expectation had been that the NSS would expand – to first and second year undergraduates, and to postgraduates (there was even a pilot study). In a regulated, customer-focused, sector like HE, there was a sense that regular data on customer satisfaction was a prerequisite. NSS was built into TEF, and was a central part of the OfS’s regulatory warning systems.
And now this review. Mandated by the DfE, it represents a retreat from the “students at the heart of the system” narrative that has dominated higher education policymaking for the last 10 years. It represents a return to an earlier idea – often argued for in the grey academic literature – that learning need not be entertaining or immediately satisfying.
The big news is on the 2021 survey – it will still go ahead (because OfS has already paid Ipsos MORI a very large sum of money), but providers no longer need to promote the survey internally – and we might see a different level or mode of publication to align with the new direction of travel (and to stop troublemakers like me plotting results, I imagine…)
That remit in full
The brief announcement today, following the September OfS board meeting, makes it clear how significant this review is seen as being. The terms of reference are:
- Assess the bureaucratic burden the NSS places on providers and how this could be reduced.
- Explore the unintended consequences of the NSS for provider behaviour and how these could be prevented, including whether the NSS drives the lowering of academic standards and grade inflation.
- Examine the appropriate level at which the NSS could continue to provide reliable data on the student perspective on their subject, provider and the wider system, and what could be done without depending on a universal annual sample.
- Examine the extent to which data from the NSS should be made public, including the implications of Freedom of Information (FoI) legislation.
- Ensure the OfS has the data it needs to regulate quality effectively.
- Ensure the NSS will stand the test of time, and can be adapted and refined periodically to prevent gaming.
- Ensure the UK wide role of the survey is considered in any recommendations.
Most of the immediate action is around the first three points, though arguably the latter terms will turn out to be more significant.
The NSS is an externally administered survey. Providers are currently asked to promote the survey to final year students during the spring term, and in return are given access to detailed responses at provider and subject level to give them the opportunity to design and implement enhancement. The first part of this can in no way be seen as burdensome – so we must assume that it is the use of the results that causes issues.
Universities can and do investigate areas of provision where NSS data signals that students are experiencing problems – remember only one question out of 27 relates to course satisfaction specifically, others examine key “hygiene” issues like resource availability and course organisation. Good universities – that is to say, most of them – will have early warning of these issues through their own systems of module and course surveys and other student feedback mechanisms.
To take the popular “university as gym” metaphor, you would want to know if customers were saying the rowing machine was broken and one of the lights was out in the changing rooms – it has an impact on satisfaction, but also on retention and quality. And if the NSS does not provide such information, it will be gathered elsewhere.
The idea that providers would act in suboptimal ways given student feedback is a curious one, and of all the strangeness in the DfE announcement this is the bit that seems to have been plucked from the air. I am unaware of any credible evidence that providers are making courses easier or less rigorous based on NSS results. There is, to be clear, an argument that NSS results are not a great basis for applicants to make course decisions (the same could be said for salary and employment data) but there is little evidence that survey results have this kind of influence.
And the idea that NSS – a survey filled in by final year students in the spring of their graduating year – somehow drives grade inflation is almost laughable. Again, the first task of the OfS review should be to identify evidence for this criticism. Here’s my attempt to help:
You’re looking at the OfS’s own “unexplained first and 2:1” metric against the difference between the provider’s Q27 agree score and the benchmark in NSS. There is no relationship whatsoever between these two variables.
NSS is a survey with a very high response rate. Around 70 per cent of all final year university students complete it – giving us (for the most part) numbers so large that it can be treated (with caveats) as a population study. Certainly, the response rate is in the same statistical ballpark as the Destination of Leavers from Higher Education survey (77 per cent) which provides the currently fashionable data on “highly skilled employment”.
There are issues when you drill right down to lower level subject area – issues that the rubric for the unistats dataset elegantly deals with by stepping up a level when looking at small populations. But for most courses at most providers enough students in a large course or subject area will respond to offer a useful split metric. And without an “annual universal system” (actually a very large annual sample) it would not be possible to do this.
An allegations procedure exists to provide a warning of “inappropriate influence” within a subject area or provider. The OfS (in England) will receive feedback on concerns from an NSS allegations panel, and investigate credible allocations and make amends. Though this is a secretive process (with no publication requirement or feed into regulatory conditions) I have never seen any evidence of data being suppressed or restated after publication – perhaps more openness around these instances would help.
The last four terms are more interesting to me. One of the strengths of the existing NSS is that it is nationally published to a high independently-audited standard, to the requirements of Official Statistics. Publication makes the data source a single source of truth – no-one can claim a satisfaction score higher (or lower) than it actually is, and the information is available to both academics, support staff (including SUs), and managers equally. A retreat from this position would be puzzling – the only beneficiaries would be providers with low or falling scores.
Term five opens a huge can of worms – what data does the OfS need to regulate quality effectively? Well, for a while it needed multiple student data collection points and now apparently it does not. Once it needed a TEF to provide an incentive to improve the quality of provision above a baseline, now… well, we’ll see. Once it definitely and emphatically need a system of regular inspections of provider quality processes, unlike most other European countries – and I feel like this may be up for debate too.
The NSS role in regulation is complex. It is an indicator of student (customer) concern, and has recently played a part in understanding differences in satisfaction across student and course characteristics. It shouldn’t really be used as an absolute cut off for direct intervention (as in regulatory condition B1) but it should certainly indicate where some actual quality assurance may need to be considered.
The term on standing the test of time is curious – surely the fact it has survived 15 years with periodic refinements (not to prevent gaming, for which there is very little evidence) but to improve the usefulness of data would suggest there is not a huge problem here.
But there is a huge problem around term 7. The NSS (along with big parts of the HESA data collection) is UK wide – the three other regulators will have strong views on how the survey changes, and may wish to strike out in another direction. This was a problem as soon as this review was announced – it is about to become a larger one.