This article is more than 3 years old

Policymakers claiming the NSS is driving down standards should get their own house in order

The government has announced a major review of the National Student Survey. Paul Ashwin questions the assumptions behind the review.
This article is more than 3 years old

Paul Ashwin is Professor of Higher Education at Lancaster University.

It is very strange to set up a review of a longstanding element of the higher education system and to explicitly tell those on the review what they are expected to find. However, the government’s announcement of the review of the National Student Survey has no such hesitation in spelling out, in fewer than 500 words, exactly what is expected.

More disturbingly, the evidence base for these expected findings ranges from the deeply contentious to the downright non-existent. For anyone committed to evidence-informed policy making it is a profoundly worrying and unsettling document.

Standards are slipping

The first blithely asserted bombshell is:

Since its inception in 2005, the NSS has exerted a downwards pressure on standards within our higher education system.

If we were to take this assertion seriously we’d be asking what on earth universities have been doing to allow educational quality to be corroded for 15 years. In reality, it is very difficult to imagine how providing universities with information about students’ experiences of studying has forced standards down.

All we are offered in support of this is the assertion:

There is valid concern from some in the sector that good scores can more easily be achieved through dumbing down and spoon-feeding students, rather than pursuing high standards and embedding the subject knowledge and intellectual skills needed to succeed in the modern workplace.

This is a rhetorical device that will be familiar to anyone who has read government documents relating to the Teaching Excellence Framework: an unidentified group is cited who just happen to agree with the government’s position, and their concerns are thereby validated.

In essence, the case seems to be that universities face a stark choice between either maintaining high standards or seeking student satisfaction. Large scale reviews of the NSS – such as this ten year study or this large-scale analysis – simply do not support this assertion. Student satisfaction with courses most strongly relates to the quality of the teaching and the organisation of the programme.

For the avoidance of doubt, I should point out that, despite the government’s assertion, poor teaching and disorganised degree programmes do not provide a secret path to the maintenance of academic standards.

Helpful tool

The second shock is that suddenly the NSS was never intended as an instrument to inform students’ choice of degree programmes. We must have all imagined the resources and effort poured into Unistats and Discover Uni, because apparently NSS was only ever intended to be “a helpful tool for providers and regulators.”

This revisionism informs the request for the review to consider whether NSS data should be made public. This is amazing (and not in a good way) on two levels. First, it is deeply unethical to ask students for their views of the quality of their degree programmes but not to make this public. Second, it would appear to go against all principles of data protection that those who asked to comment on the quality of a degree programme are not allowed to know the outcomes on which their data is based.

Next we learn that another problem with the NSS is that it:

[does] not correlate well with other, more robust, measures of quality, with some of the worst courses in the country, in terms of drop-out rates and progression to highly skilled employment, receiving high NSS scores.

As David Kernohan points out retention rates and employment outcomes don’t correlate with each other either, which more than slightly undermines the case made here.

There is, however, plenty of evidence that employment outcomes are shaped by the state of the economy, gender, ethnicity, disability, geography, and social class more than they are by the quality of the course you study – here’s one example from many.

We are living in strange times indeed, when the NSS, a measure of quality which has been developed over decades based on an instrument validated in government backed reviews in Australia and the UK, is dismissed because it does not relate to the government’s currently favoured quality measure.

Goodness only knows how the authors of this document will react when, based on this logic, they realise that the employment outcomes during the Covid-19 pandemic tell them that there has been a complete collapse in the quality of university education over the last three years.

It is only fair to note that some institutions have tried to game their performance on the NSS and some have introduced overly bureaucratic approaches to manage this. But it is a complete overreaction to use this to rubbish an instrument that has provided students with useful insights into the quality of degree programmes, and is used by course teams across the country to inform efforts to improve the quality of their courses.

Grade inflation

The final blow comes at the end when, from out of nowhere, we suddenly learn that, not only has the NSS lowered standards, but it has also fuelled grade inflation. The only possible explanation for this is that the NSS has so blinded universities to their educational responsibilities that they just give students whatever degree outcomes they ask for. Unsurprisingly, no evidence is offered to support this link.

Developing measures of educational quality that are not open to institutional gaming and genuinely inform students about the quality of courses is very important. This is why the agenda set for this review is so troubling and why it is so shocking that a radical root and branch review is set up on such falsehoods, while being given less than three and half months to reach its conclusions.

What becomes clear is that there may well have been a decline in standards: a decline in the standards of policy makers in setting the terms of an important review of a key element of the higher education system.

6 responses to “Policymakers claiming the NSS is driving down standards should get their own house in order

  1. Thank you for this article. I completely agree with all you say. The justification and nature of the NSS reassessment is in line with the government’s the hamfisted and incoherent management of the covid crisis.

  2. The NSS has had some serious deleterious effects in some Universities and has further pushed the commodification and concept of students as ‘customers’ whose wishes have to satiated, this does lead to lazy s-too-dense pushing ‘the university’ to lower standards, especially when marketing and PR get involved.

    From the very few UK students who make it into our PhD research dept I very much suspect it’s those who are sufficiently geeky to want to learn and work hard to do so independently who are suffering most from the general lowering of standards, the overseas students always seem to be more well rounded and far less geeky, probably because they are not treated as immature demanding children and given into by those focused on getting good satisfaction scores.

    One also needs to question the teaching before University and how much recently University ‘educated’ and ‘trained’ teachers pedagogical practice enables the further enfeebling of young minds in state schools, with teaching to the lowest common denominator common place there. And the expectation of that continuing at University too?

  3. Given that this article rightly criticises the government for using a rhetorical device, it’s a pity that it goes on to use a blatant straw man argument a few sentences later! (The government does not assert that poor teaching and disorganised degree programmes provide a secret path to the maintenance of academic standards.)

    “It is only fair to note that some institutions have tried to game their performance on the NSS ” – is there evidence for how widespread this is?

  4. Chris you are correct. I should have said it was implied by the government’s position – if satisfaction indicates a lowering of standards then by implication so do well designed courses and good teaching given how they correlate to satisfaction.

  5. The government’s intervention may be well-intentioned, but it is surely crude and disingenuous. That said, I would urge colleagues who want to defend the NSS as some genuine measure of good teaching to appreciate how and why this notion pays lip service to a neo-liberal hijack of tertiary learning that is dismantling the pedagogical principles of higher education proper. For further comment, see my piece:

Leave a Reply