This article is more than 3 years old

Wonkhe surveys a survey about a student survey

The Office for Students has launched its consultation on the future of the National Student Survey. David Kernohan surveys the survey about the survey.
This article is more than 3 years old

David Kernohan is Deputy Editor of Wonkhe

“The effect of the NSS has been in tangible improvements to assessment and feedback, ensuring universities and colleges have opportunities to hear their students, and offering transparent information to prospective students.”

Not my words. The words of the Office for Students, in an insight brief published on 19 February this year. A lot has clearly changed in nine months.

Principally what has happened is that the Department for Education seems to have taken a poorly evidenced dislike to the venerable National Student Survey (we’ve covered the long history of the NSS before on the site). Far from the suspicion that providers did not act on the results described by OfS in April 2018, the presumption in Whitehall is that the NSS is powerful enough to drive lowering standards, and grade inflation, all while being gamed by providers.

That’s the story so far. But this week has seen consultation activity begin in earnest. A series of small events for students (we heard 20 were on one, despite being told the event was fully booked) has been joined by a very peculiar online survey. It’s not done as a full consultation with a publication – unless you follow OfS on Twitter or regularly refresh the NSS review page on the OfS site you may not even know it is happening. And you only have till 13 November to complete it – get moving.

Two for one

The survey logic means that there are actually two surveys – one for staff at providers (be they academics, student support, administrators, senior managers, or “other”) and one for SU representatives and staff. Institutional staff can respond off their own bat (their “personal views as a HE professional”) as well as on behalf of their provider – SU types only get to do a corporate response. There seems to be no way for prospective students, those that advise them, or even just regular students to respond to the consultation.

For both, after you set out what your role is and which provider or union you are from, you plunge into thinking about what the NSS is used for.

For universities it is reckoned that this might include:

  • Understanding the student perspective
  • Identification of areas of improvement
  • Attracting prospective students to your provider/marketing
  • Comparing results against other providers
  • Strategic planning and/or policy making
  • Performance management

There’s an “other” box, but no definitions here and no space to caveat any of the options. Strategic planning, I would argue, is quite different from policy making (the former may also include comparisons against other providers) – whereas understanding the student perspective is pretty much a given for any attempt to survey students. Respondents are asked to rate how “helpful” the NSS is for each of these areas, on a five point scale plus don’t know/don’t use – if you are thinking that this last should have been split into “don’t know” and “not applicable” I would agree.

For students’ unions, the things it is reckoned the NSS might be used for are to:

  • Lobby for improvements to university/college facilities (eg libraries, computer facilities, sports facilities)
  • Lobby for changes to university/college policies
  • Lobby for improvements to the student academic experience
  • Hold the university/college to account
  • Compare results with other universities or colleges
  • Inform union campaigns and priorities
  • Understand differences in student experience of their course by student characteristics
  • Understand differences in satisfaction across courses

These at least feel like something based on actual practice – but there is no chance to rate helpfulness, just yes or no.

Such a burden

It quickly becomes apparent that what the provider survey is trying to do is to get you to say that the NSS is a lot of work to run and use (and otherwise provide ammunition for someone looking to change it in certain ways) whereas the student survey is just about the way the survey is used.

The Cabinet Office published guidelines for government consultations – these have been cut down substantially over the years with the latest (2018) iteration spanning two short pages. Even here, point B is germane:

Take consultation responses into account when taking policy forward. Consult about policies or implementation plans when the development of the policies or plans is at a formative stage. Do not ask questions about issues on which you already have a final view.”

To be fair, if OfS has a final view on the NSS, it is probably close to that expressed in February. It may be a different matter for DfE – the very vehemence of the ministerial commissioning, and the settled need for “radical root and branch reform” suggests that faces are very much fixed on big changes. If this is the case, this consultation goes against Cabinet Office guidelines – it is literally an unnecessary burden on the sector if the decision has already been made.

OfS frames the two-stage internal review (of which this survey is part) as focusing initially on responding to the concerns raised by ministers, with a wider look (including at the questions themselves, updated as recently as 2017) to follow in the new year. This in itself suggests that, in Nicholson House at least, there is some hope that DfE SpAds will be distracted by the next shiny thing they see and forget about their NSS rage by the new year. Maybe Brexit will be really brilliant, or there’ll be a leadership contest, or something. I despair.

Sample and hold

But what are the proposals under consultation? Or is just a matter of trying to gather quotable stuff on how awful the whole thing is. One thing that crops up in both surveys is the idea of a smaller sample for the NSS. Rather than trying to get all students to respond, what if the decision was made to target 5, 25, 50, or 75 per cent?

As things stand, the NSS response rate is a very credible 69 per cent of all eligible final year students- with the survey sent to all students. It functions here as population data, meaning we can get quite deep into the subject area weeds without losing statistical rigour (the safeguards on response rates are, if anything, over-strict – but they do mean we can rely on what we learn). This allows the NSS to power things like Discover Uni (the other data sources for course level data are less reliable as a guide to specific subject outcomes) as well as for allowing you to benchmark your niche course against other courses around the UK in that niche.

Sending the survey to less students would lower the total number of responses, increasing the margin of error for headline stats but removing entirely the ability to drill down in depth. We also face difficulties when benchmarking – accounting for differences based on population characteristics is very difficult with a small and possibly unrepresentative sample. Sure, when you go super-deep the statistical reliability gets a little wobbly – but if have tiny numbers in a micro-subject area that’s going to happen anyway.

I’ve plotted this for percentage agree and disagree on all questions here.

[Full screen]

Anyway, the questions are focused (for staff at least) on whether or not this would mean more or less work rather than if the output of the survey would be any use (SUs do get to have an opinion here, which is nice).

Less often, less open, less options

Other ideas on the table appear to involve running the survey less often (as in, not every year) and publishing less information – either not releasing things down to subject level or not allowing for publicly viewable comparison of providers. Would either of these things restrict the ability of providers to improve the student experience, make NSS less useful for prospective students, or ruin public accountability? – yes, clearly. Would there be a negative impact on the usefulness of NSS data if you were able to do less stuff with it? It would appear so.

For SU types, this crops up in the list of agree/disagree statements – worth listing in full to give you a sense of the mood:

  • The NSS helps improve the student experience at my university/college
  • If the NSS was published less frequently, we would be able to use it to improve the student experience as we do now
  • If the NSS was only available internally to the university/college and not published, it would affect our ability to lobby our university/college
  • If the NSS was abolished, we would be able to enhance the student experience as we do now
  • If data about individual courses were no longer available, we would be able to enhance the student experience as we do now.
  • If the NSS was abolished, graduate incomes data would be a good measure of quality for undergraduate degrees
  • The work generated by the NSS outweighs the positive things we get out of it

And for providers:

  • Student responses to the NSS are influenced by the most recent grades they have received
  • Overall, the NSS has contributed to improving the student experience
  • The NSS creates pressure on providers to inflate the grades of students
  • The NSS is a useful means for ensuring provider accountability
  • The NSS helps applicants make better informed choices
  • Academic standards are negatively influenced by the NSS

What you’ll have spotted there is that SU types are being asked about the usability of data (save for one point on the off chance they think salary data tells us anything), whereas providers get the NSSanon contrarian talking points. If I may venture some advice here, perhaps it would be better to ask students whether their NSS responses are influenced by their grades than provider staff. Or maybe commission some actual research into the links. Or spend 15 seconds looking at a graph on Wonkhe?

[Full screen]

There’s another theme at provider level – I’d characterise it as a lingering suspicion that the mere cancellation of the NSS wouldn’t stop universities surveying students, either internally or via one of the myriad external student surveys. I’d give credence to this idea. Providers survey students all the time about all kinds of things, this is not a behaviour you can stop. I had wondered whether the current government would be happier with one or more private surveys (a marketplace, indeed) than the single statist NSS – but my instinct is that they’d rather have some degree of control over the ridiculous university league table industry than none.

Responses welcome

Please do respond to the OfS survey. It may be a political response to a political decision, but an injection of actual evidence into the process can’t harm things. You don’t have to be in England – people do need to be reminded that the NSS is a UK wide initiative. You have until 13 November, not long, to get a response in – SUs have the chance to explain how they use NSS data to do what we assume is the good kind of “niche campaigning” whereas providers do the meat of the work rebutting DfE’s odd conspiracy theories. This should, we hope, be followed by a sensible look at NSS questions next year.

I worry sometimes if my exasperation at the steady-as-she-goes 2020 NSS, even if it was just a writerly device to set up my real point about students taking the NSS more seriously and altruistically regarding how it is used, played a tiny part in the unexpected DfE hostility. Was there a hope that we’d see the strike in a way that allowed for some anti-union attack lines? Was there an expectation that “bad” courses would be much more visible and show a clear correlation with – well – anything?

But the NSS is worth fighting for – as a time-series, as a spectacularly detailed comparative tool, and as a path into understanding the pain points in the academic student experience. It’s not perfect (question 26 is a mess, for instance) but it is the closest we ever came to putting the student at the heart of the system. We’d be sorry to lose it, after all:

The high overall satisfaction rating reflects a broader consensus that, in many respects, UK higher education is world-leading. Simultaneously, the willingness of the sector to make changes in response to NSS results shows that this success has not bred complacency.”

Not my words.

5 responses to “Wonkhe surveys a survey about a student survey

  1. Fewer than 20 at the consultation event I went to – not even sure we made it to double figures.
    And yes, it was blatantly obvious they were trying to get us to agree that it was not needed, or could be scaled down, or not published. Really shocking example of leading questions.
    All students present gave robust answers, and to my mind, well argued ones.
    I am quite proud of myself for not completely and utterly losing my rag when it was suggested LEO was an appropriate alternative – it was close!

  2. Unlike previous similar exercises there seems to be absolutely no verification process should you wish to claim you are providing the official institutional response for your own institution, or for that matter any other institution that takes your fancy. This rather implies that official institutional responses will carry no additional weight.

  3. The direction of travel is clear – it’s going to be very blunt metrics from here outwards – progression data and graduate outcomes. Very hard times coming in a slimmed down sector as courses lose access to student loan book.

  4. Interesting – and annoying – that there doesn’t seem to be any mention of what a lot of people were saying right at the start: that to measure improvement you need to survey cohorts longitudinally i.e. asking an intake of students the same questions in their 1st/2nd/3rd/additional years.

  5. Again, another excellent article. As a programme leader for most of my academic career I never had any issues with the survey. Sure, some of the questions were/are problematic, but from my perspective the questions covering Assessment and Feedback helped to drive up standards on my courses. Long live the NSS

Leave a Reply