Ten positive ideas for the future of the NSS

Phase One of OfS’ National Student Survey review has now been published.

Jim is an Associate Editor at Wonkhe

Tags

Elsewhere on the site DK looks in detail at what’s in the review and what it all means. Here I’ve scratched together ten positive ideas for the future of the NSS, and would welcome more in the comments below.

  1. Every year hundreds of thousands of free-text comments are collected that are never analysed nationally. This was noticed a couple of years ago, but despite promises in the OfS business plan we never got to see the big themes that emerged from all those words typed. Wouldn’t that help us understand why some of the scores are the way they are? This should surely become a standard annual publication?
  2. Last time OfS’ Director of External Relations Conor Ryan wrote for Wonkhe, he set out a powerful case for inclusion of Postgraduate Taught students in the annual exercise. A pilot followed that then disappeared down the back of the sofa in Nicholson House, PGRs have never been mentioned, and Phase One is silent on postgrads generally. Surely it’s time for the National Student Survey to actually survey all students?
  3. Despite Phase One’s determination to have Q26 on students’ unions rewritten so it makes more sense, it still wouldn’t really make much sense to have a question on something that doesn’t exist in a lot of providers. Given the diversity in “who does what”, isn’t it time we just had general questions on collective student representation, extra-curricular activities, welfare and wellbeing services and facilities on campus?
  4. On a related theme, the NSS has a narrow focus (outside of the supplementary bank) on academic aspects – but wider issues of provision, the student experience and aspects like sport or careers support or facilities are also crucial, educational if not “academic”, and are all “sold” to a student. It would be great to see NSS widening its focus.
  5. This fascinating national analysis shows sector-level NSS results split by six student and course characteristics, alongside calculated benchmark values. Why, for example, isn’t it straightforward to work out what the local version of the national gap on “staff are good at explaining things” is for Black students?
  6. For some reason that analysis above does not include gaps between home and international students, despite (for example) international students facing a 10 per cent attainment gap. Don’t we need that to be produced, wouldn’t it be useful to see it nationally, and surely it would be helpful to have that institutionally too in OfS’ proposed whizzy new tools?
  7. One of the lost opportunities in having a national survey is to determine the prevalence of an issue both locally and nationally. For example, surely we need to know about the prevalence of harassment and sexual misconduct – it would be very useful locally, there may be important equality gaps to consider and we’d know if OfS (heavily delayed) strategies were working.
  8. Another major missed opportunity right now (and so source of extra polling costs) is the potential use of the survey to test contemporary attitudes or opinions on a range of issues that we might not need to know about at provider level yet. Given the content in Gravity Assist, for example, a quarter of all respondents could be asked about student access to the right kit and space, with other quarters asked about other important things?
  9. And on that issue, one way of looking at the NSS is that it’s basically a sector wide consensus statement on what makes a good (academic) student experience, like a bill of academic rights. If that’s the case shouldn’t we publish that to students at the start of their course as a set of expectations they should have, rather than just ask students about it at the end? Wouldn’t it be a great international recruitment tool for UK HE? Shouldn’t it link to the Quality Code and OfS’ Quality and Standards definitions? And shouldn’t we also take some steps to ask students which of the elements are more, less or not at all important to them so we can capture and understand the diversity of students and providers?
  10. The underpinning theory of the survey is that it can and should be used for enhancement. But don’t we need to know if that’s true, and if so, how? The smart thing would be for NSS results to be published, and then for there to be a period where providers are invited to reflect on the results, consult with students on why the scores are the way they are and produce a report and action plan based on them that includes an independent element from the SU. OfS wouldn’t be checking on the conclusions and actions, but that the process is happening. And a national summary of that report would tell a great story about how the sector responds to student input.

Leave a Reply

Copyright © 2021 Wonkhe Ltd.

Company Number: 08784934

Wonkhe Ltd, Lower Third Floor Evelyn Suite, Quantum House,

22-24 Red Lion Court, London, United Kingdom, EC4A 3EB