Your bus is an hour late, you don’t know if the catering on campus will be open and your now online exam has just been yanked because the remote proctoring company is short staffed.
Ideal circumstances in which, then, to launch the National Student Survey 2022. If you think that Covid chaos will largely be over by February and you’re a provider that opted for an earlier collection period, you’ll be kicking yourself now.
But NSS 2022 isn’t the big news here. Alongside minor revisions to the documents that accompany the launch at this time of year, the Office for Students (OfS) has published a proposed question set for 2023 onwards as a kind of staging post in its protracted review of the survey.
There’s a further, UK wide NSS consultation to look forward to in the summer, ahead of potential changes in the 2023 iteration. This will be informed by a pilot survey, with some lucky 2022 respondents getting the chance to respond to one of two scenarios, and yet another round of engagement events examining potential changes to data publication – starting this month with SUs and other student representative bodies.
This is all just the latest iteration of a process that started in December 2020 – the interim (phase 1) findings very helpfully moved things away from the bizarre scorched earth language from DfE that kicked this whole review off, and thus far we have been looking at some tweaks to wording and the removal of the word “satisfaction” as the major changes to come. Two scenarios are presented here.
We’ve tabulated the current questions, alongside “scenario 1” and “scenario 2”, here – drawing links between similar questions where we can:
|Number||2022 Text||2023 S1||2023 S1 Text||2023 S2||2023 S2 Text|
|Q01||1. Staff are good at explaining things.||1||Teaching staff are good at explaining the course content||1||Are teaching staff good at explaining course content?|
|Q02||2. Staff have made the subject interesting.||2||Staff have made the subject engaging||2||Do teaching staff make the subject engaging?|
|Q03||3. The course is intellectually stimulating.||3||My course is intellectually stimulating||3||Is the course intellectually stimulating?|
|Q04||4. My course has challenged me to achieve my best work.||4||My course has challenged me to achieve my best work||4||Do you feel challenged by your course?|
|5||There is an appropriate balance of breadth and depth in the content of my course||8||Does your course contain the right balance of depth and breadth?|
|6||The balance of directed and independent study on my course supports my learning well||9||Does your course contain the right balance of directed and independent study?|
|Q05||5. My course has provided me with opportunities to explore ideas or concepts in depth.||7||My course has provided me with opportunities to explore ideas or concepts in depth.|
|Q06||6. My course has provided me with opportunities to bring information and ideas together from different topics.||8||My course has provided me with opportunities to bring information and ideas together from different topics.||6||Have you had the chance to bring together information and ideas from different topics?|
|Q07||7. My course has provided me with opportunities to apply what I have learnt.||9||My course has provided me with opportunities to apply what I have learnt.||5||Have you had the chance to apply the theories and concepts that you have learnt?|
|Q08||8. The criteria used in marking have been clear in advance.||11||The criteria used in marking have been clear in advance.||13||Were you given the marking criteria in advance?|
|Q09||9. Marking and assessment has been fair.||12||Marking and assessment has been fair.||11||Has marking and assessment been fair?|
|Q10||10. Feedback on my work has been timely.||13||Feedback on my work has been timely.|
|Q11||11. I have received helpful comments on my work.||14||I have received helpful comments on my work.|
|15||Assessments have allowed me to demonstrate what I have learned on my course||10||Have assessments allowed you to demonstrate what you have learnt?|
|12||Did you understand the marking criteria used to assess your work?|
|14||Has feedback helped you improve your work?|
|Q12||12. I have been able to contact staff when I needed to.||15||Are you able to contact teaching staff when you need to?|
|Q13||13. I have received sufficient advice and guidance in relation to my course.|
|16||How well have teaching staff supported your learning?|
|Q14||14. Good advice was available when I needed to make study choices on my course.||17||Are you able to get good advice about study choices?|
|Q15||15. The course is well organised and running smoothly.||18||Is the course well organised?|
|Q16||16. The timetable works efficiently for me.|
|Q17||17. Any changes in the course or teaching have been communicated effectively.||19||Have changes to the course been clearly communicated?|
|Q18||18. The IT resources and facilities provided have supported my learning well.|
|Q19||19. The library resources (e.g. books, online services and learning spaces) have supported my learning well.|
|Q20||20. I have been able to access course-specific resources (e.g. equipment, facilities, software, collections) when I needed to.|
|16||It has been easy to access learning resources (digital and physical) provided by my institution when I needed to.||20||Have you been able to access the learning resources (either digital or physical) that you need?|
|17||Learning resources (digital and physical) provided by my institution have supported my learning well||21||How well have the physical and/or digital resources supported your learning?|
|Q21||21. I feel part of a community of staff and students.|
|Q22||22. I have had the right opportunities to work with other students as part of my course.||10||I have had the right opportunities to work with other students as part of my course||7||When working with other students as part of your course, was this helpful for your learning?|
|Q23||23. I have had the right opportunities to provide feedback on my course.||22||Do you get the right opportunities to give feedback on your course?|
|Q24||24. Staff value students’ views and opinions about the course.||23||Do staff value students' opinions about the course?|
|Q25||25. It is clear how students’ feedback on the course has been acted on.||24||Do staff act on students' feedback?|
|Q26||26. The students’ union (association or guild) effectively represents students’ academic interests.||18||Overall, I am content with the students' union (association or guild) at my institution||25||Has the Students' Union (Association or Guild) had a positive impact on your experience?|
|Q27||27. Overall, I am satisfied with the quality of the course.||19||Overall, the quality of my course has been good.||26||Overall, how would you rate the quality of your course?|
|27||On a scale of 0 - 10 how likely are you to recommend your course to a friend or a colleague?|
|20||My institution has made me aware of services to support my mental wellbeing||28||Are you aware of services at your university/college to support your mental wellbeing?|
|21||My institution's services to support my mental wellbeing were available when I needed them||29||How easy is it to access your university or college's mental wellbeing services?|
|22||My institution provides a free environment for the expression of ideas, opinions and beliefs||30||During your studies, have you felt free to express your ideas, opinions, and beliefs?|
|23||My course has given me the knowledge and skills I think I will need for the future||31||Has your course given you the knowledge and skills you think you will need for your future?|
The core survey
In its current form, the NSS is a questionnaire that is very much about the “academic” student experience.
You’ll see that scenario one is quite close to the current model (using the same response framework) – what’s curious is the omission of all the “organisation and management” questions. These have long been demonstrated to be a close determinant of overall satisfaction – they matter hugely to students on the basis that disruption is the enemy of learning – so their omission is worrying.
You’ll also see that in one scenario almost of the questions about student feedback and representation are missing, and in another all of the questions about staff supporting learning outside of formal teaching tasks have been expunged. The message seems to be “the sector can have one or the other”, without any decent evidence on why these two areas are less of a priority than others.
In format terms, scenario two moves to posing questions rather than seeking agreement, using more active and student-centric language and a four point Lickert scale in an approach that is, ironically, much closer to the orthodoxy around consumer satisfaction. Both scenarios at least differentiate between “don’t know” and “not applicable” – an improvement on current practice.
And both new versions tidy up confusion in questions around learning resources by generalising in ways that might generate confusion in local accountability – moving away from questions based on specific services (libraries, IT) to a question on ease of access and one on the usefulness of the resources.
Gone from both scenarios is the question on feeling part of a community of staff and students – a problem in principle given how important other students are to the learning experience, and a problem in practice given that we’ve previously shown how close a predictor that question is of student mental health.
Beyond the old core focus on the student academic experience, things become harder to understand. Both scenarios add a question on free speech – one option asks how free students have felt themselves in expressing their “ideas, opinions, and beliefs”, another version asks them about the extent to which they think their institution provides a free environment for the expression of ideas, opinions and beliefs.
Surely we can agree that the survey should at least stick to asking how the respondent has experienced their provider rather than their views on what they’ve heard happens on someone else’s course?
Mental health was always going to be tricky. Here both options include awareness of services and access to those services. Maybe it’s too hard to do in the NSS – but to miss the mental health impact both of what happens on the course itself, and in relation to the community and culture on campus that is related to peers seems to be a real omission.
Scenario two raises the prospect of a net recommender score by provider – asking students to rate how likely they are to recommend their course on a scale of 0-10. The big question for league table compilers will either be “overall, the quality of my course has been good” (S1) or “overall, how would you rate the quality of your course” – moving the goalposts from the current subjective and personal assessment of satisfaction to a comparison between the course as experienced and a Platonic ideal of higher education. Good in comparison to what?
And the revised questions on students’ unions are amusing. Having received the edict not to use the word “satisfaction”, and having had a lot of feedback from SUs on the old wording of the union “representing academic interests”, students will instead be invited either to convey the positive impact that the SU had had on them, or more bizarrely, express that they are “content” with the SU. What will it mean if 50 percent of students are “strongly contented”? And why are we still asking a question about something that the majority of providers on the register don’t have?
Using the survey results
For all the way the NSS has crept into regulation and external quality assurance, the real value of the survey is in transparently holding a provider to account. SUs have long used responses to press for improvements and rethinks as a micro level, and anyone who has run a course or subject area knows the horror of being called before the Dean or HoD to explain dips and discrepancies.
With this in mind, the continued existence of an overall quality question – and, god forbid, net recommendation scores – feels like a sop to the mis-use of a formative survey by regulators (and, indeed, journalists). There’s scant evidence of the use of NSS scores by applicants in deciding where to study – though they do turn up on Discover Uni for whatever that is worth.
The new questions do seem to have been designed with regulatory use in mind – the “chilling effect” will be illustrated with national statistics so let’s hope that one has been cognitively tested properly, and the mental wellbeing one smacks of ammunition for regulatory intervention – although questions on identity and safety that were floated in the 2019 PGT pilot haven’t made it in.
Likewise, the question on directed and independent study could very easily be parlayed into another “contact hours” panic – or, at least, it could if it wasn’t phrased so ambiguously that a student who felt there was too little directed study and a student who felt there was too much would tick the same box!
Looking across the two scenarios, there’s both a lack of a rationale or justification for changes, and plenty of inconsistencies. The whole student experience, or just the academic experience? My experience, or the culture on campus? What happens on my course, or what happens around here? Feedback on the experience, or what I think the outcomes will be? General, or broad? Contentment? Satisfaction? Recommendation? And so on.
What’s missing altogether is evidence of clear strategic decision making – all we get on the big questions of the sort posed here is that the pilot “was developed following a series of workshops with a range of stakeholders and cognitive testing with students”. As we’ve noted on the site before, without clarity over purpose and direction extracted from synthesis questions from a consultation of that sort, you just end up with a mess.
There’s also no word on free-text comments and analysis (it’s surely handy to know why students feel the way they do), no word on why the questions don’t link explicitly to the emergent definitions of quality in the regulatory framework, no sense of why we’re asking every student in the country about institution-level aspects that could just as easily have a sample frame instead, and no clarity on whether we’ll introduce students to what amounts to a national statement of a “good” student experience at some point before the end of their course.
Whatever it was that Michelle Donelan and her advisors at DfE wanted to come out of the NSS review, it very clearly is not what we now have in front of us.
Far from exposing the “downward pressure on standards” caused by the NSS, the first phase of the review highlighted how the survey was used to drive standards upwards within providers. There has been no evidence – despite some very leading questions to SUs – of widespread “gaming” of the survey, and the lack of correlation between NSS and other measures of quality must at least admit the possibility that other measures of quality are not, in fact, measures of quality.
Though there is a bureaucratic burden on providers, it would be replicated by the use of other survey tools even if NSS was axed tomorrow – and the other three UK regulators feel that the current arrangements are fine.
And of course, the “radical review” which was to be concluded by the end of 2020 is neither especially radical nor complete even as we move into 2022. The whole episode has been one long embarrassment for DfE (particularly) and OfS.
Survey questions can and should iterate as understanding develops, but this should be balanced with maintaining a clear rationale and the need to preserve a time series. Where questions are unclear (arguably the issues around learning resources have been for a while) changes should be made, but making wording changes for the sake of it is hardly good research practice.
Maybe some will find some sensible changes here, but it is not clear they are transformative enough to outweigh the loss of historic comparators.