This article is more than 2 years old

The new NSS will leave you asking why

The Office for Students has published proposed new questions for the National Student Survey for 2023. David Kernohan and Jim Dickinson want answers.
This article is more than 2 years old

David Kernohan is Deputy Editor of Wonkhe


Jim is an Associate Editor (SUs) at Wonkhe

Your bus is an hour late, you don’t know if the catering on campus will be open and your now online exam has just been yanked because the remote proctoring company is short staffed.

Ideal circumstances in which, then, to launch the National Student Survey 2022. If you think that Covid chaos will largely be over by February and you’re a provider that opted for an earlier collection period, you’ll be kicking yourself now.

But NSS 2022 isn’t the big news here. Alongside minor revisions to the documents that accompany the launch at this time of year, the Office for Students (OfS) has published a proposed question set for 2023 onwards as a kind of staging post in its protracted review of the survey.

There’s a further, UK wide NSS consultation to look forward to in the summer, ahead of potential changes in the 2023 iteration. This will be informed by a pilot survey, with some lucky 2022 respondents getting the chance to respond to one of two scenarios, and yet another round of engagement events examining potential changes to data publication – starting this month with SUs and other student representative bodies.

This is all just the latest iteration of a process that started in December 2020 – the interim (phase 1) findings very helpfully moved things away from the bizarre scorched earth language from DfE that kicked this whole review off, and thus far we have been looking at some tweaks to wording and the removal of the word “satisfaction” as the major changes to come. Two scenarios are presented here.

Questions

We’ve tabulated the current questions, alongside “scenario 1” and “scenario 2”, here – drawing links between similar questions where we can:

Number2022 Text2023 S12023 S1 Text2023 S22023 S2 Text
Q011. Staff are good at explaining things.1Teaching staff are good at explaining the course content1Are teaching staff good at explaining course content?
Q022. Staff have made the subject interesting.2Staff have made the subject engaging2Do teaching staff make the subject engaging?
Q033. The course is intellectually stimulating.3My course is intellectually stimulating3Is the course intellectually stimulating?
Q044. My course has challenged me to achieve my best work.4My course has challenged me to achieve my best work4Do you feel challenged by your course?
5There is an appropriate balance of breadth and depth in the content of my course8Does your course contain the right balance of depth and breadth?
6The balance of directed and independent study on my course supports my learning well9Does your course contain the right balance of directed and independent study?
Q055. My course has provided me with opportunities to explore ideas or concepts in depth.7My course has provided me with opportunities to explore ideas or concepts in depth.
Q066. My course has provided me with opportunities to bring information and ideas together from different topics.8My course has provided me with opportunities to bring information and ideas together from different topics.6Have you had the chance to bring together information and ideas from different topics?
Q077. My course has provided me with opportunities to apply what I have learnt.9My course has provided me with opportunities to apply what I have learnt.5Have you had the chance to apply the theories and concepts that you have learnt?
Q088. The criteria used in marking have been clear in advance.11The criteria used in marking have been clear in advance.13Were you given the marking criteria in advance?
Q099. Marking and assessment has been fair.12Marking and assessment has been fair.11Has marking and assessment been fair?
Q1010. Feedback on my work has been timely.13Feedback on my work has been timely.
Q1111. I have received helpful comments on my work.14I have received helpful comments on my work.
15Assessments have allowed me to demonstrate what I have learned on my course10Have assessments allowed you to demonstrate what you have learnt?
12Did you understand the marking criteria used to assess your work?
14Has feedback helped you improve your work?
Q1212. I have been able to contact staff when I needed to.15Are you able to contact teaching staff when you need to?
Q1313. I have received sufficient advice and guidance in relation to my course.
16How well have teaching staff supported your learning?
Q1414. Good advice was available when I needed to make study choices on my course.17Are you able to get good advice about study choices?
Q1515. The course is well organised and running smoothly.18Is the course well organised?
Q1616. The timetable works efficiently for me.
Q1717. Any changes in the course or teaching have been communicated effectively.19Have changes to the course been clearly communicated?
Q1818. The IT resources and facilities provided have supported my learning well.
Q1919. The library resources (e.g. books, online services and learning spaces) have supported my learning well.
Q2020. I have been able to access course-specific resources (e.g. equipment, facilities, software, collections) when I needed to.
16It has been easy to access learning resources (digital and physical) provided by my institution when I needed to.20Have you been able to access the learning resources (either digital or physical) that you need?
17Learning resources (digital and physical) provided by my institution have supported my learning well21How well have the physical and/or digital resources supported your learning?
Q2121. I feel part of a community of staff and students.
Q2222. I have had the right opportunities to work with other students as part of my course.10I have had the right opportunities to work with other students as part of my course7When working with other students as part of your course, was this helpful for your learning?
Q2323. I have had the right opportunities to provide feedback on my course.22Do you get the right opportunities to give feedback on your course?
Q2424. Staff value students’ views and opinions about the course.23Do staff value students' opinions about the course?
Q2525. It is clear how students’ feedback on the course has been acted on.24Do staff act on students' feedback?
Q2626. The students’ union (association or guild) effectively represents students’ academic interests.18Overall, I am content with the students' union (association or guild) at my institution25Has the Students' Union (Association or Guild) had a positive impact on your experience?
Q2727. Overall, I am satisfied with the quality of the course.19Overall, the quality of my course has been good.26Overall, how would you rate the quality of your course?
27On a scale of 0 - 10 how likely are you to recommend your course to a friend or a colleague?
20My institution has made me aware of services to support my mental wellbeing28Are you aware of services at your university/college to support your mental wellbeing?
21My institution's services to support my mental wellbeing were available when I needed them29How easy is it to access your university or college's mental wellbeing services?
22My institution provides a free environment for the expression of ideas, opinions and beliefs30During your studies, have you felt free to express your ideas, opinions, and beliefs?
23My course has given me the knowledge and skills I think I will need for the future31Has your course given you the knowledge and skills you think you will need for your future?

The core survey

In its current form, the NSS is a questionnaire that is very much about the “academic” student experience.

You’ll see that scenario one is quite close to the current model (using the same response framework) – what’s curious is the omission of all the “organisation and management” questions. These have long been demonstrated to be a close determinant of overall satisfaction – they matter hugely to students on the basis that disruption is the enemy of learning – so their omission is worrying.

You’ll also see that in one scenario almost of the questions about student feedback and representation are missing, and in another all of the questions about staff supporting learning outside of formal teaching tasks have been expunged. The message seems to be “the sector can have one or the other”, without any decent evidence on why these two areas are less of a priority than others.

In format terms, scenario two moves to posing questions rather than seeking agreement, using more active and student-centric language and a four point Lickert scale in an approach that is, ironically, much closer to the orthodoxy around consumer satisfaction. Both scenarios at least differentiate between “don’t know” and “not applicable” – an improvement on current practice.

And both new versions tidy up confusion in questions around learning resources by generalising in ways that might generate confusion in local accountability – moving away from questions based on specific services (libraries, IT) to a question on ease of access and one on the usefulness of the resources.

Gone from both scenarios is the question on feeling part of a community of staff and students – a problem in principle given how important other students are to the learning experience, and a problem in practice given that we’ve previously shown how close a predictor that question is of student mental health.

The margin

Beyond the old core focus on the student academic experience, things become harder to understand. Both scenarios add a question on free speech – one option asks how free students have felt themselves in expressing their “ideas, opinions, and beliefs”, another version asks them about the extent to which they think their institution provides a free environment for the expression of ideas, opinions and beliefs.

Surely we can agree that the survey should at least stick to asking how the respondent has experienced their provider rather than their views on what they’ve heard happens on someone else’s course?

Mental health was always going to be tricky. Here both options include awareness of services and access to those services. Maybe it’s too hard to do in the NSS – but to miss the mental health impact both of what happens on the course itself, and in relation to the community and culture on campus that is related to peers seems to be a real omission.

Scenario two raises the prospect of a net recommender score by provider – asking students to rate how likely they are to recommend their course on a scale of 0-10. The big question for league table compilers will either be “overall, the quality of my course has been good” (S1) or “overall, how would you rate the quality of your course” – moving the goalposts from the current subjective and personal assessment of satisfaction to a comparison between the course as experienced and a Platonic ideal of higher education. Good in comparison to what?

And the revised questions on students’ unions are amusing. Having received the edict not to use the word “satisfaction”, and having had a lot of feedback from SUs on the old wording of the union “representing academic interests”, students will instead be invited either to convey the positive impact that the SU had had on them, or more bizarrely, express that they are “content” with the SU. What will it mean if 50 percent of students are “strongly contented”? And why are we still asking a question about something that the majority of providers on the register don’t have?

Using the survey results

For all the way the NSS has crept into regulation and external quality assurance, the real value of the survey is in transparently holding a provider to account. SUs have long used responses to press for improvements and rethinks as a micro level, and anyone who has run a course or subject area knows the horror of being called before the Dean or HoD to explain dips and discrepancies.

With this in mind, the continued existence of an overall quality question – and, god forbid, net recommendation scores – feels like a sop to the mis-use of a formative survey by regulators (and, indeed, journalists). There’s scant evidence of the use of NSS scores by applicants in deciding where to study – though they do turn up on Discover Uni for whatever that is worth.

The new questions do seem to have been designed with regulatory use in mind – the “chilling effect” will be illustrated with national statistics so let’s hope that one has been cognitively tested properly, and the mental wellbeing one smacks of ammunition for regulatory intervention – although questions on identity and safety that were floated in the 2019 PGT pilot haven’t made it in.

Likewise, the question on directed and independent study could very easily be parlayed into another “contact hours” panic – or, at least, it could if it wasn’t phrased so ambiguously that a student who felt there was too little directed study and a student who felt there was too much would tick the same box!

Looking across the two scenarios, there’s both a lack of a rationale or justification for changes, and plenty of inconsistencies. The whole student experience, or just the academic experience? My experience, or the culture on campus? What happens on my course, or what happens around here? Feedback on the experience, or what I think the outcomes will be? General, or broad? Contentment? Satisfaction? Recommendation? And so on.

What’s missing altogether is evidence of clear strategic decision making – all we get on the big questions of the sort posed here is that the pilot “was developed following a series of workshops with a range of stakeholders and cognitive testing with students”. As we’ve noted on the site before, without clarity over purpose and direction extracted from synthesis questions from a consultation of that sort, you just end up with a mess.

There’s also no word on free-text comments and analysis (it’s surely handy to know why students feel the way they do), no word on why the questions don’t link explicitly to the emergent definitions of quality in the regulatory framework, no sense of why we’re asking every student in the country about institution-level aspects that could just as easily have a sample frame instead, and no clarity on whether we’ll introduce students to what amounts to a national statement of a “good” student experience at some point before the end of their course.

Satisfactory?

Whatever it was that Michelle Donelan and her advisors at DfE wanted to come out of the NSS review, it very clearly is not what we now have in front of us.

Far from exposing the “downward pressure on standards” caused by the NSS, the first phase of the review highlighted how the survey was used to drive standards upwards within providers. There has been no evidence – despite some very leading questions to SUs – of widespread “gaming” of the survey, and the lack of correlation between NSS and other measures of quality must at least admit the possibility that other measures of quality are not, in fact, measures of quality.

Though there is a bureaucratic burden on providers, it would be replicated by the use of other survey tools even if NSS was axed tomorrow – and the other three UK regulators feel that the current arrangements are fine.

And of course, the “radical review” which was to be concluded by the end of 2020 is neither especially radical nor complete even as we move into 2022. The whole episode has been one long embarrassment for DfE (particularly) and OfS.

Survey questions can and should iterate as understanding develops, but this should be balanced with maintaining a clear rationale and the need to preserve a time series. Where questions are unclear (arguably the issues around learning resources have been for a while) changes should be made, but making wording changes for the sake of it is hardly good research practice.

Maybe some will find some sensible changes here, but it is not clear they are transformative enough to outweigh the loss of historic comparators.

5 responses to “The new NSS will leave you asking why

  1. The current NSS uses the term “staff” throughout. But the new scenarios can’t decide whether it’s “staff” or “teaching staff” only. There is inconsistency right at the start scenario 1. Q01 is about “teaching staff”, but Q02 uses the broader term “staff” – looks like a typo, or a sly piece of survey testing! The morphing between “staff” and “teaching staff” occurs also in comparing the current Q12 with scenario 2 Q15 where only contacting “teaching staff” now matters. Similarly the old general advice & guidance Q13 goes, to be replaced only in Scenario 2 with Q16 asking about “teaching staff” supporting learning. All of this points to confusion about who enables, enthuses and supports student learning. There should be due recognition that professional service colleagues can have a profound impact on student learning, outcomes and experience. One hopes it’s always a positive impact, but students should be able to give a view in the NSS, and not restricted to “teaching staff” only. And then by Q23 and Q24 in Scenario 2 we’re back to all “staff” valuing and acting upon student feedback.

  2. I disagree with your assessment that the new version ‘tidies up’ confusion around learning resources. With these new questions, we won’t know whether students are reflecting on library resources, IT, or other resources shared by academics such as powerpoint slides or lecture recordings.

  3. Not sure I agree, David – I would suggest that the whole episode has been one long embarrassment for (particularly) OfS, or it would be if that were an emotion that OfS were capable of. I don’t see that any of the revised questions are clearer or more useful than those which went before. It demonstrates some very muddled (or addled) thinking, and you’d expect the simultaneous publication of a document which provides some sort of rationale. Except, as this is OfS, you wouldn’t. sed quis custodiet ipsos custodes?

  4. “During your studies, have you felt free to express your ideas, opinions, and beliefs?”

    So a student who feels that they have not had the opportunity to discuss “the great replacement” or why the Holocaust wasn’t real will answer no ?

Leave a Reply