It’s been easy to write off the review of the Office for Students’ review of NSS as something of a damp squib.
We’ve come a long way from the apocalyptic language in the original bureaucracy memo – quite where those ideas came may only be revealed in Michelle Donelan’s hotly anticipated autobiography – and phase two of the review left us looking at an evolution rather than a revolution.
Since then OfS has been testing both statements and direct questions on a subsample of 2022 students. There was stuff to welcome (on mental health), stuff to ponder (the loss of the “community” question and the possible move to statements rather than questions), and stuff to grin and bear (the inevitable “free speech” question). And of course the all important replacement of the word “satisfaction” with the more nebulous speculation about “the quality of my course” (compared to what, exactly?).
This year’s results were useful in highlighting the issue of course overcrowding at more “prestigious” universities. Across the sector course leaders and student representatives are using the data to address student concerns in future iterations – alongside, of course, the myriad other data sources gathered from students throughout the course.
What to keep
Or, first up, what isn’t?
There’s no chance to comment on the aims and purpose of the survey, or the approach to surveying students (NSS remains a population survey). Publication of results is deferred to a future technical consultation next year – hopefully OfS will have sorted out publication more generally by that point. The long rumoured extension of NSS to cover all years of study is not up for debate, having been out of scope for the initial review. And there’s still no sign of a PGT NSS (announced here by OfS’ Director of External Relations in 2018), PGRs have barely ever been mentioned, and those yearning for national pre-arrival questionnaires or a consistent way of capturing the views of those who “drop out” can go whistle.
The core criteria for questions – that anything asked in the NSS should concern the concern the academic experience (and especially learning and teaching), inform student choice, enhance the student experience, and ensure public accountability – is up for grabs, but the default proposal is that these should not change from those published in 2017. We do look forward to seeing people argue that the current moral panic about freedom of expression and freedom of speech is an issue for “learning and teaching”, or “issue of enduring importance in UK higher education rather than a transient policy interest” or indeed that it is a “measurable and valid issue”.
Satisficing
The first key proposal is a shift to direct questions rather than statements. There’s further testing and piloting to come, which will result in tweaks to wording and ensure that each statement deals with only one issue. We get a bunch of examples of how this might be done – for example question 2 (“Staff have made the subject interesting”) might become “How often do teaching staff make the subject engaging?”.
To me, at least, that’s worse rather than better. What if staff made the subject interesting once at the start of year one, to the extent that students were enthralled enough to work through three intense years of physiology (fascinating, of course, but maybe not always)? That would score highly on the old question and poorly on the new one. And we’d be using a frequency scale – We have no idea how this would be derived or if this would resort to the “always, sometimes… never” type construction that seldom survives cognitive testing.
Why are they doing this? Well, the claim is agree/disagree scales can lead to acquiescence bias (subjects agree because they want to be agreeable) – this is backed up with a link to Jon Krosnick’s seminal paper on “satisficing”. In very broad layman’s terms this sets out that surveys are work, and respondents are likely to take measures to make answering easier. The bit about agreeing with everything refers only to binary (agree/disagree) responses and generally applies to interview-style surveying – there’s also sections about endorsing the status quo (not really applicable here) and selecting the first acceptable alternative response (doesn’t apply either).
In other words, the research cited doesn’t back up the claim made. Poor show. And, funnily enough, OfS’ own pilot showed that students were less likely to be able to answer direct questions, and that there was no “satisficing” effect recorded.
Summative
The overall question – “Overall, I am satisfied with the quality of the course” could be removed in England. Why? It would be – and we quote directly from paragraph 43 here – “too consumerist in nature”. They’re also unhappy that people are using it in league tables.
And yes, that is just the OfS position. Regulators in Scotland, Wales, and Northern Ireland are apparently fine with it. You are not dreaming, you are awake and reading an article on Wonkhe.
In OfS regulation (for example in TEF and conditions B1, B2, and B4) we’re already seeing the use of individual questions and scales. In contrast, SFC uses the overall question as a key performance indicator, HEFCW uses it in regulatory analysis, and DfENI uses the summative question in annual provider review reports. All four UK regulators are keen to keep the survey comparable UK wide – but apparently Q27 is the hill on which OfS wish to die.
the benefits that some have identified in maintaining the same summative question across the UK is outweighed in England by the need to ensure clear links between the information provided by the NSS and the aspects of quality that are subject to regulation.
It feels worth noting that no regulator currently uses every question in the NSS, and yet somehow the sector survives.
Speech
Freedom of expression was, apparently, “a theme consistently raised by stakeholders”. We wonder which ones – seriously, there’s nothing on this in the phase 1 report, or the spartan Annex E of this survey that covers stakeholder engagement. Here’s the token paragraph of justification:
Freedom of expression is an essential element of students’ higher education experience. They are entitled to be taught by staff holding a wide range of views, even where these may be unpopular or controversial, and to similarly express their own views. The OfS receives notifications from staff and students who identify a ‘chilling effect’ on their ability to express their cultural, religious or political views without fear of repercussions
Students understand “freedom of expression” as a concept better than “freedom of speech” – so the proposed question will be something like:
During your studies, how free did you feel to express your ideas, opinions and beliefs?
With the opportunity to indicate that you felt very free, fairly free, not very free, or not at all free to do what you want, any old time. (I’m a new creation.)
Does that question, we have to ask, address the issue as identified? It doesn’t ask anything about the views held by staff, and it doesn’t mention repercussions – though you can of course have freedom of speech alongside repercussions, especially if the repercussions are themselves freedom of speech. (DK tells Jim that Eurovision is terrible, Jim tells DK that he’s a fool to think so, DK claims that he has been “silenced” and writes a column for Unherd.)
Curiously, the “freedom of expression” question will be the last thing that students (in England, at least) answer in the survey – after the core NSS, free text response, and optional banks. We’re not given a reason for this placement, but there is a chance to feed back on it.
Support
A wide range of stakeholders, including students, raised mental wellbeing as an issue that should be examined via NSS. Two questions were originally tested, one on the adequacy of mental health provision, and one on awareness of mental health support. The former resulted primarily in “don’t know” responses during the pilot, suggesting that a minority of students have direct experience of mental health support services, so the proposed question will focus on awareness.
Here’s the example given:
How well communicated was information about your university or college’s mental wellbeing support services?
(Seriously? The editor in us wants to change this: “How well has information about your provider’s mental wellbeing support service been communicated to you?” at least reads well…)
Anyway, for some reason students would get six shades of response here (extremely well, very well, fairly well, fairly badly, very badly, extremely badly along with the standard “this does not apply to me”). As feedback on provider communications with students we guess this works – but it does illustrate the issue in using a population survey to ask about issues that only affect a minority of students.
Clearly we can’t survey only students who have experienced mental health or wellbeing issues (and these are two separate issues, We don’t know why they have been lumped together) on a national level without ethical problems, but it is not clear what – if anything – this alternative does. Ironically the question on feeling part of a community of staff and students (the “belonging” question) – which we’ve previously found to have a strong correlation with wellbeing and continuation – has been removed with no justification at all.
Cyclical
Reviewing things every four years or so is generally a good idea. Once we did it with universities. In the past NSS reviews have been adhoc, so this proposal puts changes on a defined cycle. It’s not expected that changes would be made each time – we’re still keen that we can develop a meaningful time series so we can track trends (not least in regulation).
Because of planned links to Data Futures, the survey window will be slightly shorter if proposals in this consultation are accepted – starting in early February and ending in early April. Publication would still happen in July. The benefit here is reducing provider burden, allowing for the reuse of data already collected. Many providers already start the survey in mid-February, and this does not appear to have had an impact on response rates.
There’s also a set of questions on the impacts proposed changes to the survey will have on the use of the Welsh language. It is a little surprising that there is no narrative in here – it shouldn’t have been impossible to ask a Welsh speaker with expertise in higher education (there’s quite a few in HEFCW) to examine some of the likely issues.
Structure
We are told that the “preferred” version of the survey is basically option one, as trialled with students earlier this year. We tabulated these, including links to existing questions, in this article. In actual fact there have been a fair few changes since then. Here’s where we are:
Group | Number | Proposed question | Scale |
---|---|---|---|
Teaching on my course | 1 | How often are teaching staff at explaining course content/things? (sic) | Frequency |
Teaching on my course | 2 | How often do teaching staff make the subject engaging? | Frequency |
Teaching on my course | 3 | How often is the course intellectually stimulating? | Frequency |
Teaching on my course | 4 | How often does your course challenge you to achieve your best work? | Frequency |
Learning opportunities | 5 | To what extent have you had the chance to apply theories and concepts that you have learnt? | Extent |
Learning opportunities | 6 | To what extent have you had the chance to explore ideas or concepts in depth? | Extent |
Learning opportunities | 7 | To what extent have you had the chance to bring together information and ideas from different topics? | Extent |
Learning opportunities | 8 | When working with other students as part of your course, how helpful was this for your learning? | "Helpfulness" |
Learning opportunities | 9 | To what extent does your course introduce subjects and skills in a way that builds on what you've already learnt? | Extent |
Learning opportunities | 10 | To what extent does your course have the right balance of directed and independent study? | Extent |
Learning opportunities | 11 | How well has your course developed your knowledge and skills that you think you'll need for your future? | "How well" |
Assessment and feedback | 12 | How often have assessments allowed you to demonstrate what you have learnt? | Frequency |
Assessment and feedback | 13 | How clear were the marking criteria used to assess your work? | Clarity |
Assessment and feedback | 14 | How fair has the marking and assessment been on your course? | "Fairness" |
Assessment and feedback | 15 | How timely was your feedback? | "Timeliness" |
Assessment and feedback | 16 | How often has feedback helped you to improve your learning? | Frequency |
Academic support | 17 | How easy was it to contact teaching staff when you needed to? | "Easy" |
Academic support | 18 | How well have teaching staff supported your learning? | "How well" |
Academic support | 19 | How often were you able to get good advice about study choices? | Frequency |
Organisation and management | 20 | How well organised is your course? | "Organised" |
Organisation and management | 21 | How clearly were any changes to the course communicated? | "Clearly" |
Learning resources | 22 | How often have you been able to access the learning resources (either digital or physical) that you need? | Frequency |
Learning resources | 23 | How well have the physical and/or digital resources supported your learning? | "How well" |
Learning resources | 24 | How well have the IT resources and facilities supported your learning? | "How well" |
Learning resources | 25 | How well have the library resources (e.g. books, online services and learning spaces) supported your learning? | "How well" |
Student voice | 26 | To what extent do you get the right opportunities to give feedback on your course? | Extent |
Student voice | 27 | To what extent are students' opinions about the course valued by staff? | Extent |
Student voice | 28 | How clear is it that students' feedback on the course is acted on? | "Clearly" |
Student voice | 29 | How effectively does the students' union (association or guild) represent students' academic interests? | "Effectively" |
Summative | 30 | Overall, how would you rate the quality of your course? (S/W/NI only) | "Good" |
Open text | 31 | Open text | Open text |
Mental wellbeing services | 32 | How well communicated was information about your university or college's mental wellbeing support services? | "How well" |
Freedom of expression | 33 | During your studies, how free did you feel to express your ideas, opinions and beliefs? | "How free" |
This is far from a settled set of questions, and notes in annex D of the consultation offer some rationales for changes. We learn that the frequency scale we slagged off above is “used to facilitate averaging out views over the student’s experience” – there’s a lot of “how often” constructions creeping in and I’m a little nervous about focusing on frequency rather than impact.
There’ll be some consolidation in the “learning opportunities” category, with the group work question also needing further thought. “Breadth and depth has been replaced with an idea of “building on what you have already learned”, and there’s questions on “the balance of directed and independent study” and on “employment skills” that have been sneaked in without consultation (you should probably feed back on that).
The new SUs question is remarkable – we know from research that asking about students’ unions’ impact on students’ “academic interests” confuses them, so the pilot tested “contentment” with the SU in general (where contentment was probably too close to satisfaction) and a variant based on the “impact” of the SU on academic experience which performed really badly under test conditions. The settled proposal here is to ask how effectively the students’ union (association or guild) represents students’ academic interests – which given the vast majority of providers on the OfS register don’t have an SU, and the whopping third of students who answer “neither agree not disagree” on the current academic interests question, feels entirely the wrong place to have ended up.
Compared to the current survey we have lost questions on timetabling (despite the consistent link over time to overall satisfaction), feeling part of a community (thank god no-one’s worried about belonging at the moment), and the overall question – and the three “resources” questions have been expanded into four.
Surprise
So we are looking at some fairly major changes to the survey, meaning we can no longer construct time series going back beyond 2023. This is par for the course in updating a survey – and we already struggled to look past 2018. We do feel like it would be interesting to examine if student relationships with and expectations of their provider have changed permanently since the pandemic, and whether this varies by subject, provider, and student characteristics.
There is a lot to welcome. The demise of the five-point Likert scale goes along with currently established best practice in survey design, the tweak to the start date does not appear to cause completion issues and helps reduce burden on providers, and putting reviews of the survey on a cyclical footing helps avoid nasty surprises in future.
The shift to direct questions frankly, doesn’t convince. A lot of the wording is clumsy, we’re adding in ideas of frequency where these aren’t germane to what is being asked, and we are breaking a lot of question level time series that could otherwise be very helpful in understanding longer term trends.
The “freedom of expression” question is outside of the core survey criteria, is phrased badly, and has most likely been added at the instigation of ministers. We would be interested to see a population survey on student attitudes towards freedom of expression – but I’d want to see this done in a way that could be compared with the general population, perhaps in collaboration with ONS.
There’s also scope to work with ONS on wellbeing questions – how good the posters in the toilets are is probably not what we need to be measuring in order to better support student mental health – we would argue the questions directly on the wellbeing of final year students would be a far better use of survey space.
The SU question needs changing desperately- it barely encompasses the breadth of work of SUs nowadays
One can only assume that the proposed Q1 is an exercise in testing whether the consultation works. The errors are in the original document at p. 33. Presumably “often” should read “good”, and the “good” scale should be deployed.
“Work on this question is ongoing.”
They’ve now corrected it: “How often are teaching staff good at explaining course content/things”. So if it’s not exactly clear, it’s at least a bit less unclear.
I think the “at” may be rogue
The OfS ran a PGT Survey pilot with a number of providers this year. Next steps are awaited
‘During your studies, how free did you feel to express your ideas, opinions and beliefs?’
From previous experience having difficulty designing a survey question that presumed student knowledge of current debates, I don’t think the majority of students will understand this question as phrased in the spirit it is intended and will instead take it literally.
I think if the question is phrased like this, arts and humanities subjects will be more likely to score highly and STEM subjects low – because students will assume that the question means expressing their ideas during assessments and formative projects (of course there are some STEM subjects that are creative so will let students express their ideas, but broadly speaking!). It will be interesting if my prediction is true and the government get the opposite result to that they are hoping for!