This article is more than 3 years old

This is no way to review the National Student Survey

A shortened review, done in difficult times, and without proper representation on the review panel will not improve the National Student Survey, says Gwen Van der Velden.
This article is more than 3 years old

Gwen van der Velden was until recently deputy pro vice chancellor for education at the University of Warwick. She is currently on sabbatical as co-leader of the Leadership for Educational Transformation programme for 40 education leaders from Ukraine

Last week the Office for Students published an update to the membership of the NSS Review Group.

One of the terms of reference of the OfS NSS Working Group is to “ensure the UK wide role of the survey is considered in any recommendations”. And so the question arose: How many representatives from Scotland would you think would be on the group?

The answer is “none”. Northern Ireland appears also not to be represented. Somehow it appears the Review Group continues to struggle on representation.

What has happened?

We should assume this has occurred for understandable reasons, such as the pressures the current pandemic puts on the system and therefore the people in that system. Or there may be other considerations at play. Northern Ireland often follows an English policy position, but that’s hardly the same for Scotland. Perhaps invitations were made and declined.

I find myself wondering whether it makes any sense to continue this review, with such a limited review group and while the sector is undergoing massive changes? I have serious reservations, having myself been an expert member of the review of the NSS in 2017, a HEPSIG member and later chair of the NSS subgroup of HEFCE’s Student Information Advisory Group (SIAG).

In that earlier review, which ended just three years ago, we ensured all UK countries were represented, all mission groups, all types of HE providers, and above all, students. It is unclear why this is no longer a valid approach. In fairness, the sizes of the review group and advisory groups were considerable, which was time consuming and at times caused great debate. Every aspect of each proposed change was questioned by some, and supported by others. It took quite some effort at times to ensure progress.

But it was also what ensured that every aspect of the survey was exposed to extensive scrutiny. Its applicability to all parts of the sector, and its relevance to students, was tested. It remains unclear how the same level of rigour is being achieved in this review, as it presses on during pandemic times with no current students on the panel and limited representation. And it is not as if we do not know that diversity leads to better decisions.

A fixed star

Taking a review approach informed by a wide range of voices from the sector matters even more right now, when providers’ approaches to delivery are going through a sea change. Good quality data is crucial to understanding and evaluating what happens in the sector right now.

Institutions are grappling with the impact of having moved assessment online this summer, and having shifted teaching and learning towards a myriad of blended and hybrid learning forms. We are all working out how new methods have impacted the student experience, and we are seeing gains for some student groups, and losses for others. Our internal feedback efforts tell us students largely want a return to on-campus study, but not to the extent of returning to delivery and engagement “as before”. We’re rethinking platforms, assessment routines, student data use and above all, student engagement with study in a very new setting.

Next year’s NSS could help us understand initial reactions to the revised student experience. We can compare this year’s exceptional cohort to previous cohorts, how different groups have responded, and crucially, how different approaches across different parts of the sector have created more or less success. I am aware that several universities have changed their open questions to hear from students directly what they feel should be maintained of the changes that were made this year. Imagine the learning that can come from this across the sector and how it will inform a better student experience.

No going back

Knowing full well that the sector will not return to a bygone era, we need to rely on sound data to inform the ongoing transformation of the sector. We need to see how the next few cohorts of finalists respond. In 2021-22 our current second years, who have experienced our traditional approaches, then a year of blended learning and in their final year the first version of a ‘new normal’ could be of enormous influence on the sector. In 2022-23 we will receive feedback from students who started in exceptional circumstances will have seen our increasing understanding of new methods of teaching. It will tell us more about the pedagogical transformation across the sector than any other mechanism. After all, the NSS still provides the only constant data collection across the UK sector that allows all institutions to learn from each other –independent of whether they can afford taking part in any private survey offers. The next few years will only help the sector further on that – but only if there is an NSS continuity of data we can rely upon.

Ironically, the first of the terms of reference of the review itself states that the review must “Clarify the purpose of the NSS in the current higher education landscape.” That landscape contains all four countries, all manner of different institutions and before we forget it: students. And we cannot make the mistake of interpreting the HE landscape as just being the UK. It must also relate to the international standing of the UK sector for the education it offers.

The view overseas

As we are about to become much more isolated in the global HE scene than we have ever been before, the UK’s higher education reputation greatly matters. We may have gained credibility on the research side with the recent and welcome vaccine research at Oxford, yet some of our responses to the pandemic have not done us proud in similar ways. The NSS contributes to our international standing, as one of a limited number of countries where students’ experiences of study and their university actually matter – a welcome message to parents and students nationally and internationally. As a sector we can and do provide evidence that very high numbers of our students, year on year, record high levels of satisfaction on core aspects of our offer. Go to any international recruitment fair, look at our prospecti or simply the websites we all produce. Agree with the NSS or not, we know it helps us to state how well our students think our universities do.

Fortunately, the sector knows and understands most of this. Like in previous reviews of the NSS I expect the consultations (the survey on the survey) to tell the review group this time, exactly what it has told review groups previously. The sector understands the need for continuity and supports it. Universities rely on the data and want the student voice to remain in the policy game. I fully expect that at this time of major change and insecurity, that is exactly what the review group will have been told by the sector again.

3 responses to “This is no way to review the National Student Survey

  1. No invitation was issued to Scotland. Requests from Scotland to be formally part of Phase 1 WG have been declined.

  2. I think this takes a somewhat rosey view of the NSS; the survey is so laden with biases caused by student characteristic, social background and study context as to be virtually useless as a tool for comparison.

    Designing a survey tool by committee, would be an approach that i hope the majority of students that study psychological and social phenomena would see as flawed an un-academic.

Leave a Reply