As a new intake of students excitedly buzzes around campus during induction week, I think about the messages that we give them during this time all with a focus on success, engagement, support, development and partnership. During the earliest weeks of the semester first year students are recruited to become course reps: we will empower them to represent their fellow students and make a positive difference to the experience of those students they represent… but aren’t we just lying to them?
While all universities have mechanisms for feedback and student representation, the latest NSS results make it clear that though we provide the opportunities, we fail to demonstrate the impact of the student voice. Question 25 of the NSS, “It is clear how students’ feedback has been acted on”, is the lowest scoring institutional question in the NSS nationally (excluding Q26 regarding students’ unions), and with the subject-level TEF on its way this should be a matter of real concern.
The changing student voice
There have been attempts over the years to define the relationship between universities and students resulting in a spectrum of views from “students as customers” on one end to “students as partners” on the other. As highlighted earlier by Kate Little on Wonkhe, the inclusion of questions regarding ‘student voice’ in NSS was a positive indication that the sector was beginning to catch up with the QAA’s push for student partnership – outlined in Chapter B5 of the Quality Code.
We see consensus across the sector – in the UK, but also in the US – that student engagement leads to both positive development of institutional policies and practices, and that engagement helps to create a sense of community on our campuses – which in turn results in the higher levels of retention, academic success and student satisfaction.
However, NSS results demonstrate that while we provide feedback opportunities, we fail on a local level to demonstrate the impact of the student voice. The risk of feedback without impact is that, even at the most local level of feedback such as module evaluations, it becomes little more constructive and developmental than automatically giving your Uber driver a 5* rating. I acknowledge that many institutions make effort to run campaigns to “close the feedback loop” such as “You said, We did” and I would argue that huge steps have been taken by universities to begin embedding a sense of partnership with their students – but it obviously is not enough.
TEF: Shouting into the void
Unfortunately, the issue goes beyond individual universities. Jo Johnson has halved the weight of the NSS in the next TEF which, while welcomed by some, demonstrates to students that their voice is less important than before despite entering a system they are supposedly at the heart of. The introduction of the subject-level TEF is supposed to “provide even better information to students, and be an even more powerful driver for quality and value” and yet the very metrics for ranking include a diminished student voice. A poor signal to send, as we should draw on the student voice to bring about positive change within our institutions.
So it would be simple for institutions to put the student voice agenda on the back burner following the TEF reforms, and given the reluctance of the National Union of Students to engage with the NSS further you could understand the grounds on which such a tacit decision might be made. But given the research available we cannot lose sight of question 25 when considering subject-level TEF.
Question 25 does not form part of specific TEF metrics. I see it as one of the single most important questions within the NSS as it gives an indication of how students truly see their relationship with the institution, even at a programme level. While I wish avoid any accusation of encouraging ‘’customer’’ reviews – if the TEF were truly to provide information to students about the University at which they intend to study, then surely the voice of students needs to be acted on, not just heard. Given the existing metrics used, the NSS results for the ‘student voice’ institutional questions should become a priority at all levels.
If we can build a true partnership with our students, one which demonstrates our willingness both to listen and to adapt our practices to their needs, then maybe the TEF Gold Award will follow… or at least a gold star.
Hi Zak, Interesting article.
My only comment (and potentially controversially) on Q25 would be that, as is the case with many of the NSS questions, the devil is in the detail…
Q25 is asking how well the University has responded to feedback, but ultimately this depends on how engaged the student is in the first place. Students who barely attend or ignore online fora may simply not know how to respond to this question and mark it a neither agree or disagree – so 3/5. However that might be within the context of an institution doing a hell of a lot of responding to students or even nothing at all.
NSS will only be truly useful once we start understanding how engaged students are when they feedback, otherwise we are lumping together students who do little towards their degree with those that do a lot – i.e. those who have barely taken part with those who have a true understanding of what the University has to offer. HEFCE would argue that the mean allows institutions to know what the average student feels – but let’s be honest, it is more important that we know what the students who have actually fully taken part in their degree feel not those who didn’t attend.
It’s time HEFCE started looking at NSS with more finesse in it’s focus if it is to be used by Government to assess teaching quality.