And so the 2019 National Student Survey results have been released! This is the 15th year of NSS and 11th for the Postgraduate Taught Experience Survey (PTES), and up and down the country university managers and course leaders have nervously been awaiting the results.
Marketing departments across HE have spent months during the autumn formulating strategies to maximise completion rates and to hopefully generate positive outcomes. Academic managers and teaching staff have been constantly reminded about the importance of NSS on TEF metrics and NSS and PTES on league tables. This means that there has to be not only good student completion rates but good results!
It is hoped that the internal surveys run throughout the year to monitor student satisfaction by module and course across the different levels of study (that try to mimic the national surveys) have made a difference in preparing students for that completion and to hopefully “forecast” the outcome of those national surveys.
But how reflective and a good measure of the student experience are these surveys and what real impact do they actually have? These questions are not new and have been asked over the years but in the current marketised metric driven higher education environment, these questions have never been more important to revisit and discuss.
A question of validity
I am a great supporter of surveys that generate small and big data to help inform practice based improvement – if the surveys genuinely and especially have impact for the cohort undertaking the survey. NSS and PTES were originally designed to provide institutions with independent intelligence on how students felt about their experience to aid change. What was valuable to student experience practitioners like myself when the surveys were first introduced was that it put the undergraduate and postgraduate student experience on an institution’s agenda and cajoled them into paying more than mere lip service to it.
However, with the continuing marketisation of higher education, the contribution of NSS now to TEF (institutional and subject level) and the potential for PTES to become part of Postgraduate Taught TEF in the future, I argue that it is time to revisit when the survey takes place to ensure maximum validity of the responses and its impact.
Timing and impact
NSS takes place when it does in order to maximise completion rates – but it happens at a time when undergraduate students in their final year of study are under the most pressure to deliver some of the most pivotal work in their higher education career (January-April). For nearly three to five years they have been working towards this climax but at this key moment they get telephone calls and emails from IPSOS Mori, and face pressure from their lecturers to complete a survey. They may even find simultaneously that they are asked to complete institutional module or course satisfaction questionnaires, resulting in survey fatigue.
The majority of PG taught students study for one year. Full time entrants are asked to complete the survey when they are only just over a third of the way through their course. It also becomes very complicated to include entrants who start at different times through the academic year. Students, like any other survey completer, are likely to provide responses in relation to how they are feeling at that moment in time which may not be an accurate reflection of how they have felt throughout.
The data collected from students about their time at university would be better if they are given the chance to step back and reflect on their experience. This could be done between completion of the course (last assessment submission) and confirmation of their degree or after confirmation. This is likely to reduce the completion rate as students are contacted via their university email address, and after their last assessment submission, students may well not monitor or access their university emails. However, not only could this approach generate data that is potentially a much better reflection of how the students are feeling about their course but it would also be a great way of encouraging graduands to start becoming active alumni – and provide a route for university alumni teams to engage with them. Accurate quality data is far more valuable than mere quantity.
Relying so heavily on the experience of those exiting to shape the experiences of those entering must be done be with caution. With increasing cohort diversity (e.g. entry requirements, domiciled status…) each entry cohort will have their own unique learning and support experience that impacts on their progression (another TEF metric) and satisfaction levels. In addition, changes in course leadership can affect NSS results. As a result it is not unusual to see fluctuations in satisfaction levels over time within a course. We need to be adjusting what we do based not just on NSS scores but also our entry student cohort which is why introducing an entry to study survey is so logical.
Neutrality in action
Of course, some disciplines require students to develop a neutral stance when it comes to responding to certain situations. Journalism is an example. It shouldn’t be a surprise when such students select the “neither agree” or “disagree” option. Although this provides a way for the student to not make a decision, it skews the results and tells us nothing. And with subject level TEF being piloted, getting the right type of data that is representative, reflective, and impactful has never been more important.
Surveys play a critical role in providing data intelligence to help generate change but all good initiatives need to evolve and adapt. As Einstein said: “We can’t solve problems by using the same kind of thinking we used when we created them”. PTES but especially NSS feels like it has become a tick box, quality assurance tool that isn’t about providing meaningful data but that is used to weigh, measure and judge.
It is interesting to note that the NSS is now so established after 15 years any radical change would be seen as politically sensitive but as you point out we should understand that after 15 years of significant change in HE, the NSS is showing the need for change.