Postgrad pilot NSS results finally revealed

I’m over in the Republic of Ireland this week, where over the last 6 years, 232,450 students have completed StudentSurvey.ie, each answering over 60 questions about their experiences of higher education.

Jim is an Associate Editor at Wonkhe

One of the things that marks out that approach as distinct from that adopted in the UK is that the main survey covers first year undergraduate, final year undergraduate, and taught postgraduate programmes – while a related survey covers PGR students every two years.

It also effectively combines the “opinions on the quality of their courses” approach in the National Student Survey with the “amount of time and effort students invest in their studies and how students engage with learning” approach embedded in the (optional) UK Engagement Survey run by Advance HE, with elements of the Postgraduate Taught Experience Survey (PTES) and Postgraduate Research Experience Survey (PRES) for good measure.

There’s easy to access provider reports, whizzy tools for analysis and a grab bag of examples of how providers use the results to change things. Questions cover all sorts of aspects of engagement (including a bunch of community and belonging related questions), learning activities, institution approach issues, quality of interactions, knowledge, skills and personal development and even stuff on considering withdrawal. It feels, overall, just a lot more useful than the thing we’ve ended up with in the UK.

We’ve never heard anything from the Office for Students (OfS) on a PGR survey, but it has of course promised a PGT NSS a couple of times – its director of comms told us how important it was to listen to the PGT student voice in 2018 as a way of launching a pilot, and then we got another, much bigger pilot survey in April and May 2022 that involved over 100,000 students across 42 providers.

But then… nothing. No announcement, even, of an intention to knock it on the head – nothing. And that leaves us in a position where not only do we only have UG metrics feeding the TEF, we have sites like POSTGRAD.com intimating to students looking for PG courses that they should check out universities’ NSS scores – without ever saying that not a single PG student will have been surveyed.

Results – revealed!

We’ve also never seen the results from the two pilots – until now. Via the magic of the Freedom of Information Act, we’ve got the national results from two waves of the first pilot (using the old 5-answer likert scale with the “neither agree nor disagree” option in the middle), and one wave of the second pilot (with a format more closely resemblant of the current NSS).

In the big 2022 exercise, as well as a question on “motivation” for undertaking postgraduate study (that would perhaps be more illuminating if we had student characteristic or subject breakdowns), there’s pretty much 80 per cent plus positivity all round – with some notable exceptions.

As in the NSS, course organisation comes in for a kicking with almost 1 in 4 students saying their course wasn’t well organised. Ditto on feedback helping students to improve their work.

The 2019 results are also fairly similar to aggregate NSS results of the time – but it’s on the new (now old) questions where things get interesting.

Both “my university/college cares about my mental health and wellbeing” and “there is sufficient provision of student wellbeing and support services to meet my needs” languish down at circa 60 per cent on both waves – interesting given that both questions are more useful than the NSS question on awareness of mental health support (regardless of quality), which got a 75 per cent approval when asked last year.

The 2019 exercise also included a more appropriately EDI-themed freedom of expression question – “I feel comfortable being and expressing myself at university/college”. That got 75/77 per cent positivity – it would obviously be fascinating to see that broken down by student characteristic. “I feel safe at university/college” got 90 per cent – with just 3/1 per cent disagreeing.

Back in the day OfS used to have a goal around value for money – the 2019 exercise asked respondents to agree or disagree with “I believe that my course offers me good value for money”, and got 55/57 per cent positive – much higher than the HEPI/Advance [UG] Student Academic Experience Survey result of 37 per cent in 2023, but still the worst scoring question on the survey by a long chalk.

All of this has the potential to be much more useful at subject level by provider, especially if there was some overall strategy that was being set in conjunction with Advance HE. We do also need to know if there’s an international/home difference as there appears to be in the NSS.

And while OfS is busy with lots of responsibilities, it probably does owe the sector an answer on why it’s decided to not go forward – and why we’ve ended up with prospective PGT students using a UG survey to rely on for this kind of information.

2018’s Student information, advice and guidance strategy very much feels like a forgotten thing here in 2024.

3 responses to “Postgrad pilot NSS results finally revealed

  1. One of my regrets on leaving the OfS was the lack of progress on a taught PG survey something that is surely essential for a regulator to understand the lived academic experience of students. To be fair to the OfS the sector has hardly been pushing for this fearing that having a survey could damage international recruitment even if the results are objectively positive the lack of international comparisons means good may be viewed as mediocre. If OfS were serious about listening to students they would push forward with this work rather than leaving it on the back burner, they might also find a way to use the surveys to better support baseline regulation rather than just the TEF and student information.

  2. internal student surveys at universities can be more insightful than a national survey. Internal surveys allow for targeted improvements and quicker feedback loops specific to each program. Perhaps existing frameworks like PTES (Postgraduate Taught Experience Survey) can be strengthened and
    institutions encouraged to focus more on the granular data found in module and programme level surveys rather than implementing a top-down national survey. More at https://evasys.co.uk/value-of-internal-surveys-in-tackling-pgt-student-experience/

Leave a Reply