This article is more than 5 years old

Student views, student surveys, and getting better at responding

Jim Dickinson explores what impact the NSS has had on the student experience, and student unions.
This article is more than 5 years old

Jim is an Associate Editor at Wonkhe

Within students’ union circles, the National Student Survey has always been a bit Marmite. For some it’s a dangerous, discriminatory and unreliable tool of neoliberalism. For others, a source of rich evidence to bolster arguments about the need for improvement.

One of the tenets of Barber’s “deliverology” is that data put into the hands of users who then wield it for accountability purposes can drive improvements that the data alone cannot generate. If that’s the case, students need much more support in accessing, using, interpreting and wielding the data than they ever got from HEFCE. This is especially true for those studying at private providers and FEIs, where there are not the same traditions of students’ union activity, and regulation on what counts as a students union often does not apply.

This all suggests wider questions that surround the direction of travel for an exercise that finds itself being looked after by a radically different regulatory beast. Take assessment and feedback. The sector’s continued failure to get a grip on this basic and crucial component of provision is problematic. But where HEFCE’s job was to report the numbers and fund enhancement, OfS’s job is to regulate.

If almost three in ten students still can’t agree that marking criteria was clear in advance, over a quarter don’t agree that marking and assessment has been fair, and one in four students don’t agree that feedback comments were helpful, it is perfectly possible to argue that the sector isn’t up to the baseline that the new Quality Code implies.

That might mean policy implications that go beyond the inclusion of assessment and feedback as a TEF metric. The Competition and Markets Authority – where the initial guidance on HE assumed that the service/product on offer was a course’s teaching – will surely at some stage broaden its focus to the complexity of the services wrapped up in a provider’s offer. Given assessment and feedback are crucial to students for both learning and sorting, intervening to enable students to enforce rights to have it done properly is crucial.

What happens to NSS findings?

It is never been clear why we go to the trouble of organising a national survey without requiring providers to publish a response and action plan. This has been done successfully on access. Institutions can publish statements alongside their data on Unistats. But this is hardly a public commitment to improve, and reading them smacks more of TEF-appeal style special pleading than a genuine plan.

Merely publishing satisfaction data is imperfect, whereas a comprehensive plan (with input from student representatives) would drive more balanced improvement and retain institutional autonomy. It’s telling that the OfS business plan includes an action for the year ahead on the development of “measures to regularly review the impact of the NSS on institutional behaviours and the student experience”.

At an institutional level, the heart of many of the current NSS questions is tied up in the management of performance. There has been plenty of Higher Education Academy work on the pedagogy that underpins effective practice in areas like assessment and feedback and academic support. But the real issues are in understanding the complexity of administrative systems, academic workload, and the cultural difficulties involved in leaders wanting academics to perform to a standard and deadline – all in a context of deteriorating industrial relations. These issues require real work, of the sort only possible if the promise of an integrated Advance HE is realised.

The questions beyond the curriculum

There are also questions about scope. In his pamphlet on “Protecting the public interest in higher education”, Bedfordshire VC Bill Rammell argued that we need a framework that enables universities (in partnership with students’ unions) to “articulate and evidence their development of co-curricular and extra-curricular learning environments”, given the contribution of this activity to personal and civic development of students.

But the driving agenda behind the last set of major changes to NSS in 2017 was the sector’s desire to focus more closely on the student academic experience, dropping questions on personal development in the process. This was a strange move given the wider outcomes focus of TEF planning, and left large parts of the HE sector’s provision (both academic services and co-curricular provision) oddly unevaluated. OfS may find it hard to not re-broaden.

The shift in focus also left a clear dilemma for the question on students’ unions. In the previous NSS Q23 had asked students to indicate the extent to which they agreed with the statement “Overall I am satisfied with my students’ union (association or guild)”. But on the assumption that all unions have a representative role to play in students’ education, cognitive testing of a suitable question proved difficult. HEFCE’s own research suggested that initial iterations of what is now Q26 confused students, some of whom appeared not to associate the student union with having a role in the student academic experience at all – and the final version was a last-minute compromise. Both NSS 2017 and 2018 bear out the testing, with high “neither agree nor disagree” scores. A question on student union impact should probably survive – but not if it is this easily misinterpreted.

25 + 26 = trouble?

It does mean that Q26 has been the sector’s lowest scoring item since its introduction. In the bulk of providers this low is challenged only by the relative failure of institutions to communicate the impact of feedback compared to its willingness to gather it, represented by a similarly poorly scoring Q25. The score was so poor this year that OfS abruptly disaggregated it from the rest of the student voice category, presumably at the behest of providers complaining that it wasn’t their fault.

But in most providers there is a strong correlation between Q25 and Q26. When taken together they cast doubt on the effectiveness of student input and whether it is being effectively communicated back to students. There will always be a drag on being able to implement change resulting from student intervention and views, but given the relative level of expertise and investment in marketing to applicants (HE choosers) when compared to marketing to current students (HE users) there is space for professional development in this area for institutions and student unions alike. At the very least, HE leaders need to be more open to giving credit to student reps who have long argued for a change or investment when it is announced.

Given the results it would be tempting to argue that the Q26 score indicates systemic failure and the need for serious reform if students’ unions are to continue to command funding and access to university decision making bodies. But maybe the real story is not how poorly student unions score compared to libraries or academics, it’s how well they score when doing all this (and all the other things we expect them to do) on pretty paltry budgets.

One of the sector’s secrets is the enormous contribution made by many students’ unions to the rest of their provider’s scores in NSS. Sector level actors often assume that the mere publication of data causes behavioural changes at a granular level, but it is usually the usage of the data by student representatives that levers the results.

If the publication of data to applicants helps drive change for choosers, then OfS also needs to understand what it is that can drive change in the interests of users – supporting students unions and reps within a provider that convert research into action and drive improvement in all of NSS’ questions in the student interest.

Leave a Reply