This article is more than 7 years old

Understanding student surveys: a student perspective

In order to get reliable survey data, we have to be confident that students are clear about what is being asked. And, as Julian Porch from York SU explains - this is not as easy as it may sound.
This article is more than 7 years old

Julian Porch is Academic Officer at the University of York Students' Union.

Last year was exciting time to be a Sabbatical Officer in an SU, but this year looks set to beat it. Under our feet the old assumptions about student voice- that this is done principally through elected student representatives- are being challenged by a model and a new regulator which seems to favour outcomes and metrics over stories and speeches.

For me it’s the combination of reps and data that is powerful. Take the TEF. I can’t find a student that believes the metrics and weighting currently selected for indicating quality and/or satisfaction are the right ones, nor can I find a student that thinks they are articulated in a particularly useful way. Imagine the disappointment and anger a student would feel who, on applying for a TEF gold-rated institution, discovers that their course has the worst staff student ratio of the entire institution, is based on a satellite site in a portacabin, and that the course is under notice of being wound up before the first semester is complete, leaving them totally displaced.

The potential of data

But data does matter. We ought to have access to huge, malleable datasets. Students’ unions, university management, students (current and prospective) and Government could use it to explore and interrogate the data that interests them in a way which can generate substantive outcomes. If a prospective student wants an institution that has lots of contact hours, an Ultimate Frisbee club and a lower rate of accommodation cost, they should be able to search for that. If a student representative wants to use the data to consider whether there is a correlation between student recruitment demographics and the local jobs market, they should be able to do that. If a university student counselling service wants to explore the volume of appointments they have in relation to another institution’s comparative data, they should be able to do that. If TEF’s single achievement is to generate banners for the front of campus with gold balloons on it to indicate a gold rating it will have failed- but if it gives students and their reps data and tools to advocate for change and hold Universities to account, we will be getting somewhere.

Or take the NSS. Behind the hype about whether the response rates were up or down, how the NUS boycott impacted the validity of the data and the changes to the survey, the real story for students was that 1 in 4 students were not happy with the way we are being assessed or the feedback we get. Universities remain seemingly unable to deal with this essential component of learning and teaching. Almost a third of students do not agree that marking criteria was clear in advance, over a quarter do not agree that marking and assessment has been fair and one in four students do not agree that comments were helpful. If an institution is unable to provide students with an understanding of how they are being assessed, any celebration over positive scores elsewhere would be like congratulating a very small waiting list at a GP’s surgery where you forgot to tell patients how to book an appointment.

A novelty in this year’s NSS are metrics that look at student voice, and Q26 looks specifically at satisfaction with students’ unions effectiveness at representing students’ academic interests. I was disappointed with the result at my own institution and looking across the country I could see similar frustration. The face of the data suggests widespread dissatisfaction but beneath the headline there are some interesting contradictions that may indicate a more complex problem.  

York scores an impressive 90% for ‘I have had the right opportunities to provide feedback on my course,’ and a strong 80% for staff valuing students’ views and opinions about the course. Both of these results clearly demonstrate the Union is high achieving in its core business of ensuring students’ academic interests are represented. This drops to ~62% when the students are asked whether they are clear on how student feedback on the course is acted on and to ~50% for their satisfaction with the Union’s ability to represent their academic interests. Clearly students have a different understanding of what is meant by ‘representing their academic interests’ than we do. Or they are confused (and perhaps don’t care) about the extent to which the University and the Union are the key actors in the culture and processes of ensuring they have the opportunity to feedback on their course, that staff value their views and opinions, and feedback on their course is acted on.

Research needed

To help us understand what’s going on 18 students’ unions commissioned some research, earlier this year,  which sought the views of over 17,000 student respondents This provided a source of insight and reflection for the sector to start to develop both tactical and strategic actions. It found significant variance in expectation and understanding of the language for question 26 and the subsequent expectations for academic interests in cognitive testing. HEFCE’s own research suggested that previous iterations of Q26 confused students, some of whom appeared not to associate the students’ union with having a role in the student academic experience at all. Sure enough the final data from NSS 2017 bears out the testing, with a higher “neither agree nor disagree” score question 26 than for any other question.

The concept and terminology of “interests” is also a problem for students. While the “student interest” is relatively common terminology at sector level, in the Union Futures study a substantial number of respondents appeared to interpret the concept literally- evaluating their union on the extent to which it reflected subjects and activities they were ‘interested in’. Even when students did understand the concept, the evidence suggests that a significant proportion of students consider the effectiveness of their Union at delivering co-curricular educational opportunities rather than influencing institutional academic provision.  

There is a strong correlation in NSS 2017 between questions 25 and 26, both of which vie for last place in most institutions’ results. Taken together, whilst they question satisfaction with the effectiveness of student input, they also question the success in communicating the outcome and impact of this to students. There will always be a delay in implementing change derived by student intervention and views, but it is vital to ensure that university and union clearly communicate (to both prospective applicants and current students) the process of engaging with students’ feedback, how it is acted on and that much of it is done by the students’ union. At the very least HE leaders need to be more open to giving credit to students’ union officers and staff who have long argued for a change or investment when said change or investment is announced.

The role of the student rep

The research also told us a range of other interesting things that should shape our work. Student reps are the most visible and accessible source of representation for students within the academic sphere. 6 in 10 survey respondents in the study stated that they would be likely or very likely to contact a student rep to support them in their academic interests. Where reps belong to the students’ union, they can therefore make or break perceptions of the performance of the union on Q26. The huge disparity in investment in student representation between institutions deserves investigation, but more broadly the overall lack of investment is an important story. The overwhelming majority of students’ unions have budgets that deliver the academic representation function on little more than a couple of junior staff and an elected sabbatical officer- so for it to be judged by some 50% of students as effective is miraculous given its nominal resource and clamouring for attention alongside sport teams, volunteering programs, events, bars, colleges and much more besides.

We also found that student reps must have sufficient power to make the student voice audible, be seen as approachable by both staff and students, be properly trained, understand the role of the students’ union and their function within it, and be true ambassadors. This requires real and sustained resource and a culture of University and Union empowering these voices rather than squabbling over who owns it and to what end.

Ultimately, we can’t simply assume that the mere publication of data causes behavioural changes at an institutional or programme level. It is much more likely that the use and application of the data by student representatives across a wide range of meetings, committees and forums is what generates change. The role of student representatives as a crucial user of data needs to be far better recognised by students’ unions, students, universities and the sector bodies engaged in metrical assessment of higher education. The OfS, if true to its title, has a role in securing that understanding and communication across the sector and in policy making in future.

One response to “Understanding student surveys: a student perspective

  1. I wonder how much the change at the top of the NUS will influence the vote next time out on how the organisation represents the wider student interest, as opposed perhaps to their own agenda.
    The disaffiliation discussions and votes didn’t come out of nowhere.
    Shakira Martin certainly seems to be a more businesslike and less “ideologically committed” President than the last incumbent.

Leave a Reply