Surveys can best inform student support when they reflect students’ individual experiences

As more students struggle with their academic progress and wellbeing, Bruce Johnson explains why the majority view is not always the key data point

Bruce Johnson is managing director at evasys

Parliamentary democracy in the UK may be all about getting a majority, as Labour has just resoundingly demonstrated. But there’s no reason why student surveys should follow the same principle.

There is no policy diktat that says that student surveys have to be about uncovering the most prevalent view. When polling becomes all about determining and responding to the majority view, those in the minority can feel their voice is less influential.

Evidence from large-scale national surveys such as the Student Academic Experience Survey show that those average scores can mask increasing bifurcation in the student experience between those that are thriving and those that are just about surviving, This raises questions about how useful it is to take an average score or a “majority positive sentiment” as a measure of “the student experience.”

The alternative – and go with me on this – is to deploy survey tools and techniques to hear from students as individuals; their views, their desires, their struggles, their concerns. Student surveys can be a powerful tool for individual or small-cohort student voice, where surveys are viewed as a meaningful mode of communication between students and their institutions, prompting a considered response at the individual level.

We are all individuals

Prompted by strategies to improve student outcomes and overall success, many institutions are making great inroads into the use of learning engagement analytic data to identify students whose engagement patterns suggest they may be struggling and/or at risk of leaving their course. The information that students might choose to share through surveys can play a similar role in identifying when an individual student might need an intervention.

If you know, for example, that a student has indicated that they are feeling low, is having financial challenges, or is reporting negative sentiment across a range of learning-related issues, then that could indicate a student in need of additional support. And imagine the potential for connecting up those behavioural signals on learning engagement with insight students have chosen to share through survey responses about how they are feeling, or what is going on in their lives.

To make the most of student survey data in this way requires intentional design, both of the survey process and content, and data handling policies. For some, the idea of using individual student data in this way trespasses on established principles of survey anonymity. But it is possible to set up surveys in ways that put guardrails around student anonymity without sacrificing the possibility of intervening for students that may need it. If a survey is asking questions about student wellbeing and a student discloses that they are struggling with suicidal ideation, it is all but imperative that that student receives an offer of support and reference to specialist help.

Typically, student responses are assigned a generic marker such that anyone engaged in analysis of the whole dataset cannot see individual responses, but the generic markers link up to individual student records so that a student at risk can be tracked down and contacted. If the intention is to use the survey data to inform student support more broadly than harm mitigation – for example, to inform personal tutors about individual student circumstances, or sentiment – then it is also sensible to solicit the student’s active permission in the body of the survey itself.

For example, students who complete the Being, Belonging, Becoming survey which we piloted with 15 institutions this year are told:

All data collected in this survey is confidential and will be held securely. Please do not identify yourself or others (including staff) in your comments. The survey contains a final question seeking confirmation on whether you are happy to be contacted by university staff about your answers (where appropriate), for example, to arrange contact with your personal tutor. However, the university does retain the right to identify a respondent under certain circumstances, for example, where there appears to be a risk to health or welfare.

Taking students’ pulse

At module level, the traditional end of module survey doesn’t lend itself especially well to the individual approach. By the time it has become clear that a student has been struggling, whether personally or academically, the opportunity to support them to course correct has been missed for that module. Moreover, any feedback that students have shared about their module experience may or may not be relevant to the next cohort of students to take that module.

During the Covid-19 pandemic, we developed a module pulse survey, designed to be an early check-in opportunity to give students the opportunity to reflect on their experience and catch any issues before they become ingrained. Students want to see they matter as individuals, and they want to feel that sharing something meaningful with their institution will make a difference to their experience. Taking someone’s pulse can be quite an intimate experience – requiring personal contact, and careful listening. Surveys can be a tool for taking students by the hand, and listening carefully to what they have to say.

We suggested asking students to respond on a Likert scale (agree-disagree) to questions like:

“I am confident I can succeed on this module”

“I feel connected to other students and teaching staff on this module”

“I believe I am contributing to and engaging effectively with this module”

“I understand how my learning will be assessed on this module”

If it was flagged that a student had answered negatively to, say, three or more of these questions, this could suggest cause for concern. There is also the opportunity to solicit students’ qualitative feedback – for example, sharing one thing they would like to start, one thing they would like to stop, and one thing they would like done differently to improve their module experience, creating the opportunity for a constructive discussion between module leaders and students about adapting where possible and appropriate – or explaining why things are done the way they are.

At a larger cohort level some of these questions might still be relevant but the individual approach lends itself more to targeted academic and pastoral support-related questions such as:

“I know how to access information and support from the university about my health, safety and wellbeing”

“I know how to access information and support about my academic progress”

“I am on top of my wellbeing: physically, mentally, financially”

Or a survey might include a yes/no option in which a student could indicate whether they have specific concerns about their physical, mental, or financial wellbeing. It’s less appropriate to ask these broader-based wellbeing questions as part of a module check-in because the result is that students would be asked about their general wellbeing multiple times in quick succession – reducing the authenticity of the question and giving the impression it’s a stock question rather than a genuine request for actionable information.

It is also possible to use survey tools to undertake needs assessment, as Middlesex University does with its pre-arrival survey. Students who had received an offer from the university were surveyed on their feelings about starting university, and their information and support needs, and the insight gained was used to inform academic induction, and to produce personalised actions plans for students based on the information they had provided. Setting up a dialogue from the outset in this way prepares students to be offered regular opportunities to self-assess their current situation and solicit support if required.

Pace over process

As with the Middlesex example, making provision for swift processing of the data and taking action is essential to realise the benefits of surveying students in this way. This requires deciding who needs to see particular datasets, configuring your survey platform to automatically deliver insights directly to the people who need to see them, and being clear with those individuals what the expectation is for action. For example, a personal academic tutor who was able to see the headline sentiment scores of all their tutees for their early module check-ins would be able to prioritise setting up a meeting with those who have expressed more negative sentiment.

There may also need to be a pragmatic review of who actually needs to see the results of these pulse-style surveys. If there is an ingrained culture that every student survey must be scrutinised in depth by a small and/or senior team and a response formulated and discussed in depth there will be an automatic drag on the capacity of the institution to implement something that is designed to offer a quick health check and opportunity to catch emerging issues before they become systemic. Whereas empowering personal tutors, module leaders and relevant professional teams to take quick action on the basis of the insight that comes back will let students know their voices have been heard.

One of the other issues that can come up in asking students directly about their wellbeing is the worry that more students will indicate concern than the institution is capable of responding to. This is a reasonable concern, but it must be said that it is probably better to know the scale of the issue than to tolerate the risks associated with not knowing. The key is to try to avoid raising expectations beyond what can be met: links and referrals to services might be a sufficient response rather than direct personal contact, except in the most pressing or high-risk cases.

It will always be important to listen to students’ voices and views about their learning experience. But the more we understand about how students’ experiences are filtered through their individual circumstances, and their sense of belonging and connection, the more the idea of using that collective voice as a means of assuring quality in retrospect feels like a low-impact use of resources.

This article is published in association with evasys. Find out more about how evasys can support you to transform your approach to collecting feedback and achieving actionable insights.

Leave a Reply