The importance of supporting student mental health is well-recognised and is now one of Universities UK’s (UUK) top strategic priorities.
As you would expect, it was high on the agenda at the AMOSSHE (the student services organisation) winter conference earlier this month – which took a look at the data and technology angle within the student services space.
Intuitively, data and technology occupy a very different space from the human-centred communities, structured conversations, and therapies which can help to support mental health. However, over the past year or so, we’ve begun to see how data, often data already sitting in various systems around the university, might, when pulled together and presented to humans, be able to tell an important story about individual students and their wellbeing.
Data in one place
Looking at many of the implementations of learning analytics that Jisc has been involved in, it has struck me just how far having accurate “in-flight” data on your students can get you when it’s all in a single place. If we roll back a few years, much of the talk was about what could be achieved with machine learning and predictive modelling – identifying students at risk of dropping out, say, due to patterns the AI observed in previous years of data.
We’ve found that to get to that point requires two sets of enabling activities, both very useful in themselves: one sorting out the data, and the other sorting out the processes. The data “wrangling” involves cleaning up data, pulling it together, applying some human-interpretable flags and rules, and displaying it in a comprehensible way to tutors, and others who may need to act on it. The processes and policies define the kinds of student engagement which the university expects to see coming though in the data (for example, are students required to attend lectures?) and, crucially, whose responsibility it is to look at the aggregated data, and what they then do about it.
Having done that, and without necessarily choosing to use predictive modelling, institutions are able to use learning analytics to support retention and progression: spotting students at risk of dropping out so that they can be helped to continue with their studies, or at the very least, supported through a managed withdrawal process.
A partial story
However, many of the data flags for a student at risk of dropping out could also indicate a student suffering from wellbeing challenges or mental health difficulties. So, for example, low or no attendance, or suddenly reduced attendance at lectures and seminars, combined with very little time spent in the library or on the virtual learning environment, could indicate a student who feels disengaged from their course, or someone suffering from depression or anxiety; or just someone who has been home for a visit and has been studying from printed books – the data only ever tells a partial story.
The picture painted by the data is enough to enable personal or academic tutors, student support advisers (or whoever else provides first-level pastoral or academic support) to have data-informed conversations about how the student is engaging and progressing, and with the students themselves. These conversations have a better chance at uncovering important issues, with studies or wellbeing, as tutors can ask questions from an informed – though partial – picture of the student’s engagement, rather than from a blank sheet.
This means, of course, that staff having those conversations with students need to have appropriate training, support and structures in place, so that they can carry out their role professionally and confidently, directing students to any help they need with wellbeing. This is an important consideration to avoid increasing staff stress, and an illustration of an aspect of the “whole university” approach that UUK advocates.
So, does this approach just mean even greater numbers of students being referred into already very stretched support services, such as counselling? Potentially yes, but it also offers the chance to triage need, and offer appropriate support to students at different points on the mental health continuum. In the most acute cases, early data flags may offer an opportunity to reach out to and help students at risk of suicide who may not otherwise have made contact with student services – it’s reported that only a third of people who take their own lives have a history of contact with mental health services
The data typically gathered for learning analytics gives some useful indicators, and James Murray, whose son Ben tragically took his own life at university last year, is advocating for universities to ensure that relevant data from across a student’s academic record, and other interactions with a university, are brought together, reviewed sensitively, and acted upon.
In less acute cases, data-informed “nudges” can be helpful. The University of Northumbria described at the Data Matters conference last month how their student progress teams have been sending encouraging and supportive messages to groups of students, nuanced slightly, depending on how the students are engaging and performing academically. The messages may point students to sources of help, for example study skills resources, or offer advice on revision, or managing stress, without marking students out as being in difficulty or underperforming. These messages have already prompted responses from students about the issues they’re facing, enabling them to be supported by the student progress team or referred to another appropriate support service, including disability support, counselling, and mental health support.
As with the use of learning analytics for retention, the approach becomes particularly powerful if the outcomes of conversations, and other interventions, are logged. These can then be analysed alongside the original data, both to explore how often the data flags do accurately indicate a student in need of support, and to evaluate which interventions are more effective, in which contexts. This is an area that some institutions are beginning to look at now, though the data protection and ethical issues become more complex when interventions are health-related.
Balancing ethics, privacy, wellbeing and practicality in this space is always going to be a challenge. It is something that each institution needs to consider with its student body in order to establish an approach in which students – and staff – feel supported, but not constantly watched and tracked. It is also crucial that institutions have the processes and resources to be able to act appropriately, to provide support when data flags are raised.
Right at the outset of our work on learning analytics, Jisc produced a code of practice to help guide institutions through their decisions. We are now working on an annex to that, specifically addressing the use of data-informed approaches to support wellbeing. We discussed key aspects of that with AMOSSHE delegates, drawing on Jisc chief regulatory officer Andrew Cormack’s analysis of the possible approaches, in line with the Data protection act. Although the legal (data protection) position can be complex, especially at the boundary of where learning-related data becomes potentially health-related data, it shouldn’t block work in this space, as long as the processing of any health data is overseen by a health professional, and institutions take an ethical, risk-based approach to assessing the impact of any additional data processing.
A key first step in any institutional implementation needs to be developing a clear understanding of how the data picture would be interpreted and used, in order to evaluate how much the institution would be able to do to support their students’ wellbeing, on the basis of the data, that they couldn’t otherwise do.