This article is more than 2 years old

How to support students to feel valued and understood in an increasingly digital world.

Sunday Blake dives into the latest in learning analytics and engagement data, and asks how universities can act upon it to make our interactions with students more human.
This article is more than 2 years old

Sunday Blake is associate editor at Wonkhe

Jeff Grabill, Deputy Vice Chancellor of Student Education at the University of Leeds says that his promise to all prospective students is, “if we admit you, we will graduate you.”

It’s a bold statement – particularly in a turbulent period where institutions grapple with post-pandemic restriction teaching and concerns about low student attendance and engagement. This courageous principle fed directly into the University of Leeds 2025 Access and Student Success Strategy, in which Vice Chancellor Simone Buitendijk claims that providing “students from all backgrounds with fair opportunities to study and thrive” is a “moral obligation.”

Leeds is one of the universities that want to enhance the learning experience while addressing questions around belonging and inclusion and acting upon the evidence that low engagement predicts poorer outcomes. The task that Grabill set towards fulfilling his promise was to take the university’s commitment to student success and merge it with an enthusiastic embrace of digital innovation in learning. The aim is to provide timely and useful information to educators, students and support staff about what is going on in each learning environment. With a growing student body, Gabrill stresses that he needs to know which students need his attention. At the same time, students need to be given the tools to understand what they can do to develop and succeed academically. The intention is to create a collaborative approach which works for each student so that they are in control of their own success.

So, how exactly can they support students undertaking an increasingly digital and hybrid student experience? How could they use real-time data may be able to help students navigate their digital footprint and use it to help them reach their potential? And could this technology make interactions with students more meaningful and systems more human in the process?

Engagement

Before the pandemic, student engagement at Leeds was monitored solely through student physical attendance on campus. A new system was required during lockdowns. During the pandemic, staff began monitoring online through logins to the virtual learning environment. However, this approach had limitations on the engagement data it could provide the university. Students might log on in groups through one portal, or a student might log on once at the beginning of the term and download all the resources at once. A more accurate understanding of a student’s engagement Leeds would require a more holistic system which drew on broader touchpoint data.

Building on the rapid adoption of online provision since 2020, and accelerating their existing advances in their technological provision, the university has reimagined the way they engage with students to better facilitate their success. As an early adopter of digital learning, rather than simply creating new digital platforms or systems – Gabrill notes that Leeds sought to synthesise existing digital resources into a digital ‘‘ecosystem’ to give better insight into student engagement. Data from the digital interfaces of the physical university buildings, the virtual learning environment, library and ebook use, student records, submissions, and various active learning platforms: were brought together to give individual data reports on how students are actively using these resources.

James Pickering, Academic Director at the University of Leeds explained that the intended approach was not to embrace technology for the sake of technology. Instead, it was to find ways to “understand how students are behaving within [existing] systems temporarily, throughout the duration of their course, [and at] various different time points”, as well as to understand how “different cohorts are engaging with different resources at different time points.” This is vital for an institution to take a well-being approach to student disengagement rather than a disciplinary one. The insights are then gathered into three main categories:

Curriculum analytics

Used to understand and enhance the learning environment, improve learning, teaching, inclusion and outcomes, and inform overall education strategy.

Cohort analytics

Used to understand how students are behaving at different points in time, across different cohorts, and to build on pedagogical insight. This can support senior, department, and module leaders to identify patterns of behaviour to support equity of outcomes and address differential degree attainment. The pilot informed interventions that could then be implemented locally to support particular student demographics.

Personal Analytics

Used to give students the tools to take ownership of their own learning. They can use the derived insights independently or with an academic personal tutor, allowing informed conversations aligned to that student’s success and to their own measures of success. A readily available dashboard showing the student’s engagement with the university touchpoints, and those specific to their course, is then used to create space for open, non-punitive, supportive conversations based on up-to-date information.

These engagement ratings show students, staff, and their academic personal tutors how they’re progressing in real-time throughout their programme. The system can also be used to better understand student disengagement at an aggregate level and identify which students may need support before the point of crisis.

Having insight into the specifics of any disengagement allows targeted conversations with students about what aspects of the learning experience they are struggling with – which is both more supportive and efficient than simply generically asking how they are. For example, if a student’s engagement score is low, their personal tutor can see that they may not be accessing the library as much as their cohort average. This could then lead to a conversation around anxiety or about their confidence levels in accessing archives. This is also significantly more positive and student-centred than a legalistic “unsatisfactory attendance” email, let alone an automated poor engagement procedure.

Making this ambitious project a success involved embracing learner analytics and collaboration with, and through the roll-out of, StREAM – a student engagement analytics software programme. StREAM is a cutting-edge online platform which produces an algorithm from multiple data sources to create an individual engagement score for each student.

Practice and evaluation

For this collaborative initiative to be a success and align with their values, the university needed to do two things. First, a code of practice was agreed upon, which stipulated that data collected would only be used to support student wellbeing, education, and outcomes – never for HR purposes. The code of conduct also guides staff and students in the appropriate and ethical use of learning data analytics. Secondly, it would be crucial to work on an ongoing basis to evaluate the impact of StREAM across the institution, how staff and students were using the data points, and examine what data should be made available to both students and staff.

Brownyn Swinnerton, Senior Research Fellow in Digital Education at the University of Leeds, has been evaluating the use of the platform across the institution using staff and student focus groups and interviews, and an institution-wide surveys. Her research looked at the impact of using learning analytics on the student experience and how that analytic approach affected the effectiveness of academic personal tutoring. Did it impact performance, and did providing students with their own data help them become more aware of their educational journey and support themselves better and get the external support they needed?

There were also other key decisions to be made. These included whether students should have access to cohort engagement measures, in addition to their own individual data, so that they can see where their engagement sits against the cohort average. Swinnerton explains that:

there’s evidence that says that students may feel anxious if they’re provided with a dashboard which suggests their engagement is below average and that was something that concerned us right from the outset about students seeing their dashboards, possibly at home, at midnight, with no one else to talk to about it and then get worried about what their engagement levels were. […] There’s some research that suggests that when students are faced with seeing their engagement as below average it can for some it can increase their motivation and get them to work harder to try to increase that level – but for others, it can be demotivating.

The answer was to give students the option to switch on or off cohort average, but also provide advice and guidance on what they need to do to improve their engagement, when providing comparative engagement. Swinnerton is clear that students must receive practical insights through this process, including where to go for support:

Academic personal tutors need to have conversations with students to ask them what they think about cohort average, and then talk them through how they would deal with it.

Another key decision was whether demographic data and personal characteristic information should be used in the students’ engagement algorithms and displayed on their dashboards. Demographic data can be used to predict grades and drop-out rates by institutions and is even used to benchmark against grade inflation by the Office for Students. The issue, Swinnerton explains, is that while demographic data may be useful in identifying ‘at-risk’ students and deciphering whether different groups of students have different patterns of engagement there is a risk of discrimination and self-fulfilling prophecy. The solution agreed with Solutionpath – the developers of StREAM – was to use demographic data in cohort and curriculum analytics as large, anonymised data sets but not in personal analytics. Furthermore, Leeds’ implementation of StREAM only holds data collected from existing university systems.

Swinnerton reports some issues surrounding engagement and visa regulations from international students, which she believes can be solved through better communications – noting their use of StREAM so far has been a ‘soft launch.’ There were also concerns surrounding staff and students. However, these concerns seemed to be reported alongside an appreciation of the benefits of StREAM data. One student noted:

[there is] basically nowhere to hide because you go to your academic personal tutor meeting and you can’t just go yeah yeah yeah I’m doing loads but it’s all broken down makes it really obvious what you’re actually doing or what you’re not doing.

Innovation can often be a source of concern, especially when datasets like this are used. The pilot implementation of StREAM at Leeds showed that sophisticated data-driven tools can support both staff and students. But, perhaps, more importantly, it showed that these systems will work well only where they are deployed carefully, within a transparent structure that takes account of concerns about privacy and unintended consequences.

If the focus is student-centred, with concern for their academic welfare (and indeed welfare overall) baked in from the start, nested in university systems which have those same objectives, then the latest digital learning analytics can provide timely information that is of use to academic and support staff, while simultaneously giving students insights into their own learning patterns, styles, engagement, and environment.

Leave a Reply