Aaron Porter is Director of Partnerships at Wonkhe

Data in higher education has become increasingly important, a shift driven in part by regulatory oversight.

But data also has an essential role to play as institutions seek to better understand their performance and the experience of students. Institutions have begun to not only utilise learning analytics to draw the correlation between engagement and outcomes but deepen the analysis of how they support students and what works, allowing providers to evidence the relationship between students and their experience.

As the use of analytics becomes more sophisticated, institutions are able to better target their support and interventions and evaluate where they can make the biggest difference to better support students.

The state of the art

In recent years, students have adopted different ways to participate and engage in their learning. And whilst this is not without challenges, for many institutions it has become an opportunity to rethink how they support student engagement and outcomes.

As institutions continue to reflect on what practice is effective in this changing environment, a more nuanced appreciation within providers of the importance of understanding what engagement means for their students and the role of data in demonstrating this through their digital and in-person footprint, is leading to more integrated thinking through analytics.

Institutions with access to engagement insight are more able to study the relationship between a student’s participation in their academic learning through their interaction with learning resources such as the VLE, library and lecture capture, triangulated with student outcomes and information on where support intervention will be most effective.

The scale of the issue

At a sector level, the rise in total student numbers and the growth which some providers have seen has offered some new challenges. Institutions have had to grapple with ensuring that personal and pastoral student support can be offered at scale, accompanied by a societal rise in wellbeing and mental health issues which has been replicated in higher education. More generally, as many institutions have increased in size, there are new challenges relating to ensuring that support and services across a university are appropriately joined up to ensure no student is left behind.

This growth has also made the case stronger for clear, rapid, and real-time insight to get a better handle on which students are engaged and which are less engaged, and what might be done to more proactively drive engagement with those at a risk of non-continuation or poor academic performance with a view to help support students to achieve their goals and move towards positive outcomes.

More generally we know that some institutions have been pioneers in their use of learning and data analytics. Nottingham Trent University in particular have widely been seen to have developed their approach and this was referenced when their Vice Chancellor, Edward Peck was appointed as the Government’s Student Support Champion.

Our project

To better understand this growing focus on learning analytics, Wonkhe and Solutionpath partnered to launch a series of action-research projects by inviting 6 universities currently using Solutionpath’s Student Engagement Analytics Platform StREAM to become part of a community of practice to better understand how analytics were being used, their impact and lessons that could be learnt.

The project ran over the academic year 2021-22 and this report brings together findings from 3 of these projects at – University College Birmingham (UCB), University of West of England (UWE), and Teesside University. In it we draw links between insight and practice, highlighting how engagement analytics is being used to support student success.

Central to this project was the use of Action Research. This is a method of systematic enquiry that allows practitioners to be researchers of their own practice. Our aim was to use the range of participants to identify, in further detail, successful practices to address student disengagement. We wanted a closer look not only at the student engagement score categories, but to investigate how insight into how students’ patterns and characteristics of engagement could be used to add contextuality.

Solutionpath invited users of StREAM, their student engagement analytics platform, to take part in the project.

About StREAM

StREAM is a student engagement analytics platform which provides educators with student engagement insight at cohort, course and individual level in a single platform.

StREAM takes data from a range of digital interactions that a student makes with their learning activities that represent academic engagement such as the virtual learning environment, library and e-book use, student records, submissions, lecture capture and uses a unique algorithm to transform this data into a live engagement score for each student, ranging from “none” to “very high”.

Using engagement as a proxy for progression, university staff can use disengagement as a tool for identifying who may be struggling and in need of support. Moreover, students can use their engagement scores to self-reflect.

Although identifying students at risk of disengagement was a key area of interest within this project, there was also an interest in understanding what kinds of intervention helped which students, whether the timing of these interventions and how they were delivered, and who by, made an impact, and how the use of this data could inform practice.

The main findings

Early engagement matters

Across the projects it became clear that early intervention improves engagement, and that engagement data can be a helpful first step in targeting support toward those who need it most. For many institutions, being able to identify where to direct finite resources (for a busy personal tutor or staff in student services) was important to better utilise their resources efficiently and provided an accurate place to start intervention.

In the context of the Condition B3 metrics being rolled out by the Office for Students (OfS) – which have established thresholds for continuation, completion, and progression – being able to identify which students, courses or cohorts of students are at risk of not progressing is shaping institutional approaches to better support their students. Doing so earlier may help institutions ensure more students progress to successful outcomes.

Data from all three different action sets found that early indication of changes in engagement behaviour were an effective predictor of both retention and attainment. Teesside University found that the earlier the initial intervention is made with a low engaged or unengaged student, the less likely the student is to withdraw. Students with low, very low or no engagement, who went on to receive two or more Student Success Programme (SSP) sessions, saw an improved engagement measure (73.3 per cent). Interestingly, this was reflected in students’ feedback which showed that students felt less likely to leave the course and/or university at the end of the SSP interventions, compared to at the start when engagement was low.

UCB demonstrated that both early engagement and attendance were significant predictors of later retention; however, engagement and attendance at 4 weeks was likely to give the most accurate predictions of retention. When looking at the relationship between engagement and attainment they found that an increase of 1 average engagement rating (e.g., from good to high) increased the average assignment mark by 7 marks. This suggests that encouraging engagement early in the academic year can impact attainment outcomes.

Data from the University of West England suggests that this period of intervention is even more sensitive. They found that 46 of the 61 students identified as having no engagement in StREAM did not register on their VLE until midway through the term or, in some cases, afterwards. As a result, they missed important communications from the university and were then more likely to continue to be non-engaged in semester two.

Cohorts and demographics

At UCB, comparing aggregated data between different cohorts proved invaluable as they identified clear differences in retention and attainment between cohorts that began in September, and those that began in February. This may have been due to students starting in September benefiting from the full scope of Welcome Week and other onboarding activities. As a result, they are re-evaluating the timing of some of their communication campaigns. UWE had similar success in that they used student data to find that 50% (31 students) of ‘hard to reach’ students who were flagged as lowly engaged were also resitting a year. The research strongly suggests that students who are repeating years need to be given targeted support such as being prioritised for contact hours.

The interaction of StREAM data with other characteristics was a keen area of interest – particularly if it could be used to shape interactions that lead to better outcomes for students who come from, on average, lower-attaining groups. The use of demographics when analysing data can support senior, department, and module leaders to identify patterns of behaviour to support equity of outcomes and address differential degree attainment.

It is evident that learner analytics do give a picture of predicted student success, but demographic insight supports application to policy and practice.

It is important to note that the StREAM algorithm does not use demographic data to define a student’s engagement although it may be used to draw relationships in later analysis. This is in line with a key principle to use data to reflect what students ‘do’ and not ‘who’ they are, creating more actionable data based on activity that both the student and university staff member can positively change.

Next steps

The data which institutions have is vast and constantly changing and updating. Engagement analytics provides an increasingly sophisticated approach to better understand this learning landscape, what that means in the context of each unique institution. Using this information to guide practice and interventions is of increasing importance, driven by financial and regulatory requirements and a focus on student welfare.

The findings from this project demonstrate that institutions can feel confident to use the growing body of data and evidence being generated to make informed decisions about where to move resource and how to shift the dial on continuation and attainment.

To be increasingly proactive, institutions need to better understand the ebb and flow of student participation and engagement in the context of their unique student population. This deeper understanding can nuance the approach institutions take and allow them to better evaluate the impact and efficacy of what they do. In each research set, universities were able to identify areas for improvement. This prompted agile thinking and rapid response to creating new ideas and solutions. Over time, institutions should continually be using this data to inform what is working and what is less effective to improve and iterate their approach.

Research is increasingly providing evidence to support engagement data and analytics as a powerful tool for:

  • Creating a near real-time picture of student engagement at individual level
  • Identifying which students to focus support on
  • Initiating proactive conversations with students
  • Bringing forward the point of intervention
  • Evidencing impact to assess the effectiveness of different support and learning and teaching approaches and initiatives
  • Informing policy, practice and future strategy

At a time when increasing questions are being asked of institutions about the quality and personalisation of the support they offer, when budgets are being squeezed and when regulation is placing even more focus on continuation and progression, having a sophisticated approach to analytics can help institutions on these fronts and, more importantly, ensure no student is left behind.

Read the full report on the Solutionpath site

Leave a Reply