Learning analytics are a part of the “what works” toolkit

Rob Summers explains the results of UK higher education’s first large-scale randomised controlled trials of learning analytics-prompted interventions

Rob Summers is Research Manager at TASO

Data analytics are changing the world as we know it, from influencing what we see on social media to improving public health.

Similarly, analytics have the power to highlight issues and prompt change in higher education and contribute towards reducing equality gaps between student groups.

In particular, learning analytics systems which use student-generated data – for example, data about attending lectures or checking out library books – can help to understand student engagement and learning.

When this data is combined with demographic data, analytics can help providers identify equality gaps; monitor the success of interventions designed to reduce equality gaps; or, for example, to understand the effects on engagement and attainment of sub-groups of students when pandemic-related lockdowns force face-to-face teaching and assessment to suddenly pivot online.

To pick up the phone or not

This student-generated data can be used to proactively identify students who may be at risk of withdrawing from their studies or failing some (or all) of their course due to short-term periods of low- or no engagement. When that happens, how can higher education providers help students reconnect with their learning?

Increasingly, higher education providers are contacting these students and directing them to support services either through email alerts or phone calls. But how effective are these interventions? That was the question that two TASO-funded randomised controlled trials, at Nottingham Trent University and Sheffield Hallam University, tried to answer.

After the intervention, student engagement – as measured by each institution’s learning analytics system – was pretty much identical regardless of whether a student only received an email or also received a phone call.

These trials are the first large-scale randomised controlled trials of learning analytics-prompted interventions in the UK – the Nottingham Trent University analysis included over 2,100 students – and we’re confident in the robustness of the findings.

Engagement and belonging

Before you rush off to cancel your institution’s phone-calling service, there are a few things worth noting that complicate the interpretation of these results.

Post-intervention interviews with students indicated that the phone-call contact made them feel like they mattered to the institution. Support is one of four foundations of belonging at university and students taking part in the trials positively welcomed the personal nature of the coaching phone call because it felt like “the university cared” – though one student framed it rather as the “kick” they needed!

It’s also important to note that the duration of the trial over a single autumn term lacks the long-term outcomes of attainment, continuation, and progression.

And finally, because there was no comprehensive tracking of students’ access to support services, we don’t know if the signposting was effective or if the support services themselves were meeting student needs.

Perhaps the relationship between belonging and engagement rating is complicated, and the latter are not sensitive enough to the effects of a phone call in the short-term. Maybe long-term, the attainment, continuation and progression data will reveal some positive effects of the phone call on student success.

Data infrastructure and causal evaluation

While the causal evidence base for learning analytics systems in supporting student success is weak, these platforms can be powerful tools for evaluating student-support interventions. However, current systems lack capabilities to facilitate evaluation such as randomisation of students to different support streams or to different kinds of intervention, and subsequent reporting on student outcomes other than attainment, such as use of support services.

Nottingham Trent University vice chancellor Edward Peck and the Department for Education student mental health taskforce have identified that analytics can play a role in identifying students in need of mental health or wellbeing support. It will be crucial to robustly test the impact of any wellbeing interventions that analytics systems may trigger.

TASO will continue to explore the potential for better evaluation post-entry support including how to use and develop institutional data infrastructure to facilitate robust causal evaluation of post-entry student success initiatives.

Afterall, analytics have the ability to change the world. But analytics alone aren’t enough.

When used effectively and consistently, analytics can highlight where there are issues and where interventions can be applied. And, paired with action and evaluation, analytics can galvanise the changes we want to see – including closing equality gaps in higher education.

10 responses to “Learning analytics are a part of the “what works” toolkit

  1. Great work Rob, and timely. I wasn’t clear though, are you in support of randomized trials of this type of intervention in HE? I am. Much effort is put in to a profusion of well-intentioned intiatives but only RCTs can show us what really works and what doesn’t

  2. Thank you. Yes absolutely support the use of RCTs. But they’re only part of the picture as this trial shows – they tell you what does or doesn’t work, but they don’t tell you why. You need a combination of secondary data analysis and qualitative research that test the assumptions of the intervention’s Theory of Change.

  3. Rob, interesting article and work. The elephant in the room here also concerns supporting colleagues to develop a curriculum that enables students to matter, before hitting the phones to find out why they didn’t. If using AI-RCT combos – BTW, I am not wholly a fan – the crucial aspect concerns assumptions within the interpretation. As you rightly point out, effective evaluation often uses a few different strands to ensure rigour. Thank goodness that we are not always predictive!

    1. Yes, the current focus of these learning-analytics-prompted interventions is that students must change their behaviour when perhaps it is (also) the institution’s responsibility. The beauty of these systems is that you can see how changes to a curriculum affect students behaviour too and you don’t necessarily need RCTs to see those effects – e.g. Quasi-experimental differences-in-differences design can be an effective tool to generate causal evidence in these situations.

  4. I think the comment above from Stella Jones-Devitt is important in this arena. I am not always sure we should just be focussing on the students and seeing their lack of engagement as the problem. Its difficult to change an individual’s nature. Trials looking at different forms of teaching delivery, assessment impacts or use of different media might be as informative as looking at different forms of intervention after problems have arisen.

    Themes such as block teaching, truly applied forms of assessment, avoiding bunching of assessments, using participation as part of the grading might keep more people engaged than a phone call after disengagement and be better RCT targets.

  5. @Steve W I agree with that. It can be informative to use analytics to see how a cohort as a whole engages with certain types of teaching, rather than concentrating on the individual with low engagement. That way we can give everyone a better student experience and hopefully retain more students that way. However, generally, the more we can understand about what works, the better.

    1. Thank you Bart Rientes, I can only apologise that despite many literature searches I have somehow missed this RCT. I wonder if the difference might be due to distance learning vs in-person? There certainly seem to be stronger relationships reported between a students’ digital footprint and attainment for distance-learning courses vs in-person courses.

  6. A single phone call on top of an email doesn’t sound much of an intervention for a student at risk. But as the comments in the back of the paper testify, the impact of a single phone call could be very variable.
    (1) it’s from someone you know and respect, perhaps a teacher, and maybe they know you a bit or a lot too;
    (2) the caller doesn’t know the student but conveys concern and interest, and follows it up later;
    (3) it’s one of the jobs on someone’s list, they don’t know the student, they don’t follow it up personally, but the ‘phone call’ box is checked.

Leave a Reply