This article is more than 3 years old

Waiting for perfect data won’t help the students thinking about leaving their course

When it comes to tracking student engagement, don’t let the perfect be the enemy of the good, says Richard Gascoigne.
This article is more than 3 years old

Richard Gascoigne is co-founder and managing director of Solutionpath.

University leaders have spent the last six months anxiously wondering if students will be prepared to take up their university place under the conditions imposed by the Covid-19 pandemic.

In the event, following the A level results debacle – and, for that matter, the BTEC results delays – overall UK student numbers are looking, not just healthy, but in some cases overwhelming. Nor have we yet seen a move to defer study among continuing students on a significant scale.

We won’t know the overall outcome until later in the autumn, and there’s always a risk that some of the confirmed students don’t make it through to registration. But with those caveats, it looks like there will be more than one finance director heaving a sigh of relief this month.

Yet an over-supply of students brings its own set of problems. Since the removal of the student number cap in England, where some universities have chosen to grow at pace, media headlines have focused on practical issues such as a lack of student accommodation, or overcrowded lecture theatres.

What’s less well understood in the public debate – though widely acknowledged inside universities – is the anonymity of the crowd, and the personal toll on students of being just one of a faceless throng. A student can slip out of reach before anyone even notices there is a problem, much less musters the resources to do anything about it.

And while universities are doing all they can to provide face to face academic interaction and social opportunities for students, the grim reality of Covid-19 is that the moments of personal connection that forge community and support student engagement will be hard to keep alive on the Covid campus.

As with so much else – digital transformation, pedagogy – Covid-19 is bringing a new urgency to trends and challenges that were already there. In a mass system of higher education, with many diverse students registered on different academic pathways, it can’t be assumed that those students will organically develop relationships with their tutors and course peers with the depth to prompt the sorts of conversations a struggling student might need to have.

Nor can it be assumed that students arrive at university with the self-efficacy and confidence to understand their own need for those sorts of conversations and seek out the right people to have them. It’s unfair to expect an academic or personal tutor to initiate a conversation with a student about how they are doing without essentially knowing anything about them – and far too easy for students to shrug off or deflect offers of support.

That’s where data can help.

The tyranny of metrics

Over the past few decades universities have been incentivised – even conditioned – to use external metrics such as NSS, HESA data on retention, and employability data, as triggers for internal action.

These data sets are robust, as far as they go. But they are more likely to capture the views or experiences of engaged students, they measure cohorts not individuals, and, vitally,  there’s no opportunity to go back in time, find out why any particular student wasn’t succeeding, and address the issue.

The promise of learning analytics is that it allows universities to track and monitor individual students’ learning interactions, and build an evidence-based picture of which students might be disengaged, struggling or otherwise at risk of early exit.

There’s – rightly – debate about the ethics of using data in this way. I’ve written for Wonkhe in the past about why student demographics should not be included in any predictive model for students at risk of lower attainment, early exit or any other less-desirable outcome.

But I believe there’s a bigger cultural issue at play. If we hold internal, real-time data on student learning interactions to the same standards as we expect from NSS, or HESA data, there’s a risk of pouring resources into perfecting the data, rather than focusing on building consensus on what should be done with it to support student engagement. And in the meantime, students aren’t getting the help they need.

The imperfect tense

Having worked with universities across the UK, I can confidently say that nobody is sitting on perfect data sets on student learning interactions.

There are datasets that were created for purposes other than measuring student engagement – like attendance monitoring for international students’ visa compliance. There’s data that produces a lot of noise, and not much signal, like logging into the VLE or swiping into the library. Student submission of assessed work can tell you something, but not everything about how students are coping.

We’ve become pretty expert at uncovering datasets that universities didn’t even know existed, or hadn’t considered might help build a picture of student engagement in real time.

In doing so, we’ve found that the key to making rapid progress is not working to perfect these individual datasets, but having the strategic conversations that produce a shared definition of student engagement, underpinned by the data that is available.

Different universities – and often different courses within universities – have different learning cultures and different expectations of students. By making those assumptions explicit at course or university level it becomes possible to explore how data can give some insight into whether students are, indeed, engaged.

Rather than asking, “what data do we need, to be able to understand student engagement?” it’s better to ask “how can the data we already have, drive action to improve student engagement?”

Keep in mind that the essential issue is agreeing how a student with low engagement, or “at risk”, is flagged, and what support is triggered by that flag. Yes, some students may not be caught by the system, but you’ll catch many more than if you had no system at all. Likewise, if a flag prompts a personal tutor to reach out to a student for a chat about their progress and it turns out they are basically fine, then no harm has been done. Anyone with responsibility for supporting students should understand what a flag might mean and what they are expected to do about it.

I’m also an advocate of transparency when it comes to monitoring student engagement – I believe students should know their university’s expectations when it comes to their engagement with learning, have access to their own engagement data in real time, and understand why they might have been flagged as in need of a sympathetic ear. If the goal is to put prompts in place to enable important conversations to happen, it helps a lot if there’s equal access to information and the conversation isn’t one-sided.

The art of the possible

On a completely pragmatic level, right now, nobody has the time or resource to roll out a major learning analytics project, when so much energy is being poured into preparation for the term ahead. Much better, then, to put something in place using the available data, that can give universities a real-time picture of student engagement, and develop and grow in response to the unfolding situation.

Once the system is up and running there’s absolutely nothing to stop universities making improvements to data – in fact, simply running the system will give much more actionable insight than trying to create a perfect system from the outset. But, arguably more importantly, over time, thinking will also change about what “at risk” means, the quality of the conversations will improve as people’s confidence and experience grows, and students themselves may start to use the data in unforeseen and productive ways.

People and organisations are messy, complex things, so it makes no sense to think they can be captured and interpreted with Platonic ideals of datasets. Instead, data must serve the larger purpose of real, systemic action to reach out and connect with the students who need it – this year more than ever.

This article is published in association with Solutionpath. You can find out more about how Solutionpath can support universities with tracking student engagement in real time here.

Richard will be speaking at The Secret Life of Students on 17-18 September – you can get tickets here.

Leave a Reply