This article is more than 3 years old

Should we abandon algorithms in education?

Universities are about to meet a cohort of students for whom algorithms and data-driven decision-making is potent and personal. Let’s make this a turning point, says Chris Thomson
This article is more than 3 years old

Chris Thomson is a subject specialist at the UK’s education and technology not-for-profit, Jisc. Chris focuses on digital practice.

The last few weeks saw many young people at the mercy of a system they barely understood, in circumstances beyond their control.

Many had their plans for the future thrown into disarray. Once the dust settles, the long-term impact could be a significant shift in the way students view data.

A turning point

Universities are about to meet a cohort of students for whom algorithms and data-driven decision-making is a huge and personal issue.

Of course, you could argue this is nothing new. As a society, our interaction with systems, platforms and organisations has been impacted by the use of large datasets, algorithms, machine learning and artificial intelligence (AI) for years, and there’s a thriving critical discourse about the impact of that on human behaviour and social justice.

But despite the level and quality of expert debate, use of our data hasn’t been particularly tangible. Sure, we know that corporations and governments use data to shape their services (for good or ill). We see Spotify suggesting songs based on previously played tracks, and we know Amazon recommends products “people like you” previously bought – but it’s hard to engage with an issue when the processes are so opaque, the experience so varied, and the implications unclear.

Lessons to be learned

Unlike our entertainment choices, the stakes when it comes to exams are high. If your future is at the mercy of a system that appears unjust, that’s traumatic – especially when it comes after six months of uncertainty and distress.

At this stage, we have a lot of information about the algorithm that was used to grade exams, but we still have little understanding of its implementation, as Tom SF Haines, a lecturer in machine learning at the University of Bath, has pointed out.

In addition, the process has been managed by a government experiencing low levels of approval and trust. No wonder, in a year of unprecedented firsts, students have been demonstrating, carrying banners and shouting “f*ck the algorithm”.

Fairness and transparency

Managing any sector, such as education, at scale requires the use of data. Without it, it would be impossible, chaotic and unfair. But data and people interact in complex ways, so there must be trust in both the processes involved and the people in a position of power. As philosopher Onora O’Neill said:

First, be trustworthy. Second, provide others with good evidence that you are trustworthy.”

 Such trust can’t be created as a political or PR exercise. The government actually has its own data ethics framework for leaders and practitioners in data projects. Focusing on the ethics of AI, this guidance summarises the characteristics of effective and just projects under the principles of FAST (fairness, accountability, sustainability, transparency), as set out by the Alan Turing Institute. The framework suggests that, by carefully reviewing the FAST principles, university and college leaders can help ensure a project is fair, help prevent bias or discrimination, and safeguard public trust in a project’s capacity to deliver safely and reliably.

Jisc has also produced its own codes of practice around the use of data analytics in a learning setting, most recently relating to issues of wellbeing and mental health.

Working in partnership

Transparency is key. Do students know what data is collected on them and how it is used?

That’s a good first step – but there are further opportunities to involve learners in the design and implementation of data-informed systems and services. This gives them agency in the process. It also gives students an opportunity to develop their awareness about the ways in which data is used in modern life, preparing them better to play an active role in their learning and beyond. And where this involvement includes bringing together the designers of the systems, the staff implementing it and the people who are affected by it, to work in partnership, so much the better.

There is guidance on reviewing such work from the likes the Alan Turing Institute and the ICO, but as much of that advice is aimed at technical specialists and leaders, there’s room for more support to enable and empower the rest of us – along the lines of The Dstl Biscuit Book which, while it’s about AI projects, covers a lot of useful common ground. Jisc has a mini-MOOC planned for September 2020 along similar lines.

Play to data’s strengths

The controversy around exam grading this year hasn’t all been focused on data and algorithms; the energy in the debate has come from human experiences. These individual stories could be the narratives that stick, especially when considering outcomes for disadvantaged students, and the impact on individuals – whether they are people we know or not. BBC Newsnight editor Lewis Goodall highlighted such stories on Twitter, illustrating that while a clear, objective understanding of the mechanics of this situation are essential, it’s personal experiences that answer the question “so what?” and fuel most people’s reactions and decisions.

We’re already starting to see the fallout from the results debacle extend into other areas of society. Councils are starting to look at the role of AI in benefit and welfare decisions, according to the Guardian – and there’s anxiety in the education sector too. But I’d suggest we shouldn’t be too hasty about a comprehensive rollback. The University of Gloucestershire has been exploring how it can bring analysis of the data generated by student interaction and gathered with their permission together with key systems and activities to watch for early warning signs of learners getting into difficulties. This enables informed discussions as part of the university’s pastoral role.

There’s a delicate balance to be achieved but combining the use of analytics with strong relationships and careful interventions can – when done right – prove a useful tool to supporting a positive student experience. And, given regular contact with students in physical spaces will be unreliable over the coming months, universities will need a wide range of tools to ensure the wellbeing of students.

A model for society

The psychologist Jerome Bruner wrote extensively about how making sense of the world through objective, rational means gives only a portion of the picture. The rest is made up of narrative, which is just as important in understanding motivations, reactions and behaviours. As our ability to collect and analyse data at scale and at speed increases, do we risk losing sight of these stories? Are we dismissing them as subjective and unreliable, forgetting that the use of data, machine learning and AI is not free of bias, discrimination and error? Data and stories are closer in nature than most people think.

Universities, colleges and schools exist to prepare students to thrive in the world, but also equip them to shape it. As such, educational institutions should provide a model for how society should be – and, after students’ experiences this summer, I hope that will increasingly mean using data ethically and with transparency.

One response to “Should we abandon algorithms in education?

  1. Assessment was the key test. In many ways universities were given the same choice about assessment as Government was. I’m hoping that someone is collating what we did, but my sense is that, all around the sector, we chose to use real assessment.

    The various ‘no detriment’ schemes used data, but mostly the student’s own data. I know that we did a lot of modelling of previous cohorts performance (in a very short time) to help us understand patterns of how students might perform in the future, but we let the systems run on students own performance. If anyone abandoned assessment in the summer term in favour of predicting students performance against previous years’ cohorts, then they kept it very quiet.

    Big data is great for targeting support, for ensuring that interventions are right, that evaluations can be done. They have some way to go to show that they are right on the individual scale. A quick parallel with admissions: POLAR helps target inventions at school or ward level, it’s really bad at spotting individuals with disadvantage (contrary to the minister’s understanding). Therefore most people avoid using alone it to determine a course of action for an individual (and to their credit OfS warn against it too).

Leave a Reply