Demographics in Higher Education are important. They can be used to identify disparity in academic achievement, support targeted provision for those less privileged and track demographic changes in student populations – aiding institutional planning.
Demographics can also feed into a number of key strategy issues – like recruitment and resource planning, Higher Education Statistics Agency reporting and access and participation targets. It’s therefore no surprise that in our work supporting sector planning teams, demographic splits are regarded as crucial data points for supporting university decision making.
Learning analytics
With the rise of data-led machine learning techniques, it is now possible to use data to enhance our understanding of learners and their learning environments – resulting in the rapid emergence of ‘Learning Analytics’ within the sector – data based on algorithms that support decision making and offer new areas of insight. These developments mean that we are now able to measure and codify students to support progress, attainment and retention at institutional and individual student levels – often through targeted, real time interactions that are specific to an individual student need.
There are clear attainment differences across many demographic groups, so the start point for many universities is to consider ‘demographic factors’ as an influence within any analytical model. But what do demographics do in this context? Typically, they reinforce what we already know and they can predict a student ‘outcome’ – students from postcode X might do less well, so a system predicts under performance.
If nothing changes along the students’ learning pathway, then the algorithm will be predictive – a factory production line with the same inputs and processes will deliver the same outputs every time. But to change a student’s trajectory, either the university or the student has to do something different. And because demographics do not necessarily change, we should focus on those factors that are within each individual student’s gift to manage.
The dangers in execution
In the work we do with universities, a danger we have identified is that that data analysis algorithms that embed demographic factors hold the potential for bias. Whenever a mathematical model is created, weight is attributed to certain conditions which then indicates an outcome. Gender and racial bias in algorithms are topics of heated debate in the sector, and latent bias exists in many Learning Analytics approaches that we have seen.
Algorithms that rank a student’s ‘potential’ because of a demographic attribute that shows that that particular group hasn’t fared as well are biased. What if a BAME student was written off by a tutor because in their view it was a pointless endeavour, and the ‘system’ allows them to confirm that view? What if a white student was having mental health issues and was overlooked just because he/she came from a privileged background?
If the start-point (demographics) is an issue, then the end-point may also need further consideration. Blanket assumptions about what constitutes ‘success’ also influence goals in an algorithm. If we only measure success with a grade outcome, this fails to recognise the full value of higher education by simply providing a classification at the end of the process. And focussing on ‘predicting’ a grade award can create negative perceptions from students; why should a Black and Minority Ethnic student be potentially downgraded in their ‘outcome predictions’ because of their heritage?
Solutions to problems
There are solutions to these problems. By working with KPMG, Solutionpath offers the benefit of experience in managing transformation and student journey projects, and help providers with the challenges associated with driving service adoption. This includes the ethical as well as the operational aspects of data.
In our work with Nottingham Trent University (NTU), the university gives students access to their own data and has done so right from the start. The project team (including students’ union officers and a representative from the equality & diversity unit) was worried about bias in scoring, and resolved that it would be profoundly demotivating to use student demographics in the engagement algorithm. As Ed Foster, NTU’s Student Engagement Manager makes clear,
Two students engaged in precisely the same way (took out the same textbooks, logged in to the VLE the same amount) where one student was from a disadvantaged background would be at risk of ending up with a lower score – and for them what their students do was more important than who they are”
General Data Protection Regulations (GDPR) obligate institutions to share how automated decisions are derived, and offer students a means to challenge these. This could have a huge consequences – when a tutor is asked for the reason they have asked for an academic review with a student, are they ready to justify the “at risk” flag? And does the institution even understand how the calculation has been reached to be able to explain it?
No university wants bad press, and this simple factor alone could limit Learning Analytics to research and closed-door planning. This would be a shame, because in our experience when done well, Learning Analytics could become a truly democratic tool that offers a means to successful change for students.
The sector has been slow to adopt analyatics full stop but learner analytics offer so much potential to help prevent students dropping out but I agree ethical concerns are paramount following the FB scandal.