This article is more than 5 years old

Enhancing the curriculum with learning analytics

Following the Data Matters conference, Paul Bailey from Jisc looks at the potential for learning analytics to improve teaching practice and the student experience.
This article is more than 5 years old

Paul Bailey is a Senior Co-Design Manager at Jisc.

Learning activity data is playing a growing role in education, with providers adopting learning analytics systems to help improve the retention and attainment of students.

While services like the one recently launched by Jisc in nearly 30 UK institutions can help providers to achieve their strategic goals, there are also opportunities to use this data to benefit academic teaching. The datasets Jisc uses for learning analytics can provide a basis to improve the student learning experience, by helping to adapt teaching and learning approaches in light of this new information.

Jisc is currently exploring how curriculum analytics can be used to support academics to improve teaching and curriculum design. Academic staff are often time-poor and most are not data scientists, but a data-informed approach can help staff to understand the impact of a particular approach, and adapt the future design of modules and courses to maximise student engagement.

Work by the Open University finds that is possible to capture the digital traces of learning activities of students and academics in virtual learning environments, offering educators potentially valuable insights into how students react to different learning designs.

Teachers can also use analytics in real time to monitor how students are engaging, gathering micro-feedback to find out what they are struggling with and give academics insights into how they can respond most effectively to students.

Real life

Of course, tutors are already receiving real-time feedback from learners every time they ask in person, “How is it going?”. The difference with learning analytics is that it captures automatically how students are actually behaving, and analyses and presents that data in a way that helps academics – and students – to decide how effective their approach is.

Learning analytics services currently provide lots of course and module-level data that would be useful for curriculum analytics, including descriptive data about attendance, use of resources and engagement online.

This data-informed approach tends to work best in blended learning. The digital footprint of technology-enhanced learning activities makes it much easier to measure the success of teaching than with traditional analogue activities.

Jisc is now exploring how popular voice interfaces and learning analytics data can be brought together to make it easier for staff to engage with the data. By using intelligent assistants, such as Amazon’s Alexa, instead of looking at spreadsheets, a teacher could simply ask a voice-activated virtual assistant which students are struggling, and it would provide an answer.

The only way is ethical

However, there are legal and ethical issues to be considered. For example, teachers would be rightly concerned if the very data that they use to inform and assist them was also used to monitor or appraise their performance. There is also the question of training and digital literacy; staff need the right skills in order to ask the right questions of these datasets in teaching and to be able to trust and act on the resulting insights.

Then there is the question of how much trust we put in the analytics. In my workshop on curriculum analytics at last week’s QAA/HESA/Jisc Data Matters event, we discussed the merits of qualitative and quantitative data for judging interventions. One participant felt that, when it came to getting feedback on teaching, the best approach was to ask the students. “Why look at data first,” he asked, “when you can just talk to your students?”

In light of that question I was interested to hear Harrods’ customer insights director, David Boyle, comment in his keynote for Data Matters that people working in combination with computers can produce better results than either a person or a computer alone.

What Boyle meant is that data doesn’t always provide context. For example, when a computer flags low attendance, it’s important not to make assumptions based solely on that set of data. It could have been snowing during a week of low attendance, or an assignment deadline may have been looming on another module, or the lecture may have been on a topic not appearing in that year’s exam. Scenarios such as these are why it is always important to follow up “red flags” with questions to teaching staff, rather than simply reacting to the data.

Whichever way round you choose to do things, analytics can help us make our decisions faster and more effectively. But ultimately, it’s people who make the decisions.

Leave a Reply