After all the political drama of the past few weeks, it might be easy for the higher education sector to forget that we are deep in discussions about the delivery of the TEF. With the technical consultation for year 2 now closed, many wonks will have devoted a significant amount of thought to matters of metrics and logistics, and now BIS will have the job of synthesising the many and varied responses.
Jisc and many others leading on policy and decision-making believe that learning analytics will have a significant role in the development of the TEF. BIS has already cited it on its indicative list of evidence. Data analytics can, firstly, be a robust source of evidence about excellence in the learning environment, and secondly, help deliver fairer and more equitable learning outcomes.
The first point is easily explained, and may already be familiar to Wonkhe readers: learning analytics involves mining datasets related to student activity to gain insight into the progress of learners, with the goal of improving student experience and success. Analytics can demonstrate and evidence excellent practice for the TEF.
But learning analytics can also help focus efforts on the success of non-traditional or disadvantaged students. A shift towards more data-driven teaching practice and intervention might help close gaps in attainment, retention, and employment among particular groups. To take one example, Jisc’s analytics service, currently in development, will identify ‘at risk’ students to personal tutors to help them promptly manage interventions. Tutors will also be privy to visual dashboards where they can view wider student engagement, make cohort comparisons, and appropriately benchmark students, to effectively tailor and market their services to future learners of different backgrounds.
Over the next few months, Jisc will work with higher education providers using our learning analytics service to pilot candidate ‘TEF-ready metrics’ that relate to the above areas. In the first instance, these could be volunteered by institutions in their first TEF submissions, used perhaps as evidence for TEF commendations. Once the robustness of these new metrics is established and more widely recognised, institutions might begin to phase out more traditional forms of teaching assessment, such as pro-forma completions and panel visits.
Although expert peer judgement will, and should, always be a key part of TEF assessments, using ‘TEF-ready metrics’ could significantly reduce the bureaucratic burden, improving efficiency and effectiveness, lower the total cost of running TEF, and generally reduce any friction or discontent caused.
The choice to use TEF-ready metrics will always rest with the individual institution. Where that option is exercised, universities will have a transparent view of how that data is used within the wider TEF process. We feel this aligns well with the TEF consultation document, which states that “providers will… be expected to judge for themselves how best to make their case using their own choice of indicators of impact and effectiveness.”
While other countries and industries may have taken the lead in using analytics, UK higher education is catching up fast. Jisc’s national pilot programme already has 85 expressions of interest from higher education institutions, and we are still a year away from its national launch. Significant efforts have been made to support ethical use, and as the benefits of analytics become more widely known, I am confident they will become a bedrock for understanding teaching excellence.