Looking back through the Student Travel Window

The Student Travel Window should have seen mass LFD testing of students, but what actually happened?

David Kernohan is Deputy Editor of Wonkhe

Students hoping to travel away from their term-time home for the Christmas break were asked to travel between 3 December and 7 December. In support of this, mass testing – via the medium of the then newish Lateral Flow Device (LFD) Covid-19 tests – was to be used to stop transmission.

A glance at the Wonkhe data dashboard suggests that cases were rising nationally during this period, but the number of cases identified in student areas (at MSOA resolution) was not a significant part of this. LFD test results are not shown in the figures I plot over there, but the rubric was that a positive LFD test led to isolation alongside a (more reliable) Polymerase Chain Reaction (PCR) test – those are the ones used in main case totals.

So there’s two possible explanations – either mass testing of students using LFD tests found very few Covid-19 cases, or the mass testing wasn’t really as “mass” as we might have thought.

Public Health England releases data by lower tier local authority district on the number of LFD tests each day. Here I’ve plotted the total number of new tests between 3 December and 7 December against the number of students estimated (2018-19 data, via Jisc Tailored Data from the HESA return) to be living in each area.

[Full screen]

There’s nowhere with appreciable numbers of students that can claim to have tested even a majority of students during this period – the environs of Liverpool (for example, the Wirral) are the only exceptions, and the trial of mass public testing in Liverpool will have had an impact here. Not all of these LFD tests would have been linked to university students – care homes were also being tested with LFD tests at the time.

To be clear, the DfE position was:

Tests will be offered to as many students as possible before they travel home for Christmas, with universities in areas of high prevalence prioritised.

We also understand that the availability of other testing was a factor, as was the proportion of high risk students in each provider. Students, of course, had to volunteer to take these tests before leaving – there was no requirement for each provider to test their entire student body. It appears that the tests were not popular, possibly for the reasons we discussed at the time. Remember also that providers had to volunteer to administer these tests, and covered the costs associated with this – by all accounts an amazing job.

There have been widely reported concerns about the accuracy of these tests outside of clinical environments – it does appear that accuracy improves with staff experience and training. That said, data from the University of Liverpool suggests that only 59 per cent of tests were accurate – better than tossing a coin, but only slightly.

In essence, we gave a small number of students a not-particularly-useful test and then allowed them home in the same five-day period. Only the low case numbers among students at that particular point in the cycle (for which we can thank the majority of students following guidance and self-isolating when required) meant we avoided yet another national public health disaster.

Leave a Reply