Why is lateral flow testing in universities shrouded in secrecy?
Jim is an Associate Editor (SUs) at Wonkhe
Tags
That’s the conclusion of a British Medical Journal (BMJ) investigation published today that raises real questions about the cost and efficacy of the various lateral flow device testing schemes that have been running since the “driving home for Christmas” exercise began back in late November.
We’ve written quite a bit about the higher education offshoot of Dom Cummings’ Moonshot over the weeks, but one of the issues we’ve had has been getting hold of information or data about the exercise – and it looks like we’re not alone.
The BMJ has had to resort to Freedom of Information requests to get data, and of the 69 institutions that disclosed three months’ worth, it found that just 1,649 positive results were reported from 335,383 tests carried out – a rate of 0.5%.
That reluctance to share information about the costs of the scheme and its effect on containing the virus are worrying. The BMJ says that more than three quarters of institutions refused to disclose how much money they’d had from the government to set up testing (with some citing confidentiality agreements with the DHSC) – and when you couple that with the lack of published data on efficacy or participation by students, you end up with Allyson Pollock, professor of public health at Newcastle University, arguing that mass testing in the sector is “haphazard, fragmented, disjointed, and absolutely the antithesis of public health”.
Just 16 institutions disclosed complete data on their funding, the number of tests carried out, and the number of positive results. These showed that the government spent roughly £3,000 per positive test result yielded. But the BMJ points out that experts said that this was likely to be a vast underestimate of the full cost – because it did not take into account factors such as the staffing of testing sites. Bizarrely, of the 111 institutions that gave details on data collection, 60 per cent said they were not collecting data on the number of people being tested (as opposed to the number of tests), and nearly a third were not recording the number of positive tests.
This evaluation of LFD testing in Wales said the average cost of community testing per test was £20 – and if you apply that to the latest figures for LF testing in England, you end up with a cost of £20,000 per positive. And even then we have no idea how many of those are actual positives – because the Westminster government dropped getting a PCR test to confirm the less reliable LFD test back in February, something even the BMJ hasn’t spotted.
Raffle continues:
The fact that they’re not routinely collecting data by person rather than by test, that you need an FOI in order to get any oversight of what’s happening, that they’re not all contributing data into a central, scientifically sound means of knowing what the outcomes are from doing this testing—all this says to me is that the whole thing is a desperate exercise in trying to get favourable publicity for Number 10, trying to get rid of the Innova [rapid flow] test mountain, and trying to change the culture in this country so that we start to think that regular tests for everybody is a worthwhile use of public resources, which it isn’t.”
And then of course there’s participation rates. Everyone in the sector says that many more thousands of students are “back” on campus than were supposed to be. But a couple of weeks ago in England, just 100,000 LF tests were carried out. However you look at it, that’s a tiny percentage of students getting tested twice weekly.
Some would argue that any cases picked up make the whole exercise worth it. But the capacity and time – both in government and in universities is vast. As Pollock says:
We don’t know what percentage of those cases would have been picked up anyway, what percentage of those cases are actually false positives, and how many are missed. What universities should do if they are serious about public health is focus on getting the message out to symptomatic students about the importance of isolating and testing, drop mass testing, and insist on a proper evaluation.”
There’s no doubt that getting the scheme up and running at pace was a major achievement by UK HE. But the sector has been let down here by poor governance, transparency and reporting – and without it the whole thing is starting to look like a huge waste of time and money when both are in short supply.