The first set of statistics on lateral flow device (LFD) testing conducted in education settings in England have been published by the government.
Prof James Hargreaves, Professor of Epidemiology and Evaluation, London School of Hygiene and Tropical Medicine, said:
“These initial data show encouraging early evidence of the scale up of the lateral flow testing programme in educational institutions. Over 2,500 tests have identified evidence of current infection through the programme and those receiving positive test results will have been advised not to attend school, thus reducing the risk of in-school transmission. This is to be welcomed.
“However, in the coming weeks it will be critical to track a number of other aspects of the programme. First, the coverage of the programme – the percentage of those attending schools who participate in the testing programme – as high coverage will be essential to minimise the risk of in-school transmission. Second, it will be important to understand the impact of any false negative tests on in-school transmission risk. Third, it will be critical to track the impact of the programme on school attendance and identify ways to ensure education access for those who test positive and their contacts. Finally, the burden on schools and teachers of implementing the programme successfully must also be tracked, and the most feasible successful approaches prioritised.”
Prof Jon Deeks, Professor of Biostatistics and head of the Biostatistics, Evidence Synthesis and Test Evaluation Research Group, University of Birmingham, said:
“The number of positive LFTs in secondary school children for the period from Thursday 4th March to Wednesday 10th, during which most secondary schools were undertaking their rounds of supervised lateral flow testing have just been reported. During that period 2,762,775 tests on secondary school children (excluding colleges and sixth forms) were reported to have produced 1,324 positives. This is 0.05% or 1 in 2087 pupils testing positive. Figures were similar overall (including sixth colleges and staff) with 2,039 positives out of 3,725,655 – 0.06% or 1 in 1827 testing positive.
“These figures are much lower than those that the Government have been using to plan this policy. Based on this Anthony Browne MP article (https://www.conservativehome.com/platform/2021/03/anthony-browne-the-governments-covid-testing-policy-for-schools-seems-strange-but-rests-on-good-science.html) the Government has assumed that the prevalence would be 0.05%, the test would detect 50.1% of cases, and that only 0.03% would get of those without Covid would get false positive results. These figures predict 7745 cases where 1324 were observed, and 10445 when 2039 were observed. Clearly these Government predictions are seriously wrong.
“This result is not a surprise as the test has been reporting similar low rates for the past weeks in school children, evidence that appears to have been ignored. The likely explanations are that the ability of the test to detect infection is compromised in children, or many fewer children have asymptomatic infection than was predicted. It is essential that we know.
“Testing in school children has been introduced without evaluation. Although ‘pilots’ of testing in schools have been undertaken, opportunities to evaluate the accuracy of the test in children have been missed. Thus we have no data on the accuracy of the test in school aged children, and even the findings of the University testing of students has not been released by the DHSC. Now that we see it performing in unexpected ways it is essential that proper evaluations are undertaken and all data and reports are made publicly available.
“Tests which perform poorly can do harm. False positives will make up a higher proportion of those testing positive, who are wrongly isolated together with their bubble and families, exacerbated by the policy not to confirm positive cases with PCR. Confidence that testing is helping prevent transmission is misplaced, which may lead to disinhibition to get tested when symptomatic and higher rates of continue with infection control behaviours. The ongoing failure during the pandemic to plan and evaluate tests in each setting where they are intended to be used is putting public health at risk.
“Mass testing for rare diseases may be a poor use of limited resources. Should the cost, including staff time, of each test be between £10 and £20 each, then these figures suggest that we are spending between £20K and £40K for each positive result. As many of these will be false positives it seems likely that the cost per detection of a case may well be above £50K. It is relevant to question whether these funds could be better spent in other ways to detect more cases and make a greater impact on the pandemic.”
Dr Alexander Edwards, Associate Professor in Biomedical Technology, Reading School of Pharmacy, University of Reading, said:
“This shows the scale at which rapid asymptomatic testing is being rolled out, but doesn’t yet provide much information about test performance or highlight any benefits/disadvantages of the scheme. Further analysis of more detailed data is required, and I look forward to seeing these as they are published.
“There doesn’t seem to be enough information provided to interpret performance, and indeed they have a big disclaimer stating that the data presented can’t be interpreted. For example, they include the total number of tests completed, but of course lots of people are tested frequently, so this doesn’t tell us how many people were tested. This is what they mean about the data not being “de-duplicated”.
Prof Sheila Bird, Formerly Programme Leader, MRC Biostatistics Unit, University of Cambridge, said:
“The emerging data from INNOVA Lateral Flow Tests (LFT) thrice screening in secondary schools are consistent with the test sensitivity (how often a test correctly gives a positive results for someone infected) for asymptomatic secondary pupils being less than 40%. This assumes that specificity (how often a negative result is correctly given) of the tests is genuinely as high as 9,997 per 10,000 (ie only 3 false LFT-positives per 10,000 uninfected pupils) and the prevalence of asymptomatic infection is assumed to be very low: 1 per 1000.
“As forewarned by the Royal Statistical Society, it is likely that at least half of secondary pupils’ LFT-positives would be PCR-negative if they had had the confirmatory PCR-test. Not allowing secondary pupils and their families to check their LFT results with a PCR test is outrageous.
“Prudent parents and schools have arranged PCR test to confirm the LFT results. I would like to see NHS Test & Trace urgently match these PCR results obtained by parents and schools to today’s reported LFT-positives for secondary pupils who returned to school on 8 March 2021. Please disclose urgently the number of matched PCR-tests and the percentage that were PCR-negative.
“DfE has led us into a statistical cause celebre. Our children deserve better. Even now, re-instate PCR-adjudications. And brush up on statistical thinking.
“But hats off to head-teachers for how well secondary schools have manged the LFT thrice screening of their pupils who returned to school on 8 March 2021 after winter lock-down.”
Last week, on 10 March, NHS Test & Trace/PHE released an analysis which advocated that the specificity of INNOVA Lateral Flow Test for screening asymptomatic secondary pupils was at least 99.97%: hereafter taken at face value. But please see earlier critiques by Bird & Deeks.
DfE/PHE have failed to disclose their prior planning assumptions on:
Today, NHS Test & Trace acknowledged, in effect, that England does not know how many of the 2,762,775 INNOVA screening tests taken by secondary pupils (excluding those in colleges and 16-19 schools) in the week of 4 to 10 March were 1st LFTs on return to school. Difficulties included: that some LFT-screening was already in place for secondary-age children of key workers; and some well-organized schools began their LFT-monitoring of secondary pupils in the week preceding 8 March. Both difficulties should have been anticipated by designing an efficient, minimal data-collection from secondary schools to monitor the uptake of DfE’s monitored-thrice-LFT policy.
Let’s subtract secondary pupils’ LFTs in the week of 25 February to 3 March from those reported for 4 to 10 March to account for testing in children who were not newly returned to school. We are left with 2,442,721 negative LFTs, 1143 LFT-positives and 3,119 LFT-voids. Notice
On 5 March 2021, the Royal Statistical Society’s COVID-19 Taskforce issued a statement on schools, see https://rss.org.uk/RSS/media/File-library/News/2021/RSS-statement-on-surveillance-in-schools-5-March-2021.pdf, in which we warned that half the asymptomatic secondary pupils’ LFT-positives could be false positives. Consistent with ONS Infection Survey for secondary-age pupils (4 per 1000 infected), we assumed that 2 per 1000 would be asymptomatic infections.
Revisiting RSS’s illustration, still with prevalence of 2 per 1000 asymptomatic infections, we’d have:
2 per 1000 of 2,199,642 asymptomatic pupils are infected: 4400 infected (rounded up)
Different options for INNOVA-sensitivity for asymptomatic infections in secondary pupils
Option 1: 10% => 440 LFT-positives who are truly infected;
Option 2: 20% => 880 LFT-positives who are truly infected;
Option 3: 40% => 1760 LFT-positives who are truly infected.
Options 2 & 3 are inconsistent with secondary schools’ data. Option 1 for sensitivity seems too low even for INNOVA LFT . . .
Perhaps the prevalence of asymptomatic infections was lower still by 4 March, say 1 per 1000 asymptomatic infections? Then, we’d have:
1 per 1000 of 2,199,642 asymptomatic pupils are infected: 2200 infected (rounded up)
Different options for INNOVA-sensitivity for asymptomatic infections in secondary pupils
Option 1: 10% => 220 LFT-positives who are truly infected;
Option 2: 20% => 440 LFT-positives who are truly infected;
Option 3: 40% => 880 LFT-positives who are truly infected.
Let’s now apply NHS T&T version of specificity, according to which we expect only 3 per 10,000 LFT-positives among those pupils who are uninfected. Then we’d expect 3 per 10,000 of 2,197,442 uninfected asymptomatic pupils to test LFT-positive, namely 659 LFT-false positives (qed for Option 2).
Prof Gary McLean, Professor in Molecular Immunology, London Metropolitan University, said:
“I think this is a very good sign for several reasons. Firstly, the numbers identified as positive are low. The proportion of positive tests is also very low and well under the estimated prevalence of infections currently. And finally, the lateral flow tests have identified 2588 positives that may not have been identified otherwise. It is difficult to know if any are false positives until confirmatory PCR is reported.
“Undoubtedly the lateral flow tests will miss cases as the window for this test to work is more limited than PCR and the detection mechanism has lower sensitivity. In addition self-administering tests will inevitably result in some not being performed optimally.
“Nevertheless, a very good sign that cases are low even if it is probably an underrepresentation.”
Dr Simon Clarke, Associate Professor in Cellular Microbiology at the University of Reading, said:
“Lateral flow tests are not perfect, and there are risks if people change their behaviour, mistaking a negative test result for ‘having the all-clear’, which it is not. Lateral flow tests are not as sensitive as ‘gold standard’ PCR tests, and sometimes do not pick up cases, particularly in people with little or no symptoms.
“However, as a way to provide early warning signs of outbreaks, or to survey large populations as you have in a school, they can be very useful. While only 2,500 positive cases in more than 4 million tests may not sound like much, and may well have missed cases or provided some false positives, the mass testing of populations could provide a vital part of our route out of lockdown. This needs to be part of an integrated policy of testing, tracing contacts of positive cases, and isolating those at risk of being infected, to prevent further spread.
“We need as many early warnings as possible to allow schools to stay open, or efforts to restart other parts of society and the economy will inevitably have to be put back.”
Prof Jon Deeks: “Jon Deeks is Professor of Biostatistics at the University of Birmingham and fully funded by the University of Birmingham. He leads the international Cochrane COVID-19 Diagnostic Test Accuracy Reviews team summarising the evidence of the accuracy of tests for Covid-19; he is a member of the Royal Statistical Society (RSS) Covid-19 taskforce steering group, and co-chair of the RSS Diagnostic Test Advisory Group; he is a consultant adviser to the WHO Essential Diagnostic List; and he receives payment from the BMJ as their Chief Statistical advisor.”
Prof Sheila Bird: “SMB chaired the Royal Statistical Society’s Working Party on Performance Monitoring in the Public Services, serves on the RSS’s COVID-19 Taskforce and chairs the RSS/DHSC Panel on Test and Trace. Since mid-January, SMB also serves on the Testing Initiatives Evaluation Board.”
Prof Gary McLean: No conflict of interests to declare.
None others received.