Animal studies are a vital part of scientific endeavour, particularly for clinical trials, but it is vital that the information gleaned from them is robust and that animals are not being put through unnecessary procedures. Researchers have been doing a systematic review of trials and assessed them for their statistical robustness. The research, published in PLoS Biology, looks at issues such as randomisation and blinding which increase rigour and reduce the risk of bias. These Roundup comments accompanied a briefing.
Prof. Paul Flecknell, Director of Research Animal Facilities, Newcastle University, said:
“I was aware of the work from conversations with Malcolm and hearing some of his seminars. I’ve read the manuscript, and tend to agree with pretty much all that is said. What I did not pick up was a very full discussion of whether the lack of reporting in journals represents a lack of application of the factor in study design or a lack of detail in the materials and methods section. In other words, were the studies randomised, blinded etc., but this was not reported, or were they not reported because these elements were lacking. I think the suggestion that the elements were lacking may be at least partly correct, based on the other studies of greater effect sizes being reported in papers that lack detail of study design. Similar issues have been raised in the past about other areas of research, notably clinical medical research and especially clinical trials, and major efforts have been made to address these issues. As pointed out in the paper, adoption and most importantly observance of the ARRIVE guidelines would be a major step forward. I review for a number of journals, including seeing all the papers that involve research animals for one medical journal. Although this journal is signed up to ARRIVE, many papers do not report all of the items that are required (e.g. randomisation, blinding, power calculations) let alone reporting on the animals used, their housing, the experimental methods etc. in appropriate detail. When requested, this information is often provided by the authors. This makes me think that the recommendations in the paper of better education of all scientists in study design, and pressure on journals to require information to show that studies have been properly conducted are to be supported (and they no doubt apply to all biomedical research, not simply in vivo studies).”
Dr Vicky Robinson, Chief Executive, National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs), said:
“Though sobering, the findings of this paper are not a surprise, as they add to the existing body of evidence on the need for more rigorous assessments of the experimental design and methodology used in animal research. This is another wake-up call for the scientific community. There is no excuse or justification for using animals in studies that are poorly designed and can never be reproduced. Not only is this a waste of animals but it also has serious implications for the translation of preclinical data to the clinic. Delivering sustained improvements is a major challenge and the NC3Rs is leading efforts nationally and internationally to address this, including last week the launch of an interactive web resource to help scientists use the minimum number of animals and avoid bias in their studies.”
Prof. Marcus Munafò, Professor of Biological Psychology, University of Bristol, said:
“This is an important study which highlights the need for animal research to be conducted in a way that minimised the risk of bias, and for the details to be described thoroughly when the research is published. While the present study focused on animal studies, it is likely that similar issues exist for other kinds of research as well. The fact that measures intended to reduce bias are often not reported does not necessarily mean that they were not in place when the experiment was conducted. However, if this information is not included in the publication that describes the research we cannot know. If editors and reviewers were to require authors to explicitly state whether or not those measures were in place, we would be in a better position to accurately judge whether the results of an experiment are at high risk of bias.
“If you need a reference for the point about these issues extending to other kinds of research you could use this: Carp J. (2012). The secret lives of experiments: methods reporting in the fMRI literature. Neuroimage; 63(1): 289-300.”
Prof. Chris Chambers, Professor of Cognitive Neuroscience, Cardiff University, said:
“This is a very important study but it is equally important that news outlets don’t twist the outcomes to undermine animal research. UK animal research is vital across the full spectrum of biomedicine, from understanding the basics of the body and brain through to treating diseases. What these results show is that animal research can, and must, be conducted to higher standards precisely because it is so important.
“Steps are already underway to improve transparency and reproducibility in animal research and beyond. New initiatives have been launched by journals and funders to promote study preregistration, data sharing, and disclosure of methods. For example, an increasing number of journals are now reviewing research proposals before the studies are conducted and then agreeing to publish the outcomes, regardless of whether the results are exciting or boring. This new type of scientific article is called a Registered Report and prevents several forms of bias that MacLeod and their team identify.
“One aspect of this study particularly troubles me. I find it disturbing that the Higher Education Funding Council were able to prevent the researchers from accessing the details of publications submitted as part of the 2014 Research Excellence Framework. To be clear, the authors were requesting – for the purposes of research – the names of scientific articles that already exist in the public domain and which were funded by UK taxpayers. Denying the release of this data to the researchers is a failure of transparency and accountability, and is clearly against the public interest. The Higher Education Funding Council must be called on to provide a full and public justification for withholding this important data.”
*14/10/15 Clarification from Prof. Chambers: “The data originally withheld by HEFCE has since been made publicly available.”
Prof. Kevin McConway, Professor of Applied Statistics, The Open University, said:
“The authors make it clear that their conclusions can only be about what is published, and not directly about what researchers actually do. But, make no mistake, even if most researchers are using correct methods to reduce the risk of bias and just not publishing them, that’s a big problem. How are readers of the research papers to judge how far the results can be relied on?
“This paper does generally show that things have been improving over time, in relation to some (but not all) measures to reduce the risk of bias. But there’s still clearly a big gap to bridge.
“In clinical trials involving human patients, these bias-reducing measures have been standard for a long time, and they will almost always be reported. There are many reasons for that, including the ethical review processes that clinical trials must go through before they begin, the existence of international guidelines for reporting trials, and strict publication policies of leading journals. All of these exist for in vivo animal experiments too. Ethical review for such experiments is extremely strict in the UK and most other countries, but perhaps the process has not concentrated on these aspects of experimental design to the extent that is usual in ethical review for clinical trials. Perhaps publication guidelines are not as comprehensive or well established as for clinical trials. And the editorial policies of at least some journals clearly haven’t dealt with some of these issues at all. There’s work to be done on all these fronts.”
“Risk of bias in reports of in vivo research: a focus for improvement’ by Malcolm Macleod et al. published in PLOS BIOLOGY on Tuesday 13 October 2015.
All experts declare no interest