select search filters
briefings
roundups & rapid reactions
before the headlines
Fiona fox's blog

expert reaction to study examining the reproducibility of cancer research

A study in the journal eLife has estimated the accuracy of reported effect sizes of a specific anti-cancer drug in pre-clinical studies. They report an overestimate of effect in those studies and suggest publication bias as a factor.

 

Prof. Marcus Munafò, Professor of Biological Psychology, University of Bristol, said:

“This is an important study which highlights the need for animal research to be conducted in a way that minimises the risk of bias, for the details to be described thoroughly when the research is published, and for studies which show little or no effect to still be published so the that a full picture of all the research conducted is available. While the present study focuses on animal studies of cancer drugs, it is likely that similar issues exist for other kinds of research as well. The fact that measures intended to reduce bias are often not reported does not necessarily mean that they were not in place when the experiment was conducted. However, if this information is not included in the publication that describes the research we cannot know. If editors and reviewers were to require authors to explicitly state whether or not those measures were in place, we would be in a better position to accurately judge whether the results of an experiment are at high risk of bias.”

 

Prof. Chris Chambers, Professor of Cognitive Neuroscience, Cardiff University, said:

“Once again we are faced with an area of biomedical research suffering from poor research practices, this time in preclinical cancer studies. Preclinical animal research is vital for producing effective medications, which is why the quality of research in this field is paramount. The report follows a depressingly familiar pattern. First a major problem with reproducibility is identified; then we hear about a major organisation obstructing attempts to better understand the problem – this time the Food and Drug Administration – and finally, we hear a call for reform, in particular more guidelines to stimulate better research practice.”

“The problem is that guidelines alone will never fix these problems. Scientists already know how to do better science; they simply have no reason to do it. We need to radically alter the incentive structure of academia, dumping a system that rewards scientists for doing the minimum necessary to generate publishable results in favour of one that rewards doing everything reasonable to produce reliable science. A major part of this solution is to decide which scientific studies should be published before the results of studies even exist. This initiative, known as Registered Reports (https://osf.io/8mpji/wiki/home/), prevents many forms of research bias and has now been adopted by 20 scientific journals including eLife and Royal Society Open Science. Until transparency and reproducibility are embedded within all stages of science, from funders to journals, we can expect to see more reports of how biomedical research is failing public expectations.”

 

A meta-analysis of threats to valid clinical inference in preclinical research of sunitinib’ by Valerie Henderson et al. published in eLife on Tuesday 13th October 2015. 

 

Declared interests

Prof. Marcus Munafò and Prof. Chris Chambers declare no interests.

in this section

filter RoundUps by year

search by tag