select search filters
briefings
roundups & rapid reactions
factsheets & briefing notes
before the headlines
Fiona fox's blog

expert reaction to Nobel laureate Randy Schekman’s call to ‘break the tyranny’ of top science journals

Randy Schekman, a winner of this year’s Nobel prize in physiology or medicine declared a boycott of top science journals, claiming they are distorting the scientific process.

 

Miranda Robertson, Editor of BMC Biology, said:

“Randy Schekman has drawn welcome attention to some of the problems faced by authors because of current practices in scientific publishing and the evaluation of research.

“There are lots of journals out there for academics to publish in, and many of these journals embrace open access as a publishing model. BioMed Central was the first major journal publisher to release authors from the space constraints of print and make all research publications accessible without charge on the web, offering portability of peer review and a diversity of peer review policies aimed at maintaining the integrity of scientific publication and satisfying the needs of the scientific community, funders and institutions. To answer the increasing need for measures that don’t depend on the reputation of the journal or the vagaries of the impact factor, article level metrics are provided for all research publications in BioMed Central journals.

“We actively encourage open debate between editors, publishers and the scientific community to promote the evolution of better ways of serving the community’s needs.”

 

Dr Phil Campbell, Editor of Nature, said:

“Nature has worked with the scientific community for over 140 years, and our validation that we are serving their needs comes from our readers, and the authors and peer reviewers who continue to choose to work with us. We select research for publication in Nature on the basis of scientific significance. That in turn may lead to citation impact and media coverage, but Nature editors aren’t driven by those considerations, and couldn’t predict them even if they wished to do so. Nature has been publishing research using essentially the same criteria for decades: original scientific research, of outstanding scientific importance, which reaches a conclusion of interest to an interdisciplinary readership. It is up to the scientific community and evaluators to decide how much importance they want to place on papers that appear in the journal. Nature is the world’s most cited scientific journal, but it also publishes many gems of scientific significance which receive few citations.

“The research community tends towards an over-reliance in assessing research by the journal in which it appears, or the impact factor of that journal.  In a survey Nature Publishing Group conducted this year of over 20,000 scientists, the three most important factors in choosing a journal to submit to were: the reputation of the journal; the relevance of the journal content to their discipline; and the journal’s impact factor. My colleagues and I have expressed concerns about over-reliance on impact factors many times over the years, both in the pages of Nature and elsewhere. On nature.com, we share data about individual articles – citations, news and social media mentions – and encourage use of all metrics in context.

“I don’t think it is helpful to conflate open access with selectivity. Open access is an access model and a business model. Nature’s publisher, Nature Publishing Group, has embraced that business model where it is viable and now has over 60 journals with open access options and, with Frontiers, is now the fifth largest publisher of open access journals.“

  

Dr Jeremy Farrar, Director of the Wellcome Trust, said:

“Randy Schekman, whose work at eLife we are proud to support, is right that impact factor has a distorting effect on science, and that the research community too often treats the journal brand in which scientists publish as a proxy for quality of science. The Wellcome Trust is committed to avoiding this, and to judging applicants for our support by the actual quality of their past and proposed work, not by the journals where they publish or their impact factors. We acknowledge, however, that we do not always get this right, and that applicants for our support do not always trust us to. We are determined to do better, which is why we helped to found eLife, and why we were among the original signatories to the San Francisco Declaration on Research Assessment. Science needs to develop better ways of assessing the quality of research outputs, and we welcome the very necessary conversation that has begun.”

in this section

filter RoundUps by year

search by tag