select search filters
briefings
roundups & rapid reactions
before the headlines
Fiona fox's blog

the trouble with conferences

This is a guest post by Fiona Lethbridge, Senior Press Officer at the Science Media Centre.

 

‘E-cigs are just as bad for your heart as smoking fags as they damage key blood vessels, say experts’.

‘Mediterranean diet better for the heart than taking statins, major study suggests’.

‘Red Bull gives you better mental health? Additive in Red Bull ‘EASES’ the symptoms of psychosis’.

‘How a glass of red wine every night could cut risk of diabetes’.

‘Another reason to ditch Diet Coke! Low-calorie sweeteners ‘could trigger deadly diabetes’.

‘Giving babies Calpol could raise risk of teenage asthma’.

These are just some recent headlines from the national news media.  Some even made the front page.  They are also all about very, very preliminary research that hasn’t yet been through the normal checks and balances of science – neither peer reviewed nor published.

They are stories from scientific conferences – international meetings where scientists get together to discuss their work, seek feedback from their peers, and forge collaborations.  Early-career scientists are often encouraged to attend conferences as part of their training and to present their work to senior colleagues from across the world.  They are undoubtedly important and useful events for scientists and for science.

But are they fit for public consumption?  Lots of conferences employ press officers to do PR and get media coverage for the conference.  And journalists are given that rare opportunity to escape their desks for a few days to go and find stories (and, having been allowed to go, must get some).  But the research being presented at conferences has not been through peer review – it hasn’t been checked for errors, nor had its stats scrutinised, nor been given the once over by an experienced scientist in the field.  (There are some rare exceptions where research is presented at a conference and simultaneously published in a journal, so it has been fully through peer review – those exceptions are not what I’m talking about here.)

We know peer review isn’t perfect of course – bad science does make it through to journal publication, and retractions do still have to be made.  A quick scan through a handful of SMC Roundups will show there are plenty of limitations and caveats, and sometimes serious flaws, in peer-reviewed papers.  But as the saying goes, peer review like democracy is the least bad system we have.  It exists for good reason and represents the self-correcting nature of science.  Critically, peer review often has the effect of toning down the authors’ claims.  Conference talks are science that hasn’t yet been corrected – like unmarked homework, something you probably wouldn’t submit for your exam.  And – quite rightly given what conferences are for – the research often comes from students and junior researchers.  I gave a talk at a conference as a PhD student – I quickly finished making the graphs on the plane over, and I never published.  It’s widely believed that most research presented at conferences never makes it to publication.

The science and health journalists in the UK are good and responsible and will often ask us to gather comments from senior scientists about new research being presented at conferences, which we are of course very happy to do.  But those scientists often say how difficult it is to provide a decent analysis of the work because there is so little information to go on.  Some get exasperated.  Unlike with journal publications where the full paper (methods, stats analyses, figures and tables etc.) is available often along with supplementary material giving more detail about how the work was carried out, the conference research will consist of a poster presentation or a short abstract summary, very light on detail.  Often scientists are reluctant to comment for this reason – we have to persuade them that without their comments journalists will only have the short abstract or poster to go on and will have no option other than to write it up uncritically.  What journalists want is a third-party scientist to say whether the research is any good or not and what its implications are – a very fair ask, but scientists are just unable to do this.

Here are just some examples of what scientists have said in SMC Roundup quotes about press-released conference abstracts:

“This research is unpublished, so has not been through peer review, and only a small amount of information is available; for example we don’t know about the quality of the studies included… it’s difficult to draw any conclusions at this stage”;

“It really is a mug’s game trying to make a careful assessment of a study like this, where no full research paper is yet available, and the study has not yet gone through full peer review by other experts in the field.”;

“There is not enough information presented here to be at all sure that the reported findings are conclusive.  We can’t tell whether this research is of a high quality or not, or whether the data is solid”.

This last one, about red wine and fertility, was covered prominently in five national news outlets.

Prof Sir David Spiegelhalter blogged about one example he was particularly upset about – a conference poster linking pyrethroid pesticides to autism.  In the end this one didn’t make the news – it was widely press released, third-party experts were highly critical of it (of both the details that were given and those that were missing), and UK journalists did not write it up.  A good outcome, and more evidence that journalists are responsible and spend a lot of time sorting the wheat from the chaff.  But are conference proceedings even ripe enough to be picked?

Most conferences have a committee of scientists who review abstracts before the conference programme is put together – but this often involves picking abstracts for novelty or quirkiness, and does not represent a review of the quality of the work (which again would be impossible to do given the scant information presented).  Allan Pacey, Professor of Andrology at the University of Sheffield, said, “The role of an abstract review committee for a conference is not the same as the role of an editor of a journal.  In deciding which abstracts should or should not be presented at a conference, or which should be given a poster presentation rather than a podium slot, the rule base is very different.  The abstract review committee is trying to find a blend of abstracts that might fit the conference theme, or be able to be grouped into sessions to make for an interesting conference that people will want to attend.  It may need to give an early career researcher the opportunity to present their early work and get feedback, or there may be a prize session of the best abstract from a particular group (e.g. nurses or PhD students) to encourage them to participate in research.  Abstracts may be submitted from researchers working in emerging fields who need the feedback from the conference delegates to know where to take the work next or get some rapid informal peer review.  All of this has to be balanced with the abstracts that sound the most promising given more prominence in the programme than the ones which sound more routine.  It’s in no-one’s interest to reject abstracts and inhibit people coming to the conference as the collaborations and connections which can be made can lead to something which is much better.  None of this is done with media in mind.”

Research presented at conferences is often partially formed and asks more questions than it can answer.

So, what’s the solution?  How can we help the public navigate how much confidence to place in such early, unfinished work?  In my ideal world, work that is at such a preliminary stage wouldn’t be press released at all and we should all wait until journal publication.  If these abstracts and posters are good, they will go on to be published in journals and will at that point get the media coverage they deserve.  But I realise this system has existed for years, and so unfortunately that seems unlikely to happen.

The next best thing is to encourage press officers and journalists to report conference research responsibly – and to remind scientists and students that their very early work could end up in the news.  At the very least, conferences shouldn’t be churning out simple ‘A causes B’ stories based on data which might never see the light of day in a peer-reviewed journal.  It would be great if it wasn’t just sexy, quirky stories that got picked but rather the more involved research that is of importance to the field and reflects the direction of travel of research in the area.  Good conference press officers (and there are plenty of those) will help ensure their press releases are cautious, knowing they are publicising work which may not stand the test of science.  But we still get plenty of ‘A causes B’ stories from conferences, and that risks misinforming the public, who may make behavioural or lifestyle changes based on what they read.

And good science journalists (and there are plenty of those too) will recognise that posters and abstracts presented at conferences are merely talking points and discussions of work in progress rather than anything the public can rely on to inform their lifestyle choices.  My ideal world needn’t mean an end to media coverage of conferences – if journalists could spend two or three days with scientists at the conference and get to the bottom of a detailed story or new field of research, that would be fascinating.  But the pressures on news journalists make that near to impossible and more often than not abstracts are turned into stories without much further investigation, and in many cases without the journalist attending the conference at all.

Recent evidence shows scientists are still held in high regard by the public, with no sign that they’ve had enough of experts – but to maintain these high levels of trust scientists need to think about what the public might do with findings that reach the news headlines.  We already know this matters.  There is evidence for example that following negative media stories about statins, fewer prescriptions than normal were picked up.  I would like scientists to think twice about whether their work is ready to hit the news, and to consider waiting a few months until their findings are published.

The conference problem is one of the reasons we developed the labelling system for press releases – with the ‘not peer-reviewed’ label designed specifically for press releases on conference proceedings.  We developed this system having been asked to in a report by the Academy of Medical Sciences after their survey found that only 37% of the public said they trusted evidence from medical science – we don’t know why it’s so low, but could a contributing factor be confusion around the claim and counter-claim they read in the news (e.g. statins safe then not safe, several butter U-turns, the safety and the danger of HRT…)?  This noise is not just the fault of conferences of course – even published science is iterative and can be contradictory.  A newly published tiny observational study one day might have the opposite finding to a newly published huge meta-analysis of RCTs the following week, and those could be covered with equal prominence.  But at least those studies have been finished, peer-reviewed and accepted for publication by a quality journal.

Some journalists may refuse to cover work that hasn’t been peer reviewed.  But if journalists do cover these stories, ideally it’d be clearly signposted to the public that the science is much more preliminary than they might think.  In reality I’m not sure how much information “presented at a conference” gives to readers – better still would be to state what that means: that the work is unchecked and unpublished.  In the same way as many scientific organisations are agreeing that preprints should not be press released because they are not yet fit for public consumption, similar considerations should be made for conference proceedings, which are at an even earlier stage.

Conferences are vital for the scientific community and could be a great place for journalists to get new leads, to delve deep into a particular field, and to make contacts.  But we should all want science coming out of conferences to be treated more cautiously than peer-reviewed, published research.  If not, we’re doing a disservice to the public struggling to navigate what to eat, drink, and take.  A conference poster can never give that sort of advice.

20 Responses to the trouble with conferences

  1. Carl May says:
    What an excellent article! I couldn't agree more.

    What an excellent article! I couldn’t agree more.

  2. Philippa Saunders says:
    A few comments: If the only work presented at conferences was already published (peer reviewed) there would be little justification for the environmental impact of attending! The dissemination of information from scientific studies needs to be carefully considered and balanced with better drafting of press releases - whether peer review guarentees a …

    A few comments:
    If the only work presented at conferences was already published (peer reviewed) there would be little justification for the environmental impact of attending!
    The dissemination of information from scientific studies needs to be carefully considered and balanced with better drafting of press releases – whether peer review guarentees a study is without flaw is another debate to be had.
    On the flip side at a conference last week a presenter emphasised that the information he was presenting was not yet published and that s/he was sharing it so that others could challenge the results if they saw fit. A journalist approached the presenter and said they were going to write them up as a story and would not take ‘no’ for an answer. If the journalist writes a story this may compromise the ability of the presenter to get funding. All scientists need to explain their work and the media can help them in this task but premature reporting or overclaiming in press-releases are both unfortunate and risks losing public trust in research.

    read more
    • This is very true. As scientists we take Unpublished data from conferences with a huge pinch of salt. Another issue that makes this material unreliable is that abstracts have to be submitted several months before the conference so the abstract is often based on very preliminary results. The data presented …

      This is very true. As scientists we take Unpublished data from conferences with a huge pinch of salt. Another issue that makes this material unreliable is that abstracts have to be submitted several months before the conference so the abstract is often based on very preliminary results. The data presented at the conference will usually be at a more advanced stage but will rarely be at the state when it is ready for submission to a journal, letjournal peer reviewed

      read more
  3. Ron Dixon says:
    good thought provoking post Fiona Some scientists promote their work through conferences to claim the territory so that others will not work in the area! Others promote 'promising' results to provide evidence for funding campaigns - both will usually over-inflate the claims for these reasons.

    good thought provoking post Fiona

    Some scientists promote their work through conferences to claim the territory so that others will not work in the area! Others promote ‘promising’ results to provide evidence for funding campaigns – both will usually over-inflate the claims for these reasons.

  4. Petroc Sumner says:
    Good Blog Fiona. Do you know what proportion of news from conferences comes via press releases or via journalists actually attending. Is the key target for caveats and labels the press releases from these conferences?

    Good Blog Fiona. Do you know what proportion of news from conferences comes via press releases or via journalists actually attending. Is the key target for caveats and labels the press releases from these conferences?

  5. Professor Michael Swash says:
    excellent comments - but the major source of misinformation in everyday life is the daily press. And a cursory reads of the Woman's pages makes one want to scream: weird potions, weirder diets, and expensive hair products - all totally untested objectively ! This allies with the extraordinary usage of …

    excellent comments – but the major source of misinformation in everyday life is the daily press. And a cursory reads of the Woman’s pages makes one want to scream: weird potions, weirder diets, and expensive hair products – all totally untested objectively ! This allies with the extraordinary usage of street drugs – none consisting of measured doses or known potency and, at the same time anxiety about tested Pharma products or vaccines. What a world!

    read more
    • Irene Stratton says:
      A recent article in a new women's magazine claimed that eating apples would improve your sex life - even mentioned a highly respected journal. I looked for an article on Pubmed but wasn't able to find it........

      A recent article in a new women’s magazine claimed that eating apples would improve your sex life – even mentioned a highly respected journal. I looked for an article on Pubmed but wasn’t able to find it……..

  6. Margaret Stanley says:
    An excellent piece on the hazards of believing everything you hear -never mind read! Conferences vary hugely but one of the hazards are those in which the results of small clinical trials are presented without the caveats and reservations of the methodology and design of ten because of time …

    An excellent piece on the hazards of believing everything you hear -never mind read! Conferences vary hugely but one of the hazards are those in which the results of small clinical trials are presented without the caveats and reservations of the methodology and design of ten because of time constraints. If these are on important public health interventions significant damage can be done and there are some recent examples. Good science and medical journalists check with experts before writing the article but if the topic is really human interest then the articles may be written by a different journalist. Catchy headlines and careful science rarely go together.

    read more
  7. Pete Castle says:
    Great blog Fiona. I am cautious however of discouraging scientists from talking about their work in public, even at early stages. Incorrectly-communicated 'A causes B'-style presentations are not only unhelpful; if they are wrong, they are just wrong. One way to solve this is to help scientists, of all stages, …

    Great blog Fiona. I am cautious however of discouraging scientists from talking about their work in public, even at early stages. Incorrectly-communicated ‘A causes B’-style presentations are not only unhelpful; if they are wrong, they are just wrong.

    One way to solve this is to help scientists, of all stages, present their work correctly and unambiguously, throughout the research process. For this they need training and a clear understanding of expectations and their responsibilities.

    I think the wider public need to see more behind the scenes of science, not less, if we as a society are to better understand how scientific progress works. It’s harder and costs more than last-minute, easy to digest, ‘ta-dah!’ science comms, but we would all benefit from the enterprise.

    read more
  8. Melanie Davies says:
    Agree, and it's inevitable that the media want to report novel and dramatic findings, that's what their audiences will read. I do feel that the standard of reporting has risen greatly over the years, so that "we think there may be an association between A and B" isn't translated to …

    Agree, and it’s inevitable that the media want to report novel and dramatic findings, that’s what their audiences will read. I do feel that the standard of reporting has risen greatly over the years, so that “we think there may be an association between A and B” isn’t translated to “A causes B”

    read more
  9. Geraint Rees says:
    Sympathetic to the points made but disciplines can have quite different traditions. Beyond a solely biomedical perspective, many conferences are primary means of academic communication and have blind peer-reviewed abstracts, significant debate and discourse and a much smaller press presence (if at all). That doesn’t invalidate the points made …

    Sympathetic to the points made but disciplines can have quite different traditions. Beyond a solely biomedical perspective, many conferences are primary means of academic communication and have blind peer-reviewed abstracts, significant debate and discourse and a much smaller press presence (if at all). That doesn’t invalidate the points made but might suggest being a bit more cautious about generalisation.

    read more
  10. Adam Vaughan says:
    Interesting post Fiona. I guess the key question is what is the best language in an article to reflect the 'unchecked and unpublished' nature of a poster/talk at a a conference, without pulling the rug out from the story or having a paragraph explaining what that means. I'm not entirely …

    Interesting post Fiona. I guess the key question is what is the best language in an article to reflect the ‘unchecked and unpublished’ nature of a poster/talk at a a conference, without pulling the rug out from the story or having a paragraph explaining what that means. I’m not entirely sure of the best answer – usually I’ve got down the ‘presented at conference XYZ’ route.

    read more
  11. Jonathan Amos says:
    Nice blog, Fiona. It reminds me of the time a few years ago when a group would regularly turn up at the American Geophysical Union meeting with yet more "evidence" of Noah's Ark and the Great Flood. They would be given a corner in the poster hall to present their …

    Nice blog, Fiona. It reminds me of the time a few years ago when a group would regularly turn up at the American Geophysical Union meeting with yet more “evidence” of Noah’s Ark and the Great Flood. They would be given a corner in the poster hall to present their “research”. The point I’m making is that some conferences have a very deliberate policy of openness. Crackpot ideas crash and burn when they are exposed to scrutiny, not when they’re “filtered out” by organising committees. The American Physical Society is another meeting that makes room for “the fringe”… because just occasionally (very occasionally) the outlier becomes the next big thing. But as a journalist, you swim in those waters at your own peril.

    read more
  12. Martin Rose says:
    Some good points made, but in the interest of balance, it would be easy to argue about the positive reasons for conference attendance. Networking, meeting colleagues, exchanging research ideas with contempories, social aspects. I'm concerned that managers increasingly think it is sufficient to watch an occassional webinar and …

    Some good points made, but in the interest of balance, it would be easy to argue about the positive reasons for conference attendance. Networking, meeting colleagues, exchanging research ideas with contempories, social aspects. I’m concerned that managers increasingly think it is sufficient to watch an occassional webinar and to trivialise the benefits and breadth of experence gained at conferences. Plus opportunities to visit trades / vendors and learn about new innovatoons ….

    The positives are also numerous, and in my mind still outweigh the negatives! Worth a contrasting article I feel!

    read more
  13. I completely agree with the sentiment of the post. However, it's misleading to say that conferences are never peer reviewed. In Computer Science, conferences are one of the main mediums of publication, are stringently peer reviewed, with papers longer than many a medical journal paper, and acceptance rates of as …

    I completely agree with the sentiment of the post.

    However, it’s misleading to say that conferences are never peer reviewed. In Computer Science, conferences are one of the main mediums of publication, are stringently peer reviewed, with papers longer than many a medical journal paper, and acceptance rates of as low as 10%.

    Labelling all conferences as “not peer-reviewed” unnecessarily stigmatises the T and E in STEM – both of whom use highly competitive conferences as publication venues. I’ve published in conferences in the S, M, and T fields, and the difference between S and M conferences on one hand and T conferences on the other are massive.

    Happy to give you more detail, and it would be great if you could amend your post, which will obviously be highly shared and cited.

    read more
  14. Bravo, your article hits the sweet spot. All we need is radio gaga radio gaga as the queen sing says. You chose nicely apt initial headline claims derived from congress abstracts. But will this erudite piece result in a revolution in more responsible reporting . I am afraid not. As is …

    Bravo, your article hits the sweet spot.
    All we need is radio gaga radio gaga as the queen sing says. You chose nicely apt initial headline claims derived from congress abstracts. But will this erudite piece result in a revolution in more responsible reporting . I am afraid not. As is clearly shown in the book “factfulness”Aeroplane lands safely does not attract news, Aeroplane crashes no matter how rare does. We are in the world of entertainment and measurement sometimes manifesting as science. Add a pinch of salt, stir and hey presto!

    read more
  15. Daniel Brison says:
    Great article Fiona, I am totally in agreement as you know. I can understand the journalistic drive to catch stories when they are brand new, this seems to be the real appeal of conferences. In light of that, perhaps you could do the next blog on the rise …

    Great article Fiona, I am totally in agreement as you know. I can understand the journalistic drive to catch stories when they are brand new, this seems to be the real appeal of conferences. In light of that, perhaps you could do the next blog on the rise of preprint servers such as BioRx? These of course also release data into the public domain without peer review. This has two effects, one is that journalists can write more detailed stories on data which are still not peer reviewed, and second, that when the data do come out in peer reviewed journals newspaper editors are less interested because the story isn’t new! So perhaps you could have a go at squaring that circle 🙂

    read more
  16. ursula arens says:
    And my particular gripe: headlines and text that claim diet-health links, and then reveal 'mouse study' data.

    And my particular gripe: headlines and text that claim diet-health links, and then reveal ‘mouse study’ data.

  17. Fiona Lethbridge says:
    Thanks everyone for your comments – very useful to hear your thoughts, feedback and experiences. Thanks especially to those pointing out that things work differently in computer sciences than in health – that’s really useful to know and I wasn’t aware of that before, so thank you for that …

    Thanks everyone for your comments – very useful to hear your thoughts, feedback and experiences. Thanks especially to those pointing out that things work differently in computer sciences than in health – that’s really useful to know and I wasn’t aware of that before, so thank you for that helpful information. We had a lot of responses to this blog (via email as well as here on the blog), so are taking stock of these and will think about what else we might do in this area.

    read more
  18. Tom Whipple says:
    This is a really difficult topic. As a journalist, I'm not wholly sure what the solution is. We can state in articles that something is at a conference and hasn't been through peer review, but I don't know how much disclaimers like that really change how people read it. The …

    This is a really difficult topic. As a journalist, I’m not wholly sure what the solution is. We can state in articles that something is at a conference and hasn’t been through peer review, but I don’t know how much disclaimers like that really change how people read it.

    The problem is, as with arxiv etc, if something is out there it’s out there. This is, obviously, doubly the case if it’s press released.

    When it isn’t press released though, it’s still out there. I’ve heard of situations where scientists have tried to forbid journalists from quoting them – believing they can stop people from writing down something said in a public forum. That’s not how it works.

    I’ve also personally had a situation where I’ve heard a really interesting talk, tried repeatedly to follow up with the scientist to get some very basic information, and been completely ignored. But the entire world’s press were there; we had to cover it. When his co-authors found out (we didn’t know who they were) they were very cross (with him, not us).

    read more

Leave a Reply to Philippa Saunders Cancel reply

Your email address will not be published. Required fields are marked *

*By commenting on this blog you agree to abide by our Terms and Conditions.