Cautious headlines, claims and explicit limitations in press releases: the evidence so far.

16th August 2019

This is a guest blog from Prof Petroc Sumner, Professor of Neuroscience at Cardiff University, and member of InSciOut, a group of scientists and journalism academics studying how science gets reported in the press and the processes that create misunderstandings and exaggerations.

Since we started on the daunting task of analysing hundreds of press releases and news articles in 2012, there have been several studies, including our own, presenting evidence on the topic of caution vs hype in news and press releases. The main conclusions from this research are:

Cautious press releases still get news. The main motivation for researchers or press officers to worry about being cautious seems unfounded. Three studies involving 1300 press releases and over 3000 news stories have found as much news for cautious and non-exaggerated press releases as for overly-strong press releases1-3. In our new study, we report on a collaboration with press officers to run a trial where some press release headlines and claims were made more cautious (e.g. with words like ‘may’), and some press releases had explicit caveats added (e.g. ‘this research was correlational, so we cannot be sure if X causes Y’). The press releases that were cautious in these ways got at least as much news as the press releases that had stronger titles or did not have caveats. Note that the cautious headlines, claims and caveats in these studies were relatively concise and clear; long-winded caution was not tested, and should probably be avoided. (There has also been one study in the Netherlands4, which found 45% more news for exaggerated Dutch press releases. The reason for the difference is unclear because in most other ways the Dutch press releases and news stories are similar to those in the UK4,5 ).

Explicit caveats do not harm reader interest. General readers in internet-based research rated stories just as interesting even when they had strong caveats6. The caveats had the expected effect of lowering ratings of confidence in the researchers’ conclusions; thus they communicated caution without lowing interest.

Cautious tone is not ignored. It is not true that news will always opt for the most pithy and strongly stated headline if you give them a reason to choose a more cautious one. For example, news headlines were about twice as likely to use words like ‘might’ and ‘may’ for correlational findings if press releases did so 3. Conversely, if press releases used stronger wording than the peer-reviewed research warrants, news was 2 to 4 times more likely to do so1,2,7. Obviously, there are many exceptions, but on average the news followed the press release’s lead in terms of either cautious wording or stronger wording (to maximise this relationship, it is probably important to provide a form of words that conveys caution deftly; see advice below).

Explicit caveats can also get into news (as does other key information). Although news does not carry an explicit caveat as often as we might like (the rate is below 20%), neither, in fact, do press releases 2,8. And news is very much more likely to do so if the press release does 1-3. The increase was 20-fold in our new study. Likewise, information such as absolute rates tends to make it into news if prominent in the press release9.

Quotes from independent sources are rare in news. Although all journalists aspire to providing balance and critique, the time pressures in reality impose severe restraints. One outcome is that quotes from independent experts unrelated to the study appeared in under 8% of news stories in both the UK and Dutch samples 5. Interestingly, when independent quotes did appear, they were correlated with lower rates of exaggeration. It is debatable what the role of press officers (beyond the Science Media Centre) could be to help improve the seeking of independent quotes, given the role is often to promote one university or journal.

Handling correlations

Many of the most eye-catching stories about health, diet and lifestyle are based on correlational data 10,11. Although correlations are often presented as causal evidence 12,13, it is very difficult to know whether the relationship is causal or whether there is some other factor at play. More than half of news-reported conclusions based on correlations turn out to be unsupported by subsequent meta-analyses14. Handling this uncertainty in a clear and concise way is not easy, and the English language lends itself to inadvertently making causal statements. Our advice is as follows:

Learn to spot them. Once you learn to spot the clues that data is correlational even though the claim is causal, you will see them everywhere (including in peer-reviewed journal articles). Think twice before repeating the causal claim. Discuss it with the researchers.

Use ‘might’ and ‘may’. Sentences or headlines that explicitly describe correlational findings are longer and more cumbersome than simple causal statements. But you can use a simple ‘might’ or ‘may’ to convey the same thing (e.g. ‘eating chocolate may reduce heart disease risk’). Evidence suggests readers interpret such sentences the same way as the longer correlational sentences (e.g. ‘eating chocolate is associated with lower heart disease risk’) 15. This allows you to use a deft ‘may’ in the headline and then explain the correlation in the main text. As noted above, mights and mays can make it into news headlines and claims 3.

Include an explicit caveat about cause. The evidence shows that even very explicit caveats, such as ‘this study was correlational and therefore we cannot make a causal conclusion’, can get reproduced in news 3,8. Importantly press releases with such strong caveats still got at least as much news3, and readers judge stories with caveats to be as interesting as those without caveats6.

References

  1.  1. Sumner, P. et al. Exaggerations and Caveats in Press Releases and Health-Related Science News. PloS one 11, e0168217 (2016).
  2.  2. Adams, R. C. et al. Claims of causality in health news: a randomised trial. BMC Medicine 2019 17:1 17, 91 (2019).
  3.  3. Schat, J., Bossema, F. G., Numans, M., Smeets, I. & Burger, P. Overdreven gezondheidsnieuws. Relatie tussen overdrijving in academische persberichten en in nieuwsmedia. openaccess.leidenuniv.nl (2018).
  4.  4. Bossema, F. G. et al. Expert quotes and exaggeration in health news: a retrospective quantitative content analysis. Wellcome Open Research 4, 56 (2019).
  5.  5. Bott, L. et al. Caveats in science-based news stories communicate caution without lowering interest. Journal of Experimental Psychology: Applied (2019). doi:10.1037/xap0000232
  6.  6. Yavchitz, A. et al. Misrepresentation of Randomized Controlled Trials in Press Releases and News Coverage: A Cohort Study. PLoS Med 9, e1001308 (2012).
  7. 7. Wang, M. T. M., Bolland, M. J. & Grey, A. Reporting of Limitations of Observational Research. JAMA Intern Med 175, 1571–1572 (2015).
  8. 8. Schwartz, L. M., Woloshin, S., Andrews, A. & Stukel, T. A. Influence of medical journal press releases on the quality of associated newspaper coverage: retrospective cohort study. BMJ 344, d8164–d8164 (2012).
  9. 9. Haber, N. et al. Causal language and strength of inference in academic and media articles shared in social media (CLAIMS): A systematic review. PloS one 13, e0196346 (2018).
  10. 10. Wang, M. T. M., Bolland, M. J., Gamble, G. & Grey, A. Media Coverage, Journal Press Releases and Editorials Associated with Randomized and Observational Studies in High-Impact Medical Journals: A Cohort Study. PloS one 10, e0145294 (2015).
  11. 11. Haneef, R., Lazarus, C., Ravaud, P., Yavchitz, A. & Boutron, I. Interpretation of Results of Studies Evaluating an Intervention Highlighted in Google Health News: A Cross-Sectional Study of News. PloS one 10, e0140889 (2015).
  12. 12. Prasad, V., Jorgenson, J., Ioannidis, J. P. A. & Cifu, A. Observational studies often make clinical practice recommendations: an empirical evaluation of authors’ attitudes. Journal of Clinical Epidemiology 66, 361–366.e4 (2013).
  13. 13. Dumas-Mallet, E., Smith, A., Boraud, T. & Gonon, F. Poor replication validity of biomedical association studies reported by newspapers. PloS one 12, e0172650 (2017).
  14. 14. Adams, R. C. et al. How readers understand causal and correlational expressions used in news headlines. Journal of Experimental Psychology: Applied 23, 1 (2017).
Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInShare on RedditEmail this to someone
© Copyright Stempra