With WHO declaring an infodemic of false facts in the wake of COVID-19, our misinformation event aimed to give science communicators and press officer some useful tools to help combat the problem.
Our first speaker was Dietram Scheufele, Taylor-Bascom Chair from the University of Wisconsin-Madison and Morgridge Institute for Research. He started by questioning how much we know about the infodemic and its impact.
Facts can be elusive during a pandemic, or any new scientific challenge, as researchers work to get consensus and results take time to interpret. This can lead to the dissemination of opposing ‘facts’ and views which makes it difficult to ensure people act or respond appropriately. How this phenomenon has impacted on behaviours linked to COVID-19 (such as mask-wearing) is still unknown.
Our identity, beliefs, and confirmation bias (the tendency to search for, interpret, and favour, information that confirms prior beliefs) shape how we perceive any information that we receive. This means that countering misinformation with accurate information may not be successful in ensuring people have a proper understanding of the situation. An example of confirmation bias was seen when polling Americans about their government’s response to COVID-19: 63% of democrats believed there were more deaths than were being reported (with 29% saying about the same); 40% of republicans believed there were fewer deaths than were being reported (with 36% saying about the same). Both political sides had the same information but were more likely to interpreted it in line with their own values.
Ultimately, facts alone do not shape opinion or policy, which can confuse things further when politicians say they are sticking to the science. Societal debates informed by science, help us respond to questions that do not have (exclusively) scientific answers.
Author and journalist Tara Haelle continued the conversation focussing on her work around antivaccine sentiment.
Responding to a question on the live chat, asking why some memes containing misinformation are so persistent, Tara explained that memes are based on symbology and storytelling, and those have been the central form of communication and understanding of ideas since the earliest days of human communication. The stronger the emotional and storytelling salience of the meme, the more robust its persistence is.
Adding to what Tara said, Dietram explained that the more resonance there is with shared cultural schemas, the more effective the meme. That’s why the ‘frankenfood’ frame for GM crops was shared so much, because it plays to a widely understood story about Dr Frankenstein creating a monster from different parts (linking this to transgenics), and the monster getting out of the lab (perceived unintended consequences of GM).
Tara highlighted her recent article on the ‘plandemic’ conspiracy theory to call for us all to speak up about misinformation when we see it while being careful how we do this. Often, people can respond to misinformation with anger. However, it is essential to consider emotions as well as facts. Empathetic responses which aim to understand why someone is sharing misinformation may be more successful.
Conspiracy theories are successful because they tap into people’s uncertainty, anxiety and need for answers. COVID-19 has made the world an uncertain place, and change can be uncomfortable. Conspiracies and misinformation can provide comfort.
Using a concept introduced by Aristotle over two thousand years ago, Tara explained that conspiracy theories could be successful because they understand how to persuade and motivate people using ethos, pathos and logos. Ethos appeals to the desire for credibility and authority. Misinformation and conspiracy theories often claim to be from credible sources. Pathos is an appeal to emotion, and sources of misinformation often use more emotional communications than those delivering scientific findings. Logos (the appeal of facts and logic) is often omitted or given as misinformation painted as the truth after making a strong emotional connection. One way of responding to people posting misinformation is to build trust, perhaps by asking why they find the information they are sharing so persuasive and starting the discussion there.
The closing part of the event discussed resources that could help (see the list at the end of the article).
Tom Philips is Editor at Full Fact and discussed their methods for fact-checking and countering disinformation. Full Fact is an independent fact-checking charity launched by a cross-party group with a board that includes representatives of different political parties and viewpoints. They are an official fact-checker for Facebook. They ask people to correct the record when they get things wrong, develop new technology to counter misleading claims, and campaign for better information in public life. Full Fact have a range of resource that are useful for communications professionals.
The Full Fact toolkit provides simple, practical tools anyone can use to identify bad information as well as links to fact-checkers in many different countries. Tom also pointed to some useful blogs on the Full Fact website, including one focussing on tackling health misinformation.
Other useful resources discussed included:
Ways to analyse an account to see if it is likely to be a bot:
A misinformation colloquium held in Irvine, CA on April 3-4, 2019: https://www.youtube.com/playlist?list=PLGJm1x3XQeK0iQVY1nQ2yarJcCEmxxsG7
Escite resources for tackling misinformation.
An article on how to help those around you be better informed about the pandemic.
The Science Media Centre’s wealth of resources.
It was great to receive so much positive feedback from the event. Richard Ashby, founder, Dotkumo said, “I attended the recent online Stempra event ‘Managing Misinformation’ as a guest and found it very useful with outstanding speakers. I really enjoyed it and now have a lot more resources to explore. There was a wealth of expertise in the virtual room, although the meeting always felt inclusive and welcoming with ample opportunities for questions.”