The scientific community needs to think carefully now about how it chooses to communicate with the wider world and those it struggles to reach.

Iterations of a Generative Adversarial Network AI learning to create abstract art

If knowledge is the soil from which all human endeavors grow, then our modern society has sprung from field upon field of scientific discovery. Science has cultivated much of the progress our species has made over the past few centuries – allowing us to treat diseases, build computers and visit other worlds.

But is trust in the scientific way of thinking – which has been so important since the Enlightenment – eroding away? The director general of the World Health Organization has described how the world is currently fighting not just the Covid-19 pandemic but also an “infodemic” – where data is distorted and tangled with falsehoods so perniciously that it is causing harm to people’s health and lives. Misinformation about vaccines, for example, has interfered with the uptake of our best defence against the virus, while some people have died after following inaccurate advice or taking fabricated treatments against the virus.

There are concerns that the digital ecosystems we use to communicate are allowing misinformation and disinformation to spread faster and in ways that they never have before.

Not all areas of scientific work, however, are subject to misinformation campaigns and conspiracy theories. Few hoax rumours circulate on social media and internet forums about quantum computing or gravitational waves. So why do other topics such as climate change, 5G technology and vaccines become the subject of sustained misinformation campaigns?

While they encompass quite different areas of science and have differing impacts on people's lives, if we look closely it is possible to see some common threads running through myths that perpetuate around science.

Invariably they start with a kernel of truth. They are underpinned by a fact or observation that make the arguments that follow feel more plausible. They often use scientific sounding reasoning too. But while scientists build hypotheses that can be tested, adapting their theories based on their observations, conspiracy theories and fake news tend to be rooted in emotions. They trigger emotional reactions such as moral panic that cloud how evidence is interpreted. Emotional messages travel faster and use a form of logic that is very difficult to then counter with facts, even when there is a preponderance of evidence. Once a kernel of truth is planted in the right fertile ground, it is then watered and fed by this different way of thinking.

But what creates this fertile ground that allows misinformation to flourish?

Research suggests that misinformation is more likely to spread at times of great uncertainty, such as during the current pandemic, where it becomes difficult for people to assess a claim’s credibility. In the absence of a good explanation, it is natural to seek out alternative information to complete the gaps in our mental model of the world.

People’s own preconceptions also make them more prone to believe misinformation – if a claim confirms our existing beliefs and biases, then we are less likely to interrogate its truthfulness. Misinformation is also more likely to spread if it has a direct impact on the people who are reading and sharing it. Certain topics such as those involving our health or those that trigger moral outrage proliferate more readily.

It would also be wrong to dismiss those who are vulnerable to misinformation too readily. Many communities still carry the weight of inequality and discrimination that has left them with a deep distrust of conventional sources of information. The medical community, for example, should be aware of how its past mistakes affect the willingness of certain groups to listen to them. Instead they may seek alternative sources of information on the internet, social media or passing within their communities themselves.

The source that we all obtain our information from is important and the level to which we question its validity varies accordingly. We are more likely to trust our friends and family as the core of our social networks, so in those communities that already distrust the authority of the medical community, it is hardly surprising we see higher levels of misinformation being spread.

Armed with this understanding of how falsehoods can become an alternative reality, it may even be possible to predict what topics may become fake news in the future. The question then becomes what to do about it? There is some evidence that simply rebutting misinformation may only serve to amplify that claim.

Scientists and engineers need instead to listen and try to empathise with the people they are trying to communicate with. It is impossible to win an argument driven by fear and personal values with facts alone. Facts definitely matter, but they are not enough. The words used and how they are communicated will be what makes the difference.

In the words of the American pediatrician and educator Wendy Sue Swanson: “I knew if I wanted to change the minds of my patient's parents, I had to change the conversation.”

It is something I have tried to do myself.

When Covid-19 hit I saw some of my own friends and family back in Kentucky where I am from dismiss the existence of a global pandemic entirely. At first I tried to debate with them on social media by sending them links to information from the WHO and research papers. It would end up enraging me so I decided to change how I was using social media. Rather than withdrawing from the conversation altogether, I began posting the occasional picture of a flower along with a short commentary on what was happening in Oxford at the time with the pandemic from a personal point of view. The most extraordinary thing happened - those same people I had been fighting with started engaging and asking questions about whether they should get a vaccine. All that had happened was that I had changed the conversation.

The scientific community needs to think carefully now about how it chooses to communicate with the wider world and those it struggles to reach. The stakes are high – nothing less than our trust in the ability of science to deliver progress, our trust in governments and society, and finally our trust in each other.

Further reading

This blog is one of a series of perspective pieces published to support the Royal Society's Online information environment report, which provides an overview of how the internet has changed, and continues to change, the way society engages with scientific information, and how it may be affecting people’s decision-making behaviour.

Authors

  • Professor Gina Neff

    Professor Gina Neff

    Gina is the Executive Director of the Minderoo Centre for Technology & Democracy at the University of Cambridge and Professor of Technology & Society at the University of Oxford. Her research focuses on the effects that the rapid expansion of the digital information environment has had on our lives. She has also written a number of books on the topic.