Misinformation about scientific issues, from vaccine safety to climate change, can cause harm to individuals and society at large.

Iterations of a Generative Adversarial Network AI learning to create abstract art

It is arguably one of the most abundant resources of our age – never before has so much information been available to so many people. 

With the press of just a few keys and a couple of mouse clicks, it is possible to access entire libraries-worth of knowledge, decades of news reports, vaults-full of documents and records, speeches, images and videos. And, in the current pandemic, the genome sequence of a novel coronavirus and a torrent of research preprints released before peer review. Where once it would take days for news to pass from town to town, weeks for it to travel across nations, and months to cross oceans to other continents, the internet can deliver anything we want to know to us in seconds. There is no doubt that digital technology has transformed our ability to be informed and to inform others. 
But it is not just high-quality information that is being shared. 

Published today (19 January) the Royal Society’s The online information environment report warns misinformation about scientific issues, from vaccine safety to climate change, can cause harm to individuals and society at large. 

Inaccurate, misleading and completely false information is shared – both unwittingly by some and maliciously by others – online in large volumes. Fictional stories end up being passed around as truth, conspiracies gain weight as they pass through the rumour mill and science becomes mangled beyond recognition. 

Misinformation and fake news is not new.  Alongside this report, we are publishing two literature reviews looking at the spread of misinformation about water fluoridation and vaccination in the C20th, well before the emergence of the modern information environment.  What online technologies have changed, however, is the scale and speed of spread.  

The Royal Society’s mission since it was established in 1660 has been to promote science for the benefit of humanity, and a major strand of that is to communicate accurately. But false information is interfering with that goal. It has fuelled mistrust in vaccines, been wielded by those determined to confuse discussions about tackling the climate crisis and increased opposition to genetically modified crops.

Science stands on the edge of error.  It is a process of dealing with uncertainties, prodding and testing received wisdom.  Science challenges us to continually assess and revise our understanding of the world.  It is a process that depends on protecting and encouraging free speech and open debate, and a process that requires the prioritisation of the best data and most trustworthy information.  A safe and healthy online information environment is needed to allow robust and open scientific debate.  Balancing these necessities is one of the key aims of this report.

Of course, this report can only consider part of a problem as broad and complicated as how to improve the quality of the information environment. Misinformation problems are in part irreducibly political and social in nature.  In free and diverse societies we will always have some version of them. In this report we have focused on the part of this where the Royal Society, as the National Academy of natural sciences, speaks with the greatest authority: on issues pertaining to how science is communicated online, and the technologies underpinning that.  

Fact checking is especially important, and this is an area the scientific community can help.  National academies and learned societies can react to new misinformation threats by quickly providing accurate summaries of what we know.  To do this, better access to data is needed for researchers to identify topics of misinformation early in the process of amplification.

This will not alone be enough to counteract the algorithmic amplification of polarising misinformation in an attention economy which incentivises the spread of sensational stories rather than sound understanding. The EU Digital Services Act is an example of legislation to address the incentives of business models. Scientists will need to work with lawyers and economists to make sure that the sensitivities of scientific misinformation are considered when legislation is framed.  

The scientific community has its own issues to address in this regard. The incentives for scientific publication and communication need careful consideration, to ensure that novelty isn’t overstated simply in order to grab attention.  Open access has been a boon, but in an age of information overload we need tools to identify questionable publishers.  And scientists need to be clear and transparent about whether they are seeking to inform or seeking to persuade. 

The online information environment report provides an overview of these issues, and highlights key challenges for creating a healthy online information environment and makes a series of recommendations for policymakers, academics, and online platforms.

Further reading

This blog is one of a pair published today to mark the publication of the Society's report on The online information environment.  The other is a response to the report from Ed Humpherson, Director General for Regulation at the UK Statistics Authority, who makes the case for optimism in the face of misinformation.

 

Authors

  • Professor Frank Kelly FRS

    Professor Frank Kelly FRS

    Professor of the Mathematics of Systems in the University of Cambridge