Introduction by Croakey: The toxic effects on democracy of misinformation and the associated attacks on scientists call for corrective political action, according to an article recently published in the journal, Current Opinion in Psychology.
The article, ‘Misinformation and the epistemic integrity of democracy’, explores connections between climate denialism, COVID misinformation, attacks on scientists and right wing libertarianism.
The authors recommend regulation of digital platforms, noting European Union legislative efforts to curtail misinformation and hate speech online, as well as interventions to boost the public’s resistance to misinformation, such as media literacy tips.
The article below is by two of the article’s six authors, and was first published by The Conversation under the headline, ‘Disinformation campaigns are undermining democracy. Here’s how we can fight back’.
Below it are details of other recent publications on misinformation interventions, including one that calls for public health experts to be more involved in designing such interventions, and for development of a health misinformation typology.
Stephan Lewandowsky and John Cook write:
Misinformation is debated everywhere and has justifiably sparked concerns. It can polarise the public, reduce health-protective behaviours such as mask wearing and vaccination, and erode trust in science. Much of misinformation is spread not by accident but as part of organised political campaigns, in which case we refer to it as disinformation.
But there is a more fundamental, subversive damage arising from misinformation and disinformation that is discussed less often.
It undermines democracy itself. In a recent paper published in Current Opinion in Psychology, we highlight two important aspects of democracy that disinformation works to erode.
The integrity of elections
The first of the two aspects is confidence in how power is distributed – the integrity of elections in particular.
In the United States, recent polls have shown nearly 70 percent of Republicans question the legitimacy of the 2020 presidential election. This is a direct result of disinformation from Donald Trump, the loser of that election.
Democracy depends on the people knowing that power will be transferred peacefully if an incumbent loses an election. The “big lie” that the 2020 US election was stolen undermines that confidence.
Depending on reliable information
The second important aspect of democracy is this – it depends on reliable information about the evidence for various policy options.
One reason we trust democracy as a system of governance is the idea that it can deliver “better” decisions and outcomes than autocracy, because the “wisdom of crowds” outperforms any one individual. But the benefits of this wisdom vanish if people are pervasively disinformed.
Disinformation about climate change is a well-documented example.
The fossil fuel industry understood the environmental consequences of burning fossil fuels at least as early as the 1960s. Yet they spent decades funding organisations that denied the reality of climate change. This disinformation campaign has delayed climate mitigation by several decades – a case of public policy being thwarted by false information.
We’ve seen a similar misinformation trajectory in the COVID-19 pandemic, although it happened in just a few years rather than decades. Misinformation about COVID varied from claims that 5G towers rather than a virus caused the disease, to casting doubt on the effectiveness of lockdowns or the safety of vaccines.
The viral surge of misinformation led to the World Health Organization introducing a new term – infodemic – to describe the abundance of low-quality information and conspiracy theories.
A common denominator of misinformation
Strikingly, some of the same political operatives involved in denying climate change have also used their rhetorical playbook to promote COVID disinformation. What do these two issues have in common?
One common denominator is suspicion of government solutions to societal problems. Whether it’s setting a price on carbon to mitigate climate change, or social distancing to slow the spread of COVID, contrarians fear the policies they consider to be an attack on personal liberties.
An ecosystem of conservative and free-market think tanks exists to deny any science that, if acted on, has the potential to infringe on “liberty” through regulations.
There is another common attribute that ties together all organised disinformation campaigns – whether about elections, climate change or vaccines. It’s the use of personal attacks to compromise people’s integrity and credibility.
Election workers in the US were falsely accused of committing fraud by those who fraudulently claimed the election had been “stolen” from Trump.
Climate scientists have been subject to harassment campaigns, ranging from hate mail to vexatious complaints and freedom-of-information requests. Public health officials such as Anthony Fauci have been prominent targets of far-right attacks.
The new frontier in attacks on scientists
It is perhaps unsurprising there is now a new frontier in the attacks on scientists and others who seek to uphold the evidence-based integrity of democracy. It involves attacks and allegations of bias against misinformation researchers.
Such attacks are largely driven by Republican politicians, in particular those who have endorsed Trump’s baseless claims about the 2020 election.
The misinformers are seeking to neutralise research focused on their own conduct by borrowing from the climate denial and anti-vaccination playbook. Their campaign has had a chilling effect on research into misinformation.
How do we move on from here?
Psychological research has contributed to legislative efforts by the European Union, such as the Digital Services Act or Code of Practice, which seek to make democracies more resilient against misinformation and disinformation.
Research has also investigated how to boost the public’s resistance to misinformation. One such method is inoculation, which rests on the idea people can be protected against being misled if they learn about the rhetorical techniques used to mislead them.
In a recent inoculation campaign involving brief educational videos shown to 38 million citizens in Eastern Europe, people’s ability to recognise misleading rhetoric about Ukrainian refugees was frequently improved.
It remains to be seen whether these initiatives and research findings will be put to use in places like the US, where one side of politics appears more threatened by research into misinformation than by the risks to democracy arising from misinformation itself.
Author details
Stephan Lewandowsky Chair of Cognitive Psychology, University of Bristol
John Cook is Senior Research Fellow, Melbourne School of Psychological Sciences, The University of Melbourne
We’d like to acknowledge our colleagues Ullrich Ecker, Naomi Oreskes, Jon Roozenbeek and Sander van der Linden who coauthored the journal article on which this article is based.
Further reading
Croakey readers may also be interested in these recent publications, shared recently by the Covering Climate Now collaboration.
A Systematic Review Of COVID-19 Misinformation Interventions: Lessons Learned
Abstract, Health Affairs: Governments, public health authorities, and social media platforms have employed various measures to counter misinformation that emerged during the COVID-19 pandemic. The effectiveness of those misinformation interventions is poorly understood.
We analysed 50 papers published between January 1, 2020, and February 24, 2023, to understand which interventions, if any, were helpful in mitigating COVID-19 misinformation. We found evidence supporting accuracy prompts, debunks, media literacy tips, warning labels, and overlays in mitigating either the spread of or belief in COVID-19 misinformation.
However, by mapping the different characteristics of each study, we found levels of variation that weaken the current evidence base. For example, only 18 percent of studies included public health–related measures, such as intent to vaccinate, and the misinformation that interventions were tested against ranged considerably from conspiracy theories (vaccines include microchips) to unproven claims (gargling with saltwater prevents COVID-19).
To more clearly discern the impact of various interventions and make evidence actionable for public health, the field urgently needs to include more public health experts in intervention design and to develop a health misinformation typology; agreed-upon outcome measures; and more global, more longitudinal, more video-based, and more platform-diverse studies.
Abstract, Scientific Reports: Building misinformation resilience at scale continues to pose a challenge. Gamified “inoculation” interventions have shown promise in improving people’s ability to recognise manipulation techniques commonly used in misinformation, but so far few interventions exist that tackle multimodal misinformation (for example, videos, images).
We developed a game called Cat Park, in which players learn about five manipulation techniques (trolling, emotional manipulation, amplification, polarisation, and conspiracism), and how misinformation can spread through images.
To test the game’s efficacy, we conducted a conceptual replication (N = 380) of Roozenbeek and van der Linden’s 2020 study about Harmony Square, with the same study design, item set, and hypotheses. Like the original study, we find that people who play Cat Park find misinformation significantly less reliable post-gameplay (d = 0.95, p < 0.001) compared to a control group, and are significantly less willing to share misinformation with people in their network (d = 0.54, p < 0.001). These effects are robust across different covariates.
However, unlike the original study, Cat Park players do not become significantly more confident in their ability to identify misinformation (p = 0.204, d = − 0.13). We did not find that the game increases people’s self-reported motivation and confidence to counter misinformation online.
See Croakey’s extensive archives on misinformation and disinformation