Innoculate20against20misinformation
Discovery
Apr 4, 2023

Worried About Misinformation? Prebunking’s Got You Covered

Misinformation has been the bane of online platforms, but a few simple tools can help inoculate people against it.

By Sander van der Linden

The internet is rife with misinformation ranging from conspiracy theories about stolen elections to fake papers claiming to disprove climate change to simply made-up covid cures. There’s a lot of misinformation out there, and we’re all susceptible to it. The human brain struggles with identifying falsehoods, and online misinformation operates a bit like a virus hijacking cells; it propagates quickly and can be hard to remove once ensconced. Fortunately, we’re beginning to develop effective tools to inoculate people against online misinformation. One of these interventions, “prebunking,” shows particular promise when deployed at scale on platforms such as YouTube.
 
Cannily presented misinformation can distort our memories, subtly influence our judgment in small ways or — depending on how pernicious and omnipresent a particular lie may be — totally take over our decision-making and behavior, to the point of radicalization. There are now countless cases of otherwise rational people being radicalized by repeated contact with online misinformation to join terrorist groups, deny scientific facts like climate change, or campaign against vaccines.
 
In one particularly extreme case, a fake petition, claiming to have been signed by 30,000 scientists who believe climate change wasn’t real, circulated online and became a viral story on Facebook. This piece of misinformation relied on a technique known as a fake expert technique and stole the letterhead of the National Academy of Science to make itself look legit. The signatures were from real people, but they weren’t experts in climate change, or even practicing scientists. They were just random people with a college degree who were presented as experts in an attempt to mislead people about climate change.
 
As with viruses, people can be susceptible to misinformation, infected with it, and recover from it. The viral comparison isn’t just a metaphor, either. The same models that are used in epidemiology apply to the spread of misinformation. Misinformation spreads like a virus between people, because when somebody shares fake news, it goes on to infect another person. With social media, that person can reshare it and reach many more people. Yet, in the same way that the spread of a virus can be disrupted by various means — including vaccination — we can also arrest the spread of misinformation and the intensity of infection when people encounter it.
 
Prebunking Theory
There are two primary ways of combating misinformation. One is trying to block it entirely by removing it from social networks and media platforms. This can be a bit like a game of whack-a-mole and frequently collides with free speech concerns. The other method is building up people’s capacity to avoid infection by giving them the tools to identify potential misinformation, view it critically, and make informed decisions not to share it. By preemptively vaccinating people against misinformation, it’s possible to prevent it from infecting the population at scale. One of the most effective methods of psychological inoculation is a technique which I have dubbed “prebunking.”
 
Prebunking is the result of years of research and collaboration with major tech platforms, including Google, and is explored at length in my new book Foolproof — Why We Fall for Misinformation and How to Build Immunity. When scientists develop vaccines against viruses, they frequently use an inactive strain, which triggers an immune response and the production of antibodies when the body is exposed to it. Prebunking takes a similar approach by  exposing people to an inactive or weakened strain of misinformation and refuting it in advance, priming them to spot and reject actual misinformation encountered in the wild.
 
This is a fundamentally different approach than has typically been taken by journalist, news organizations and media platforms. Historically, these groups have relied on fact-checking and disclaimers to try and counter the messaging of misinformation. Unfortunately, in many cases these techniques require that people seek out an alternative explanation after they have encountered misinformation, and they do little to build people’s innate resistance to misinformation. With prebunking, however, we seek to expose people to a weakened dose of a falsehood, refuting it in advance and giving people the tools — psychological antibodies — they need to dismantle it themselves.
 
How Prebunking Works
There are of course challenges in prebunking misinformation. Platforms obviously don’t want to share potentially controversial material themselves, and there’s a commercial imperative to avoid political or divisive material. So, whatever material is used to prebunk misinformation must be essentially innocuous itself.  There are several ways to achieve this.
 
In the case of the fake climate change petition, for instance, we ran a prebunking experiment where people were told in advance — before they saw the petition — that there were politically motivated actors who try to mislead people on the issue of climate change. We explained that there are people who create fake petitions and use a technique called the fake expert technique, combined with credibility stolen from a real organization, to make them appear legitimate. This kind of forewarning element help activate people’s psychological immune systems, and they were more critical of the fake petition when they did encounter it and found “signatures” from the Spice Girls and Charles Darwin.
 
Of course, the problem with this approach is that the prebunking has to be tailored and matched to the specific piece of misinformation, which makes it difficult to deploy at scale. To achieve this, we developed an even more attenuated form of prebunking, which, can be used on a much wider basis across a platform such as YouTube (Google Jigsaw was a collaborator in developing this technique, an incubator of parent company Google). Social media and video platforms, including YouTube, have long been home to videos peddling in misinformation, including those seeking to radicalize people to more extreme modes of thinking or even to join terrorist groups.
 
To counter these videos, we developed short videos which could be shown — as (non-skippable) ads — before videos which might contain misinformation or extremist content. With these videos, the goal is to forewarn people that they might encounter attempts to manipulate them, and then you show them an example of what that technique might look like. It’s essentially a micro-dose of the misinformation technique employed in the actual video. And it’s a completely inactive micro-dose, meaning it uses references from pop culture (South Park or Star Wars clips), rather than divisive content which the platform might not want to deploy.
 
For instance, extremist groups often try to promote false dilemmas, false dichotomies and scapegoating as part of their messaging, such as “either join ISIS or you’re not a good Muslim” or “we need to fix the homelessness problem in San Francisco before we can think about immigrants.” In the prebunking video, we consequently present an example of a false dichotomy drawn from familiar pop culture, such as a clip from Star Wars Revenge of the Sith where Anakin Skywalker is talking to Obi-Wan Kenoi and says “you’re either with me or you’re the enemy,” to which Obi-Wan replies “Only a Sith deal in absolutes.” The prebunking video then explains that this is a false dichotomy. This primes viewers to be aware of these sorts of fallacies before they encounter them in extremist content.
 
What we found when these prebunking videos were deployed across millions of users on YouTube was that they were effective at activating people’s psychological defenses against misinformation (albeit at a lower level than in the lab). These techniques are now being rolled out to hundreds of millions of people in Eastern Europe and Germany. Prebunking isn’t a panacea, and there’s still substantial work to be done to combat misinformation, but it is an example of how simple techniques can be deployed at scale and have a measurable, positive impact by building up people’s own innate abilities to spot misinformation.


Social psychologist and author, Prof. Sander van der Linden, whose work on the The Cambridge Overcoming Polarization Seminar is funded by Templeton World Charity Foundation's (TWCF) Listening and Learning in a Polarized World, sees misinformation as a defining problem of our times.

Listen to Prof. van der Linden in this recent podcast with Posthoc Salon's Susan MacTavish Best.


This article was edited by Benjamin Reeves, a New York-based writer, filmmaker and journalist. Learn more at BenjaminReeves.com or follow him on Twitter.