Inoculation Theory - Wikipedia
Inoculation theory is a social psychological/communication theory that explains how
an attitude or belief can be protected against persuasion or influence in much the same way a body can be protected against disease–for example, through pre-exposure to weakened versions of a stronger, future threat.
The theory uses medical inoculation as its explanatory analogy—applied to attitudes (or beliefs) rather than to a disease.
- The theory posits that weak counterarguments generate resistance within the receiver, enabling them to maintain their beliefs in the face of a future, stronger attack.
- Following exposure to weak counterarguments (e.g., counterarguments that have been paired with refutations), the receiver will then seek out supporting information to further strengthen their threatened position.
- The held attitude or belief becomes resistant to a stronger attack, hence the medical analogy of a vaccine.
Inoculation is a theory that explains how attitudes and beliefs can be made more resistant to future challenges. For an inoculation message to be successful, the recipient experiences threat (a recognition that a held attitude or belief is vulnerable to change) and is exposed to and/or engages in refutational preemption (preemptive refutation, that is, defenses against potential counterarguments). The arguments that are presented in an inoculation message must be strong enough to initiate motivation to maintain current attitudes and beliefs, but weak enough that the receiver will be able to refute the counterargument.
Medical inoculation works by exposing the body to a weakened form of a virus—strong enough to trigger a response (that is, the production of antibodies), but not so strong as to overwhelm the body's resistance. Attitudinal inoculation works the same way: expose the receiver to weakened counterarguments, triggering a process of counterargument which confers resistance to later, stronger persuasive messages. This process works like a metaphorical vaccination: the receiver becomes immune to attacking messages that attempt to change their attitudes or beliefs. Inoculation theory suggests that if one sends out messages with weak counterarguments, an individual can build immunity to stronger messages and strengthen their original attitudes toward an issue.
Most inoculation theory research treats inoculation as a preemptive, preventive (prophylactic) messaging strategy—used before exposure to strong challenges. More recently, scholars have begun to test inoculation as a therapeutic inoculation treatment, administered to those who have the "wrong" target attitude/belief. In this application, the treatment messages both persuade and inoculate—much like a flu shot that cures those who already have been infected with the flu and protects them against future threats. More research is needed to better understand therapeutic inoculation treatments—especially field research that takes inoculation outside of the laboratory setting.
Pre-bunking - It is much more
difficult to eliminate the influence or persuasion of misinformation once
individuals have seen it
-
which is why debunking and fact checking have failed
Pre-bunking (or prebunking) is a form of Inoculation theory that aims to combat various kinds of manipulation and misinformation spread around the web. In recent years, misleading information and the permeation of such have become an increasingly prevalent issue. Standard Inoculation theory aims to combat persuasion. Still pre-bunking seeks to target misinformation by providing a harmless example of it. Exposure builds future resistance to similar misinformation.
Due to the nature of attitudinal inoculation as a form of psychological manipulation, the counterarguments used in the process do not necessarily need to be accurately representative of the opposing belief in order to trigger the inoculation effect. This is a form of a Straw Man fallacy, and can be effectively used to reinforce beliefs with less legitimate support.
https://en.wikipedia.org/wiki/Inoculation_theory
Across various fields, researchers and policy-makers have sought to find ways to reduce the spread and influence of misinformation, from legal and policy interventions to post hoc corrections such as debunking and fact checking. Policy interventions may include public authorities directly intervening through regulating the media environment or making social media companies liable for third-party content (Alemanno 2018).
Alternatively, post hoc corrections or fact checking interventions involve exposing news consumers to factual information or, in some cases, a more detailed “debunking” message that puts forth strong arguments for why previously seen information is false (Chan et al. 2017).
Evidence on the effectiveness of these interventions is mixed, but they continue to be widely used (Walter and Murphy 2018; Nyhan et al. 2020).
One particular problem with these post hoc measures is that;
- it is harder to eliminate the influence of misinformation after people have been exposed to it. Indeed,
- misinformation often continues to influence inferential reasoning, even after it has been formally retracted or corrected, a phenomenon known as - the continued influence effect - (Lewandowsky et al. 2012). In addition,
- the mere presence of misinformation in people’s news environment may undermine accurate information, as the persuasive impact of facts can be neutralized by misinformation (Cook, Lewandowsky, and Ecker 2017; van der Linden et al. 2017); and,
- once exposed, people may be directionally motivated to seek out further misinformation in ways that confirm their social identity (Van Bavel et al. 2021).
As such, researchers have taken to studying how we can prevent misinformation from influencing people in the first place, a process known as inoculation or prebunking. Importantly, we note that
prebunking and inoculation are related but not synonymous terms. Some scholars argue that - the difference between debunking and prebunking is simply defined by timing - or when someone is exposed to a correction.
For example, Brashier et al. (2021) distinguish between featuring warning labels on a social media post either before (prebunk) or after (debunk) exposure. However, the timing of a fact check (see also Grady et al. 2021) is not a conventional conceptualization of either inoculation or debunking (see Lewandowsky et al. 2020; Jolley and Douglas 2017). Inoculation requires much more than a simple warning.
https://journals.sagepub.com/doi/10.1177/00027162221087936
The Continued Influence Effect
Examining the role of information integration in the continued influence effect (CIE) using an event segmentation approach
Abstract - Misinformation regarding the cause of an event often continues to influence an individual’s event-related reasoning, even after they have received a retraction. This is known as the continued influence effect (CIE). Dominant theoretical models of the CIE have suggested the effect arises primarily from failures to retrieve the correction. However, recent research has implicated information integration and memory updating processes in the CIE. As a behavioural test of integration, we applied an event segmentation approach to the CIE paradigm. Event segmentation theory suggests that incoming information is parsed into distinct events separated by event boundaries, which can have implications for memory. As such, when an individual encodes an event report that contains a retraction, the presence of event boundaries should impair retraction integration and memory updating, resulting in an enhanced CIE. Experiments 1 and 2 employed spatial event segmentation boundaries in an attempt to manipulate the ease with which a retraction can be integrated into a participant’s mental event model. While Experiment 1 showed no impact of an event boundary, Experiment 2 yielded evidence that an event boundary resulted in a reduced CIE. To the extent that this finding reflects enhanced retrieval of the retraction relative to the misinformation, it is more in line with retrieval accounts of the CIE.
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0271566
Countering Misinformation and Fake News Through Inoculation and Prebunking
Abstract - There has been increasing concern with the growing infusion of misinformation, or “fake news”, into public discourse and politics in many western democracies. Our article first briefly reviews the current state of the literature on conventional countermeasures to misinformation. We then explore proactive measures to prevent misinformation from finding traction in the first place that is based on the psychological theory of “inoculation”. Inoculation rests on the idea that if people are forewarned that they might be misinformed and are exposed to weakened examples of the ways in which they might be misled, they will become more immune to misinformation. We review a number of techniques that can boost people’s resilience to misinformation, ranging from general warnings to more specific instructions about misleading (rhetorical) techniques. We show that based on the available evidence, inoculation appears to be a promising avenue to help protect people from misinformation and “fake news”.
- attitude polarization - (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence)
- belief perseverance - (when beliefs persist after the evidence for them is shown to be false)
- the irrational primacy effect - (a greater reliance on information encountered early in a series)
- illusory correlation - (when people falsely perceive an association between two events or situations).
Strengthen Existing Attitudes & Beliefs, & Resist Attempts of Persuasion.
Like in Medical Science - A weak version of the threat or
weakened virus is injected in the body, to resist stronger
attacks in future or infections.Similarly…
In Attitudinal Inoculation - A weak version of the threat or
a counter-attitudinal message is employed, to strengthen
the mind against future stronger attacks or
persuasive-messages.
In order to prevent giving into persuasion, strengthening of one’s original attitudes and beliefs is essential. A warning about the imminent danger or persuasive attack should be given to the receiver. For that, a possibility of threat is created and realized, which initiates defense of future attempts of persuasion. McGuire believed that mere counterarguments to the receiver’s attitude would not suffice. It has to be a potent threat or a real risk that would encourage deeper belief in preexisting attitudes, and prepare the individual to resist any real persuasion in the future.
https://psychologenie.com/explanation-of-inoculation-theory-with-examples
Inoculation Theory in the Post-truth Era
Extant Findings and New Frontiers for Contested Science,
Misinformation, and Conspiracy Theories.
Inoculation theory explains how immunity to counter-attitudinal messages
is conferred by preemptively exposing people to weakened doses of
challenging information. The theory has been applied in a number of
contexts (e.g., politics, health) in its 50+ year history. Importantly,
one of the newest contexts for inoculation theory is work in the area of
contested science, misinformation, and conspiracy theories. Recent
research has revealed that when a desirable position on a scientific issue
(e.g., climate change) exists, conventional preemptive (prophylactic)
inoculation can help to protect it from misinformation, and that even when
people have undesirable positions, “therapeutic” inoculation messages can
have positive effects. We call for further research to explain and predict
the efficacy of inoculation theory in this new context to help inform
better public understandings of issues such as climate change, genetically
modified organisms, vaccine hesitancy, and other contested science beliefs
such as conspiracy theories about COVID-19.
Josh Compton, Sander van der Linden, John Cook, Melisa Basol
https://compass.onlinelibrary.wiley.com/doi/10.1111/spc3.12602
Communication Theory & The Inoculation Effect
The inoculation theory was proposed by McGuire in response to a situation where the goal is to persuade someone not to be persuaded by another. The theory is a model for building resistance to persuasion attempts by exposing people to arguments against their beliefs and giving them counter arguments to refute attacks. The theory therefore offers mechanisms by which communication is used to help people defend their beliefs.
Comparison with Vaccination
The inoculation theory draws comparison with the concept of vaccination. In a normal vaccination, a weakened form of the virus is injected into an individual in order to build resistance to the disease. A similar procedure is used to ‘inoculate’ an individual from attacks on his belief.
According to the theory, a weakened or smaller dose of a contrary argument called the inoculation message is given to the people. These individuals who have been exposed to the weaker argument develop a defense system that helps them to retain their beliefs and not change their attitudes when they are confronted with a stronger form of the argument.
Research has proved that the inoculation process is more successful that simply reinforcing original beliefs with stronger evidence.
Components of an Inoculation Message
According to Pfau, the following are the two major components in an inoculation message.
a) Threat: A threat here is a forewarning of a possible attack on one’s attitudes and beliefs. The person is aware of his/her vulnerability to a persuasive attack. The perception that there is an impending threat psychologically motivates a person to defend his beliefs and attitudes.
Compton and Pfau conducted an experiment to show how threats could provide inoculation. In their experiment, they created a fictitious credit card company and started a campaign to get college students to sign up for the card. Some of the students were then sent inoculation messages which contained a threat that warned students of the extent to which credit companies might go to in order to encourage more people to sign-up. The message also spoke about how credit card companies could cause people to rethink their cautious attitudes towards credit cards. The remaining students received no warning at all. At the end of the campaign, the students who received the inoculation message seemed less likely to sign up to the fictitious credit card company and overall showed more positive attitudes towards cautious use of credit cards to avoid debt. This was in stark contrast to the control group. Further testing showed that even low levels of threat or warnings that were not very strong also provided inoculation.
b) Refutational Preemption: Potential targets for persuasion attacks should not only be forewarned but the inoculation message should also preempt what the possible counterattack will be. While preparing the inoculation message, the arguments that the other side will put forward should be anticipated and ways of countering them should be prepared.
For example, in the credit card experiment, mentioned above Compton and Pfau kept in mind the popular arguments that credit card companies used to lure students. They then came up with statistics and arguments that refuted claims like ‘increase financial security, build a strong credit history, etc.’ that credit card companies were making.
https://www.communicationtheory.org/inoculation-theory/
If Misinformation Behaves Like a Virus - We Can Create a Vaccine
Why is inoculation necessary?
Debunking and fact checking are good, but they have limitations; when people have already been infected, it’s just very hard to sort of undo the damage. And so that's why we have focused on inoculation.
In a medical vaccine, you expose people to a weakened or inactivated strain of a virus to try to trigger the production of antibodies to help confer resistance against future infection.
The amazing thing, to me, is that you can do the same with information. You expose people to a weakened dose of the misinformation, and then – this is the crucial part – you help people’s psychological immune system to recognise and neutralise it.
Forewarned is forearmed, so you have to tell people that there are actors out there trying to manipulate them, and that they use certain strategies to achieve that goal. And once you have people motivated to pay attention, you then give people the tools to deal with the misinformation and recognise why it is wrong. For those who don’t like the medical metaphor, this process is sometimes known as ‘pre-bunking’.
There’s a lot of flexibility with this. You can do fact-based inoculations – where you tackle a specific claim – or technique based-inoculations that target the general strategies that people use to spread misinformation.
So how do you go about inoculating people?
It’s possible to pre-bunk claims with simple text. For example, you can tell people that they’ll hear that there’s a lot of disagreement among scientists about climate change. Then we can explain that this is called ‘casting doubt on the consensus’ when in fact, 97 per cent of scientists agree on the causes of climate change. Then when people come across this tactic of sowing doubt, they've been inoculated against it.
A more interesting way of combatting minformation is ‘active inoculation’. A lot of research shows that when people have agency and control over what they're doing, they're more likely to remember things and to use what they learn. And so we’ve created some games in which you play the role of a manipulator, and you find out how bad actors make use of six broad techniques that are that are used to spread misinformation, such as polarising people, using emotions to stoke fear and outrage, and floating conspiracy theories.
We know that conspiracy theories have certain ingredients, like there's always some evil actor working behind the scenes to dupe people. It's always casting doubt on the mainstream narrative. There's usually some persecuted victim in the story. And there’s usually a causal story created around random events. For example, they would often talk about the apparent link between 5G phone masts and Covid outbreaks, right? A third (non-conspiratorial) factor, population density, links the two – when there are more people there are more phone masts and more outbreaks. But conspiracy theorists can tie it into a nice causal story by falsely suggesting that 5G is causing Covid.
We found that you can take these ingredients and let people build their own conspiracy theory. And then, over time, it turns out that they can better recognise these building blocks in new variants of conspiracy theories that they haven't seen before. You can’t fact-check everything, but this technique-based inoculation gives a broader spectrum of immunity.
Prebunking Can Reduce Susceptibility To Misinformation Across Cultures
This study finds that the online “fake news” game, Bad News, can confer psychological resistance against common online misinformation strategies across different cultures. The intervention draws on the theory of psychological inoculation: Analogous to the process of medical immunization, we find that “prebunking,” or preemptively warning and exposing people to weakened doses of misinformation, can help cultivate “mental antibodies” against fake news. We conclude that social impact games rooted in basic insights from social psychology can boost immunity against misinformation across a variety of cultural, linguistic, and political settings.
Common approaches to tackling the problem of online misinformation include developing and improving detection algorithms (Monti et al., 2019), introducing or amending legislation (Human Rights Watch, 2018), developing and improving fact-checking mechanisms (Nyhan & Reifler, 2012), and focusing on media literacy education (Livingstone, 2018). However, such interventions present limitations. In particular, it has been shown that debunking and fact-checking can lack effectiveness because of the continued influence of misinformation: once people are exposed to a falsehood, it is difficult to correct (De keersmaecker & Roets, 2017; Lewandowsky et al., 2012). Overall, there is a lack of evidence-based educational materials to support citizens’ attitudes and abilities to resist misinformation (European Union, 2018; Wardle & Derakshan, 2017). Importantly, most research-based educational interventions do not reach beyond the classroom (Lee, 2018).
Inoculation theory is a framework from social psychology that posits that it is possible to pre-emptively confer psychological resistance against (malicious) persuasion attempts (Compton, 2013; McGuire & Papageorgis, 1961). This is a fitting analogy, because “fake news” can spread much like a virus (Kucharski, 2016; Vosoughi et al., 2018). In the context of vaccines, the body is exposed to a weakened dose of a pathogen—strong enough to trigger the immune system—but not so strong as to overwhelm the body. The same can be achieved with information by introducing pre-emptive refutations of weakened arguments, which help build cognitive resistance against future persuasion attempts. Meta-analyses have shown that inoculation theory is effective at reducing vulnerability to persuasion (Banas & Rains, 2010).
https://misinforeview.hks.harvard.edu/article/global-vaccination-badnews/
Threat & Refutational Preemption - Two Elements of Inoculation Theory
The threat component of an inoculation treatment raises the possibility that a person may encounter persuasive challenges to existing attitudes. It is designed to get people to acknowledge the vulnerability of existing attitudes to potential change. Threat functions as the motivational catalyst to resistance. Once a person accepts that attitudes are vulnerable to change, they will expend the effort to strengthen attitudes. The refutational preemption component of an inoculation treatment raises—and then refutes—specific arguments contrary to attitudes. It is designed to provide the specific content that people can use to defend attitudes and to provide people with a model or script for how to defend attitudes.
Studies by McGuire in the 1960s proved, convincingly, that inoculation works. Subsequent studies by Michael Pfau indicated that inoculation works, in part, through the theorized mechanisms of threat and counterarguing, but also by eliciting anger, making attitudes more certain, rendering attitudes more accessible, and altering the structure of associative networks.
Evidence of threat’s motivational role in resistance is found in the consistency of findings by McGuire and Pfau that inoculation-same and inoculation-different treatments are equally effective in conferring resistance to attacks. Refutational-same inoculation treatments cover the same counterarguments raised in later attacks, whereas different treatments employ counterarguments that are completely different than those raised in subsequent attacks. Because inoculation-different treatments feature unique content, effectiveness cannot be attributed to the refutational-preemption component of the treatment; instead, it can only be explained by the threat component, which motivates people to bolster their attitudes. The power of inoculation stems from the fact that treatments spread a broad umbrella of protection—not just against specific counterarguments raised in subsequent treatments, but against all potential counterarguments.
Applications of Inoculation Theory
Inoculation is an interesting and useful theory. Research during the past 20 years has revealed numerous real-world applications of inoculation theory. For example, studies indicate that it is possible to inoculate, for example, political supporters of a candidate in a campaign against the influence of an opponent’s attack ads; citizens against the corrosive influence of soft-money-sponsored political attack ads on democratic values; citizens of fledgling democracies against the spiral of silence which can thwart the expression of minority views; commercial brands against the influence of competitors’ comparative ads; corporations against the damage to credibility and image that can occur in crisis settings; and young adolescents against influences of peer pressure, which can lead to smoking, underage drinking, and other harmful behaviors.
http://psychology.iresearchnet.com/social-psychology/social-psychology-theories/inoculation-theory/
Persuading Others to Avoid Persuasion: Resistan Health Attitudes
For example, an inoculation message designed to discourage teen cigarette smoking (e.g., Pfau et al., 1992) might begin with a warning that peer pressure will strongly challenge their negative attitudes toward smoking, then follow this forewarning with a handful of potential counterarguments they might face from their peers (e.g., “Smoking isn't really bad for you”) followed by refutations of these counterarguments (e.g., “Actually, smoking is harmful in a number of ways…”). This inoculation format can be adapted to a number of issues, so long as (1) the intended attitude or position is already in place with message recipients, and (2) message designers are aware of some counterarguments that might be employed in attack messages in order to provide weakened, or refuted, counterarguments in the inoculation treatment message (see Ivanov, 2012, for more information on message design).
Interestingly and importantly, as part of refutational preemption, message designers do not need to raise and refute every potential future counterargument to be successful (Pfau, 1995). As originally argued by McGuire (1964) and confirmed by a meta-analysis of inoculation theory studies (Banas and Rains, 2010), both refutational different (i.e., where the treatment refutes challenges that do not specifically appear in a future attack) and refutational same (i.e., where the treatment refutes specific challenges that are raised) treatments confer protection, supporting the premise that inoculation messages provide “umbrella protection” against subsequent attacks. Also, encouragingly, research indicates that inoculation treatments are effective regardless of whether refutations are provided by the advertiser/messenger (i.e., “passive” refutations) or are generated by the recipient (i.e., “active” refutations; Banas and Rains, 2010). Note, too, that inoculation research has indicated a number of characteristics that make inoculation messages more effective at conferring resistance, including perceived credibility of the inoculation message source (An and Pfau, 2003) and message language that frames future attacks as threats to freedom (Miller et al., 2013).
Clearly, inoculation treatments involve dynamic, powerful processes that ultimately lead to resistance to influence. But how do these treatments differ to other commonly employed techniques among health psychologists and practitioners?
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4746429/?report=classic
Inoculation Improves Resilience Against Misinformation on Social Media
Online misinformation continues to have adverse consequences for society. Inoculation theory has been put forward as a way to reduce susceptibility to misinformation by informing people about how they might be misinformed, but its scalability has been elusive both at a theoretical level and a practical level. We developed five short videos that inoculate people against manipulation techniques commonly used in misinformation: emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks. In seven preregistered studies, i.e., six randomized controlled studies (n = 6464) and an ecologically valid field study on YouTube (n = 22,632), we find that these videos improve manipulation technique recognition, boost confidence in spotting these techniques, increase people’s ability to discern trustworthy from untrustworthy content, and improve the quality of their sharing decisions. These effects are robust across the political spectrum and a wide variety of covariates. We show that psychological inoculation campaigns on social media are effective at improving misinformation resilience at scale.
https://www.science.org/doi/10.1126/sciadv.abo6254
The typical straw man argument creates the illusion of having refuted or defeated an opponent's proposition through the covert replacement of it with a different proposition (i.e., "stand up a straw man") and the subsequent refutation of that false argument ("knock down a straw man") instead of the opponent's proposition.
https://en.wikipedia.org/wiki/Straw_man
Reactance is an unpleasant motivational reaction to offers, persons, rules, or regulations that threaten or eliminate specific behavioral freedoms. Reactance occurs when an individual feels that an agent is attempting to limit one's choice of response and/or range of alternatives.