Using psychological interventions to navigate disinformation | tackling-disinformation-learning-guide | DW | 14.04.2024
  1. Inhalt
  2. Navigation
  3. Weitere Inhalte
  4. Metanavigation
  5. Suche
  6. Choose from 30 Languages

Disinformation research

Using psychological interventions to navigate disinformation

Carolin-Theresa Ziemer explains her research on the kinds of psychological interventions like inoculation and prebunking used against disinformation and what their impacts are.

An image of a skull showing a person with an illuminated brain.

Psychological intervention against disinformation are becoming more common, but which ones work?

Although the evidence about the exact impactof disinformation on specific events such electoral outcomes is mixed,numerous studies indicate its general influence on perception and behaviorPlenty of correlational evidence links misperceptions and behavior. Examples include cancer patients seeking alternative medicineor believers of vaccine rumors refusing to get vaccinated (Larson, 2020). But only a few studiesestablish a causal relation between the two. Because of the complex relationship between beliefs and behavior, it is difficult to prove that a certain behavior stems solely from a certain misperception. However, when it comes to events like the Capitol Hill insurrection, it is reasonable to argue that disinformation, in this case that the election was fraudulent, added to the motivation for Trump supporters to storm the US Capitol in January 2021. Even if disinformation is only in part responsible for behavior, it is worth addressing.

Dozens of US flags wave as Trump supporters clash with police and security forces as they push barricades to storm the US Capitol on January 6, 2021.

There is general consensus that disinformation on social media helped fuel the storming of the US Capitol by Donald Trump supporters on January 6, 2020

Measures targeting the content on social media such as moderation, deplatforming and legal regulations like the European Union's Digital Service Act are commonly adopted to fight disinformation. In addition, a rising number of psychological interventions against disinformation are being trialed to target the susceptibility of individuals. 

To examine what kind of psychological interventions are being used, if these interventions are tested, what their impacts are and how long they last (among other things), our research team from Friedrich-Schiller University Jena in Germany comprehensively reviewed more than 3,000 scientific articles.

5 main psychological interventions against disinformation 

We categorized the psychological interventions against disinformation into five main categories. These are: boosting, inoculation, identity management, nudging and fact-checking. 

Boosting, inoculation and identify management are known as prebunking approaches, which take place prior to contact with disinformation. Nudges, on the other hand, are presented next to postings or articles in question. In contrast, fact-checking interventions aim to correct misconceptions that stem from former contact with disinformation. 

Fact-checking 

The most common and best examined intervention type is fact-checking. Fact-checks label disinformation as such and/or correct it. We distinguish between three ways disinformation is labeled: with no further information (Flags), text-based corrections by other social media users (Social Invalidation) or text-based corrections by professional fact-checkers (Expert Correction). 

Even though fact-checking is well established in journalism, it has several limitations. From a psychological perspective, it is not trivial to correct existing misperceptions. Several psychological phenomena, such as the continued influence effect, make it tricky for fact-checks to correct and replace misperceptions (Ecker et al., 2022). Also, fact-checking, if not automated, has a Sisyphean character as its efforts are always outperformed by the sheer amount of disinformation circulating. 

Lastly, and this might be its biggest disadvantage, a meta-analysis demonstrated that fact-checking is limited by the (in-) congruence of the correction with the recipient's worldview (Walter et al., 2020). In practice, this means that if I am wrong about something, and the correction is not in line with my ideological thinking, I am more likely to reject the fact-check rather than update my incorrect belief.

Fact-checking's structural disadvantage of being doomed to chase lies is bypassed by prebunking interventions. Instead of correcting disinformation that already circulates widely, prebunking immunizes recipients against disinformation before they come in contact with it.

8 tips for journalists to track disinformation campaigns

Boosting 

Boosting is a more general educative approach that strengthens knowledge and skills to reduce individuals' susceptibility to disinformation. We distinguish between simply providing knowledge on topics where disinformation frequently occurs, such as climate change (Knowledge), and imparting skills in dealing with media, information and scientific findings (Literacy).

Literacy boosting, for example, can be achieved by providing internet users with tips on how to improve their disinformation detection skills, such as by checking the source or being cautious when facing unusual formatting. 

Inoculation 

Inoculation is a technique for building mental resistance against disinformation. It's often illustrated using a vaccination metaphor. Recipients receive small doses of refuted disinformation which builds up an immunization against subsequent disinformation encounters. According to the theory, inoculation consists of a warning of upcoming disinformation, which activates the recipient defense system, followed by arguments refuting disinformation content (Classic Inoculation) or by an explanation of strategies typically applied in disinformation (Strategic Inoculation).

In a subsequent disinformation attack, inoculated recipients are immunized and less vulnerable. Some inoculation interventions only warn individuals of upcoming disinformation (Warning). Its operating principle of warnings and pre-emptive refutations has been applied to more user-friendly, engaging formats such as online games and videos. 

In BadNews, for example, a game which is available in 24 languages, players take over the role of an online troll. With each level they learn a new technique to effectively misinform the audience, thereby becoming experts in spotting disinformation (Roozenbeek & van der Linden, 2019). Several meta-analyses have demonstrated the effectiveness of inoculation treatments (Lu et al., 2023; Banas & Rains, 2010). Most importantly, inoculation seems to protect as well against belief-congruent disinformation, such as Republicans' attitudes on climate change (Cook et al., 2017).

Identity management 

Identity management interventions address the problem of interventions being less effective because their content is mismatched with the recipient's worldview. They aim to reduce the negative feelings triggered by information conflicting with individuals' beliefs and make these individuals more open to belief-incongruent corrections.

Self-affirmations bolster individuals' self-worth, for example, by appreciating one of their character traits. As a consequence, subsequent confrontations with belief-incongruent corrections are perceived as less unpleasant.

Another way to create more openness towards belief-incongruent but correct information is to actively encourage individuals to take a different perspective on a situation (Perspective Taking).

Currently, identity management interventions are scarcely researched, existing findings are heterogeneous (Lyons et al., 2021) and not yet applied in real-world settings like social media platforms. We know from motivated reasoning (Kunda, 1990), one of the most prominent theories in explaining why individuals are vulnerable to disinformation, that our identity plays a significant role in how we judge information. However, it may be difficult to influence someone's identity with a quick, impersonal intervention. 

Nudges 

Nudges are cues in the communication environment that increase the likelihood that an individual will identify disinformation. To count as a nudge, a desired behavior is incentivized but can be avoided without consequences. We distinguish between prompts to accurately check information (Accuracy Nudge), visually highlighting sources in social media posts (Credibility Nudge), emphasizing the importance of norm compliance in dealing with disinformation (Social Norm Nudge) and suggesting to verify information using other sources (Lateral Reading Nudge). 

The biggest advantage of nudges is that they are small-scale and easy to implement. A recent study in 16 countries across six continents found promising results underlining the effectiveness of accuracy nudges for diverse cultural contexts (Arechar et al., 2023). 

Why this isn't enough 

Addressing disinformation with psychological interventions is promising but not enough. The current state of research has several limitations when it comes to understanding the meaning of research findings as well as their applicability.

Even though the research field on psychological interventions is growing quickly, the majority of interventions are not theoretically integrated. This makes it difficult to understand why some interventions succeed while others fail — and hinders tailored adjustment.

Moreover, most interventions are developed and tested in Global North countries raising doubts about their applicability to countries of the Global South.

Tips on how to tackle disinformation in the Global South

Studies also primarily focus on adult populations, leaving us with limited insights into the effectiveness of interventions for children, adolescents and the elderly.

We also know little about how long intervention effects last and how well lab findings translate into real-world scenarios.

For instance, a recent study by Google Jigsaw implementing inoculation videos on YouTube found that discernment rates of anti-Ukrainian disinformation improved between 2-5% for inoculated participants (Jigsaw, 2023). Is this a reason for optimism, and what realistic outcomes can we expect? 

In a similar vein, we must consider that most successful intervention studies rely on clickworkers that have been paid to participate. As people spend their time in the virtual world mostly for entertainment purposes (Allen et al., 2020), they may have limited motivation to engage in interventions that require cognitive effort without being immediately rewarded.

Finally, we must also consider the desired impact of interventions against disinformation. Is the aim solely to enhance individuals' ability to discern fake from fact, or should it also extend to shaping attitudes and behaviors surrounding politicized events targeted by disinformation?

Recent findings indicate a complex relationship between knowing the facts and adjusting attitudes and behavior (Swire et al., 2017; Ziemer et al., under review).

Recommendations 

Our research has several findings relevant to media professionals and others working in the media sector:

  • Adhere to high quality research and reporting standards. Refrain from using strategies often used by disinformation campaigns, such as click-baiting if you want to be credible and trustworthy.

  • Don't only rely on fact-checking to tackle disinformation. Rather make use of the variety of existing intervention categories such as boosting. If employing fact-checking, there are also some caveats to consider when effectively correcting existing misperceptions. For example, an elaborated correction should dominate the disinformation, both in terms of quantity and design, warn of the disinformation before mentioning it and illuminate the motives behind the disinformation (Lewandowsky et al., 2020).

  • Use complementary measures on multiple levels. Psychological interventions against disinformation constitute an indispensable component within a broader toolkit of interventions. 

  • Advocate for the regulation of (digital) environments where disinformation proliferates. While the recent engagement of tech companies in combating disinformation is commendable, regulations like the Digital Services Act play a crucial role in holding them accountable. 

     

Basically, addressing disinformation with a holistic, multi-level approach is the way forward. This should also include fostering low polarization, enhancing media trust and improving education systems as these contribute to society's resilience against disinformation. (Humprecht et al., 2020).  

This article by Carolin-Theresa Ziemer is a guest contribution. Ziemer is a PhD researcher at Friedrich Schiller University Jena in Germany. Her research focuses on disinformation interventions and ideological bias. 

 

 

This article is part of Tackling Disinformation: A Learning Guide produced by DW Akademie.

The Learning Guide includes explainers, videos and articles aimed at helping those already working in the field or directly impacted by the issues, such as media professionals, civil society actors, DW Akademie partners and experts.

It offers insights for evaluating media development activities and rethinking approaches to disinformation, alongside practical solutions and expert advice, with a focus on the Global South and Eastern Europe.

References 

Allen, J., Howland, B., Mobius, M., Rothschild, D., & Watts, D. J. (2020). Evaluating the fake news problem at the scale of the information ecosystem. Science Advances, 6(14), eaay3539. https://www.science.org/doi/10.1126/sciadv.aay3539 

Arechar, A. A., Allen, J. N. L., Berinsky, A., Cole, R., Epstein, Z., Garimella, K., Gully, A., Lu, J. G., Ross, R. M., Stagnaro, M., & al., E. (2022). Understanding and Combatting COVID-19 Misinformation Across 16 Countries on Six Continents. https://doi.org/10.31234/osf.io/a9frz 

Banas, J. A., & Rains, S. A. (2010). A Meta-Analysis of Research on Inoculation Theory. Communication Monographs, 77(3), 281–311. http://www.communicationcache.com/uploads/1/0/8/8/10887248/a_meta-analysis_of_research_on_inocultion_theory.pdf 

BBC (2023 February 1). Russia in Africa: How disinformation operations target the continent. https://www.bbc.com/news/world-africa-64451376 

Cook, J., Lewandowsky, S., & Ecker, U. K. H. (2017). Neutralizing disinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PloS One, 12(5), e0175799. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0175799 

Eady, G., Paskhalis, T., Zilinsky, J., Bonneau, R., Nagler, J., & Tucker, J. A. (2023). Exposure to the Russian Internet Research Agency foreign influence campaign on Twitter in the 2016 US election and its relationship to attitudes and voting behavior. Nature Communications, 14(1), 62.  https://www.nature.com/articles/s41467-022-35576-9 

Ecker, U. K. H., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., Kendeou, P., Vraga, E. K., & Amazeen, M. A. (2022). The psychological drivers of disinformation belief and its resistance to correction. Nature Reviews Psychology, 1(1), 13–29. https://www.nature.com/articles/s44159-021-00006-y 

Humprecht, E., Esser, F., & Van Aelst, P. (2020). Resilience to Online Disinformation: A Framework for Cross-National Comparative Research. The International Journal of Press/Politics, 25(3), 493–516. https://www.researchgate.net/publication/338809208_Resilience_to_Online_Disinformation_A_Framework_for_Cross-National_Comparative_Research 

Jigsaw (2023). Defanging Disinformation's Threat to Ukrainian Refugees. https://medium.com/jigsaw/defanging-disinformations-threat-to-ukrainian-refugees-b164dbbc1c60 

Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–498. https://fbaum.unc.edu/teaching/articles/Psych-Bulletin-1990-Kunda.pdf 

Larson, H. J. (2020). Stuck: How Vaccine Rumors Start-and Why They Don't Go Away. Oxford University Press. 

Lewandowsky, S., Cook, J., Ecker, U. K. H., Albarracín, D., Amazeen, M. A., Kendeou, P., Lombardi, D., Newman, E. J., Pennycook, G., Porter, E. Rand, D. G., Rapp, D. N., Reifler, J., Roozenbeek, J., Schmid, P., Seifert, C. M., Sinatra, G. M., Swire-Thompson, B., van der Linden, S., Vraga, E. K., Wood, T. J., Zaragoza, M. S. (2020). The Debunking Handbook 2020. DOI:10.17910/b7.1182. https://sks.to/db2020 

Loomba, S., de Figueiredo, A., Piatek, S. J., de Graaf, K., & Larson, H. J. (2021). Measuring the Impact of COVID-19 Vaccine Misinformation on Vaccination Intent in the UK and USA. Nature Human Behaviour. https://doi.org/10.1038/s41562-021-01056-1 

Lu, C., Hu, B., Li, Q., Bi, C., & Ju, X.-D. (2023). Psychological Inoculation for Credibility Assessment, Sharing Intention, and Discernment of Disinformation: Systematic Review and Meta-Analysis. Journal of Medical Internet Research, 25, e49255. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10498317/ 

Lyons, B. A., Farhart, C. E., Hall, M. P., Kotcher, J., Levendusky, M., Miller, J. M., Nyhan, B., Raimi, K. T., Reifler, J., Saunders, K. L., Skytte, R., & Zhao, X. (2021). Self-Affirmation and Identity-Driven Political Behavior. Journal of Experimental Political Science, 1–16. https://www.cambridge.org/core/journals/journal-of-experimental-political-science/article/selfaffirmation-and-identitydriven-political-behavior/259498FE3F921E731CA8644F607030A2 

Roozenbeek, J., & van der Linden, S. (2019). Fake news game confers psychological resistance against online disinformation. Palgrave Communications, 5(1), 65. https://www.nature.com/articles/s41599-019-0279-9 

Roozenbeek, J., & van der Linden, S. (2024). The Psychology of Misinformation. Cambridge University Press. 

Johnson, S. B., Park, H. S., Gross, C. P., & Yu, J. B. (2018). Use of Alternative Medicine for Cancer and Its Impact on Survival. Journal of the National Cancer Institute, 110(1). https://doi.org/10.1093/jnci/djx145 

Swire, B., Berinsky, A. J., Lewandowsky, S., & Ecker, U. K. H. (2017). Processing political disinformation: comprehending the Trump phenomenon. Royal Society Open Science, 4(3), 160802. https://royalsocietypublishing.org/doi/10.1098/rsos.160802 

Walter, N., Cohen, J., Holbert, R. L., & Morag, Y. (2020). Fact-Checking: A Meta-Analysis of What Works and for Whom. Political Communication, 37(3), 350–375. 

Ziemer, C.-T., & Rothmund, T. (2024). Psychological Underpinnings of Disinformation Countermeasures. Journal of Media Psychology. https://econtent.hogrefe.com/doi/10.1027/1864-1105/a000407 

Ziemer, C-T., Schmid, P., Betsch, C., & Rothmund, T. (under review). Identity is key, but Inoculation helps – how to empower Russian Germans against pro-Kremlin disinformation.

Audios and videos on the topic