Disinformation understood as any information that turns out to be false, poses an inevitable challenge to human cognition and social interaction. This is a consequence of the fact that people are often wrong and sometimes lie. This fact and phenomenon is aggravated by the rise of the information age, where the increasingly interconnected world allows the high flow of information at a higher speed than 2 centuries ago. This brings with it a major challenge to counteract the rise and viralization of misinformation, and its consequent influence on memory and decision-making.
Beginning in the 1970s, psychologists demonstrated that even after correcting misinformation, false beliefs can persist (Anderson, C. 1980). "When we hear new information, we tend to think about what it might mean," assert Norbert Schwarz (2015). "If we later hear a correction, this does not invalidate our thoughts, and it is our own thoughts that may maintain a bias, even when we accept that the original information was false."
Schwarz (2015), identified five criteria that people use to decide whether the information is true: compatibility with other known information, credibility of the source, whether others believe it, whether the information is internally consistent, and whether there is supporting evidence. His studies also show that people are more likely to accept misinformation as fact if it is easy to hear or read.
Peter Ditto, shows that people selectively deploy skepticism, for example, when they are less critical of ideas that align with their political beliefs (Gampa, A. 2019). Others have built on Schwarz's earlier findings, showing that people are more likely to fall into misinformation when they do not carefully deliberate material, whether or not it aligns with their political views (Bago, B. 2020). The lead author of one such analysis, Dr. Gordon Pennycook (2019), who states that this suggests that passive sharers, not malicious actors, must be the biggest problem in the fake news phenomenon.
According to Dr. Sander Van Der Linden and colleagues (2020), six "degrees of manipulation"-personification, conspiracy, emotion, polarization, debunking, and trolling-are used to spread misinformation and disinformation. For example, a fake news story may quote a false expert, use emotional language or propose a conspiracy theory to manipulate readers. This study also reveals individual differences in susceptibility to misinformation. For example, people who use an intuitive reasoning style tend to believe fake news more often than those who rely primarily on analytical reasoning. Political ideology also seems to play a role, as those with extreme beliefs - especially extreme right-wing ones - are more susceptible to disinformation (Baptista, J. 2020).
All these studies are the compendium of the reasons behind our behaviors that privilege the acceptance and proliferation of disinformation. Being aware of them will allow us to turn off the "automatic pilots" that hover around our behavior and promote critical thinking behind the consumption of information.
To learn more about forms of disinformation and train your critical thinking skills, register to our free, online, self-paced course on Media Literacy Online:
1. Petty, R. E., & Cacioppo, J. T. (1986). Communication and persuasion. Springer; Anderson, C. A., et al. (1980). Perseverance of social theories: The role of explanation in the persistence of discredited information. Journal of Personality and Social Psychology, 39(6), 1037–1049; McGuire, W. J. (1964). Some contemporary approaches. Advances in Experimental Social Psychology, 1, 191–229.
2. Gampa, A., et al., Social Psychological and Personality Science, Vol. 10, n.º 8, 2019
3. Bago, B., et al., Journal of Experimental Psychology: General, Vol. 149, nº 8, 2020
4. Abrams, Z. 2021. Monitor on Psychology. Controlling the spread of misinformation: Psychologists’ research on misinformation may help in the fight to debunk myths surrounding COVID-19. Vol. 52 No. 2.
5. Baptista, J. P., & Gradim, A., Social Sciences, Vol. 9, No. 10, 2020