In the midst of the complexity of disinformation research, it is crucial to establish a shared understanding of the vocabulary used. It is important to differentiate between terms such as "disinformation" and "misinformation" to ensure accurate analysis and effective communication in the fight against false narratives.

This article aims to clarify the various terms associated with disinformation, providing a comprehensive guide on how these terms should be used in analytical discussions. By standardizing our use of terminology, we can better navigate the complex information landscape, leading to a more informed and resilient society. Let's explore the lexicon of disinformation to gain a better understanding and accountability in an era dominated by information disorder.

From "Fake News" to Information Disorder

In today's world, the term "fake news" is everywhere. It's used by politicians, journalists, and everyday people alike. But here's the thing: everyone seems to use it differently. Some say it when they mean a mistake in the news, while others use it to talk about deliberate lies spread to mislead people.

Experts deliberately avoid the term ‘fake news’. They think it is too confusing and too political. It is not clear what it really means, and it's often used to attack news sources that people don't like or to attack political opponents. Thus, we recommend being careful when using it. If you have to use this term, do it to only talk about the mainstream media accidentally spreading false information.

There is a better term we can use: "information disorder." It covers everything – not just fake news, but also lies, propaganda, and other ways people try to trick us with information. It's a clearer way to talk about the messiness of the information we see every day and to describe the phenomena we are fighting. Information disorder refers to the many ways our information environment is polluted. It's not just about lies – it is about how information is twisted and turned to suit different agendas. It's about making it hard to know what's true and what's not.

So, let's try to use "information disorder" instead of "fake news." It's a simpler and clearer way to talk about the messy world of information we live in. And by understanding it better, we can all be smarter about the news we see and share.

Information Disorder Elements: Understanding Disinformation, Misinformation, and Malinformation

There are three main types of information disorder that exist online: disinformation, misinformation, and malinformation. They all involve false or misleading information being spread, but their intentions and consequences are different. It's important to be aware of these types of information disorder when consuming information online.

Disinformation

Disinformation is characterized by its deliberate intent to deceive or manipulate. It involves the dissemination of false or misleading content with the aim of achieving economic, political, or social gain. Disinformation is often crafted to cause public harm, whether by influencing opinions, destabilizing societies, or advancing specific agendas.

Misinformation

In contrast, misinformation refers to false or misleading content that is shared without harmful intent and usually without knowledge about falsehood. While the effects of misinformation can still be detrimental, it typically arises from unintentional errors or oversights. Common markers of misinformation include contradictions and logical errors, stemming from ideological biases or cognitive blind spots. Examples may include sharing inaccurate dates or misattributed quotes due to negligence or ignorance.

Malinformation

Malinformation represents a distinct category characterized by the deliberate publication of private information for personal, corporate, or political motives. Unlike disinformation, malinformation may not involve the fabrication of false content but instead focuses on the intentional manipulation or misuse of authentic information. This can include the unauthorized disclosure of private data, such as revenge porn or leaked emails, aimed at damaging someone's reputation. Additionally, malinformation encompasses deliberate changes in the context, date, or timing of original content to mislead or deceive.

The primary distinction between these forms lies in their underlying intent and potential harm. Disinformation is driven by a deliberate desire to deceive for gain, whereas misinformation often arises from inadvertent errors or ideological biases. Malinformation, on the other hand, involves the intentional misuse or manipulation of information for personal or political ends.

There is another term that you can approach when analyzing information disorder – inauthentic online behavior. It originates from internal policies of social media platforms, it encapsulates actions aimed at misleading users about the identity, purpose, or origin of the entities they encounter online.

Different Approaches in Disinformation Research

Disinformation researchers usually refer to 3 different approaches:

Fact-checking is an important method used to analyze disinformation. It involves verifying information in a meticulous process to ensure its accuracy and truthfulness. Fact-checking is typically done after content has been disseminated, with the goal of promoting the accuracy of reporting and statements. Major organizations involved in fact-checking efforts include Factcheck.org, Snopes, Africa Check, and those verified by the International Fact-Checking Network.

Debunking is another critical analytical approach utilized in disinformation research, particularly emphasized by organizations such as NATO Stratcom. Unlike the broad scope of fact-checking, debunking focuses on correcting specific falsehoods propagated by particular actors or on specific topics. It involves a strategic decision-making process to prioritize which content to address based on an assessment of intent and behavior. Debunking initiatives prioritize tackling falsehoods that pose significant harm or impact, often disregarding less consequential mis or disinformation.

Pre-bunking is an emerging strategy aimed at countering disinformation before it gains traction. This proactive approach seeks to preemptively debunk or counter disinformation before it is disseminated widely. Pre-bunking messages are designed to enhance audiences' critical thinking skills and discernment, empowering them to recognize and reject information pollution. These messages can take various forms, including interactive games, videos, and posts, tailored to engage and educate diverse audiences.

Authors and Actors of Disinformation

In the realm of disinformation, identifying the individuals and entities responsible for its propagation is crucial. From hostile actors to governmental propaganda machines, a diverse array of actors play a role in shaping the narratives that permeate our information ecosystem.

How to call those who spread disinformation?

One commonly used term to describe those who spread disinformation is "hostile actors". These individuals or groups intentionally disseminate false or misleading information with the aim of sowing discord, manipulating public opinion, or achieving specific political or ideological goals. Similarly, the term "threat actors" encompasses individuals or organizations that pose a threat to the integrity of information.

In some cases, the only thing we can detect is a channel of communication, not the author itself, then we could refer to these as "unreliable sources." This term emphasizes the lack of credibility or trustworthiness associated with the information they disseminate. It highlights the importance of critically evaluating sources and information to discern truth from falsehood.

Governmental/Political Propaganda vs. Foreign Interference

Disinformation can also originate from governmental or political propaganda machines, which use public resources to disseminate biased or misleading information. This propaganda often serves to manipulate public opinion, suppress dissent, or advance specific political agendas. Additionally, foreign interference, such as the manipulation of social media by foreign actors, can exacerbate disinformation campaigns and undermine democratic processes. Foreign interference in the information space, often carried out as part of a broader hybrid operation, can be understood as coercive and deceptive efforts to disrupt the free formation and expression of individuals’ political will by a foreign state actor or their agents.

Trolls vs. Bots

In the digital realm, trolls and bots are two distinct types of actors commonly associated with disinformation campaigns. Trolls are individuals who purposefully provoke or inflame online conflicts by spreading controversial or divisive content. Bots, on the other hand, are automated social media accounts programmed to amplify certain messages or manipulate online discourse.

Fake Experts

Lastly, "fake experts" are individuals who pose as authorities in a particular field but lack the expertise or credentials to support their claims. These individuals may contribute to disinformation by spreading false or misleading information under the guise of expertise, further complicating efforts to discern credible sources.

Countering Efforts

In the battle against disinformation and misleading content, narratives play a pivotal role in shaping public perception and discourse. Two key approaches to combating false narratives are positive narratives and counter-narratives.

Positive (Alternative) Narratives

Positive alternative narratives are grounded in constructive and affirmative messaging. Instead of merely responding to false information, they offer proactive proposals and solutions. These narratives focus on what we are for rather than what we are against, presenting a vision of positivity and progress. An illustrative example is the Second Chances campaign in Florida, USA, which advocates for rehabilitation and second chances for individuals with criminal records. By promoting positive values and initiatives, positive alternative narratives offer a compelling alternative to divisive or negative messaging.

Counter-Narratives

Factual counter-narratives aim to debunk false information by providing evidence-based rebuttals and fact-checking. These narratives identify flaws in the logic and credibility of misleading content, presenting accurate information to counteract falsehoods. Counter-narratives are essential for correcting misinformation and preventing its spread. By highlighting inaccuracies and providing verifiable evidence, they empower individuals to make informed decisions and resist manipulation.

Both positive narratives and counter-narratives serve as valuable tools in the fight against disinformation. While positive narratives offer a vision of hope and progress, counter-narratives provide the necessary foundation of truth and accuracy. By employing a combination of these approaches, we can effectively challenge false narratives and promote a more informed and resilient society.

Disinformation terminology:

1. Deepfake vs Cheapfake

A “deepfake” is a video that has been altered through some form of machine learning to hybridize or generate human bodies and faces” whereas a “cheapfake” is an AV manipulation created with cheaper, more accessible software (or, none at all). Cheap fakes can be rendered through Photoshop, lookalikes, re-contextualizing footage, speeding, or slowing.

2. Information Bubble vs Echo Chamber

3. Logical Errors vs Cognitive Bias

Both terms are used to refer to our preconceptions (for example stereotypes) that we can have when approaching information.

4. TTPs (Tactics, Techniques, and Procedures)

Refer to the patterns of behavior used by threat actors to manipulate the information environment in order to deceive.

5. Demonetizing disinformation - cutting financial incentives for purveyors of disinformation.

6. ISAC - Information Sharing and Analysis Centres.

7. Hybrid threat - combines military and non-military as well as covert and overt means, including disinformation, cyberattacks, economic pressure, deployment of irregular armed groups and use of regular forces. Hybrid methods are used to blur the lines between war and peace and attempt to sow doubt in the minds of target populations. They aim to destabilize and undermine societies.

8. Information influence operations - refers to coordinated efforts by either domestic or foreign actors to influence a target audience using a range of deceptive means, including suppressing independent information sources in combination with disinform

Background illustration by: fotogestoeber

If you want to strengthen your skills on countering disinformation, take our free self-paces course!