Disinformation's Political Weaponization in the U.S. Election Context
In the U.S., recent tactics in disinformation have become increasingly complex. Unlike the Internet Research Agency troll factories of 2016 or fake advocacy groups of 2020, the tactics have evolved even further. The approach is sophisticated: foreign influence actors, primarily from Russia, employ AI-generated content, impersonate official sources, and capitalize on existing societal divisions to sow chaos and instigate violence.
One prominent example involves videos falsely depicting illegal voting, such as the widely circulated footage allegedly showing Haitians voting multiple times in Georgia. Such content, which U.S. intelligence agencies traced back to Russian sources, builds on the social unrest and racial sensitivities already present in American society. Alongside this, the propagation of fake narratives around mail-in ballot destruction and impersonation of FBI and Justice Department officials add layers of confusion and distrust. All with one overarching aim – to push a substantial segment of the population to question electoral integrity and sow the seeds of potential post-election unrest.
Disinformation in the U.S. isn’t merely about distorting facts, it strategically fuels pre-existing narratives that challenge the very foundation of trust in democratic processes. This strategy echoes the playbook Russia and other foreign actors have used in Europe, where disinformation exploits local vulnerabilities to destabilize societies.
The European Parallel: Moldova and Georgia as Case Studies
Most post-communist countries in Europe have long been on a trajectory to join the EU and NATO. The prospect of joining the EU and the security guarantees offered by NATO act as an important driver for the necessary societal and economic transformation. Yet, these efforts are coming under increasing pressure from Russia, which sees the post-communist and post-Soviet space as its “near abroad” and a zone of influence. In its effort to slow down or even revert the transition of the post-Soviet countries into Western orbit, Russia is increasingly using disinformation as an important part of a larger hybrid influencing toolset.
Moldova and Georgia, the two nations with complex geopolitical situations, have faced disinformation campaigns similar to those seen in the U.S. for years. In Moldova, for instance, recent disinformation efforts in the run-up to the constitutional referendum on Moldova`s EU future have aimed to weaken the support for the EU and NATO by promoting narratives that portray Western alliances as threats to Moldovan sovereignty. Similarly, in Georgia, disinformation has been used to incite public unrest, amplify tensions, and portray pro-Western sentiments as a betrayal of national identity.
These cases demonstrate how disinformation can be tailored to exploit specific societal fractures. In Moldova and Georgia, the content focuses on cultural, political, and historical divides that are deeply embedded in public consciousness. By stoking fears and amplifying identity conflicts, these campaigns not only weaken democratic resilience but also make it increasingly challenging for pro-democratic forces to maintain public trust. As our Disinformation and Civil Society Regional Mapping Reports vividly illustrate, civil society organizations (CSOs) from the region are operating in an increasingly hostile environment as a direct consequence of disinformation campaigns.
What European Experiences Reveal About the Possible Impact on American Society
Reflecting on the European experience, we can observe that disinformation thrives where societal cleavages exist. It is evident that disinformation campaigns succeed in amplifying divisions that were already present but not necessarily divisive enough to destabilize societies on their own. Disinformation amplifies fears, drives wedges, and fosters an "us vs. them" mentality that makes societies more susceptible to manipulation and less resistant to external interference.
The parallels to the current American context are stark and frightening. The U.S. is currently experiencing a profound identity struggle, with political, racial, and ideological divides deepening in recent years. One notable example illustrating this trend is the divergent views on racial issues with 70% of Republicans believing that in America people are judged by character rather than skin color, compared to only 28% of Democrats.
Disinformation campaigns targeting the 2024 U.S. election capitalize on these divisions by crafting narratives that resonate with these “tribal” identities. It’s not merely the spread of false information that matters but rather the way disinformation embeds itself in the wider societal discourse, creating “tribes” or virtual communities that resist fact-checking and are immune to logic and facts.
European Approaches to Navigating and Countering Disinformation
European countries have undertaken various countermeasures to curb the spread of disinformation, ranging from media literacy initiatives to regulatory actions on digital platforms. For example, the European Union’s Code of Practice on Disinformation has encouraged platforms to monitor and reduce the reach of harmful content, including disinformation. However, these efforts can only partially address the problem. Disinformation actors are resilient and adapt to regulatory environments. Moreover, some platforms most prone to spreading disinformation (such as Telegram) are often absent from such voluntary regulation.
One critical insight from the European experience is that fact-checking and media literacy, while necessary, are insufficient on their own. Addressing disinformation also requires an understanding of the social and psychological drivers that make people susceptible to manipulation in the first place. In many cases, disinformation is not just information but a vehicle for expressing group identity, shared beliefs, and even protest against perceived injustices. Simply debunking a falsehood will not deter those who see the narrative as an integral part of their worldview. The same applies in the U.S., where disinformation campaigns often serve as expressions of ideological identities rather than factual discourse, thus limiting the fact-checking and debunking effectiveness.
Moving Forward: Strategic Responses for Democracies
The experiences from Europe reveal that successful strategies against disinformation must be multifaceted. Regulatory efforts such as the EU Digital Services Act and fact-checking initiatives need to be coupled with support for civil society initiatives and deeper societal interventions that address the underlying conditions making disinformation effective.
As the U.S. faces another wave of election-related disinformation, which could have serious consequences, it’s evident that valuable lessons from other countries can be utilized. Disinformation is no longer a mere tool for spreading falsehoods. It has become an instrument for sowing discord and weakening the very foundations of democratic societies. Recent examples from Europe remind us that countering disinformation is a long-term struggle that requires more than just technological fixes. It demands a commitment to strengthening the social fabric, bolstering democratic norms, and empowering citizens with the resilience to see through manipulative narratives.
Civil society is vital in this effort, as these organizations frequently bridge the gap between citizens and authorities or online platforms, ensuring transparency and accountability. Thanks to their flexibility, expertise, and innovative approaches, civil society organizations (CSOs) often serve as the first line of defense against malicious actors spreading disinformation for harmful purposes. By looking beyond fact-checking and embracing these broader strategies, democracies on both sides of the Atlantic can safeguard their integrity and trust from any hostile actors, foreign or domestic, who try to capitalize on sowing discord and division for their benefit.
Author: Daniel Milo, Countering Disinformation Subject Matter Expert, TechSoup
Background illustration: yavdat