Disinformation is currently a worldwide phenomenon caused by the massification of information that the Internet allows to share and disseminate quickly and effectively. This phenomenon, largely promoted by information technology, has permeated the world's social and political environments.
The Pew Research Center and Elon University's Imagining the Internet Center conducted a set of surveys of different experts among technologists, academics, practitioners, strategic thinkers and others, asking them to react to the following proposition:
“The rise of “fake news” and the proliferation of doctored narratives that are spread by humans and bots online are challenging publishers and platforms. Those trying to stop the spread of false information are working to design technical and human systems that can weed it out and minimize the ways in which bots and other schemes spread lies and misinformation.”
And they consequently asked them: "In the next 10 years, will trusted methods emerge to block false narratives and allow the most accurate information to prevail in the overall information ecosystem? Or will the quality and veracity of information online deteriorate due to the spread of unreliable, sometimes even dangerous, socially destabilizing ideas?" The rise of "fake news" and the proliferation of manipulated narratives that are spread by humans and bots online are challenging publishers and platforms. Those trying to curb the spread of false information are working to design technical and human systems that can eliminate it and minimize the ways in which bots and other systems spread lies and misinformation."
There are some who outline arguments advocating a position that the outlook with regarding misinformation may improve over time. While there are those who advocate a position that keep a more daunting future with respect to what may happen with this phenomenon.
51% of these experts believe that this situation will not improve, and the majority of them are generally framed by two reasons. One group is broadly framed as the problem is human nature, while the other believes it will not improve because technology will create new large-scale challenges that cannot be countered, but there is also some interested that this won’t happen.
Tom Rosenstiel, author, director of the American Press Institute and senior fellow at the Brookings Institution illustrates the first of the reasons by arguing that "whatever changes platform companies make, and whatever innovations fact-checkers and other journalists implement, those who want deception will adapt to them. Disinformation is not like a plumbing problem that you fix."
While, Scott Spangler, senior data scientist at IBM Watson Health, is among the experts who fall into the second reason, saying that the technologies that now exist make false information nearly impossible to discern and flag, filter or block. This one wrote, "Machine learning and sophisticated statistical techniques will be used to accurately simulate real information and make false information almost indistinguishable from the real thing."
On the other hand, there are 49% of experts who claim that the global misinformation scenario will improve. These experts generally promote two arguments. The first is that technology can help solve these problems. These more hopeful experts said that the increasing speed, reach and efficiency of the Internet, apps and platforms can be leveraged for fake news and disinformation campaigns. Some predicted that better methods of creating and promoting reliable, fact-based news sources will emerge.
Larry Diamond, a senior fellow at the Hoover Institution and the Freeman Spogli Institute (FSI) at Stanford University, circumscribes to this position. "I am hopeful that the major digital information platforms will take creative initiatives to privilege the most authoritative and credible sources and to denounce and degrade information sources that appear to be engines of propaganda and manipulation, whether human or robot. In fact, companies are already starting to take steps in this direction."
There are also experts who generally believe that another characteristic of human nature is to come together and solve problems. The hopeful experts in this survey were of the opinion that people have always adapted to change and that this wave of challenges will also be overcome. They argued that misinformation and bad actors have always existed, but have eventually been sidelined by smart people and processes. They hope that well-intentioned actors will work together to find ways to improve the information environment. They also believe that better information literacy among citizens will enable them to judge the veracity of the content of material and ultimately raise the tone of discourse.
Irene Wu, associate professor of communication, culture and technology at Georgetown University, circumscribes her opinion to the last , stating that "information will improve because people will learn to better manage the masses of digital information. Right now, many people naively believe what they read on social networks. When television became popular, people also believed that everything that appeared on it was true. What is important is how people choose to react and access information and news, not the mechanisms that distribute it."
In any case, misinformation seems to be a challenge that will continue to spread int the content that is distributed daily through the media, social media and digital channels in general. This challenge will lead us to remain optimistic about new solutions to counteract it, or it will also lead us to understand the reality in a cruder way, assuming that it is a problem that will always exist and may increase.
Any of these positions will always demand from individuals a citizen commitment to be more critical with the information we consume, produce or replicate in order not to fall into the nets of manipulation, misinformation or misunderstandings.
Register to the free, self-paced course on Countering Disinformation today and help us create a safer, disinformation-free future: