China Disinformation Elections Iran Russia United States and Canada

Elections 2020

September 22, 2020

Five big questions as America votes: Disinformation

By DFRLab

As part of the Atlantic Council’s Elections 2020 programming, the New Atlanticist will feature a series of pieces looking at the major questions facing the United States around the world as Americans head to the polls.

The 2020 US elections are on the near horizon, and there is a lot of discussion of mis- and disinformation going into them. Much ink has been spilled about disinformation and foreign interference in the 2016 elections, and much of it was merited as there is copious evidence that Russia, in particular, used fake social media accounts—among other things—to influence the outcome. While foreign interference (as discussed in three of the questions below) remains a concern in 2020, the story this year must also focus on how much of the mis- and disinformation is domestic in origin. With the rise of particularly virulent and hostile conspiracy theories such as QAnon, alongside an increased willingness of politicians to purposefully mislead or lie as a means of achieving their goals, it is safe to say that the 2020 election cycle will see US voters exposed to more in-house information pollution than any previous cycle. In the end, whether the mis- or disinformation is foreign or domestic in origin, an information environment rife with confusing, polarizing, and often false narratives can only serve to further divide an already tense nation.

Below are the five major questions facing the United States on disinformation as the US elections approach, answered by top Digital Forensic Research Lab experts:

How should we label this problem: Disinformation, misinformation, or “fake news?” 

Clarity of language is very important when discussing social media manipulation and foreign influence operations. In general, the biggest challenge is disinformation: what the DFRLab defines as “false or misleading information, spread with the intention to deceive.” Disinformation is distinct from misinformation, in which false or misleading information is spread without evidence of obvious coordination.

For instance, you are spreading disinformation if you create a network of fake social media personas in order to mislead Americans about the procedure for casting their absentee ballot. If you share a friend’s Facebook status about absentee ballot procedures without double-checking the facts for yourself, you are engaged in the spread of misinformation (but try not to do that!)

Both phenomena are distinct from fake news, an unfortunate term that was coined in late 2016 when journalists were first trying to describe the sudden flood of viral falsehoods. However, “fake news” was quickly co-opted by then-President-Elect Donald Trump as a way to attack and delegitimize the news media as a whole. The term has since been used to justify a wave of censorship laws passed by authoritarian governments around the world.

In early 2018, the DFRLab vowed to stop using the term “fake news.” By the end of the year, the DFRLab had helped convince the British government to do the same. You’d be well served to follow the same example.

Emerson Brooking, resident fellow, Atlantic Council’s DFRLab.

How are the Russians targeting the 2020 election and how have their tactics evolved since 2016?

Since 2016, Russian influence operations have become less a thing of spy movies and more a kitchen table issue in the United States.

Efforts during the presidential election that year by Russian intelligence and the network of oligarchs surrounding President Vladimir Putin were directed to mount disruptions in America’s domestic politics with catastrophic success. Their most effective gambit was Russian intelligence’s cyber espionage–hacking–of campaign officials and the Democratic National Committee. Sensitive materials were subsequently leaked through DCLeaks.com and Wikileaks, intended to maximize media coverage. The leaks were further amplified by the work of the infamous St. Petersburg-based Internet Research Agency, which used botnets and sockpuppet accounts to keep the hacked materials in the national conversation.

The Intelligence Community Assessment produced by the Office of the Director of National Intelligence on January 7, 2017 and written by the CIA, FBI, and NSA outright warned, “We assess Moscow will apply lessons learned from its Putin-ordered campaign aimed at the US presidential election to future influence efforts.” By mid-2017, the issue of foreign interference was receiving serious attention from the government, media, technology companies alike. The Senate Select Committee on Intelligence launched its own independent investigation. Former FBI Director Robert Mueller assumed the role of special counsel, leading the Department of Justice’s investigation of Russian operations that had targeted 2016. Social media companies made their first data disclosures about foreign influence activities on their platforms, which DFRLab independently analyzed.

Ahead of the 2018 US midterm elections, 60 percent of Americans (split mostly along party lines) expressed a belief that Russia had interfered in the 2016 elections, showing just how widely the issue had penetrated mainstream political discourse. In response to this scrutiny, the Russians were forced to use more subtle tactics, utilizing less bots and original content creation and more targeting of narrow US demographics and attempts to translate online engagement into real-world protests. As Director of National Intelligence Daniel Coats concluded after the 2018 election, foreign interference by Russia and other actors was limited to “influence activities and messaging campaigns.”

The continuous efforts by Russia to exploit our internal divisions achieved varying degrees of impact but also hung a cloud over democratic discourse in the United States, creating a specter of influence that in and of itself is a success for foreign actors seeking to sow discord. Ahead of 2020, we have to ready ourselves for all of the above, and—equally—important remain skeptical of spurious or unevidenced claims of foreign influence.

Graham Brookie, director and managing Editor, DFRLab

Is China targeting the 2020 election? If so, what should be done about it?

As of now, there is no direct evidence that China is targeting the 2020 election. However, repurposed YouTube accounts tied back to China and paid advertisements with anti-Trump narratives in US newspapers show clear signs of intent and production. On the other hand, the People’s Liberation Army’s novice-level foreign information operations, especially compared to Russia, and the refocusing of resources on promoting a meritocratic system through “COVID Diplomacy” signify China’s lack of interest in US election interference.

A researcher at the Foreign Policy Research Institute found Chinese disinformation campaigns targeting multiple US presidential candidates. Repurposed Youtube channels were found to fact-twist Mike Pompeo’s words, attack against former US President Barack Obama for his inability to solve racial discrimination while in office, and promote overall negative sentiment against Trump. The deteriorating state of US-China relations and negative disposition on Trump clearly indicates the Chinese government’s disfavor of Trump’s reelection. However, presidential candidate Joe Biden’s history of supporting American labor unions is also not a good pick. Although Chinese hackers were found to be targeting Joe Biden’s campaign, it was less to influence the election, and more to understand his administration’s stance on China.

Chinese methods of inserting paid ads in foreign newspapers favoring a specific candidate is not a relatively new phenomenon. In the past, ads called “China Watch” have been used since 2016 in The Washington Post, Wall Street Journal, and many other US based newspapers. Recent anti-Trump paid ads in the Des Moines Register and other papers are nothing new.

Most likely, China will use Twitter accounts linked to the Chinese Communist Party to push negative content created by the state-sponsored Xinhua media outlet. China will likely focus their limited western social media operations on dividing a transatlantic approach towards China, hence pushing “COVID diplomacy” to Belt and Road Initiative countries, spinning a positive narrative around coronavirus, and focus their resources on promoting a pro-Party image around the world. China also understands its disadvantage to Russia’s sophisticated influence campaigns.

Solutions for countering Chinese interference in the US elections consist of working together with social media companies, especially YouTube, to identify and takedown repurposed accounts with fake profiles and images; and secondly, to block paid advertisements from foreign nations with political narratives from being advertised in US based newspapers.

Alicia Fawcett, visiting fellow for East Asia, DFRLab

Is Iran targeting the 2020 election? How should the US respond?

Tensions between Iran and the United States have intensified since President Trump unilaterally withdrew the United States from the Joint Comprehensive Plan of Action (JCPOA), also known as the Iran nuclear deal, in 2018. To Iran, another term of the Trump presidency would mean continuation of US pressure in an effort to foment regime change. The upcoming elections are thus an opportunity for Iran to undermine US democratic institutions, President Trump, and drive wedges into pre-existing divisions in society.

Iran is likely to pursue these objectives by expanding its cyber-enabled activities, including espionage campaigns (e.g. hacking and phishing attacks) and online influence operations through networks of fake websites and social media accounts. These outlets seek to spread disinformation and influence public opinion in favor of Iran’s geopolitical interests. They also circulate anti-US content, such as issues related to race, religion, police brutality, and the US response to the COVID-19 pandemic. Moreover, Iran has tested techniques like impersonating political figures, including US senators and candidates for the House of Representatives, to sow confusion and doubt. In short, Iran has high stakes in the outcome of the 2020 presidential election and, in preparation, is likely to diversify its information operations.

As social media platforms continue to detect and remove coordinated inauthentic behavior, US intelligence agencies need to work closely with these companies to pinpoint Iran’s digital influence operations. As the DFRLab noted previously, the Office of Director of National Intelligence should regularly alert US officials and the public of these expansive threats. Moreover, the US government needs to centralize its interagency efforts in studying and countering foreign influence. Lastly, the US government and social media companies should continue to invest in resources that boost digital literacy of American voters. They should also support organizations that track and identify Iran’s digital influence networks without editorializing their findings. Only a truly multi-faceted approach can tackle the multi-faceted threat of foreign influence to our democracy.

Simin Kargar, nonresident fellow, DFRLab

What should be done about the spread of political dis- and misinformation within the United States?

Some foreign actors repurpose American-sown disinformation to interfere in the United States. This means addressing domestic disinformation is not only a national issue—it is also a foreign policy problem.

We’ve seen the offline impacts of online dis- and misinformation all too clearly during the COVID-19 pandemic, where online rumors about the efficacy of masks, social-distancing, and COVID cures and causes endanger lives. Domestic-driven conspiracies like QAnon—an aggregation of far-right theories groundlessly proffering that President Trump is fighting a deep-state cabal—have garnered mainstream political support despite being false. Domestic dis- and misinformation is often particularly pernicious because it spins lies around kernels of truth. For example, conspiracies about a political elite controlling the world capitalize on very real rising wealth inequality in the United States, where the top 1 percent have more wealth than the bottom 90 percent of American households combined.

The United States’ pre-existing social problems, like public distrust of institutions and lack of faith in governance, help domestic disinformation take root and grow; the pernicious spread of false information can then make these problems even worse. That means we need a holistic and whole-of-society approach to fight disinformation effectively. When we better seemingly unrelated issues like our healthcare system, inequality, and disenfranchisement, we help build public faith in our government, media, and institutions—and increasing that trust helps insulate the public from falling for baseless conspiracies. Long term, we must also push for enhanced digital literacy, advocate for algorithmic transparency, and defend user privacy as we imagine what sorts of digital commons the public deserves.

We can start in the short term by reckoning with those pushing and spreading false information to the public. Elected leaders should be condemned and politically ostracized for repeatedly pushing groundless conspiracies, not supported. Platforms must be held accountable for amplifying false information with serious offline implications, like lies about how to vote.

Fixing our fractured information environment won’t be easy, but our democracy will function best on a set of shared facts, so it is the only way forward.

Alyssa Kann, Research Assistant, DFRLab

The Atlantic Council’s Digital Forensic Research Lab (DFRLab) has operationalized the study of disinformation by exposing falsehoods and fake news, documenting human rights abuses, and building digital resilience worldwide.

Related Experts: Emerson T. Brooking, Graham Brookie, Simin Kargar, and Alyssa Kann

Image: Stock photo of Facebook, Messenger, Instagram and WhatsApp, social media app icons on a smart phone. (Via REUTERS)