August 11, 2021
As we search for ways to combat online disinformation and misinformation in the election sphere — an important part of our mandate here at the USC Election Cybersecurity Initiative — we must pay attention to the tactics nefarious actors are using to promote bad information on other issues as well. It’s often the same actors, using similar tools and tactics, who are spreading disinformation on a range of topics. Studying these broader disinformation trends and threats can help us immensely in the fight to keep our democracy safe and secure.
To that end, there is increasing public focus on online disinformation related to the COVID-19 vaccines, as the United States is grappling with stagnant vaccination numbers, a major outbreak of the Delta variant, and rampant false information proliferating on social media.
YouTube, for example, is confronting a very specific challenge in preventing a certain kind of COVID-19 misinformation. Part of its mandate, according to company representatives, is to give users transparency into how government works. In this case, that means allowing the posting of videos from local government meetings (city council and school board meetings, for example). But in some of these meetings, some members of the public espouse false information about COVID-19 and the vaccines in their comments.
The vexing regulatory question for YouTube is whether they should remove the video to prevent the dissemination of false information about COVID-19. On the one hand, the platform is hesitant to remove entire videos of these meetings, which are in the public interest. On the other, YouTube has struggled to prevent dangerous disinformation from proliferating on its website and is not keen to give these voices a platform. The company has, for example, at times removed videos from prominent political figures in the U.S. and abroad who were promoting unproven treatments.
The YouTube videos of local meetings that were initially removed have now been reinstated, but the challenge here is clear. And consider this alternative scenario: What if the local meeting featured false allegations of voter fraud, as we’ve seen happening in places across the country? Which public interest is more important: publishing the unvarnished views of citizens, even if they’re dangerous and false, or removing them?
This heated debate in the United States — exemplified by these contentious local government meetings — is exactly the political climate Russia is taking advantage of to further divide us with their online disinformation campaigns about COVID-19.
Russia, who we know has undertaken aggressive cyber activities against both our political system and private industry, has been a leading voice of online misinformation about the COVID-19 vaccine. Russian groups have been posting offensive cartoons, for example, aimed at encouraging fear of the vaccine and promoting the false idea that the Biden administration is forcing Americans to get the shot.
This Russian effort, like its other nefarious online activity, is designed to further inflame the already-tense culture wars here in the United States and drive wedges between Americans — one of Moscow’s key policy objectives.
Just this week, Facebook announced that it had removed several hundred Facebook and Instagram accounts (based mostly in Russia) that promoted disinformation about the COVID-19 vaccines. This campaign was targeted not just at audiences in the United States but also in India and Latin America. Given the scope of the problem with vaccine disinformation, this step by Facebook is a necessary one but far, far from sufficient — the problem is orders of magnitude larger than just these accounts.
And the Biden administration, which has been pushing Facebook for months to crack down on vaccine disinformation, is reportedly increasingly frustrated with what it sees as an irresponsible and dangerous lack of action from the social media giant. Senior White House officials have complained that Facebook is withholding key information, deflecting blame, and refusing to take aggressive action against repeat violators of their policies — the exact same criticisms the company has faced for its efforts to combat election-related disinformation.
In reportedly tense meetings and conversations, the administration has become more and more frustrated with Facebook’s responses to their questions, especially in the face of rising case loads and deaths. One senior White House health official, Andy Slavvit, in March warned a Facebook executive that, “In eight weeks’ time, Facebook will be the No. 1 story of the pandemic.”
As we work to combat COVID-19 disinformation, we can also read the calendar and know that the next elections here in the United States are not that far away.
President Biden recently visited the headquarters of the Director of National Intelligence, the head of the U.S. Intelligence Community, and said he was briefed just that morning on efforts by Moscow to spread misinformation ahead of the 2022 midterm elections. Biden said that day’s Presidential Daily Briefing — the most sensitive and important analytic intelligence product — discussed these Russian actions, which he called “a pure violation of our sovereignty.”
Biden attributed Russian President Vladimir Putin’s actions to the fact that his country’s economy is in “real trouble, which makes him even more dangerous.” Because Russia knows it cannot compete with the United States in many conventional economic or military ways, this theory holds, it has increasingly turned to asymmetrical warfare to pursue its agenda. As we know, these cyberattacks cost little, relatively speaking, and often have an outsized impact. David Sanger, who covers intelligence for The New York Times, calls cyberattacks “the perfect weapon.”
In addition to disrupting key infrastructure, these online campaigns exacerbate our political and cultural divides and make it harder for our democracy to function. Here at the USC Election Cybersecurity Initiative, our motto is, “Our candidate is democracy” – which is why it’s so important to pay attention to all of these various cyberattacks.
Marie Harf
International Elections Analyst, USC Election Cybersecurity Initiative
Marie Harf is a strategist who has focused her career on promoting American foreign policy to domestic audiences. She has held senior positions at the State Department and the Central Intelligence Agency, worked on political campaigns for President Barack Obama and Congressman Seth Moulton, and served as a cable news commentator. Marie has also been an Instructor at the University of Pennsylvania and a Fellow at Georgetown University’s Institute of Politics and Public Service.