May 11, 2021
As countries around the world rush to anticipate new online threats to their elections and monitor how bad actors’ tactics are evolving, Facebook made an interesting announcement last week in one of its regular “Coordinated Inauthentic Behavior Reports.”
The social media company said it had recently taken down a network of hundreds of fake accounts and pages they had determined were waging a long-running deceptive political influence operation inside Ukraine. The Facebook investigation of these accounts and pages found links to Ukrainian individuals who were already sanctioned by the U.S. Treasury Department for their election interference in the United States, Andrii Derkach and Petro Zhuravel.
Derkach, a Ukrainian Member of Parliament who is assessed to have been an active Russian agent for over a decade, was sanctioned by the U.S. for conducting “a covert influence campaign centered on cultivating false and unsubstantiated narratives concerning U.S. officials” in the 2020 presidential election. Zhuravel is Derkach’s media manager and website administrator for NabuLeaks, the centerpiece of Derkach’s election influence platform that pushes false narratives, according to the U.S. Treasury Department. Both have been key players in the Russian-directed influence operations designed to hurt U.S. candidates seen as less friendly to Moscow’s point of view, such as now-President Joe Biden.
Facebook began investigating this new activity after receiving a tip from the FBI, which they say was about Derkach’s U.S.-focused activity. But when they dug a little bit deeper, they instead discovered a constellation of websites, fake accounts, and pages focused exclusively on Ukraine. It appeared that these internet hooligans had turned their focus back to their own country’s politics.
Here’s how these influence operations worked in Ukraine, according to Facebook:
“The network used its fake accounts to boost its own content and comments about it. For example, the first fake account would post a favorable article about a politician. Others would then make supportive comments, more fakes would like the post, and still more would like the comments. With this behavior, the network acted as inauthentic cheerleaders for the politicians they attempted to promote. They found a distinctive pattern, where each team ran its own cluster of fake accounts and pages, but the clusters were connected through technical signals and on-platform links.”
What’s most interesting about this activity, however, is its ideological bent: this Ukraine-focused content was consistently anti-Russia in nature, including in posts about Moscow’s early 2021 military buildup near Ukraine’s border, a move heavily criticized by the U.S. and other global partners. While their content promoted a wide range of Ukrainian political actors without appearing to have a specific ideological bent, it was fairly consistent in its opposition to a host of Russian activities and was generally supportive of NATO and Ukrainian membership in the organization, which is a policy position that is anathema to the Kremlin.
It’s too early to know why a long-time Russian agent like Derkach would undertake an anti-Russia influence operation in his home country, itself the ground zero of Moscow’s expansionist overseas activity. One goal could be stoking general political chaos and internal divisions. Ben Nimmo, Facebook’s global influence operations threat intelligence lead, put forward a theory on a call with reporters: “You can really think of these operators as would-be influence mercenaries, renting out inauthentic online support in Ukrainian political circles.”
In other words, some cyber warriors may be willing to rent out their services to the highest bidder, regardless of ideology. As Republican Senator Marco Rubio warned in 2016, when other members of his party were using information Russia had stolen from Hillary Clinton’s campaign and released through Wikileaks: “Today it is the Democrats. Tomorrow, it could be us.”
There are two noteworthy trends regarding these kinds of political influence operations in Ukraine highlighted by this recent report.
First, the bad news: According to Facebook, Ukraine has been among the top sources of the coordinated inauthentic behavior activity they’ve discovered on and removed from their platform over the past few years. It’s a hotbed of bad behavior online. They write:
“From a global trend analysis, this signals the burgeoning industry of what we call IO-for-hire that offers media and social media services involving inauthentic assets and deceptive amplification across the internet.”
It’s also worth noting, depressingly, that in this same report Facebook said it had also taken action against actors in the Central African Republic, Mexico, and Peru, among other places, demonstrating how far and wide these tactics have spread throughout the globe.
Now, here’s the good news: A number of countries (including Ukraine) have growing civil societies and independent media organizations that are laser-focused on finding and exposing these deceptive online political campaigns and disinformation. Shining light on this nefarious activity so the public understands more about what they’re seeing online is a necessary, but not sufficient, step in countering this growing and pervasive threat. And needless to say, we all have a role to play in fighting this battle.


Marie Harf
International Elections Analyst, USC Election Cybersecurity Initiative

Marie Harf is a strategist who has focused her career on promoting American foreign policy to domestic audiences. She has held senior positions at the State Department and the Central Intelligence Agency, worked on political campaigns for President Barack Obama and Congressman Seth Moulton, and served as a cable news commentator. Marie has also been an Instructor at the University of Pennsylvania and a Fellow at Georgetown University’s Institute of Politics and Public Service.