Fabricating Support: Russian Bots Create Illusions of Ukrainian Approval for Occupation

On Russian social media, individuals present themselves as typical residents from occupied Ukrainian towns, lauding Moscow’s «reconstruction» initiatives, praising the valor of Russian soldiers in combat, and criticizing the so-called “neo-Nazi” regime in Kyiv. Occasionally, they even engage in disputes with one another.

Take, for instance, the VKontakte profile of Roman Koshelev, which showcases a black-and-white selfie from the gym alongside the slogan “Sport is life.”

This 29-year-old consistently shares posts endorsing the Kremlin’s so-called “special military operation” and frequently condemns Ukraine. He subscribes to groups with titles like “Contract Service in the Russian Defense Ministry” and follows numerous regional news outlets across Russia.

However, Roman does not actually exist. His profile image was taken from a sports-nutrition channel on Telegram, and his entire account—as with countless others—belongs to a vast network of fictitious profiles aimed at generating pro-Kremlin sentiment on the internet.

Utilizing the Botnadzor service, which tracks bot activity linked to Russia, The Moscow Times discovered that fake accounts frequently masquerade as Ukrainians to promote favorable narratives on VKontakte pages operated by Kremlin-supported authorities in occupied areas. Their comments, interactions, and even discussions create a facade of public backing for Moscow in these occupied regions.

Russia’s online influence endeavors have been in play long before the full-scale invasion of Ukraine. In 2013, the Novaya Gazeta newspaper revealed a St. Petersburg office where paid “trolls” lauded Vladimir Putin and Moscow’s Mayor Sergei Sobyanin. At that time, each employee was reportedly expected to leave around 100 comments daily.

Just a few years later, the infamous Internet Research Agency, associated with Yevgeny Prigozhin, the deceased founder of the Wagner mercenary group, became widely known after being accused of creating thousands of social media accounts to meddle in the 2016 U.S. presidential election.

Even following Prigozhin’s death in a plane accident in August 2023, the army of bots continued their activity. For instance, Roman persisted in commenting within Wagner-related communities and on pages managed by Russian-backed officials in occupied eastern Ukraine.

At the end of October, after significant power outages hit the Luhansk region, Roman attempted to present a positive perspective on the blackouts in his online comments: “Having to walk around the house with candles is kind of romantic. I believe the electricity will be restored soon.”

The day before, he had commended the cutting down of trees in Melitopol—a likely unpopular decision framed by the Moscow-affiliated authorities as part of an “urbanization” initiative.

On the VKontakte page of Melitopol’s administration, numerous bots have posted comments pretending to be local citizens. Thousands more have appeared on various other official websites in the occupied territories, sometimes expressing support for the ruling United Russia party and, at other times, celebrating Russian military actions as they advance to seize additional Ukrainian land.

Human rights organizations argue that the Kremlin has exerted nearly total control over the media environment in occupied Ukraine. Only journalists loyal to Moscow are permitted to operate there, while those who oppose this line face risks of arrest, torture, and even death, as exemplified by the case of Ukrainian journalist Viktoriia Roshchyna.

This lack of credible information makes bot activities more impactful, asserts Vincent Berthier, who heads the technology department at Reporters Without Borders. He emphasizes that the occupied regions have become “information black holes where only Kremlin propagandists are allowed to operate.”

“Online bots don’t generate narratives but take advantage of existing controversies to enhance visibility and legitimacy,” Berthier stated in an interview with The Moscow Times. “Their impact relies on the overall information landscape and how receptive the audience is.”

However, he noted that determining how much of the propaganda campaign is driven specifically by bots is challenging. “Disinformation seldom relies on a single mechanism. Bots function within coordinated ecosystems that also involve human-operated accounts and fake news sites.”

On September 30—marking the anniversary of Russia’s claimed annexation of four Ukrainian regions—bot activity surged. Numerous accounts shared congratulatory messages and proposed which Ukrainian areas Russia should “liberate” next.

“It would be great to include Kharkiv, but only if a referendum takes place. No one intends to impose anything, of course,” wrote an account named Vasilisa.

“I’m hoping for a referendum in Odesa,” another fake persona replied.

Bots also engage in confrontations with real users who are skeptical of the full-scale invasion.

After Russia fired a Zircon hypersonic missile near NATO borders this year, a fake user named Dmitry jumped into the comments section of a post to accuse a critic of overlooking what he termed Ukrainian assaults on the self-proclaimed Donetsk and Luhansk People’s Republics—mirroring Vladimir Putin’s rationale for the 2022 invasion, when he claimed Russia needed to safeguard Russians living there from “genocide.”

Shortly afterward, another bot joined the dialogue. Nadezhda, who describes herself online as a “homemaker and devoted wife,” maintained that the Zircon launch was “merely a military exercise, nothing else—certainly not a demonstration of Russia’s strength.”

Yet, the discussion quickly dissipated. One genuine participant in the conversation utilized a browser add-on that detects bot presence on VKontakte, marking accounts as “bot/promoted.” Once this label appeared next to their names, the fake profiles became silent.