top of page

Disinformation Without Borders: How Belarusian Silovik Content Spreads Across X

The Belarusian Silovik Telegram channel, controlled by Belarus’s GUBOPiK agency, amplifies pro-Kremlin narratives, particularly those related to the war in Ukraine. In addition to the Sprinter network, previously identified as the channel’s primary amplifier on X, our new research reveals a broader network of accounts echoing its disinformation in Japan, Poland, India, and the United States.

Belarusian Silovik, a Telegram news channel, initially posed as a platform for Belarusian opposition voices, but it was in fact operated by GUBOPiK, the Belarusian Main Directorate for Combating Organized Crime and Corruption, an agency known for surveillance and political repression. After establishing credibility under pretenses, the channel began promoting pro-Kremlin narratives, particularly concerning the war in Ukraine.


In a previous investigation into the Sprinter network on the platform X (Former Twitter), we identified the network's connection to the Belarusian Silovik as the content was aligned and coordinated. This follow-up expands on those findings, focusing on how the channel’s content is being systematically amplified via different accounts on X (formerly Twitter). Our research in cooperation with Trollensics, a software for tracking disinformation and hybrid warfare campaigns on social media platforms, identified five inauthentic accounts responsible for nearly all posts referencing Belarusian Silovik on the platform between May 2024 and July 2025.


These accounts exhibit patterns associated with coordinated information operations, including identity impersonation, AI-generated content, and narrative localisation. Some accounts adopt national identities, targeting users in Japan, India, Poland, and the United States, while others mimic OSINT organisations to borrow credibility. The findings suggest a structured effort to extend the reach of Belarusian state-linked disinformation beyond Telegram and across multiple online environments.


ABCDE Framework


A-ACTOR


Belarusian Silovik is a Telegram channel that we examined in the previous part of the investigation targeting the Sprinter network. @belarusian_silovik is a pro-Russian, Belarusian state-controlled outlet that started as a fictitious pro-opposition platform.


Investigations by Nashaniva and reform.news link the channel to Artur Gaiko, head of the Belarusian Main Directorate for Combating Organized Crime and Corruption (GUBOPiK), an agency known for political repression and infiltration of opposition communities. Initially presenting itself as a resistance tool, the channel’s real aim was surveillance and entrapment. It has been sharing identical content as the Sprinter network, which prompted a deeper look into the Telegram channel’s activities.


In cooperation with Trollrensics, we have found the most prolific accounts that have been amplifying Belarusian_silovik’s content on X (former Twitter). Trollrensics software scraped all posts on X linked to the Telegram channel in approximately a year period (20 May 2024. - 11 July 2025) and found 2398 posts revealing some prominent distributors of the Belarusian_silovik’s content. 


Five accounts have accounted for most of the interactions with the Telegram channel. 


One of such accounts was @1Gg7Dlct8tfwNJL or Вольвач Юрий (TL: Volvach Jurij). The account was created in October 2020, and is responsible for almost half of all posts that shared Belarusian_silovik’s content - 1104 posts in total. 


ree

The account seemed suspicious with its posting patterns and a nonsensical handle. The content from this account mostly amplified other accounts similar to, and including, Sprinter-affiliated accounts.


The only valuable online mention of the account name was an obituary for a man named Jurij Volvach (TL: Вольвач Юрий). Jurij passed away in January 2000. The man was murdered in Riga in connection with his high standing in a Russian organised crime group, ОПГ бригады Харитонова (TL: OCG Brigade of Haritonov). It was one of the more prominent crime-related murders in Latvia at the time, as the country experienced increased rates of organised crime connected to Russia.


Searching for the profile pictures' origin didn't help, as the reverse image search revealed that the image has been stolen from the original user multiple times, further suggesting the account’s inauthenticity. However, one of the results has led to a Russian independent opposition initiative (the identity of the owners is redacted for safety purposes), which aimed to record all found instances of Kremlin-owned X bots, containing a database of over 6000 bot accounts utilised by the Kremlin. The list contains Twitter accounts which are used to “promote Russian aggression, including propagandists and “forcers” (those who inflate the number of “readers” on Kremlin bot accounts), and excluding official media outlets and their journalists”. According to the creators of the list, the algorithm used to detect and log these accounts is configured only to identify bot farms, with an error margin of 1%. Sure enough, Jurij Volvach was listed in the registry as one of the Kremlin-linked bot accounts.


Most of these other prominent accounts identified in Trollensic’s data raise questions about their authenticity. They operate with the same modus operandi as is observable with other Kremlin-owned bots; either farming engagement for existing pro-Russian accounts on X or localising the same content. These localising efforts were present in all five amplifiers identified by Trollensics, with targets set on Poland, India, Japan, and the U.S.


The account focusing on the U.S. aside, the other accounts present themselves with a strong pro-Russia slant, such as profile pictures, banners, or bios with heavy pro-Russian messaging, however holding onto their “localised” names. 


ree

However, inconsistencies in their local image exist, especially with the @shoft47ym account. The account seems to have existed for quite some time, with the creation date in 2013. The vast majority of their posts are written in Japanese. However, the profile picture includes the logo of the Wagner Group, a Russian mercenary organization that has carried out military missions worldwide on behalf of the Russian state, including their heavy presence at the beginning of the war in Ukraine. Even though private military companies are technically illegal under the Russian Constitution, Wagner has long been believed to maintain strong connections with Russian military and intelligence agencies (these suspicions were confirmed in June 2023 when President Putin acknowledged that the Russian government had fully funded the group’s operations). The account additionally features a Ribbon of Saint George, a widely recognized Russian military and patriotic symbol used in Russia and some post-Soviet countries. The account differs from the other accounts previously mentioned, as it often provides polarising original commentary on political issues. The account often reposts and speaks on content related to the Russia-Ukraine war, with the pinned post being a pro-Kremlin thread that features posts with graphic war footage. Some of these posts feature videos of Russian kamikaze drone footage and corpses of Ukrainian soldiers, with positive endorsements from the author. Whilst the vast majority of posts on the account are in Japanese, occasional posts in Russian have been found. 


Another interesting case is @WSIntelMonitor. The account was first flagged in 2023 by Trollrensics. This account is the seeming impersonation of a nonexistent American OSINT organization, going under the name of WSIntelMonitor at the time of this report’s creation. The modus operandi of this account is eerily similar to that of the Sprinter network. The account publishes news by scraping news articles and AI-generated posts to amplify them. The account has changed its name a plethora of times since its creation in 2023.


ree

Once Trollrensics software encounters an account, it tracks changes to the account, such as bio, followers, number of posts, handle, profile picture, and banner. The following screenshot from the Trollrensics dashboard shows all the changes made to the account since first encountering it.


The account has long claimed to be an OSINT organization, with one of the previous bios stating “OSINT Agency monitoring warfare worldwide". Some posts made by using Artificial Intelligence ( NLP Bot ) - Support us – https://t[.]co/DIjBetQZzb”. No records of any OSINT organisations with such names exist, suggesting a fabricated persona for purposes of borrowed legitimacy.


The t.co link leads to their ko-fi website, presenting a similar crowdfunding system to the Sprinter network, as both of the Sprinter accounts have urged followers to donate to their Ko-fi or buymeacoffee platforms


B-BEHAVIOR


The five accounts found in the course of this investigation, @WSIntelMonitor, @1Gg7Dlct8tfwNJL, @shoft47ym, @Rajendr67215893, and @MadzGrodno, have been identified as amplifiers, given their extensive referencing of Belarusian_silovik. All of the accounts share a pattern of behaviour in reposting and amplifying pro-Russian content at inauthentic rates and localising Russian disinformation for separate target audiences.


However, the modus operandi differs in some of the accounts uncovered during the investigation. Whilst some accounts (@1Gg7Dlct8tfwNJL, @MadzGrodno, and @Rajendr67215893) act like regular Kremlin-owned sock puppets with little to no authentic interactions, a lack of original content creation, or sharing the pattern of nonsensical names, other accounts seem to share similarities with different actors. 


@WSIntelMonitor seems to operate similarly to the Sprinter network accounts, which tend to change names and branding ever so often to avoid detection, AI-generate new content scraped from legitimate websites whilst curating a specific bias, posing as a news monitor, or even imitating a non-existent OSINT organisation. 


The @shoft47ym account seems to focus heavily on the localisation aspect of the operation, translating and liaising with other Japanese-speaking users to promote pro-Russian narratives surrounding the Russia-Ukraine war. The account has been known to repost Russian content with Japanese commentary or translate certain materials, suggesting a more sophisticated type of asset.


DISARM Framework

T0101 "Create Localized Content"

Whilst applicable to most accounts on the list, this TTP is most relevant to the latter @shoft47ym account.

T0049.001 "Trolls amplify and manipulate"

It refers to content reposting from the attributed Belarusian state-affiliated Telegram channel by sock-puppets.

T0118 "Amplify Existing Narrative"

applies to the constant amplification of existing pro-Russian narratives surrounding the Russia-Ukraine war.

T0009 "Create fake experts"

It is relevant to @WSIntelMonitor, as the account seems to create a fake OSINT or news monitoring agency, borrowing legitimacy from the credentials those titles grant. The statement is nonsensical, as there are no records of the organisations this account claimed to represent. 

C-CONTENT


The content uncovered in this operation closely mirrors established pro-Russian narratives, with a strong emphasis on international politics and the Russia-Ukraine war. Most of the accounts highlighted in this investigation act primarily as amplifiers, reposting and recycling content from other disinformation actors rather than producing original material. Despite this, the overall messaging consistently aligns with efforts to polarize audiences through emotionally charged, manipulated, and heavily skewed pro-Russia content. This flood of disinformation is disseminated at scale, following recognizable patterns similar to those observed in networks like Sprinter, which was previously investigated in depth. Both in tone and thematic focus. Additionally, AI-generated material has been documented, including AI-generated retellings of articles or news listings, as well as derogatory AI-generated multimedia content designed to further polarise viewers. 


D-DEGREE & E-EFFECT


Between 20 May 2024 and 11 July 2025, 2398 tweets cited belarusian_silovik. Five accounts shared nearly all of the content, and the total number of followers of those accounts at the end of July 2025 was approximately 11,000. 


ree

Out of total tweets citing belarusian_silovik, the biggest offender was @1Gg7Dlct8tfwNJL with 1104 posts within the given timeframe. The second-highest number of tweets was registered with @shoft47ym, totalling at 525 posts. @Rajendr67215893, @WSIntelMonitor, and @MadzGrodno cited belarusian_silovik in approximately 500 cases in total. 


Besides the belarusian_silovik, listed accounts have shared Russian-affiliated content from other sources. Most of the accounts were created between 2020 and 2024, and they have produced 300,235 tweets total. 


Whilst it is hard to evaluate the full impact of the amplification network, or even trace all of the ties that exist throughout the endless heaps of Kremlin-owned sock puppets, it is important to at least measure what the direct impact of the accounts that have been linked to belarusian_silovik. 


CONCLUSION


The significant improvements of monitoring and takedown mechanisms to combat disinformation at scale remain one of our goals. Our investigation highlights that threat actors, particularly those linked to state-backed operations such as Russia, continue to exploit platform vulnerabilities with ease. These threat actors utilise vast networks of bots to amplify disinformative, harmful, and polarising narratives. 


These campaigns are persistent, coordinated, and effective. However, the ability of the counter-disinformation community to identify and remove every instance of sock puppets and amplification mechanism is severely limited by resource constraints, as such is not viable. For this reason, it is important to call on platforms to take more responsibility and to adopt proactive approaches to act swiftly upon takedown notices. Without proactive cooperation from those who operate and moderate the platforms that threat actors abuse, we cannot meaningfully disrupt the reach or influence of these operations.


bottom of page