top of page

The Sprinter Network: Unmasking the Disinformation Web

The "Sprinter" network on X (formerly Twitter) is a complex web of fake accounts that spread pro-Kremlin disinformation. Our investigation reveals that these accounts, automated and inauthentic, coordinate efforts to amplify harmful political narratives. By uncovering links to a Belarusian state operative-controlled Telegram channel, we trace the operations of the Sprinter network to actors linked to government-sponsored disinformation. This report examines how these networks function, their impact on public perception, and the ongoing challenges in combating foreign information manipulation and interference (FIMI).


Maria Voltsichina

Michael Corech


The "Sprinter" network, consisting of seemingly independent accounts on X (formerly Twitter), has been used to spread pro-Kremlin narratives, manipulate public opinion, and amplify politically charged content aimed at undermining democratic values and supporting authoritarian regimes. Our investigation uncovered significant patterns of inauthentic behavior tied to several accounts with similar usernames, such as @SprinterFamily (currently @SprinterObserver) and @SprinterIII (currently @WarzoneObserver). These accounts exhibited high daily post volumes, used identical content, and engaged in coordinated messaging closely aligned with Russian state propaganda. The accounts were linked through shared branding and unusual posting patterns, indicating they were part of a larger foreign information manipulation and interference (FIMI) campaign.

Representation of the initially assumed network operating on X from 2022, based on account handles


Further analysis revealed that the Sprinter network operated using various tactics, including purchasing fake followers, using AI-generated content, and manipulating platform algorithms to amplify their posts. Despite facing multiple suspensions, these accounts evaded detection by regularly renaming themselves and continuing their operations under new identities. The network's ability to avoid long-term suspension allowed it to continue influencing users, spreading misleading narratives, and contributing to the ongoing problem of disinformative content online.


We also revealed connections between the Sprinter network and a pro-Russian Belarusian Telegram channel, "Белорусский силовик" (belarusian_silovik), which was found to share identical content with Sprinter accounts. This connection points to the potential involvement of Belarusian state actors in the network's operations, as there are multiple cited investigations further in the report, which explicitly prove the connection between this Telegram channel and the Belarussian state. This discovery adds a layer of complexity to the issue, as state-sponsored disinformation campaigns have far-reaching implications for national security and global stability.


The findings presented in this report highlight the critical need for continued efforts to raise awareness and combat FIMI campaigns.. They also emphasize the importance of stronger platform moderation, increased transparency, and public awareness to mitigate the influence of networks like Sprinter.


ABCDE FRAMEWORK

The ABCDE is designed to help organize reports and conduct analysis and assessment clearly and logically. It breaks down disinformation incidents into five key elements: Actor, Behavior, Content, Degree and Effect.


ACTOR

In a data analysis conducted in collaboration with Murmur Intelligence, Sprinter accounts on X (formerly Twitter) - including  @SprinterIII, @Sprinter00000, @SpriterMonitor1, @Sprinter00001, @Sprinter99800, @SprinterX99800, @Sprinterfactory, and @SprinterFamily accounts on X due to the same branding, high daily post volumes, inauthentic behavior, and dissemination of disinformation. However, after a deeper analysis, we uncovered the X IDs associated with these accounts. Unlike handles and display names, account IDs remain static, revealing only two unique account IDs despite different handles and names. This indicates that the "suspended" accounts were merely renamed versions of the same two accounts.

Attribution of account handles to the two X Account ID’s, including the statistics per Account ID (latest data from 05/12/2024)
Attribution of account handles to the two X Account ID’s, including the statistics per Account ID (latest data from 05/12/2024)

The drastic changes in follower counts and time gaps between renaming accounts point to the fact that they have been suspended, most probably for violating X posting rate limits, which allow 2400 posts per day organized in hourly batches. Whilst renaming could be attributed to an evasion strategy by the owner(s) of the accounts, the follower counts may change after suspension due to the deletion of bot accounts from the follower list.


Two of the “Sprinter” accounts investigated in this case (Case 1, Case 2, Case 3) showed an image of a black and white man’s face with a helmet and the “Z” letter embedded on its neck. The image has been used as a profile picture, however, more frequently when the account called for donations (Case 1, Case 2), or to make other statements (Case 1, Case 2, Case 3).


This image is primarily utilized by a Telegram channel called “Белорусский силовик” (@belarusian_silovik). The username utilises the word Silovik, which has a longstanding cultural meaning in Russia and Belarus, in which the word is used to describe members of the military or security services, or otherwise politicians with a background in those services. In a post from VK, on an account that tagged the Telegram channel, it is explained that “the warrior's face is looking to the west, and the second logo is a shield.” The @belarusian_silovik telegram channel uses the logo as a digital watermark for content protection. 


Another significant link between the Telegram channel and the X account is the sequential sharing of identical content, with some posts appearing on the @belarusian_silovik Telegram channel and then on the @Sprinterfamily X account. 


For example, on November 1st, one of the posts on @Sprinterfamily X account featured a picture of graffiti depicting Volodymyr Zelensky begging for international aid. The same graffiti has also been shared on the Telegram channel on the same date within an hour's difference. Additionally, both the Telegram channel and the X account published a disinformative video on the same date from Kentucky, USA, alleging that a voting machine "prevented" a voter from selecting Trump and automatically registered the vote for Harris. Another post on the Telegram channel about Hungary's refusal of Ukraine's accession to NATO is again posted by the X account with a couple of hours' difference. Repeated research on 10 November 2024 had the same outcomes with sequential sharing of identical content, where some posts appeared on the Telegram channel and, a minute later, on the X account. For example, a post on the Telegram channel featuring a comment by Alexander Lukashenko, the president of Belarus, about Syria is copied word-for-word into English on the X account, seemingly translated using Google Translate. The post, originally published on the Telegram channel at 08:44 AM UTC, appears on the X account one minute later. A post comparing Volodymyr Zelensky, to the kid from the “Home Alone” movie is published at 07:20 AM UTC, and an identical post on the @Sprinterfamily account appears two hours later. 

According to an investigation conducted by Nashaniva (a Belarusian anti-regime newspaper operating in exile following a ban on their activities due to enlistment into the list of extremist organizations), the person behind the Telegram account is “Артур Гайко” (“Atrur Gaiko”) head of the Belarusian Main Directorate for Combating Organized Crime and Corruption of the Ministry of Internal Affairs. The GUBOPiK has been responsible over the years for the political repression, violence, and torture of political opponents. Further, another article from "reform.news" points out that the logo used by the Sprinter network and the @belarusian_silovik Telegram channel is also used as a personal profile image on Telegram by Gaiko. These leaks were part of previous infiltration efforts by Artur Gaiko into anti-regime Telegram communities.

The @belarusian_silovik channel initially posed as a pro-opposition organizational platform, claiming to mobilize people and raise funds for anti-government efforts. However, it was, in fact, one of many government-operated Telegram channels used to infiltrate anti-government communities and gather information for future arrests. The evidence of Gaiko’s involvement is linked to the ID number of the Telegram channel belarusian_silovik (-1396864349), which matches the ID of a state-controlled Telegram channel active during the 2020 Belarus protests. 


Telegram suspended the channel for violating its terms of service. Upon reinstatement, it was renamed Belarusian silovik from "Объединенный союз революции" (@shpsr2). In its first post under the new name, it referenced the previous account and its ties to Belarusian silovik (security) structures. The contact person listed in the channel’s description was verified using a leaked phone database, which revealed changes in the associated Telegram username. These prior usernames have been linked to earlier infiltration activities.


BEHAVIOR

SocialBlade data and the scraped dataset of the accounts’ posts have identified evidence of purchased followers.


The posting activity of the @SprinterFamily account has been demonstrably inauthentic, with daily post counts ranging from 140 to 533 and an average of 7,642 posts per month, according to Socialblade statistics from 05.12.2024. These posting habits are highly unlikely to be achieved organically. Analysis conducted using PangramLabs’ AI-detection tool further reveals that some portions of the posts are AI-generated (Case 1, Case 2, Case 3, Case 4, Case 5)


An organization specialising in AI detection, PangramLabs, screened the Sprinter network posts, finding at least 60 examples of AI-generated posts, most of which have been sourced from existing news articles and publications. The content generation relies on many sources, but predominantly amplifies legitimate sources that align with pro-Russian narratives. For example, this post from @SprinterFamily has seemingly been sourced from a legitimate article published by HEC, a Parisian business school. Another example of AI-generated posts is the following post that sources from an article published in the American Conservative, which aligns with the pro-Kremlin narratives. The article states that Putin's stated rationale for the invasion of Ukraine centres on broader international principles, particularly ensuring Ukraine's neutrality, and preventing NATO expansion, and that “Putin went to war in Ukraine, not as a step toward war with NATO, but to prevent a war with NATO”. This directly correlates with the disinformation campaigns launched by the Kremlin in 2022, aimed both at domestic and foreign audiences. The post amplifies existing aligned sentiments in this article to further push pro-Russian narratives by using legitimate sources for this operation.


Whilst in itself, the usage of AI in the generation of posts does not pose an immediate threat or signal the spread of disinformation, this behavior further proves that this network operates in an inauthentic way to propagate pro-Russian, harmful rhetoric. The propagation reaches a problematic degree when the network seems to utilise genuine, legitimate media, twisting the true narratives to create content for an organised disinformation campaign. Noting the volumes of posts generated, the usage of AI to automate and support the operation of the network amplifies reasons for concern.


DISARM FRAMEWORK TTP


PLAN

PREPARE

EXECUTE

TA01: Plan Strategy:

T0073:

Determine Target Audiences

T0074: Determine Strategic Ends

TA02: Plan Objectives

T0002: Facilitate State PropagandaT0066: Degrade AdversaryT0075.001: Discredit Credible Sources

T0076: DistortT0079: Divide

TA13: Target Audience AnalysisT0072.001: Geographic Segmentation

T0081.003: Identify Existing Prejudices


TA14: Develop Narratives

T0022: Leverage Conspiracy Theory

Narratives

T0022.001: Amplify Existing Conspiracy Theory Narratives

TA06: Develop Content

T0019: Generate information pollution

T0023: Distort facts

T0084: Reuse Existing Content

T0084.002: Plagiarize Content

T0084.003: Deceptively Labeled or Translated

T0085.001: Develop AI-Generated TextT0086: Develop Image-based ContentT0086.001: Develop MemesT0086.002: Develop AI-Generated

Images (Deepfakes)T0087.001: Develop AI-Generated

Videos (Deepfakes)

TA15: Establish Social Assets

T0090: Create Inauthentic Accounts

T0090.004: Create Sockpuppet Accounts

T0092: Build Network

TA09: Deliver Content:

T0114.001: Social media

TA17: Maximize Exposure T0049: Flooding the Information Space T0049.003: Bots Amplify via

Automated Forwarding

And Reposting

T0121: Manipulate Platform Algorithm TA10: Drive Offline Activity T0017: Conduct fundraising TA11: Persist in the Information Environment

T0128.005: Change Names of Accounts

CONTENT

The X “Sprinter" network has been previously covered for systemically spreading disinformation (Case 1, Case 2) and specifically pro-Russian disinformation (Case 1, Case 2). While comprehensive statistics quantifying the precise total volume of disinformation spread (or its impact) by the network across its posting activity are not available, several indicators suggest the scale of the problem is a cause for concern. Disinformation originating from the network frequently surfaces on fact-checking websites and posts shared by the network often attract community notes—crowdsourced annotations aimed at correcting misleading or false claims—further highlighting the prevalence of disinformative content. These patterns strongly suggest that the dissemination of false or misleading information is a significant characteristic of the network's activity, even if precise measurements remain elusive. Debunk has logged more than 30 separate instances of misleading posts in the duration of the research process, showing cases of outright information manipulation and disinformation campaigns aimed at mirroring Kremlin Russian propaganda. 


Manipulated Videos

On June 16, 2024, the @SprinterFamily account published a Russian-dubbed video of Italian Prime Minister Giorgia Meloni giving a speech on the Russia-Ukraine. The video included a manipulated translation of her remarks, portraying her speech as aggressive, filled with provocative statements, such as: 


"Defending Ukraine means defending the system of rules in the world. We can unite all efforts to defend Ukraine. If Russia does not agree to the terms, we will force them to surrender. We need to set the terms for this discussion. Peace in Ukraine does not mean that Ukraine should surrender, as Putin thinks. It will not be so.” 


In reality, Meloni’s original statement was strikingly different from @Sprinterfamily’s Russian-dubbed statements, where she emphasized defending international rules and supporting Ukraine to avoid the severe consequences of allowing its sovereignty to be violated. Verbatim, the original speech stated: 


“Defending Ukraine means defending that system of rules that holds the international community together and protects every Nation. If Ukraine had not been able to count on our support and therefore would have been forced to surrender, today we would not be here to discuss the minimum conditions for a negotiation. We would be just discussing the invasion of a sovereign state and we can all imagine with what consequences.”


The manipulated translation was aimed at misrepresenting her position, fueling anti-Western sentiment by portraying her and Italy as a hostile actor to Russia and reinforcing narratives of Western leaders as aggressive and escalation-driven. 


On 31 May 2024, the @Sprinterfamily account published a manipulated deepfake video of the U.S. State Department spokesperson Matthew Miller, in which he purportedly told reporters that it was acceptable for the Ukrainian military to attack the Russian border city of Belgorod due to the absence of civilians, stating that the only remaining people in the city were legitimate military targets. The statement was never made by Miller, however the manufactured video was used by Kremlin propaganda operatives, state news outlets, and top officials as a massive disinformation attack against the U.S. and Ukraine. 


On 25 May 2024, the @Sprinterfamily account published a deepfake video of the State Department spokesman Vedant Patel. In the manipulated video Patel is purportedly stating that “undermining the role of the dollar and developing alternatives to SWIFT is a direct threat to democracy in the world.” The manipulation is fabricated and does not represent Patel’s statements during any official press briefing. This misrepresentation serves to reinforce the narrative that the U.S. prioritizes maintaining its global financial hegemony over democratic values, potentially fueling anti-American sentiment and skepticism about its motives. 


Zelensky Graffiti

On November the 4th 2024, @SprinterFamily posted an image illustrating a graffiti artwork of Volodymyr Zelensky, pictured kneeling on the ground, begging for foreign aid. After investigating further, it has become evident that the picture seems to stem from Operation Overload, within which pro-Russian actors flooded newsrooms with fake content to sway public opinion and exert influence in the West. The image, which has been shared, has been proven to be part of that operation. Whereas, the image itself has been manipulated, with no such graffiti existing. The network thus seems to additionally amplify other disinformation campaigns already in place and investigated by other counter-disinformation organizations, and attributed to pro-Russian FIMI operations.



DEGREE

The accounts seemingly amass great amounts of traction on the platform, with Meltwater, a social media tracking tool, reporting 6.06 million mentions and 4.62 million reposts annually.

According to Socialblade, the @SprinterFamily account has posted 7,642 posts in the past 30 days (from 05/11/2024 to 05/12/2024) and @sprinterIII posted 2,228 posts in two weeks (21/11/2024 - 05/12/2024). Most of the accounts that were documented were temporarily suspended, with accounts communicating with one another, claiming to be backup versions of the pages or claiming previous ownership of the suspended accounts. 


EFFECT

Whilst it is difficult to make an accurate judgement of the effect the network has on public perception, it is possible to point out that the Sprinter network spreads large amounts of disinformative content, which has been generated automatically and oftentimes includes manipulated or fabricated content is actively monetized via publicly accessible Paypal (currently not active) and buymeacoffee links. The same content propagates harmful anti-western rhetoric and shares ties with previously proven harmful actors, such as the Belarusian operative.


The inauthentic behavior promoted by the Sprinter network such as the use of automation to post repetitive content or engaging in behaviors that mislead or deceive others directly violates the X community guidelines. Accounts employing automation to disseminate such content are in violation of this policy. The post posting rates and instances of publishing deepfake videos and manipulating state officials’ addresses violate X's “Platform Manipulation and Spam Policy”. Those posts also violate X's “Synthetic and Manipulated Media Policy” which prohibits sharing deceptive or harmful manipulated media, including deepfakes or edited content that misrepresents reality. Disseminating false or misleading information, especially related to civic processes, undermines public trust and violates X's standards.

Suspensions for repeated offenders with moderate violations, which would include automated posts and exceeding hourly rate limitations, would receive a suspension period between 2 to 4 weeks depending on the severity level. Both the severity level and the detection of the violation would be identified by the X automatic algorithm. Despite the initial beliefs of a larger network present, it is suspected that the renaming patterns may be a reaction to suspensions of only two separate accounts, to avoid further suspensions by the automated algorithm. The constant change of usernames creating the appearance of a larger network,  the previously mentioned eight handles being separate accounts, seems to be a ploy designed to “trick” and go unnoticed in order  to be able to continue operations within the current climate on the X platform.


The time frames between renaming and resuming activity have also been compliant with the usual suspension times. Given repeat offenders receive 2 to 4 weeks suspensions, activity lapses have correlated with these timeframes in the dataset obtained by Debunk.org.


To further solidify the link between the two main accounts, they have been noted to communicate via tagging one another and have been sharing the same links for donations (example 1, example 2), which indicates a connection, claiming to either be their previous account which has been suspended, or refer to one another as backup accounts. For example, the @SprinterIII account recounted the temporary restriction of @Sprinter0000 (the predecessor of @Sprinterfamily), claiming that the account has remained temporarily restricted for over two weeks. 


On at least two posts the @Sprinterfamily account describes how the account ad revenue has been reinstated or blocked. In summary, we deduce that X frequently suspended the accounts due to the account exceeding its hourly posting rate limits, leading their operators to repeatedly change usernames to stay active.


Additionally, over the years the accounts belonging to the Sprinter network have been collecting donations through advertising various accounts on sites including PayPal, and Buymeacoffee. Although the PayPal account is no longer active, the perpetrator behind the Sprinter network offers subscriptions and donations through the Buymeacoffee platform. Furthermore, on at least two posts the @Sprinterfamily account describes how his ad revenue has been in the past reinstated or blocked. During our investigation, we observed advertisements appearing in the replies on the Sprinter accounts, from which X likely profited. 


Conclusion

The following investigation reveals a network of X (Twitter) accounts, identified as an Advanced Persistent Threat (APT), distinguished by the frequent use of "Sprinter" in nearly all usernames. The Sprinter network is renowned for disseminating pro-Kremlin disinformation narratives on a massive scale. The study uses collected datasets and further analyses to reveal that the accounts employ automated mechanisms to evade detection. Further investigation connects this network to a pro-Russian Belarusian Telegram channel with connections to a Belarusian state agent. 


Furthermore, the network’s rise to prominence tracks with a dramatic decrease in moderation of hateful content on the platform, which dropped from 1 million moderated accounts in 2021 to only 2,361 accounts in the most recent 2024 X transparency report. Nevertheless, the inauthentic behavior of the two accounts has been noted and sanctioned several times by the X platform moderators. Despite the sanctions, there have been ways to mitigate suspicions and protect users from disinformative content produced by pro-Russia affiliated actors. The drop in moderation activities on X allows not only the spread but also the active monetization of disinformative content on the platform.


Further investigation is required to confirm the links between the Sprinter network and Belarusian state actors, as the potential connection represents a significant concern that demands immediate attention. The volume and nature of politically charged content disseminated and amplified by this network, coupled with the substantial engagement these posts receive, underscore the urgency of dismantling its operations. It is also important to maximise efforts in digital media literacy, to ensure the general public would be inoculated and less susceptible to falling under the influence of networks like Sprinter.


For more information about the AI-CODE Project, click here. The AI-CODE project is funded by the HORIZON Europe Programme of the European union. 
For more information about the AI-CODE Project, click here. The AI-CODE project is funded by the HORIZON Europe Programme of the European union. 




bottom of page