Automating Terror: The role & impact of telegram bots in the Islamic State’s online ecosystem
"Automating Terror: The Role and Impact of Telegram Bots in the Islamic State’s Online Ecosystem", 7 February 2023
Introduction
To understand and ultimately mitigate the threat from terrorism today, which is increasingly reliant on the massive, multivariate usage of internet-based technologies, researchers, policymakers, and practitioners require new approaches that rely on complexity science.Footnote1 Over the past decade, network theory has contributed tremendously to our understanding of how and why violent extremist communities form and thrive.Footnote2 Among other things, network science has helped to demonstrate that the topological structure of these illicit networks has common features with that of other complex systems and social phenomena,Footnote3 even though extremist communities are often treated as a social aberration at a policy level.Footnote4 At the same time, features specific to terrorist networks have been revealed through network science, like the counter-intuitive role women play in making them robust,Footnote5 and the relationship between structural network characteristics and the severity of the attacks carried out by the actor in question.Footnote6
Today, one of the most important but under-researched aspects of terrorist networking online is the use of bots, something that we explore in this article using network science methods. Specifically, we investigate the role of bots as they appear in the context of Islamic State supporter communities on Telegram.
Bots on social media, including but not limited to Telegram, are best understood as automated accounts that execute specific tasks such as publishing, sharing, and resharing content.Footnote11 As this study demonstrates, in the context of the Islamic State and its supporters’ activities on Telegram, bots generally perform one of three key functions: publishing content, moderating discussions, and acting as gatekeepers. In this capacity, they play a central, lubricating role in amplifying the movement’s ideology and cultivating its community of sympathizers, automating administrative tasks like blocking users that violate group policies, and permitting new members to join.
Below, drawing on 1,215,850 data points that were collected from Telegram between February 1 and September 30, 2021 via an ingest program, we study the activities of the Islamic State’s “terrorist bots” within the community dynamics amidst which they operate. We map out their interaction network and, in addition to presenting a schema of their activities and impacts, we sequentially apply community detection algorithms to the data with a view to determining the extent to which they operate in a structured or unstructured manner. These methodologically distinct approaches parse the network’s underlying structure by dividing its nodes into communities. All of the applied methods show that the structure is by no means randomly distributed but, rather, made up of clusters (or modules) of closely interconnected nodes. Our analysis of the network’s modular structure and clustered activities implies the existence of a hybrid system of functional groupings that have been proactively, and collectively, developed to augment the Islamic State’s presence in Telegram channels and groups, as well as a spontaneous process of unorganized supporter-generated community formation. Based on these findings—which speak to the flexibility, ease, and effectiveness with which bots can be deployed to further the interests of bad actors—we contend that the allowances that Telegram makes for bot development are a central factor driving the Islamic State’s years-long preference for it over other platforms that are demonstrably more secure. This is in spite of the (valid and widely implemented) assertion by Telegram in its FAQs that “we do block terrorist (e.g., ISIS-related) bots and channels.”Footnote18