Open Access Publisher and Free Library
TERRORISM.jpeg

TERRORISM

Terrorism-Domestic-International-Radicalization-War-Weapons-Trafficking-Crime-Mass Shootings

Posts in Radicalism
TRANSPARENCY REPORTING ON TERRORIST AND VIOLENT EXTREMIST CONTENT ONLINE, 4TH EDITION

By  Nora Beauvais

This is the OECD’s fourth benchmarking report examining the policies and procedures related to terrorist and violent extremist content (TVEC) online, with a focus on transparency reporting, of the world’s top 50 most popular online content-sharing services (the “popular services”). Like the third edition, this report also covers the 50 online content-sharing services that terrorist and violent extremist groups and their supporters exploit or rely upon the most (the “intensive services”). The first three reports provided a benchmark against which this fourth report assesses relevant developments. Terrorist and violent extremist actors continually adapt their methods to technological developments. As governments and online platforms increasingly take measures to curb the dissemination of TVEC, terrorists and violent extremists make adjustments to avoid content moderation. On mainstream online platforms, for example, they have been developing tactics to evade automated detection tools. Meanwhile, sustained efforts by large platforms to combat TVEC have also caused a “displacement effect” whereby terrorists and violent extremists turn to alternatives (e.g. cloud platform websites, decentralised web technology, niche alt-platforms, and terrorist-operated websites). Transparency reporting on TVEC online is crucial to assess the evolution and magnitude of the threat, evaluate the effectiveness and efficiency of online platforms’ policies and actions to tackle this problem, as well as their impact on human rights, and build an evidence base to support policymaking and regulatory frameworks. The key findings of this report are: 1. The popular and intensive services are more diverse, both ideologically and geographically. The TVEC landscape is multi-faceted, encompassing a wide range of ideologies, from terrorist groups to violent extremist political movements and lone actors, and it is spreading across different types of contentsharing services and geographical regions. For the first time in this report series, the popular services’ list includes a gaming service. This is noteworthy because gaming services are increasingly used by terrorist and violent extremist actors. In addition, three Indian platforms have joined this ranking. As for the intensive services’ list, it features a self-proclaimed anarchist website for the first time and covers a wider spectrum of geographic regions and languages.2. Overlap between the popular and intensive services remains low, highlighting the need to look at the TVEC landscape more comprehensively. Only ten services appear on both the popular and intensive lists, compared to 11 in the third benchmarking report. However, many policy discussions and responses still tend to focus on the largest platforms. Paired with the finding that the intensive services tend to be less transparent than the popular services (see below), the takeaway is that neglecting smaller but intensive services risks under-scrutinising or even turning a blind eye to a core part of the problem.3. The evidence shows mixed results regarding the clarity of the popular services popular services’ definitions of TVEC, while most of the intensive services’ still do not define or even expressly prohibit TVEC. On the one hand, the definitions related to TVEC in the popular services’ policies and procedures are, overall, clearer than in the previous report. Services are using more comprehensive descriptions of TVEC and related concepts, but new gaps among the services’ approaches have emerged, with a proportion of them still using vague terminology (18%) or having become less precise. On the other hand, 60% of the intensive services still do not define or explicitly prohibit TVEC, or they simply have not established any governing documents. 4. Transparency reporting on TVEC reveals new gaps among popular services and remains rare among intensive services. Seventeen of the popular services now issue transparency reports with specific information on TVEC, as compared to just five in the first edition, 11 in the second, and 15 in the third of this series. This represents the slowest year-to-year growth rate to date. For the first time in the series, one of the services (present on both the popular and the intensive services lists) that previously issued transparency reports with TVECspecific information ceased this practice. In addition, three of the four newest Services to issue transparency reports on TVEC provide very limited information, both quantitatively and qualitatively. Furthermore, there is still significant heterogeneity among the popular services’ reporting approaches, which continues to make data aggregation and cross-platform comparisons difficult, if not impossible. Among the intensive services, only six issue transparency reports on their policies and actions concerning TVEC, against 8 previously, and the vast majority (5 of 6) also appear in the popular services list. The scarcity in transparency reporting on TVEC among the intensive services may be explained by the fact that many of them are operated by terrorist and violent extremist groups and supporters, or by free speech “absolutists” who deliberately let TVEC flourish on their platforms. 5. Content moderation approaches continue to pose risks for privacy, freedom of expression and due process. Continuing a trend that began during the COVID-19 pandemic, popular services rely more heavily on automated tools to detect and remove TVEC, which has generally increased the removal of lawful content and unjustified censorship. Furthermore, half of the intensive services remain opaque regarding their approaches to content moderation; and most of them either have no notifications and appeal mechanisms in place, or do not provide any information in this regard. This raises questions regarding their efforts to ensure the respect of privacy, freedom of expression and due process.6. New online safety laws and regulations are creating an increasingly fragmented transparency reporting landscape. As new online safety laws and regulations come into force, content-sharing services are facing new obligations to issue transparency reports in multiple jurisdictions, and they face different reporting requirements in each of them. To conclude, this report highlights the need for more precision in the Services’ governing documents; more consistency in the metrics and methodologies used to prepare transparency reports; more transparency in their content moderation approaches; and more efforts to ensure due process and to safeguard human rights and fundamental freedoms.

PROTOCOL: Understanding the Content, Context, and Impact of Far-Right Extremist Propaganda Disseminated Online: A Systematic Review

By Mia Doolan,  Katie Cox,  Kiran M. Sarma

This is the protocol for a Campbell Systematic Review. This review will address two aims: (1) A qualitative synthesis ofliterature on the composition of online far right propaganda, and (2) A quantitative synthesis of literature examining the impactof exposure to online far‐right propaganda on audiences. These syntheses will be guided by the following specific objectives: (i)What is the content (i.e. themes) of online far‐right propaganda, and how does this differ across ideological subgroups? (ii) What is the structure of online far‐right propaganda, and how does this differ across ideological subgroups? (iii) What is the context ofthese messages (i.e., where, when and by whom were they posted?) (iv) What impact does exposure to online far‐rightpropaganda have on audiences with reference to the radicalisation of opinion and/or action.

Campbell Systematic Reviews Volume 21, Issue 4 Dec 2025

Blurred Boundaries: Legal, Ethical, and Practical Limits in Detecting and Moderating Terrorist, Illegal and Implicit Extremist Content Online while Respecting Freedom of Expression

By Bibi van Ginkel, Tanya Mehra, Merlina Herbach, Julian Lanchès, and Yael Boerma

This study examines a pressing and highly topical challenge: how to assess online content that may undermine democracy, threaten national security and public safety, or infringe upon the rights of others—while safeguarding freedom of expression. The central question it explores, the specific challenges identified, and the recommendations it puts forward should not be viewed in a vacuum. Rather, they are situated within a broader and increasingly complex societal and political context. A range of systemic developments shapes the environment in which this work takes place: the rise of online radicalisation, particularly among children and young adults; the expanding influence of large technology platforms and the tensions this creates with rule-of-law-based democratic societies leading to a global trend toward both techno-libertarianism and techno-authoritarianism; and the evolving role of governments as they seek to reconcile the imperatives of security, safety, and national interest with those of privacy, human rights, and minority protection. These challenges are compounded by the unprecedented speed and scale of online information dissemination, growing concerns about disinformation and foreign influence, and the urgent need to strengthen societal resilience and media literacy. While this study does not address each of these systemic issues in depth, they form the essential backdrop against which its findings and proposals should be understood.

The Hague: The International Centre for Counter-Terrorism (ICCT), 2025. 208p.

Global Terrorism Forecast 2026

By Rohan Gunaratna

SYNOPSIS
In 2026, intensified geopolitical competition and rivalries will influence and shape the global threat environment. In parallel, non-state armed groups driven by religious, ethnic, and hard-line ideologies will threaten both governments and social harmony in various countries around the world.

S. Rajaratnam School of International Studies, NTU Singapore. 2025.

Report On The Emerging Patterns Of Misuse Of Technology By Terrorist Actors

By The Council of Europe

Although the misuse of new technologies by terrorist actors has been a major concern for some time, the capabilities offered by (and the availability of) a range of new and emerging technologies – including gaming platforms, unmanned aerial systems (UAS), artificial intelligence (AI) and 3D-printed weapons – have heightened these fears even further. An analysis of how and why terrorists adopt new technologies suggests that it remains highly context specific, with the extent and speed of innovation affected by internal factors (for example strategic, structural and individual factors) and external factors, particularly relationships, resources and the effects of counter-terrorism. In combination, these factors can encourage or inhibit the adoption of new technologies by terrorist actors, resulting in significant variations in the adoption and use of key technologies of concern. Terrorist actors in or affecting Europe have adopted (or are beginning to adopt) many of these technologies. Social media platforms, small or micro platforms, terrorist-hosted websites and gaming or gaming-adjacent platforms are all playing critical roles in the radicalisation and recruitment process. Emerging technologies used in this process include the decentralised web, the dark web and, most recently, generative AI. Although many terrorist attacks in Europe use a low-tech modus operandi, technology plays a key role in their preparation, planning and subsequent promotion. Propaganda and instructional material – typically stored and shared online – play a prominent role in shaping attack targets and methodology. For example, the emergence of 3D-printed weapon usage by terrorist actors in Europe has been fuelled by instructional materials developed by an active online subculture. Other far-right online subcultures have also encouraged the live-streaming of attacks and sharing of manifestos online. Terrorist actors in Europe use a range of licit and illicit activities to fund their attacks and radicalisation and recruitment activities, some of which (but not all) require the use of new technologies. These include mobile payment systems, online exchanges and wallets, crowdfunding, peer-to-peer online funds transfers and the solicitation of donations on social media platforms. Simultaneously, terrorist actors outside Europe, notably ISIL (Islamic State of Iraq and the Levant)/Daesh, are increasingly encouraging donations via virtual assets, driving a rise in the presence of virtual assets in European terrorist financing arrests and prosecutions. Interviews with national, regional and international experts identified lessons learned and good practices when responding to terrorist misuse of new technologies. These include reducing the lag between terrorist exploitation of new technologies and counter-terrorism responses to it (through horizon scanning exercises and greater information sharing), the criticality of multistakeholder approaches, the importance of identifying and managing human rights-related risks, and the benefits of greater strategic clarity, which can lead to a focus on desired outcomes, rather than the steps required to reach them

Battling Extremism: What Counts as Knowledge

By Mohamed Bin Ali, Sabariah Hussin and Muhammad Haziq Bin Jani

Recent years have shown that extremist worldviews are no longer limited to specific ideologies, regions, or grievances. Whether driven by Islamist militancy, far-right conspiracies, or historical grievances, the dissemination of radical beliefs today is shaped by a deeper and more fundamental issue: extremist epistemology, especially regarding how individuals come to know, filter, and reinforce what they believe to be true.

COMMENTARY

At the heart of radicalisation lies an epistemology that rigidly filters information, dismisses contradictory evidence, and resists alternative perspectives. Quassim Cassam and Olivia Bailey have described this as a “closed-minded worldview” that replaces open inquiry with ideological purity. In this view, epistemology refers not to formal theories of knowledge, but to the everyday frameworks and practices by which individuals justify their beliefs and decide what to believe or reject.

When a person becomes epistemically closed off – often through social media echo chambers or ideological networks – their epistemic autonomy is compromised. They no longer analyse evidence critically, instead relying on trusted sources or in-groups to determine what qualifies as “truth.” This vulnerability is what extremist groups exploit, online and offline.

Self-radicalised individuals – including those in Singapore detained under the Internal Security Act for plotting attacks or attempting to travel to conflict zones – often fell into these epistemic traps. Although the content they consumed may have varied, their radicalisation process was similar: they entered epistemic environments that made violence appear not only justified but also necessary.

These environments often revolve around radical ideologies that reinforce extremist epistemology by offering binary moral frameworks that simplify complex realities and by providing emotionally resonant certainties about the future – such as promises of martyrdom or apocalyptic triumph. These approaches help define individuals as they seek clarity, purpose, or control amid uncertainty.

S. Rajaratnam School of International Studies, NTU Singapore, 2025. 5p.

Undue Influence By Criminal and Extremist groups.

Attempts to influence elected officials and municipal administrations.

By David Andersson, Anna Horgby and Albin Östervall

This is a study of undue influence exerted by actors and groups constituting a systemic threat in Sweden, including undue influence against elected officials, political parties and decision-making bodies. The study also sheds light on what forms undue influence can take against municipalities. Particular focus is on actors and groups connected to organised crime.

English summary of Brå report 2025:4 Stockholm: The Swedish National Council for Crime Prevention (Brå) -2025. 14p.

The Silent Rise of the Left-Wing Militia

By The Program on Extremism at The George Washington University

At a time in which violent Left Wing extremism is seeing a massive surge, this report sheds light on how armed, organized left-wing militias have quietly emerged across the U.S., often overlooked or mischaracterized by law enforcement, policymakers, and the media.

Drawing on thousands of court records, open-source videos, social media pages, manifestos, and more, the report profiles four prominent groups:

  • Redneck Revolt / John Brown Gun Clubs

  • The Socialist Rifle Association

  • The Not Fucking Around Coalition (NFAC)

  • The Huey P. Newton Gun Club / Guerilla Mainframe / Geronimo Tactical

These groups are armed, ideologically driven, and increasingly well-organized - championing causes from anti-fascism and Black nationalism to anti-capitalism. Many boast high levels of veteran involvement and adopt military-style recruitment tactics aimed at active-duty service members and former personnel.

Key findings include:

  • Left-wing militias are largely absent from federal extremism frameworks, allowing them to operate with less scrutiny than their right-wing counterparts.

  • Their rise correlates with high-profile flashpoints like the Ferguson protests and the 2016 election of Donald Trump.

  • They are highly active online, often with little-to-no content moderation, cultivating large digital followings across platforms like TikTok, Reddit, Instagram, and X.

  • While less violent overall, these groups glorify attacks by ideological allies and exhibit many of the same behaviors seen in right-wing militia ecosystems.

The Silent Rise of the Left-Wing Militia examines an underexplored and rapidly growing element of the domestic extremist landscape, at a time of rampant political violence across the ideological spectrum and urges policymakers, analysts, and the public to confront a rapidly evolving militia landscape without ideological blinders.

Washington, DC: The Program on Extremism at The George Washington University , 2025. 78p.

Refugee Protection Crises and Transit Europe: Immediate Responses, Selective Memory, and the Self-Serving Politics of Diversity

By Sardelić, Julija

This open-access book presents a socio-legal analysis of immediate responses to large-scale refugee displacement in Europe after the 1951 Refugee Convention came into force, focusing on the countries to which refugees initially fled or through which they passed (namely Austria and, initially, Yugoslavia, followed by several of the former Yugoslav countries). First, it investigates the immediate responses to refugee movements following the suppression of the 1956 Hungarian Revolution by Soviet troops. Second, it examines the responses to individuals seeking asylum after being displaced during the post-Yugoslav wars of the 1990s. Third, it analyses the responses of the same countries to refugees fleeing Global South countries (predominantly Syria, Iraq, and Afghanistan) in 2015 and 2016. Finally, it explores how these countries responded to the mass displacement of refugees from Ukraine. The book argues that these countries have positioned themselves as “transit” or temporary protection countries in order to avoid assuming long-term responsibility for a larger number of refugees. As a consequence, they granted various forms of temporary legal status to refugees that differed from the refugee status defined in the 1951 Refugee Convention. These legal statuses were hierarchical (in terms of the rights attached to them) and racialized, with the fewest rights granted to refugees from the Global South and other negatively racialized groups. The book traces the usage of self-serving politics of diversity and selective memory to legitimise why refugees could not be protected long-term in these countries, and also why there were such differences in treatment of refugees.

Cham: Springer Nature, 2025.

Returning Nuance to Nostalgic Group Studies: Understanding White Supremacy as a Hegemonic Force

By Amy Cooter

A dominant analytical frame has emerged in extremism studies that attributes nearly all right wing, far right, or nostalgic group ideology1 and action to white supremacy. Some versions of this narrative further posit that these extremist groups intentionally and consciously effect white supremacy through a “cohesive social network based on commonly held beliefs,” a “white power movement.”2 However, these conceptions sometimes lack definitions of social movements, white supremacy, and other key concepts that are central to their arguments.3 This has led to over-generalizations about nostalgic group actors’ motives and goals in a way that downplays both the power of white supremacy as a hegemonic system and the specific harms caused by overtly supremacist actors. This paper clarifies a social science understanding of the key, but sometimes taken-for-granted, terms necessary for understanding these dynamics and demonstrates how faulty or unclear usage of this terminology leads to both analytical problems and the perpetuation of power structures that the field of extremism studies hopes to address. Specifically, I argue that improper conceptualization of white supremacy and related terms creates risks falling into three categories: analytic accuracy and predictive capacity, preventing near-term harm, and perpetuating white supremacy’s power structure and radicalization.

Monterey, CA: Center on Terrorism, Extremism and Counterterrorism, Middlebury Institute of International Studies. 2024. 29p.

Protecting Minors from Online Radicalisation in Indonesia

By Noor Huda Ismail and Putri Kusuma Amanda

The rise of JAD Nusantara, an ISIS-linked online network drawing in large numbers of minors, exposes serious gaps in Indonesia’s child protection systems. Vulnerable adolescents, often grappling with bullying, isolation, or absent parents, are being recruited without showing clear outward signs of radicalisation. In line with UN child rights standards, Indonesia must adopt an approach that prioritises rehabilitative, child-centred responses, safeguarding children’s rights while tackling the vulnerabilities and special needs that extremists exploit.

COMMENTARY
The case of a 12-year-old boy in Pemalang, Central Java, who joined the terrorist group JAD Nusantara, underscores a worrisome trend: radicalisation is increasingly happening entirely online, beyond parental or authority awareness. 

Social media platforms and messaging apps serve as conduits, enabling extremist content to reach vulnerable youth undetected. Research analyses show that extremism thrives on platforms that offer anonymity, rapid dissemination, and emotional appeal – qualities that make virtual spaces ideal for radical recruitment. 

Detecting online-driven radicalisation through traditional community surveillance is extremely difficult. Therefore, child protection systems need to adopt digital literacy and monitoring capabilities so that educators and social workers, not just security personnel, can recognize warning signs and intervene early.

A comprehensive society-wide strategy is needed – one that identifies young people at risk and engages them through pastoral, not punitive, channels.

S. Rajaratnam School of International Studies, NTU Singapore, 2025. 6p.

The Right Fit: How Active Club Propaganda Attracts Women to the Far-Right

By Robin O'Luanaigh, Hannah Ritchey and Frances Breidenstein

One image shows two young women sparring with each other, donning boxing gloves and athletic wear. A second image shows a young woman wrapping her hands and wrists, presumably preparing for a fight. On her arm is a tattoo of an Othala rune, a symbol common in neo-Nazi and white supremacist communities. 

These images, identified in online Active Club spaces, diverge from more traditional portrayals of women in right-wing extremist movements and communities. Instead of quaint cottagecore aesthetics and traditional ‘tradwives’ tending to the family and home, these images present women as activists, ideologues and warriors. While the Active Club network’s portrayals of women still promote traditional gender roles–especially within romantic relationships–the invocation of ‘warrior women’ tropes opens the door to a more palatable form of right-wing extremist activism – one that is less overtly misogynistic and ostensibly more ‘gender equal’. 

This Insight serves as a first look into the hypermasculine extremist spaces and communities of the Active Club network and how they co-opt and utilise images of women in their propaganda. We first introduce the Active Club network before reviewing existing literature on representations of women in right-wing extremist content. Next, we identify and discuss distinctly gendered tropes regarding the representation of women and couples in Active Club content. We conclude with a cautionary analysis of how such content can make Active Clubs and similar organisations palatable to women who may view these groups as gender-equal or empowering. 

Global Network on Extremism & Technology, 2023. 

Dangerous Organizations and Bad Actors: The Active Club Network

By Middlebury Institute of International Studies

Active Clubs make up a decentralized network of individually-formed organizations that are centered around the premise of a white supremacist fraternal brotherhood. First introduced in December of 2020 by Robert Rundo, the leader of the white supremacist Rise Above Movement (R.A.M), Active Clubs are intended to preserve and defend the white population and traditional European culture from a perceived global genocide by non-white ethnic and racial groups. 

Rundo was inspired to create the Active Club network—something he referred to as “white nationalism 3.0”—in response to the numerous arrests of R.A.M. members made in 2018. He wanted to create an organization that would be less perceptible to law enforcement, and thus less susceptible to disruption or destruction. From this, Active Clubs were born—small, decentralized organizations that would focus recruitment efforts on localized areas and thus garner less attention than traditional white nationalist organizations. This structure would also ensure that Active Clubs were not reliant on a particular physical entity or leadership figure for survival.

Active Clubs provide like-minded white men with physical spaces where they can train in mixed martial arts in preparation for war against their perceived enemies. Ideologically, Active Clubs adhere to neofascist and accelerationist principles, with the promotion of violence comprising a key theme in Active Club communication and propaganda. Located across the United States and in several countries transnationally, the Active Club network ensures that groups of men devoted to training for battle are available for mobilization in multiple locations across Western countries. 

Monterey, CA: Middlebury Institute of International Studies and Center on terrorism, Extremism, and Counterterrorism, 2024.

Veteran Perspectives on Extremist Exploitation of the Military: Sources and Solutions

 By Amy Cooter

There has been increasing attention to how military service members and veterans may be recruited or exploited by extremists, yet there is little research on precisely how this may happen or on how such ties may, in turn, influence military cohesion. It is important to emphasize that the vast majority of service members are not extremist, but a growing number of domestic extremists have military connections who may then have an outsized ability to enact harm, including by training others in military techniques. Given the potential for veterans’ knowledge and experiences to be exploited by extremist groups, understanding these connections is pressing. This paper shares findings from an in-depth interview study with 42 veterans from all military branches who collectively shed light on how extremism influences various aspects of military life from recruitment to readiness and who offer concrete steps the military could pursue at every stage of service to limit extremists’ exploitation of the institution and those who serve.

Monterey, CA: Center on Terrorism, Extremism, and Counterterrorism (CTEC) at the Middlebury Institute of International Studies 2025. 31p.

Active Clubs: The Growing Threat of ‘White Nationalism 3.0’ across the United States

By Ciarán O’Connor, Laurie Wood, Katherine Keneally and Kevin D. Reyes

The number of Active Clubs in the United States, Canada and Europe is increasing, posing a threat to public safety. Active Clubs (“ACs”) are white nationalist extremist groups that emphasize physical fitness and hand-to-hand combat skills and have a history of violence. Though each “club” is autonomous, the groups frequently engage in coordinated activities offline, such as mixed martial arts (MMA) tournaments, protests and physical training. In recent years, these clubs have used their social media profiles to encourage likeminded individuals to establish their own clubs in their respective locations. ISD’s research shows this strategy has been highly effective throughout the US.

This report identifies and analyzes the network of Active Clubs operating within the US along three themes: ideology, tactics and targets. The research predominately focuses on the use of the messaging platform Telegram by ACs, and includes detailed data analysis exploring how this network uses Telegram to produce and promote white nationalist propaganda, expand the network of clubs, and facilitate on- and offline collaboration between members and groups.

Amman | Berlin | London | Paris | Washington DC: Institute for Strategic Dialogue, 2023. 17p.

The “Chanification” of White Supremacist Extremism

By  Michael Miller Yoder,  David West Brown &  Kathleen M. Carley 

Much research has focused on the role of the alt-right in pushing far-right narratives into mainstream discourse. In this work, we focus on the alt-right’s effects on extremist narratives themselves. From 2012 to 2017, we find a rise in alt-right, 4chan-like discourse styles across multiple communication platforms known for white supremacist extremism, such as Stormfront. This discourse style incorporates inflammatory insults, irreverent comments, and talk about memes and online “chan” culture itself. A network analysis of one far-right extremist platform suggests that central users adopt and spread this alt-right style. This analysis has implications for understanding influence and change in online white supremacist extremism, as well as the role of style in white supremacist communications. Warning: This paper contains examples of hateful and offensive language.

Comput Math Organ Theory Volume 31, pages 222–235, (2025)

Discord and the Pentagon's Watchdog: Countering Extremism in the U.S. Military

By Amy C. Gaudion

In his 2022 book, Ward Farnsworth crafts a metaphor from the lead-pipe theory for the fall of Rome to consider how rage and misinformation traveling through today’s technology-enabled pipes are poisoning our civic engagement and threatening our governmental structures: “We have built networks for the delivery of information––the internet, and especially social media. These networks too, are a marvel. But they also carry a kind of poison with them. The mind fed from those sources learns to subsist happily on quick reactions, easy certainties, one-liners, and rage.” This Article carries the metaphor into a new context and considers what should be done when the poison being transported through the digital pipes is directed at members of the U.S. military. While extremism in the U.S. military is not a new threat, the events of January 6, 2021, brought the threat into much sharper focus. It exposed three preexisting trends, each sitting in plain sight but not yet woven together. These trends include a growing acceptance of extremist views and ideologies in U.S. military and veteran communities, an increase in violent extremist acts committed by individuals with military backgrounds, and the enhanced use of digital platforms by extremist groups to target their messaging to and strengthen their recruitment of individuals with military experience. To return to the metaphor, the extremist poison is teeming through the pipes at an alarming rate, and the number of pipes has increased to include social media platforms, encrypted chat tools, gaming platforms, podcasts, and music streaming apps, including YouTube, Discord, Gab, Telegram, and WhatsApp, among many others. In offering these observations, the author is mindful of not overstating the threat and takes seriously warnings as to the adverse consequences that follow from hyperbole and exaggeration. Indeed, a fundamental difficulty is the lack of understanding as to scope and scale of the extremism threat in the U.S. military. This Article attempts to draw the contours of that threat, exposes the structural and legal obstacles that make countering extremism in the military such a fraught exercise, and identifies actors, tools, and mechanisms—beyond the conventional options––able to overcome these long-standing structural and institutional obstacles.

Indiana Law Journal | Vol. 100:1743 | 2025, Penn State Dickinson Law Research Paper 10-2025

Exploring Youth Radicalisation within the Almajiri System in Northern Nigeria

By Oge Samuel Okonkwo

Boko Haram’s emergence in Northern Nigeria is closely tied to systemic vulnerabilities within the Almajiri system, a traditional Islamic educational framework mainly for boys. Founded by Mohammed Yusuf, an Almajiri graduate himself, the terror group exploited socio-cultural fractures, leveraging identity-based grievances, economic deprivation, and governance failures to recruit marginalised Almajirai. While the Almajiri system itself does not inherently radicalise individuals, it produces a large, unemployed youth demographic with a strong collective identity, creating fertile ground for extremist exploitation.Addressing systemic marginalisation and poverty within Nigeria's Almajiri educational system is crucial for preventing youth radicalisation, requiring integrated reforms across education, governance, and community engagement spheres.

The Hague: The International Centre for Counter-Terrorism (ICCT)  2025. 18p.

Addressing Online Self-Radicalisation in Singapore

By Sabariah Hussin

SYNOPSIS
The evolving nature of online self-radicalisation in Singapore raises pressing concerns that go beyond traditional counterterrorism frameworks. While Singapore’s preventive strategies are largely effective, emerging digital dynamics and psychosocial vulnerabilities call for more spiritually grounded, trauma-informed, and community-empowered approaches.

COMMENTARY
The issue of youth radicalisation is gaining attention in Singapore. During a speech by Acting Minister for Muslim Affairs, Faishal Ibrahim, at the Religious Rehabilitation Group (RRG) retreat on 24 June 2025, he noted that the availability of extremist content and the emergence of ideologically themed online communities have contributed to a gradual increase in radicalisation among young people. Given that many of these individuals are still developing their identities and critical thinking skills, they may be more susceptible to these influences.
It is concerning that a 17-year-old supporter of far-right ideology was reportedly planning a mass shooting of worshippers attending Friday prayers, while a 15-year-old girl expressed a desire to marry an ISIS fighter and engage in combat overseas. Both cases illustrate the phenomenon of self-radicalisation occurring entirely through online platforms.
These developments highlight a significant and rapid evolution in the patterns and scope of radicalisation, necessitating a thorough reassessment of Singapore’s CVE (countering violent extremism) strategies.

S. Rajaratnam School of International Studies, NTU Singapore, 2025. 6p.

EXTREME WEATHER: How a storm of false and misleading claims about extreme weather events spread unchecked on social media putting lives at risk

Social media companies are profiting from lies about extreme weather events.

  • On X, 88% of misleading extreme weather posts were from verified accounts. The platform enables paid subscriptions for five of these accounts – which combined have 14 million followers 

  • On YouTube, 73% of posts were from verified accounts. YouTube displayed ads next to 29% of misleading extreme weather videos. 

  • On Facebook and Instagram, 64% of posts were from verified accounts. Meta is sharing ad revenue with three content creators pushing misleading claims, enabling them to share in Meta’s revenue from ads near their posts.