Open Access Publisher and Free Library
04-terrorism.jpg

TERRORISM

TERRORISM-DOMESTIC-INTERNATIONAL-RADICALIZATION-WAR

Posts tagged extremism
Radicalisation and Gender – What Do We Know?

By Joana Cook, Eva Herschinger, Seran de Leede, and Anna-Maria Andreeva

The literature focusing on gender and radicalisation has steadily increased over the last ten years. This has reflected the rise of extremism across the globe, and has been particularly triggered by researchers seeking to better understand the experiences of individuals throughout all stages of the radicalisation process. However, research on the topic has also largely focused on the experiences of women, especially those associated with Islamist forms of extremism. Such narrow focus has resulted in several gaps in the literature, which in turn has translated into gaps in practice. This report seeks to identify the key trends in research between 2014 and 2024 concerning literature on gender and radicalisation, as well as make explicit the areas that remain underexplored. Focusing on tangible recommendations, which align with the needs of practitioners, the report seeks to bring forward the state of the art of research on gender and radicalisation.

The Hague: The International Centre for Counter-Terrorism (ICCT) 2024. 21p.

Far-Right Online Radicalization: A Review of the Literature

By Alice Marwick, Benjamin Clancy, Katherine Furl

This literature review examines cross-disciplinary work on radicalization to situate, historicize, frame, and better understand the present concerns around online radicalization and far-right extremist and fringe movements. We find that research on radicalization is inextricably linked to the post-9/11 context in which it emerged, and as a result is overly focused on studying the other. Applying this research to the spread of far-right ideas online does not account for the ways in which the far-right’s endorsement of white supremacy and racism holds historical, normative precedent in the United States. Further, radicalization research is rife with uncertainties, ranging from definitional ambiguity to an inability to identify any simplistic, causal models capable of fully explaining the conditions under which radicalization occurs. Instead, there are multiple possible pathways to radicalization, and while the internet does not cause individuals to adopt far-right extremist or fringe beliefs, some technological affordances may aid adoption of these beliefs through gradual processes of socialization. We conclude that the term “radicalization” does not serve as a useful analytical frame for studying the spread of far-right and fringe ideas online. Instead, potential analytical frameworks better suited to studying these phenomena include theories prominent in the study of online communities, conversion, mainstreaming, and sociotechnical theories of media effects.

A summary of key take-aways includes:

The adoption of extremist, far-right, and fringe beliefs is often referred to as “radicalization,” a term formulated post-9/11 to understand jihadi terrorism, a very different context from the far-right.

Radicalization research is full of uncertainty.

  • No specific type of person is vulnerable to radicalization, and most people who commit political violence are not mentally ill or alienated from society.

  • Radicalization is not caused by poverty, oppression, or marginalization.

  • There is no one way in which people are “radicalized.”

  • Viewing extremist media does not necessarily lead to adopting extremist beliefs or committing political violence.

In contrast to the “red pill” model, radicalization is gradual. Recruits slowly adopt the identities, emotions, and interpretations shared by a community. They conceptualize their problems as injustices caused by others, and justify using political violence against them.

The internet does not cause radicalization, but it helps spread extremist ideas, enables people interested in these ideas to form communities, and mainstreams conspiracy theories and distrust in institutions.

"Radicalization” is not a useful frame for understanding the spread of far-right and fringe ideas online.

  • It is analytically imprecise and morally judgmentalIt doesn’t help us understand the role of media and digital technologies.

  • It is inextricably tied to a global security infrastructure targeting Islam.

  • It doesn’t account for the fact that fringe or far-right beliefs may change what people think is “true” and “false,” making it hard to find common ground.

  • The focus on violence ignores other worrying effects of mainstreaming far-right and fringe ideas.

Publisher: Bulletin of Technology & Public Life, 2022. 83p.

White Supremacist and Anti‐government Extremist Groups in the US

Keneally, Katherine; Davey, Jacob

From the document: "This project by the Global Network on Extremism and Technology (GNET) looks at the user journeys of individuals who enter and participate in the online spaces of extremist communities. A user journey here refers to the distinct path a user may follow to reach their goals when accessing and using an online space belonging to extremist communities. User journeys are particularly important in offering insights into the rationale and motivations of users on the one hand, and to the inner workings of extremist online communities on the other. This is vital for understanding their goals and objectives. In selecting the ideologies for this project, we drew upon extremist communities - rather than extremist and terrorist organisations or groups - including those actors that participate in the extremist milieu and share ideas but do not necessarily operate in concert. These ideologies include those of formal and well-defined extremist organisations of White supremacist and anti-government extremist groups in the United States, supporter networks of Islamic State (IS), and looser communities of extremist actors including accelerationists, incels and chan site members who operate on shared platforms, congregating around common beliefs but without the connection of formal membership. This project is a response to the growing interest in understanding how individuals enter and participate in online spaces of extremist communities."

Global Network On Extremism And Technology (GNET). 24 JUL, 2024.

“You are doomed!" Crisis-specific and Dynamic Use of Fear Speech in Protest and Extremist Radical Social Movements

By Simon Greipl, Julian Hohner, Heidi Schulze, Patrick Schwabl, Diana Rieger

Social media messages can elicit emotional reactions and mobilize users. Strategic utilization of emotionally charged messages, particularly those inducing fear, potentially nurtures a climate of threat and hostility online. Coined fear speech (FS), such communication deliberately portrays certain entities as imminently harmful and drives the perception of a threat, especially when the topic is already crisis-laden. Despite the notion that FS and the resulting climate of threat can serve as a justification for radical attitudes and behavior toward outgroups, research on the prevalence, nature, and context of FS is still scarce. The current paper aims to close this gap and provides a definition of FS, its theoretical foundations, and a starting point for (automatically) detecting FS on social media. The paper presents the results of a manual as well as an automated content analysis of three broadly categorized actor types within a larger radical German Telegram messaging sphere (2.9 million posts). With a rather conservative classification approach, we analyzed the prevalence and distribution of FS for more than five years in relation to six crisis-specific topics. A substantial proportion between 21% and 34% within the observed communication of radical/extremist actors was classified as FS. Additionally, the relative amount of FS was found to increase with the overall posting frequency. This underscores FS's potential as an indicator for radicalization dynamics and crisis escalation.

Journal of Quantitative Description: Digital Media. Vol. 4, 2024.

Introducing SHIFT Analysis and Understanding Intra-Actions Within QAnon: Co-Presence of Conspiracy Beliefs and Extremism, Full Report

CENTRE FOR RESEARCH AND EVIDENCE ON SECURITY THREATS

From the document: "Eruptions of violence during the events of January 6th 2021 exemplified the need to understand how conspiracy theories and extremism interact to create security threats. Social media presents as a key player in this exchange, and this project sought to respond to that dynamic by exploring and seeking to understand the intra-actions in groups which discuss both conspiracy beliefs and extreme ideas. To achieve this, it synthesises techniques from computer science and social science to analyse data from platforms which enable and promote unguarded speech. [...] This report seeks to understand how conspiracy theories and extreme ideologies impact one another. [...] This report will introduce a new method analysis of conspiracy and politically motivated groups named the SHIFT analysis. This analysis combines data science and social research techniques within a broadly abductive approach to develop new understanding of dynamic events. SHIFT analysis, which is mutable and robust to examine a singular group or between group interactions in this context, is applied to examine the QAnon movement, in the context of the lead up to and aftermath of January 6th, which espoused both extreme political ideology and conspiracy beliefs." "SHIFT" comes from the beginning letters of the following five analysis procedures: 1. 'S'ocial network analysis and identification; 2. 'H'one extracted sample and coding; 3. 'I'nvestigative netnography; 4. 'F'rame analysis; and 5. 'T'ext analytics.

CENTRE FOR RESEARCH AND EVIDENCE ON SECURITY THREATS. JUN, 2024. 36p.

'Substitution': Extremists' New Form of Implicit Hate Speech to Avoid Detection

RISIUS, MARTEN; NAMVAR, MORTEZA; AKHLAGHPOUR, SAEED; XIE, HETIAO (SLIM)

The following excerpt from the document contains multiple links embedded in the original text: "'Content Warning: This insight contains antisemitic, racist, and hateful imagery.' [...] Extremists exploit social media platforms to spread hate against minority groups based on protected attributes such as gender, religion, and ethnicity. Platforms and researchers have been actively developing AI tools to detect and remove such hate speech. However, extremists employ various forms of implicit hate speech (IHS) to evade AI detection systems. IHS spreads hateful messages using subtle expressions and complex contextual semantic relationships instead of explicit abusive words, bringing challenges to automatic detection algorithms. Common forms of IHS include dog whistles, coded language, humorous hate speech, and implicit dehumanisation. Moreover, the forms and expressions of IHS evolve rapidly with societal controversies (e.g., regional wars). Identifying and tracking such changes in IHS is crucial for platforms trying to counter them. In this Insight, we report and analyse 'Substitution' as a new form of IHS. Recently, we observed extremists using 'Substitution' by propagating hateful rhetoric against a target group (e.g., Jews) while explicitly referencing another label group (e.g., Chinese). We show that Substitution not only effectively spreads hate but also exacerbates engagement and obscures detection."

GLOBAL NETWORK ON EXTREMISM AND TECHNOLOGY (GNET). 24 JUN, 2024. 8p.

Grievance-fuelled violence: Modelling the process of grievance development

By Emily Corner and Helen Taylor

Acts of extreme or mass violence perpetrated by lone offenders have become increasingly common in liberal democracies over the past 20 years. Some describe these acts as politically motivated, while others attribute them to mental disorder or criminal intent. This has led to the development of distinct research and practice areas focusing on either violent extremism, mass murder, fixation, stalking, or familial and intimate partner homicide. However, there is increasing understanding that the distinction between political ideology, criminal intent and personal motivation is blurred, and that the violence carried out by these individuals is better understood using the broader concept of grievance-fuelled violence. This work is the first to empirically consolidate the existing research in these distinct areas, employing a multifaceted analytical approach to develop a holistic model of the processes of grievance development among those who commit grievance-fuelled violence.

Research Report no. 27. Canberra: Australian Institute of Criminology. 2023. 95p.

Murder & Extremism in the United States in 2023

By Anti-Defamation League

Every year, individuals with ties to different extreme causes and movements kill people in the United States; the ADL Center on Extremism (COE) tracks these murders. Extremists regularly commit murders in the service of their ideology, to further a group or gang they may belong to, or even while engaging in traditional, non-ideological criminal activities. In 2023, domestic extremists killed at least 17 people in the U.S., in seven separate incidents. This represents a sharp decrease from the 27 extremist-related murders ADL has documented for 2022—which itself was a decrease from the 35 identified in 2021. It continues a trend of fewer extremist-related killings after a five-year span of 47-79 extremist-related murders per year (2015- 2019). One reason for the trend is the decrease in recent years of extremist-related killings by domestic Islamist extremists and left-wing extremists. The 2023 murder totals include two extremist-related shootings sprees, both by white supremacists, which together accounted for 11 of the 17 deaths. A third shooting spree, also by an apparent white supremacist, wounded several people but luckily did not result in fatalities. All the extremist-related murders in 2023 were committed by right-wing extremists of various kinds, with 15 of the 17 killings involving perpetrators or accomplices with white supremacist connections. This is the second year in a row that right-wing extremists have been connected to all identified extremist-related killings. Two of the incidents from 2023 involved women playing some role in the killing or its aftermath. This report includes a special section that examines the role played by women in deadly extremist violence in the United States by analyzing 50 incidents from the past 20 years in which women were involved in some fashion in extremist-related killings. Murder & Extremism in the United States in 2023

New York: Anti-Defamation League, 2024. 36p.

Good Lives in Right-Wing Extremist Autobiographies

By Hanna Paalgard Munden, Sarah Marsden, MD Kamruzzaman Bhuiyan, Lotta Rahlf, Hanna Rigault Arkhis, Aimee Taylor

This report sets out the findings of research to understand the potential of the Good Lives Model (GLM) to interpret trajectories into and out of violent extremism and considers the implications for policy and practice. The Good Lives Model (GLM) is a well-developed manifestation of a strength-based approach to rehabilitation. The model argues that focusing on developing strengths and enhancing protections, rather than solely managing and controlling risk factors, offers a more fruitful route to preventing (re)offending. The GLM has become an increasingly prominent part of efforts to rehabilitate criminal offenders, however its potential with respect to the violent extremist population has not been fully exploited. This report is informed by a review of research on protective factors set out in an earlier report: Conceptualising Protective Factors: Strength-Based Approaches (Marsden and Lee, 2022) that established the theoretical foundation for the empirical research set out here.

Scotland: Centre for Research and Evidence on Security Threats, 2023, 30p.

In the Blind Spot – Right-wing Extremists on Alternative and Established Platforms

By Hanna Börgmann

This report provides a summary of the expert conference “In the Blind Spot – Right-Wing Extremists on Alternative and Established Platforms“, which took place in Berlin in September 2023 as part of the “Countering Radicalisation in Right-Wing Extremist Online Subcultures” programme funded by the Federal Ministry of Justice (BMJ). The third conference of the programme served to present current research projects from digital right-wing extremism research and to discuss various perspectives from research, regulation and law enforcement.

The report emphasises the social and political relevance of the research field and current trends in right-wing extremism research, underpinned by welcoming addresses from Benjamin Strasser, State Secretary of the Federal Ministry of Justice (BMJ), and Huberta von Voss, Executive Director of ISD Germany, among others. Dr Julia Ebner, extremism expert and ISD Senior Research Fellow, and Dr Matthias Becker, project manager of the interdisciplinary research project “Decoding Antisemitism”, also provided keynote speeches.

The conference was divided into several panels that focused primarily on discursive and strategic developments, far-right financing, research into right-wing online activities, deradicalization and online regulation. In the discussion panel, a panel of experts focused on the effects of the EU Digital Services Act (DSA).

Berlin: ISD - Institute for Strategic Dialogue, 2023. 19p.

Continuity and Change: Extremist-used Arms in Mali

By: Holger Anders

Mali has faced more than a decade of armed violence perpetrated by extremists, resulting in thousands of victims among national and international armed forces, UN peacekeepers, and civilians.

Continuity and Change: Extremist-used Arms in Mali—a new Briefing Paper from the Small Arms Survey’s Security Assessment in North Africa (SANA) project—investigates the arms, ammunition, explosives, and other material used in extremist attacks in Mali from 2015 to 2022, and the sources and pathways through which they were obtained.

Geneva: Small Arms Survey, 2024. 16p.

Prohibited Extremist Activities in the U.S. Department of Defense

By Peter K. Levine, Joseph F. Adams, Amy A. Alrich, Rachel G. Augustine, Margaret D.M. Barber, Sujeeta B. Bhatt Kathleen M. Conley, Dave I. Cotting, Alan B. Gelder, Jeffery M. Jaworski, Mark F. Kaye, Carrington A. Metts, Neil V. Mithal, and Matthew J. Reed.

From the document: "The objectives of the IDA [Institute for Defense Analyses] study are to gain greater fidelity on the scope and nature of extremist ideologies and behaviors in the [DOD]; identify the sources of such ideologies and behavior; assess their impact; and develop strategies for preventing, countering, and neutralizing that impact. To that end, the project description calls for IDA to: 1. Document the range of known extremist ideologies and behaviors that are contrary to U.S. law and policy; 2. Identify existing definitions of extremism and prohibited extremist activities; 3. Identify pathways of extremist ideology and behavior broadly and within the Department in particular; 4. Assess why the DOD workforce and others in the military community (including veterans, DOD civilians, and contractor employees) might be susceptible to extremist recruiting efforts; 5. Survey DOD approaches to the prevention of other forms of violence (including suicide, domestic violence, assault, sexual assault, and hate crimes) to identify strategies that might be adopted; 6. Assess policies and initiatives of other federal agencies that might be helpful to the Department; 7. Identify existing legal frameworks for addressing prohibited extremist activities in the Total Force; 8. Evaluate current DOD efforts to counter extremist ideologies and behaviors in the ranks, identifying gaps and strengths; and 9. Review and evaluate current DOD information collection, tracking, and data sharing systems (including through the military justice, equal employment opportunity, command discipline, hotline response systems, insider threat, and law enforcement/security systems)."

INSTITUTE FOR DEFENSE ANALYSES. 2023. 282p.

The Online Extremist Ecosystem: Its Evolution and a Framework for Separating Extreme from Mainstream

by Heather J. Williams, Alexandra T. Evans, Jamie Ryan, Erik E. Mueller, Bryce Downing

n this Perspective, the authors introduce a framework for internet users to categorize the virtual platforms they use and to understand the likelihood that they may encounter extreme content online.

The authors first provide a landscape of the online extremist "ecosystem," describing how the proliferation of messaging forums, social media networks, and other virtual community platforms has coincided with an increase in extremist online activity. Next, they present a framework to describe and categorize the platforms that host varying amounts of extreme content as mainstream, fringe, or niche. Mainstream platforms are those for which only a small portion of the content would be considered inappropriate or extreme speech. Fringe platforms are those that host a mix of mainstream and extreme content—and where a user might readily come across extreme content that is coded or obscured to disguise its violent or racist underpinning. Niche platforms are those that openly and purposefully cater to an extreme audience.

Santa Monica, CA: RAND, 2021. 44p.

Countering Radicalization to Violence in Ontario and Quebec: Canada's First Online-Offline Interventions Model

By Moonshot

Over a one year period from April 2021 - March 2022, Moonshot partnered with three violence prevention organizations to deliver an online interventions pilot in two Canadian provinces. The pilot advertised psychosocial support services to individuals engaging with extremist content online. Access to these services was voluntary, confidential, and anonymous by design. Our goal was to offer a secure pathway for at-risk individuals to contact a trained therapist or social worker. We built this approach around offering integrated care. Together with our intervention partners, we crafted our advertising messages and service websites to emphasize the confidential, non-judgemental support that callers would receive. Individuals who reached out were connected to an interdisciplinary team, which included a therapist, youth engagement workers, a psychiatrist, and other intervention staff who could offer services like counseling, employment support, addiction support, or simply a space to talk. Our partners were the Estimated Time of Arrival (ETA) team in Ontario, and Recherche et Action sur les Polarisations Sociales (RAPS) in Quebec. The Canadian Practitioners Network for the Prevention of Radicalization and Extremist Violence (CPNPREV) acted as a convening and best practice provider, and supported our pilot evaluation. A description of each organization is at the end of this report. Moonshot’s intervention campaigns ran for a total of six months, and reached individuals consuming incel and violent far-right extremist content on Google Search and YouTube. Our online interventions focused on meeting individuals’ psychosocial needs, and appealed to vulnerabilities and grievances, such as anger, frustration, exhaustion, and isolation. Executive summary Key outcomes Moonshot redirected 786 at-risk individuals to our intervention partners’ websites. 22 initiated a conversation with a counselor. Four individuals formally registered and engaged with a service provider for several months, in addition to those who accessed virtual counselling without going through the registration process. At least one person who initially shared violent impulses has been able to find positive, hopeful alternatives for the future. Moonshot’s ads reached users engaging with harmful content on Google and YouTube 44,508 times. Among the hundreds of users redirected to ETA and RAPS’ websites, 26 were watching influential incel YouTube channels and 39 had searched Google for high-risk keywords related to incel and violent far-right ideology (“looksmax org”; “1488 tattoos”). Moonshot, ETA, RAPS, and CPN-PREV established an effective multi-sectoral partnership. During our pilot program, we co-designed support pathways and risk escalation procedures for each service area, built teams’ capacity to deliver online interventions safely and effectively, and engaged at-risk audiences online. This pilot provides a blueprint for future interventions to reach and engage at-risk internet users. New iterations of this work can reach larger audiences by expanding advertising beyond the pilot platforms, strengthening and expanding cross-sectoral partnerships, and testing new ways to reach often-isolated internet users.

Washington, DC: Moonshot, 2023. 13p.

White Supremacy Search Trends in the United States

By Moonshot and the Anti-Defamation League

Moonshot partnered with the Anti-Defamation League (ADL) to analyze US search traffic in July 2020 in response to the threats posed by white supremacist narratives and ideology in the US this past year. The dominant socio-political events of 2020-2021—the COVID-19 pandemic, the widespread BLM protests and counter-protests, and the presidential election—coalesced to create fertile ground for white supremacists and other violent extremist movements to mobilize and recruit. In 2020, racism and systemic racial inequality took center stage in the American public eye, with nationwide mass protests against recent police killings of Black people and historic evidence of racial injustice.1 In a nationwide reactionary mobilization, members of armed extremist groups made frequent appearances at BLM protests as self-appointed “protection” for property and counter-protesters.2 This high-profile direct action, combined with tacit and explicit support from local and national political figures, contributed to an increased interest in white supremacist and racist ideas by segments of the country.3 Protests and opposition to state lockdowns and other measures introduced in response to COVID-19 also provided opportunities for extremist movements to mobilize and engage with wider swathes of the public around shared grievances. While anti-lockdown protests were not related to white supremacy on the surface, these movements began to overlap in their joint opposition to the BLM movement, the defense of Confederate monuments, and general opposition to perceived government tyranny.4 Similarly, national protests alleging election rigging in the wake of Joe Biden’s presidential election victory were repeatedly co-opted and reinforced by white supremacist groups, culminating in the 6 January siege on the US Capitol. Extremist groups and individuals expressing support for white supremacist ideas were well-documented participants in the insurrection. White supremacist groups and other extremist organizations seized on the tensions and uncertainty in American life to promote racist beliefs and anti-Semitic conspiracy theories in order to increase their recruitment. Extremist narratives related to the pandemic promoted the conspiracy theory alleging that COVID-19 is a hoax created by a Jewish-led cabal. This and related anti-Semitic tropes and conspiracies are mainstays of many QAnon narratives.5 Other groups, such as the Patriot Front, have used the past year’s societal upheavals to recruit new members by promoting an impending race war and the perceived persecution of white people—as indicated by conspiracy theories such as “white genocide” and “the great replacement”.6 Extremist groups also exploited wider tensions, perceived grievances and disinformation against the BLM movement, as well as popular disinformation alleging the election was rigged. The findings from this project provide valuable insights on the types of harmful narratives and content that appeal to individuals potentially at-risk of radicalization, including those first searching for extremist slogans and conspiracies out of curiosity. This report presents an overview of the search traffic data collected during the project, between 17 July 2020 - 7 March 2021, and our main findings on online white supremacist narrative trends during this time.

Washington, DC: Moonshot, 2021. 21p.

Delegitimising Counter-Terrorism: The Activist Campaign to Demonise Prevent

By John Jenkins, Dmon L. Perry and Paul Stott

The Prevent counter-terrorism strategy is perhaps the most controversial government policy most people have never heard of. Public recognition of it is generally low, but opposition from Britain’s raucous Islamist scene, near total. From there, opposition has spread to sections of the far-left, and those parts of academia where Islamism and the revolutionary left intersect. This report, written by three experts on Islamism, outlines the campaign against Prevent, and argues that this is not an exceptional campaign against a uniquely flawed policy – the groups opposing Prevent have tended to criticise pretty much any counter-terrorism policy, in sine cases for a generation. The same names and campaign groups appear time after time regardless of the colour of the government of the day.

Disappointingly, ministers and officials have tended to shy away from some of these debates, allowing misinformation, and even conspiracy theory, to flourish. The forthcoming Prevent review by William Shawcross risks being dead on arrival if this continues. The authors call for a Centre for the Study of Extremism to give Ministers the tools to properly push back against campaigners, with a separate communications unit to disseminate rebuttal, and a due diligence unit. The latter is needed to ensure that government departments and the public sector are choosing their friends wisely. Too often anti-Prevent campaigners are able to grandstand against government counter-terrorism policies, whilst at the same time receiving government patronage and engagement. It should no longer be possible to run with the fox, and hunt with the hounds.

London: Policy Exchange, 2022.' 89p,

Behind the Mask:Uncovering the Extremist Messages of a 3D‑Printed Gun Designer

By The International Centre for the Study of Radicalisation and Political Violence

Within the world of 3D-printed guns, one pseudonymous figure has emerged as a symbol for the cause of universal access to firearms: “JStark1809”. He created the world’s most popular 3D-printed gun and established an influential network of 3D-printed gun designers. Since his death in July 2021, he has been memorialised as a martyr for the right to bear arms. In this report, ICSR Senior Research Fellow Dr Rajan Basra identifies “JStark1809” as Jacob Duygu, a German national born to Kurdish parents who arrived as refugees from Southeast Turkey in the 1990s. Using a combination of authorship attribution techniques, JStark can be identified as the author of over 700 seemingly “anonymous” comments on 4chan’s /pol/ board. He disclosed hitherto unknown details about his life, broader political views, and extremist attitudes.

Chapter one lays out the findings and structure of the report. Chapter two details the open-source methodology used in finding JStark’s digital footprint, including how he was identified as the author of “anonymous” comments. It also summarises JStark’s biographical details.

The subsequent chapters analyse JStark’s life according to three themes: (1) his journey to designing 3D-printed firearms; (2) his political beliefs, including his xenophobic online behaviour and threats of violence as expressed on 4chan’s /pol/; and (3) his life as a self-identified incel, attitudes to misogynistic violence, and his related suicidal ideation. The report concludes with implications for the broader 3D-printed gun movement.

London: ICSR, Department of War Studies, King’s College London, 2023. 52p.

Layers of Lies: A First Look at Irish Far-Right Activity on Telegram

By Institute for Strategic Dialogue (ISD)

This report aims to provide a first look into Irish far-right activity on the messaging app, Telegram, where the movement is operating both as identifiable groups and influencers, and anonymously-run channels and groups. The report looks at the activity across 34 such Telegram channels through the lens of a series of case studies where content posted on these channels resulted in real life consequences. Using both quantitative and qualitative methods, the report examines the tactics, language and trends within these channels, providing much-needed detail on the activity of the Irish far-right online. This report was produced in conjunction with TheJournal.ie and its investigative platform Noteworthy.ie as part of their Eyes Right series, examining the growth of far-right ideology on Irish online networks, and its influence on wider public opinion.

Beirut; London; etc.: Institute for Strategic Dialogue (2021). ,30p.

The Domestic Extremist Next Door: How Digital Platforms Enable the War Against American Government

By The Digital Citizens Alliance

Digital platforms enabled the disturbing rise of domestic extremism, culminating with the January 6 attack on the U.S. Capitol. Militia groups use social media networks to plan operations, recruit new members and spread anti-democracy propaganda, a new Digital Citizens Alliance (Digital Citizens) and Coalition for a Safer Web (CSW) investigation has found.

Taking a page from Jihadists, these extremist groups operate along the fringes of where platforms such as YouTube, Twitter, Facebook, and Instagram will let them. Federal prosecutors investigating the Capitol riot revealed how militia groups used social media platforms to coordinate and prepare for possible conflict with Antifa. But the joint Digital Citizens / CSW investigation found the use of platforms goes well beyond tactical planning. Militias rely on the platforms to share their beliefs and ideology and recruit new members. The militias get a boost from their ideological simpatico with mis/disinformation groups like QAnon, which provides oxygen that militias use to fan the flames.

The anti-government militia movement first emerged after the 1992 Ruby Ridge standoff, the 1993 Waco siege, and the Oklahoma City Bombing on April 19, 1995. After Oklahoma City, U.S. law enforcement cracked down on domestic terrorism and the militia movement. In 1996, the Southern Poverty Law Center (SPLC) reported 858 militia groups with up to 50,000 active members. The 9/11 terrorist attacks shifted focus to global threats and led to a dormant period for militias. But domestic extremists such as the Proud Boys, the Boogaloo Bois, the Three Percenters, and the Oath Keepers have reinvigorated the movement – aided in large part by digital platforms. In 2020, according to research by The Washington Post, the number of domestic terrorism incidents in the United States had doubled from what it was in 1995. But the joint Digital Citizens / CSW investigation found the use of platforms goes well beyond tactical planning. Militias rely on the platforms to share their beliefs and ideology and recruit new members. The militias get a boost from their ideological simpatico with mis/disinformation groups like QAnon, which provides oxygen that militias use to fan the flames.

The anti-government militia movement first emerged after the 1992 Ruby Ridge standoff, the 1993 Waco siege, and the Oklahoma City Bombing on April 19, 1995. After Oklahoma City, U.S. law enforcement cracked down on domestic terrorism and the militia movement. In 1996, the Southern Poverty Law Center (SPLC) reported 858 militia groups with up to 50,000 active members. The 9/11 terrorist attacks shifted focus to global threats and led to a dormant period for militias. But domestic extremists such as the Proud Boys, the Boogaloo Bois, the Three Percenters, and the Oath Keepers have reinvigorated the movement – aided in large part by digital platforms. In 2020, according to research by The Washington Post, the number of domestic terrorism incidents in the United States had doubled from what it was in 1995.

Washington, DC: Digital Citizens Alliance, 2021. 56p.

Buying and Selling Extremism: New funding opportunities in the right-wing extremist online ecosystem

By Ariel Bogle

As mainstream social media companies have increased their scrutiny and moderation of right-wing extremist (RWE) content and groups,1 there’s been a move to alternative online content platforms.2 There’s also growing concern about right-wing extremism in Australia,3 and about how this shift has diversified the mechanisms used to fundraise by RWE entities.4 This phenomenon isn’t well understood in Australia, despite the Australian Security Intelligence Organisation (ASIO) advising in March 2021 that ‘ideological extremism’5 now makes up around 40% of its priority counterterrorism caseload.6 Research by ASPI’s International Cyber Policy Centre (ICPC) has found that nine Australian Telegram channels7 that share RWE content used at least 22 different funding platforms, including online monetisation tools and cryptocurrencies, to solicit, process and earn funds between 1 January 2021 and 15 July 2021. Due to the opaque nature of many online financial platforms, it’s difficult to obtain a complete picture of online fundraising, so this sample is necessarily limited. However, in this report we aim to provide a preliminary map of the online financial platforms and services that may both support and incentivise an RWE content ecosystem in Australia. Most funding platforms found in our sample have policies that explicitly prohibit the use of their services for hate speech, but we found that those policies were often unclear and not uniformly enforced. Of course, there’s debate about how to balance civil liberties with the risks posed by online communities that promote RWE ideology (and much of that activity isn’t illegal), but a better understanding of online funding mechanisms is necessary, given the growing concern about the role online propaganda may play in inspiring acts of violence8 as well as the risk that, like other social divisions, such channels and movements could be exploited by adversaries.9 The fundraising facilitated by these platforms not only has the potential to grow the resources of groups and individuals linked to right-wing extremism, but it’s also likely to be a means of building the RWE community both within Australia and with overseas groups and a vector for spreading RWE propaganda through the engagement inherent in fundraising efforts. The funding platforms mirror those used by RWE figures overseas, and funding requests were boosted by foreign actors, continuing Australian RWEs’ history of ‘meaningful international exchange’ with overseas counterparts.

Barton, ACT: The Australian Strategic Policy Institute Limited, International Cyber Policy Centre 2021.36p.