The Open Access Publisher and Free Library
04-terrorism.jpg

TERRORISM

TERRORISM-DOMESTIC-INTERNATIONAL-RADICALIZATION-WAR

Posts tagged online extremism
ONLINE EXTREMISM AND TERRORISM RESEARCHERS’ SECURITY, SAFETY, AND RESILIENCE: FINDINGS FROM THE FIELD

Elizabeth Pearson, Joe Whittaker, Till Baaken, Sara Zeiger, Farangiz Atamuradova, and Maura Conway

“This report presents findings from the REASSURE (Researcher, Security, Safety, and Resilience) project’s in-depth interviews with 39 online extremism and terrorism researchers. Based at universities, research institutes, and think tanks in Europe and North America, the interviewees studied mainly, albeit not exclusively, far-right and violent jihadist online activity. The report catalogues for the first time the range of harms they have experienced, the lack of formalised systems of care or training, and their reliance therefore on informal support networks to mitigate those harms.”

Vox Pol. REASSURE. 2023. 138p

Online Radicalisation: A Rapid Review of the Literature

By Rosamund Mutton, James Lewis, and Sarah Marsden

This guide sets out the evidence base for ‘online radicalisation’, examining how individual use of the Internet, in conjunction with offline influences, can facilitate radicalisation processes. The UK is the main context of concern, however comparable evidence is found in studies with samples from the USA, Canada, Belgium, Germany, Austria, and Israel. Radicalisation remains a contentious concept and few studies explicitly define ‘online radicalisation’. For the purposes of this guide, ‘radicalisation’ is understood as leading to cognitive outcomes reflected in changes in beliefs and ideas, and/or behavioural outcomes which manifest in changes in behaviour. Two systematic literature reviews (Hassan et al., 2018; Carthy et al., 2020) directed initial searches for relevant research. Further literature was identified through forward and backward citation searching, and narrower key word searches conducted in Google Scholar. Literature searches were completed between June and August 2022. The guide primarily examines literature published between January 2017 and July 2022. Although the evidence base remains modest in size, the research underpinning this guide is assessed to be good quality. There is a growing body of evidence that uses qualitative and quantitative methods to examine a range of factors which are relevant to online radicalisation.

Scotland: Centre for Research and Evidence on Security Threats, 2023, 42p.

“DIGITAL SOLDIERS” QAnon Extremists Exploit U.S. Military, Threaten Democracy

By Elizabeth Yates and Erin E. Wilson

This report examines how the QAnon movement exploits the U.S. military’s credibility in society to further its aim of undermining American democracy. It shows how 25 U.S. military veterans – all of whom have been engaged in the QAnon movement since the failed insurrection on January 6, 2021 – spread disinformation and build support for the movement. Analysis of their social media, along with a broader review of QAnon content, reveals how Q influencers portray the U.S. military as a heroic protagonist in their conspiratorial propaganda, and how they exploit military veterans to legitimate such claims and even recruit. It concludes with policy recommendations that focus on protecting members of the armed services, those who have served and their families, and communities the movement targets with its conspiracies

The mainstreaming of the QAnon extremist movement presents a growing threat to the American system of government. QAnon’s effort to create the perception that they are allied with the U.S. military has particularly alarming implications for our democracy:  It strengthens the QAnon movement by facilitating recruitment from both military and civilian communities and encourages active participation among adherents;  It lends legitimacy to discriminatory and anti-democratic conspiracies that are integrated into the Q movement, such as antisemitism and election denial;  It distorts the public’s understanding of the primary responsibilities of the military, and, importantly, the legal boundaries of domestic military intervention;  It undermines public faith in democratic institutions by regularly encouraging the acceptance of authoritarian actions; and  It threatens the communities the movement targets with its conspiracies and maligns the reputation of U.S. servicemembers, veterans, and their families.

New York: Human Rights First, 2022. 20p.

A Dangerous Web: Mapping Racially and Ethnically Motivated Violent Extremism

by Heather J. Williams, Luke J. Matthews, Pauline Moore, Matthew A. DeNardo, James V. Marrone, Brian A. Jackson, William Marcellino, Todd C. Helmus

Racially and Ethnically Motivated Violent Extremism: The Basics

Racially and ethnically motivated violent extremism (REMVE) refers to a loosely organized movement of individuals and groups that espouse some combination of racist, anti-Semitic, xenophobic, Islamophobic, misogynistic, and homophobic ideology. REMVE actors see their race or ethnicity under threat and promote the use of or engage in violence against a given population group. The majority of REMVE actors are motivated by cultural nationalism or White supremacy—beliefs that Caucasian or "Aryan" peoples represent superior races, and that "White culture" is superior to other cultures. Many REMVE actors also are motivated by White nationalism, which overlaps with White supremacy: Adherents espouse the belief that the White race is superior to others, and White nationalism emphasizes defining a country or region by White racial identity and promoting the interests of White people exclusively and at the expense of non-White populations.

More-common terms related to REMVE include far-right extremism, right-wing terrorism, radical right, or extreme right, which are used more frequently in literature and by other countries. Although these terms are not synonymous, they are used somewhat interchangeably and often without precise definitions. These terms also can be applied to political parties and movements that participate in political systems and do not engage in violence directly, particularly in Europe, where many parliamentary systems have formal far-right parties that participate in elections.

The U.S. State Department commissioned the RAND Corporation to produce a comprehensive network analysis of the White Identity Terrorist Movement (WITM) and REMVE in response to a congressional requirement from the 2021 National Defense Authorization Act.[2] The analysis—which sought to identify key actors, organizations, and supporting infrastructure and the relationships and interactions between them—is intended to inform a U.S. government strategy to counter REMVE.

Rand. 2022. 8p.

The Online Extremist Ecosystem: Its Evolution and a Framework for Separating Extreme from Mainstream

by Heather J. Williams, Alexandra T. Evans, Jamie Ryan, Erik E. Mueller, Bryce Downing

n this Perspective, the authors introduce a framework for internet users to categorize the virtual platforms they use and to understand the likelihood that they may encounter extreme content online.

The authors first provide a landscape of the online extremist "ecosystem," describing how the proliferation of messaging forums, social media networks, and other virtual community platforms has coincided with an increase in extremist online activity. Next, they present a framework to describe and categorize the platforms that host varying amounts of extreme content as mainstream, fringe, or niche. Mainstream platforms are those for which only a small portion of the content would be considered inappropriate or extreme speech. Fringe platforms are those that host a mix of mainstream and extreme content—and where a user might readily come across extreme content that is coded or obscured to disguise its violent or racist underpinning. Niche platforms are those that openly and purposefully cater to an extreme audience.

Santa Monica, CA: RAND, 2021. 44p.

Countering Radicalization to Violence in Ontario and Quebec: Canada's First Online-Offline Interventions Model

By Moonshot

Over a one year period from April 2021 - March 2022, Moonshot partnered with three violence prevention organizations to deliver an online interventions pilot in two Canadian provinces. The pilot advertised psychosocial support services to individuals engaging with extremist content online. Access to these services was voluntary, confidential, and anonymous by design. Our goal was to offer a secure pathway for at-risk individuals to contact a trained therapist or social worker. We built this approach around offering integrated care. Together with our intervention partners, we crafted our advertising messages and service websites to emphasize the confidential, non-judgemental support that callers would receive. Individuals who reached out were connected to an interdisciplinary team, which included a therapist, youth engagement workers, a psychiatrist, and other intervention staff who could offer services like counseling, employment support, addiction support, or simply a space to talk. Our partners were the Estimated Time of Arrival (ETA) team in Ontario, and Recherche et Action sur les Polarisations Sociales (RAPS) in Quebec. The Canadian Practitioners Network for the Prevention of Radicalization and Extremist Violence (CPNPREV) acted as a convening and best practice provider, and supported our pilot evaluation. A description of each organization is at the end of this report. Moonshot’s intervention campaigns ran for a total of six months, and reached individuals consuming incel and violent far-right extremist content on Google Search and YouTube. Our online interventions focused on meeting individuals’ psychosocial needs, and appealed to vulnerabilities and grievances, such as anger, frustration, exhaustion, and isolation. Executive summary Key outcomes Moonshot redirected 786 at-risk individuals to our intervention partners’ websites. 22 initiated a conversation with a counselor. Four individuals formally registered and engaged with a service provider for several months, in addition to those who accessed virtual counselling without going through the registration process. At least one person who initially shared violent impulses has been able to find positive, hopeful alternatives for the future. Moonshot’s ads reached users engaging with harmful content on Google and YouTube 44,508 times. Among the hundreds of users redirected to ETA and RAPS’ websites, 26 were watching influential incel YouTube channels and 39 had searched Google for high-risk keywords related to incel and violent far-right ideology (“looksmax org”; “1488 tattoos”). Moonshot, ETA, RAPS, and CPN-PREV established an effective multi-sectoral partnership. During our pilot program, we co-designed support pathways and risk escalation procedures for each service area, built teams’ capacity to deliver online interventions safely and effectively, and engaged at-risk audiences online. This pilot provides a blueprint for future interventions to reach and engage at-risk internet users. New iterations of this work can reach larger audiences by expanding advertising beyond the pilot platforms, strengthening and expanding cross-sectoral partnerships, and testing new ways to reach often-isolated internet users.

Washington, DC: Moonshot, 2023. 13p.

Screen Hate; National Findings Report

By The McCain Institute , Moonshot and Ketchum

In September 2022, the McCain Institute, in collaboration with Moonshot and Ketchum, launched SCREEN Hate, an initiative that provides caregivers and concerned adults with the knowledge, tools, and resources needed to keep youth safe from online messages that could incite acts of hate-based violence. SCREEN Hate is the first nationwide campaign aimed at equipping bystanders to prevent acts of hate-based violence perpetrated by youth. SCREEN Hate was created as part of a two-year project for the Center for Prevention Programs and Partnerships (CP3) at the Department of Homeland Security (DHS), funded through the 2021 Targeted Violence and Terrorism Prevention Grant Program.1 The SCREEN Hate resource hub and accompanying online campaigns launched on 15 September 2022, having been announced at the “United We Stand Summit” at the White House. The campaigns—designed and implemented by Moonshot, the McCain Institute, and Ketchum—included behavior-based bystander engagement campaigns on Google Search and YouTube, as well as wider community outreach campaigns on Reddit, Facebook, and Instagram. This report details the national findings from the SCREEN Hate online campaigns, conducted between September 2022 and July 2023. It includes geographic and behavioral insights into user engagement, as well as the results of comparative testing on user engagement with our YouTube campaign ads. The final section includes recommendations for future programming for practitioners working to engage with concerned bystanders through online campaigns.

Washington, DC: Moonshot, 2023, 28p.

The Role of Translation in ISIS Propaganda:: International Online Radicalization Methods and Its Effect on Extremism in Indonesia

By Hanny Purnama Sari and Muhammad Syauqillah

This research aims to compile data and information that will contribute to understanding the online radicalization phenomenon through translation. There are many studies on using the internet and propaganda in a terrorism context. However, only a handful studied the correlation between translation and terrorism propaganda, especially in Indonesia. There was little discussion on the role of translation in bridging communication between different nations, cultures, and languages and using it to propagate radical/propaganda narratives worldwide and amplify those messages to its target audience. The research method is descriptive qualitative using primary and secondary data; the sample is taken from the book of Nadharat Fi Al Ijma' Al Qath'i and previous findings and news. This research revealed at least ten roles of translation in the online radicalization phenomenon; among others, translation in the target language can be used to identify the target audience of the propagandist, and many terrorist sympathizers were willing to volunteer to translate the propaganda. However, although the translation is used to leverage the spread of propaganda, it can also assist law enforcement officers in combating terrorist/propaganda narratives. Indonesian law enforcement officers may use translation to counter-terrorism as Indonesia has hundreds of vernacular languages that can be used to 'encrypt' and disseminate their extremist narratives.

International Journal of Science and Society, 4(4), 319-336.

Buying and Selling Extremism: New funding opportunities in the right-wing extremist online ecosystem

By Ariel Bogle

As mainstream social media companies have increased their scrutiny and moderation of right-wing extremist (RWE) content and groups,1 there’s been a move to alternative online content platforms.2 There’s also growing concern about right-wing extremism in Australia,3 and about how this shift has diversified the mechanisms used to fundraise by RWE entities.4 This phenomenon isn’t well understood in Australia, despite the Australian Security Intelligence Organisation (ASIO) advising in March 2021 that ‘ideological extremism’5 now makes up around 40% of its priority counterterrorism caseload.6 Research by ASPI’s International Cyber Policy Centre (ICPC) has found that nine Australian Telegram channels7 that share RWE content used at least 22 different funding platforms, including online monetisation tools and cryptocurrencies, to solicit, process and earn funds between 1 January 2021 and 15 July 2021. Due to the opaque nature of many online financial platforms, it’s difficult to obtain a complete picture of online fundraising, so this sample is necessarily limited. However, in this report we aim to provide a preliminary map of the online financial platforms and services that may both support and incentivise an RWE content ecosystem in Australia. Most funding platforms found in our sample have policies that explicitly prohibit the use of their services for hate speech, but we found that those policies were often unclear and not uniformly enforced. Of course, there’s debate about how to balance civil liberties with the risks posed by online communities that promote RWE ideology (and much of that activity isn’t illegal), but a better understanding of online funding mechanisms is necessary, given the growing concern about the role online propaganda may play in inspiring acts of violence8 as well as the risk that, like other social divisions, such channels and movements could be exploited by adversaries.9 The fundraising facilitated by these platforms not only has the potential to grow the resources of groups and individuals linked to right-wing extremism, but it’s also likely to be a means of building the RWE community both within Australia and with overseas groups and a vector for spreading RWE propaganda through the engagement inherent in fundraising efforts. The funding platforms mirror those used by RWE figures overseas, and funding requests were boosted by foreign actors, continuing Australian RWEs’ history of ‘meaningful international exchange’ with overseas counterparts.

Barton, ACT: The Australian Strategic Policy Institute Limited, International Cyber Policy Centre 2021.36p.

Cults and Online Violent Extremism

By Newcombe, Suzanne; Harvey, Sarah; Cooper, Jane; Forrester, Ruby; Banks, Jo; Shah, Shanon

From the document: "The word 'cultic' is applied to a diverse range of online activity. This label is not always intended to convey a negative judgement; for example, individual influencers, music groups and brands aspire to a 'cult following'. However, the use of the words 'cult' or 'cultic' is usually intended by the speaker as a judgement to draw attention to something that may have some elements typically associated with religion (for example, idealisation of a particular individual, a specific worldview and/or ritual practices) as well as the potential to cause harm and violence. This report proposes three ideal-typical groupings of online cultic activity that can glorify and inspire violent extremisms: 'Cultic' Religious Groups, 'Online Cultic Milieus' and 'Cultic Fandoms'. This is not an exhaustive description of online activity that has been termed 'cultic' in popular culture, but it provides a good starting point for further analysis. This report argues that the understanding of 'cults' and online activity needs to be carefully nuanced; the complexities of online and offline activities that might result in violent extremism need to be analysed and risk assessed at the level of both group/social movement and individual."

How Extremism Operates Online: A Primer

By Alexandra T. Evans, Heather J. Williams

Recent demonstrations and violent attacks have highlighted the need for an improved understanding of the role of internet-based technologies in aiding and amplifying the spread of extremist ideologies. Since the early days of the internet, radical groups and movements across the ideological spectrum have demonstrated their intent and ability to harness virtual platforms to perform critical functions.

This Perspective, the second in a RAND Corporation series on online white-supremacist and violent misogynist material, provides a primer on how the internet influences the activities of radical groups and movements and how exposure to or consumption of extremist content online influences the behavior of internet users. After briefly discussing relevant terminology, the authors describe the role of the internet in facilitating five operational functions for radical groups and movements: (1) group financing; (2) networking and coordination; (3) recruitment and radicalization; (4) inter- and intra-group knowledge transfer; and (5) planning, coordination, and execution of harmful online and offline operations. The authors then examine how virtual interactions can facilitate or encourage users' adoption of extremist ideas and inspire or alter offline behavior. The Perspective concludes with a discussion of how the internet can be leveraged as a tool to counter extremism, and the authors provide suggestions for further research.

Santa Monica, CA: RAND Corporation, 2022. 48p.

Countering Online Radicalization in America

By Peter Neumann

The Internet has revolutionized the way all of us communicate and do business. Its benefits to people everywhere have been enormous and will continue to drive progress in practically every area of life. At the same time, it should be recognized that, while being a force for good, the Internet has also come to play an important—and, in many ways, unique—role in radicalizing homegrown and domestic terrorists. Supporters of Al Qaeda, Sovereign Citizens, white supremacists and neo-Nazis, environmental and animal liberationists, and other violent extremist groups all have embraced the Internet with great enthusiasm and vigor. They are using it as a platform to spread their ideas, connect with each other, make new recruits, and incite illegal and violent actions. We believe that this trend will continue and that future terrorist attacks against the United States and its interests will involve individuals who have been radicalized—at least in part—on the Internet. As a result, countering online radicalization should continue to be a major priority for the government and its Countering Violent Extremism (CVE) efforts. The purpose of this report is to equip policy makers with a better understanding of how the Internet facilitates radicalization, in particular within the United States; an appreciation of the dilemmas and trade-offs that are involved in countering online radicalization within the United States; and ideas and best practices for making the emerging approach and strategy richer and more effective

Washington, DC: Bipartisan Policy Center’s National Security Preparedness Group (NSPG). 2012. 56p.

White Crusade: How to Prevent Right-Wing Extremists from Exploiting the Internet

By Christina Schori Liang and Matthew John Cross

Right-wing extremists (RWEs) are using the current protests over police brutality in the United States as a cover to commit terroristic acts and to grow their numbers. They present a significant danger to public safety and security and are a growing threat in the West. Despite this, the rise of right-wing extremism (a homogenized term for white ethnonationalists, alt-rights, white supremacist groups, male supremacist groups, and rightwing anti-government extremists) has not been afforded the priority and attention it justly deserves. There are three reasons for this. First, the global narrative maintains that terrorism rests almost exclusively in the hands of a balaclava-clad Salafi-jihadist holding a Kalashnikov. Second, Western right-wing media has largely pushed back against covering the rise of right-wing extremism and the media as a whole has failed to contextualize the systematic threat RWEs present. Third, the global pandemic has forced governments to focus their attention on maintaining public health and socioeconomic order and have consequently failed to see how RWEs are subversively using the pandemic to support and expand their own agenda. RWEs have utilized the lawless and unmoderated internet to reach broader audiences, disseminate literature, and target vulnerable people. They have done so quietly, pushing an ideological campaign that manifests itself under the surface of popular internet discourse, rather than the aggressive proselytizing of Salafi-jihadist groups like the Islamic State. These efforts can be understood as a kind of subversive exposure, where memes and fake news dominate discourse. This paper will analyse the scope of the RWE threat, describe their latest modus operandi, and explore how the pandemic is being instrumentalized by such groups and how the internet has become their principal tool and battleground. The paper will then provide theory and evidence for how counter-narrative programs, especially through digital disruption, can help neutralise the threat.

Geneva: Geneva Centre for Security Policy, 2020. 27p.