Open Access Publisher and Free Library
TERRORISM.jpeg

TERRORISM

Terrorism-Domestic-International-Radicalization-War-Weapons-Trafficking-Crime-Mass Shootings

Posts tagged decentralized platforms
Dangerous Organizations and Bad Actors: The Active Club Network

By Middlebury Institute of International Studies

Active Clubs make up a decentralized network of individually-formed organizations that are centered around the premise of a white supremacist fraternal brotherhood. First introduced in December of 2020 by Robert Rundo, the leader of the white supremacist Rise Above Movement (R.A.M), Active Clubs are intended to preserve and defend the white population and traditional European culture from a perceived global genocide by non-white ethnic and racial groups. 

Rundo was inspired to create the Active Club network—something he referred to as “white nationalism 3.0”—in response to the numerous arrests of R.A.M. members made in 2018. He wanted to create an organization that would be less perceptible to law enforcement, and thus less susceptible to disruption or destruction. From this, Active Clubs were born—small, decentralized organizations that would focus recruitment efforts on localized areas and thus garner less attention than traditional white nationalist organizations. This structure would also ensure that Active Clubs were not reliant on a particular physical entity or leadership figure for survival.

Active Clubs provide like-minded white men with physical spaces where they can train in mixed martial arts in preparation for war against their perceived enemies. Ideologically, Active Clubs adhere to neofascist and accelerationist principles, with the promotion of violence comprising a key theme in Active Club communication and propaganda. Located across the United States and in several countries transnationally, the Active Club network ensures that groups of men devoted to training for battle are available for mobilization in multiple locations across Western countries. 

Monterey, CA: Middlebury Institute of International Studies and Center on terrorism, Extremism, and Counterterrorism, 2024.

download
The Hydra on the Web: Challenges Associated with Extremist Use of the Fediverse – A Case Study of PeerTube

By Lea Gerster, Francesca Arcostanzo, Nestor Prieto-Chavana, Dominik Hammer and Christian Schwieter

As part of its project on “Combating Radicalisation in Right-Wing Extremist Online Subcultures”, ISD is investigating smaller platforms to which the German-speaking far-right online scene is retreating. Their aim is to circumvent regulation and moderation on large platforms, for example as required by the German Network Enforcement Act (NetzDG, or 'Facebook Act'). Analyses contained in the report on “Telegram as a Buttress” have already made clear the importance of investigating PeerTube. While writing the previous report, the ISD research team came across multiple video platforms with almost identical layouts and functionalities. It was found that eight out of 19 video platforms identified were set up using the free software PeerTube. PeerTube is an example of a growing socio-technological movement that attempts to turn away from large, centralised platforms towards decentralised and mostly community-managed websites. Instead of a single platform with a monopoly on content, this movement is building a network based on servers that are maintained independently of one another. This is leading to a “hydra effect”: even if connected servers are cut off, the network itself survives, allowing for new servers to be added at will. While far-right extremists are not the ones driving this phenomenon, it does appear that they are exploiting these new-found possibilities. For example, various far-right figureheads have established platforms via PeerTube where users can in some cases create their own accounts. Some of these PeerTube platforms record millions of visits per month. But it is not just its reach but also its structure which makes PeerTube worth investigating. With PeerTube, individuals or organisations can create their own video platforms where they set the rules for content, moderation and user registration. This is essential for far-right and conspiracy groups and individuals who, according to previous ISD research, prefer audiovisual platforms to other types, such as micro-blogging services. PeerTube is a particularly valuable tool since it is more technologically demanding to host and access audiovisual materials than text files. In contrast to centralized platforms such as YouTube, PeerTube content is managed separately by so-called instances - this refers to the small-scale video platforms created with PeerTube software. Different instances can network with each other and form federations. This allows videos that have been uploaded on one instance to be played on another instance without having to change the website. PeerTube belongs to the so-called Fediverse, which will be discussed in more detail below. Another difference to centralized video platforms is that this software uses peer-to-peer technology (P2P), which presumably explains the name. The fact that instances are not managed by large companies, but by individuals or groups at their own expense and with the help of free software, also has implications for their regulation. Key Findings • There is no central moderating authority for managing content. PeerTube offers individuals or groups, whose content has been blocked on centrally managed video platforms for to violating the terms of service, an attractive way of continuing to share their content online. Where it is these groups or individuals who control moderation, content can only be removed from the network by switching off the server. • It is difficult to accurately map the size and connections of the Fediverse. The network is constantly in flux as the relationships between instances can change rapidly due to blocking and follow requests. Instances can also go offline from one day to the next. • The instances used by far-right and conspiracy actors make up only a small fraction of the Fediverse network. They primarily network among themselves. However, some of the instances that were investigated are connected to a wider Fediverse through their own highly networked servers. • The deletion of extremist YouTube channels is not necessarily reflected in the number of account registrations on the corresponding PeerTube instances. PeerTube is rather used as a back-up option for deplatforming. The frequently observed phenomenon that not all users migrate to the new platform can also be observed here. • Instances are often customised in different ways. For example, each instance varies in whether they permit third parties to register for accounts and upload videos. There is no clear correlation between the numbers of accounts, videos and views. However, the most watched videos on relevant instances were mostly created by prominent members of the milieu, which would appear to indicate that persons with a pre-established audience are particularly successful on PeerTube. • The instances selected for the five case studies hosted a lot of content that focused on the COVID-19 pandemic. Another frequent narrative was an alleged conspiracy perpetuated by elites who, according to conspiracy theorists, use events such as the pandemic or Putin's war against Ukraine to further their secret agenda. These findings suggest that PeerTube instances provide safe refuge for disinformation. • Because PeerTube is a piece of software that anyone can access and use in a variety of different ways, state regulation will do little to limit its use by farright extremists. While state agencies can enforce individual aspects of the NetzDG against PeerTube instances, many instances do not have the number of users required to impose reporting obligations or requirements to delete content. Moreover, most content is not hosted with the aim of generating profit, which limits the applicability of both the Facebook Act and the EU's Digital Services Act. • However, PeerTube's community moderation function does allow the community to moderate the use of PeerTube for promoting harmful content. One way of doing this is by isolating extremist instances. Efforts should be undertaken to work with the Fediverse community, i.e. with the server operators and their users, and to develop best practices for identifying and combating extremist activities. This could include further training on how to spot hate speech or setting up a body for reporting extremist instances.

London: Institute for Strategic Dialogue (2023). 44p.

download