Open Access Publisher and Free Library
10-social sciences.jpg

SOCIAL SCIENCES

EXCLUSION-SUICIDE-HATE-DIVERSITY-EXTREMISM-SOCIOLOGY-PSYCHOLOGY-INCLUSION-EQUITY-CULTURE

Posts tagged Violent Extremist
Transmisogyny, Colonialism and Online Anti‐Trans Activism Following Violent Extremist Attacks in the US and EU

By Anne Craanen, Charley Gleeson and Anna Meier

This report investigates the rise of online anti-trans activism following two prominent attacks involving LGBTQ+ communities, namely the October 2022 attack on a gay bar in Bratislava, Slovakia, and the March 2023 shooting at a school in Nashville, Tennessee perpetrated by a trans man.

We use a postcolonial approach, through which we find that the transphobia espoused online following the attacks was predominantly transmisogynistic, a consequence of the colonial logics around gender which assign the monopoly of violence to white cisgender men. The main themes identified were the erasure of trans identities, particularly transmasculinity, the overlap between transmisogyny and other forms of discrimination, and the demonization of trans people. 

The most important conclusion from our research is for everyone – technology companies, policymakers and other stakeholders – to take transphobia and transmisogyny seriously. Too often transmisogyny is seen as a side problem, or as a complement to another set of more radical ideas, including but not limited to white nationalism or anti-government sentiment. It can often be the case that transphobia, alongside misogyny, hate speech, or other forms of discrimination, is seen as “harmful but lawful” or described as “borderline content”, thereby not in need of online moderation. While simply removing such material from platforms may be neither appropriate nor advisable in all cases, there are other forms of content moderation that platforms can consider, depending on how online transphobia manifests itself. 

In the conclusion of our work, we provide practical recommendations to technology companies of all sizes for tackling transphobia more effectively. Key among these are the importance of knowledge-sharing between platforms and subject matter experts, defining transphobia and transmisogyny in platforms’ terms of service, and employing content moderation practices such as disinformation tags and algorithmic deprioritization. 

Recommendations for technology companies:

  1. Increase online monitoring following attacks that are directly relevant to the LGBTQ+  community as transphobic content is likely to increase, including material that violates terms of service, incites violence or is otherwise illegal. 

  2. Collaborate with experts to comprehend and classify transphobic rhetoric, and produce a taxonomy alongside subject-matter specialists, technology representatives, civil society, and government partners.

  3. Consider diverse moderation methods, removing illegal content and also using alternatives to removal such as fact-checking and algorithmic adjustments to mitigate exposure to transphobic channels and content.

  4. Define transphobia in terms of service to guide users as to what is allowed on platforms and enable user reporting. 

  5. Design clear reporting and appeal mechanisms for moderated content, including online transphobia, to protect digital and human rights.

” London: Global Network on Extremism and Technology (GNET), May 2024.2024. 26p.

30 Years of Trends in Terrorist and Extremist Games 

By Emily Thompson and Galen Lamphere-Englund

Violent extremist, terrorist, and targeted hate actors have been actively exploiting video games to propagandize, recruit, and fundraise for more than 30 years. This report presents an analysis of that history using a unique dataset, the Extremist and Terrorist Games Database (ETGD), developed by the authors. It contains 155 reviewed entries of standalone games, modifications for existing games (mods), and browser‑based games dating from 1982 to 2024. The titles analyzed appear across the ideological spectrum: far right (101 titles), jihadist (24), far left (1), and other forms of extremism and targeted hate (29), including school‑massacre ideation (12). They span platforms ranging from simple standalone games for Atari in the 1980s to sophisticated mods for some of today’s most popular games. The number of titles has increased year on year – in line with global conflict and extremist ideological trends, and revealing a continued push by malicious actors to exploit gaming. Meanwhile, the means of distribution have shifted from violent extremist organizations and marketplaces – such as white supremacist, neo‑Nazi, and jihadist organizations – to distributed repositories of extremist games hosted on internet archives, Ethereum‑hosted file‑sharing, Telegram and with subtly coded titles on mainstream platforms like Steam. While most of the titles in the ETGD are available for free, several that have been sold (often at symbolic prices like $14.88 or $17.76) appear to have generated revenue for groups ranging from Hezbollah to the National Alliance, an American neo‑Nazi group. Through new analysis of Steam data, we also show that a small number of extremist and targeted hate titles have generated almost an estimated $600,000 in revenue for small publishers on the platform. Far from being a comprehensive analysis of the ETGD, we intend this preliminary launch report to form a basis for future research of the dataset and a framework for continued contributions to the ETGD from Extremism and Gaming Research Network (EGRN) members. Above all, we seek to contribute to sensible policymaking to prevent violent extremism that situates games as part of a wider contested and exploited information space, which deserves far more attention from those working towards peaceful ends. Complete recommendations are provided in the conclusion section of this report but include the following: 1. Prohibit and prevent violent extremist exploitation: Gaming platforms should explicitly prohibit violent extremist and terrorist behaviors and content. Leadership exists here from Twitch, Discord, Microsoft/Xbox, and the affiliated Activision‑Blizzard.  a. Audio and video platforms, such as Spotify, Apple Music, and YouTube should seek to identify extremist gaming content currently available under misleading titles and tags. b. Flag and remove extremist titles across platforms: Hashing and preventing outlinking to ETGD games and links should be a priority across platforms. 2. Improve reporting mechanisms: Platforms must improve reporting mechanisms to make it easier for players to report violative content found in games and in‑game conduct. 3. Understand and take down distributed repositories: Larger repositories of extremist gaming content readily available on the surface web accelerate user exposure. 4. Collaborate across sectors: Addressing the spread of extremist games requires a collaborative effort between tech companies, government agencies, and civil society organizations. 5. Educate across sectors: Programmes supporting educators and frontline community moderators should be developed. 6. Support research and innovation: Including cross‑sector initiatives like the Global Network on Extremism and Technology (GNET) and EGRN, which produced this database. 7. Enhance regulatory frameworks: Governments should update regulatory frameworks applying to digital platforms, recognizing the nuances of gaming platforms and complying with human rights. 8. Encourage positive community engagement: Thoughtful, well-designed community guidelines, moderation policies, and reporting mechanisms can support community‑building.  

London: The Global Network on Extremism and Technology (GNET) , 2024. 40p.'