Read-Me.Org

View Original

Gaming and Extremism:  The Extreme Right on Twitch

By Ciarán O’Connor

This briefing is part of ISD’s Gaming and Extremism Series exploring the role online gaming plays in the strategy of far-right extremists in the UK and globally. This is part of a broader programme on the ‘Future of Extremism’ being delivered by ISD in the second half of 2021, charting the transformational shifts in the extremist threat landscape two decades on from 9/11, and the policy strategies required to counter the next generation of extremist threats. It provides a snapshot overview of the extreme right’s use of Twitch. Twitch launched as a livestreaming service in 2011 focused on gaming and eSports and was acquired by Amazon in August 2014 for $970 million. According to Twitch, the platform has over 30 million average daily visitors and almost half of all Twitch users are between 18 - 34 years old, while 21% are between 13 - 17. In the UK, based on the most recent Ofcom figures from 2019, Twitch accounts were held by 8% of 16-24 year olds, 3% of 25-34 year olds and 2% of 35-44 year olds. Users typically stream themselves playing a game and others can tune in to watch or interact with the gamer through the in-app chat function, whereby a gamer will respond to text questions via their microphone, or to users who send voice comments via a connected chat channel set up by the host gamer on another messaging platform like Discord. There are several ways for Twitch users to monetize their content, most of which are supported and facilitated by the platform. This includes donations sent using the platform’s digital currency, Bits, or via a third-party donations tool like Streamlabs, or via a payment platform like Paypal. Additionally, users earn revenue by running ads on their content or channel, paid subscriptions from other Twitch users (followers), or sponsorships and selling merchandise. Extremist activists have used Twitch in the past to livestream. The platform hosted numerous streams, primarily rebroadcasts or live streams from other platforms, showing events inside the US Capitol in Washington DC on 6 January as protesters stormed the Capitol. In response to extremist threats in the past, Twitch has instituted an in-house moderation team, which suspends or remove channels which breach their rules. Twitch has also been used to promote extremist ideologies and broadcast terrorist attacks. In October 2019, a man killed two people during an attempted attack on a synagogue in Halle. The attack was live streamed for 25 minutes on Twitch. According to the platform, only five viewers watched the video while it was live while a recording of the video generated automatically after the stream ended was viewed by 2,200 people in the 30 minutes it was available before it was flagged and removed. The Twitch account used to broadcast the attack was created about two months prior to the attack and had attempted to stream only once before. In October 2020, Twitch updated its community guidelines to clarify and broaden its ban on terrorist and extremist content. Twitch does not allow content “that depicts, glorifies, encourages, or supports terrorism, or violent extremist actors or acts,” while additionally, users may not display footage of terrorist or extremist violence “even for the purposes of denouncing such content.” In March 2021, Twitch released its first-ever transparency report, detailing its safety initiatives and efforts to protect users on the platform. To better understand the current use of Twitch by the extreme right, and to analyse the overlap with gaming we performed scoping analysis of the platform by searching the platform for keywords associated with extremist activity with the aim of identifying extremist accounts and content. In total we analysed 73 videos and 91 channels on the platform. Key findings • We discovered that content which expresses support for extreme right wing ideologies can be discovered on Twitch with relative ease. These videos are probably better considered as sporadic examples of support for these ideologies on the platform, rather than representative of the systemic use of Twitch by the extreme right for radicalisation and coordination. However, this nevertheless demonstrates that the platform still has a problem with extremist activity. ISD also discovered that there are, and have recently been, prominent extreme right-wing content creators active on the platform, but that these appear to be low in number. • Twitch is one of many live streaming platforms that are favoured by extremists in the practice of “Omegle Redpilling.” This practice involves extremists using the live video chat platform Omegle to troll and spew racism towards others, whilst simultaneously live streaming themselves to their own followers on their profile on another livestream platform. ISD found evidence of at least two such online extremists who have used Twitch for these purposes.Extreme right-wing activists are platform agnostic. Based on findings in this and other reports in this ISD series, there is growing evidence that points to extreme right-activists online adopting a multi-platform approach, where they use as many platforms as possible as part of a strategy to avoid moderation efforts. • A Twitch account belonging to jailed white supremacist Paul Miller is still live. ISD discovered that a Twitch account run by Paul Miller, a white supremacist who used multiple Twitch accounts to simultaneously broadcast hate on multiple video platforms, is still live. Though it features no content, it continues to grow in subscribers and serves as a promotional page for Miller and his hateful ideology. • Streams of gaming did not appear to be used systemically to target, groom or recruit individuals on the platform. ISD did not find evidence that gaming content or communities on Twitch are routinely used or targeted, groomed or recruited by extremists. • We discovered that counter-speech content which pushes back against the extreme right is widely accessible on Twitch. Counter-speech is term for a tactic used by individuals and groups online in countering hate speech, extremism or misinformation by presenting critical responses, debates or alternative narratives in reaction to offensive narratives. ISD discovered there is an active anti-extremist progressive community of counter-speech channels on the platform. • Compared to other online platforms analysed in other reports in this series Twitch does not appear to be a major hub for extreme right-wing communities, content creators or organisations. Notwithstanding some high profile examples of extremist trolling, these appear to be isolated rather than evidence of systemic extremist mobilisation on the platform to reach large audiences, incite violence or recruit others.   

Beirut Berlin London Paris Washington DC Copyright © Institute for Strategic Dialogue, 2021. 13p.