Open Access Publisher and Free Library
TERRORISM.jpeg

TERRORISM

Terrorism-Domestic-International-Radicalization-War-Weapons-Trafficking-Crime-Mass Shootings

TRANSPARENCY REPORTING ON TERRORIST AND VIOLENT EXTREMIST CONTENT ONLINE, 4TH EDITION

By  Nora Beauvais

This is the OECD’s fourth benchmarking report examining the policies and procedures related to terrorist and violent extremist content (TVEC) online, with a focus on transparency reporting, of the world’s top 50 most popular online content-sharing services (the “popular services”). Like the third edition, this report also covers the 50 online content-sharing services that terrorist and violent extremist groups and their supporters exploit or rely upon the most (the “intensive services”). The first three reports provided a benchmark against which this fourth report assesses relevant developments. Terrorist and violent extremist actors continually adapt their methods to technological developments. As governments and online platforms increasingly take measures to curb the dissemination of TVEC, terrorists and violent extremists make adjustments to avoid content moderation. On mainstream online platforms, for example, they have been developing tactics to evade automated detection tools. Meanwhile, sustained efforts by large platforms to combat TVEC have also caused a “displacement effect” whereby terrorists and violent extremists turn to alternatives (e.g. cloud platform websites, decentralised web technology, niche alt-platforms, and terrorist-operated websites). Transparency reporting on TVEC online is crucial to assess the evolution and magnitude of the threat, evaluate the effectiveness and efficiency of online platforms’ policies and actions to tackle this problem, as well as their impact on human rights, and build an evidence base to support policymaking and regulatory frameworks. The key findings of this report are: 1. The popular and intensive services are more diverse, both ideologically and geographically. The TVEC landscape is multi-faceted, encompassing a wide range of ideologies, from terrorist groups to violent extremist political movements and lone actors, and it is spreading across different types of contentsharing services and geographical regions. For the first time in this report series, the popular services’ list includes a gaming service. This is noteworthy because gaming services are increasingly used by terrorist and violent extremist actors. In addition, three Indian platforms have joined this ranking. As for the intensive services’ list, it features a self-proclaimed anarchist website for the first time and covers a wider spectrum of geographic regions and languages.2. Overlap between the popular and intensive services remains low, highlighting the need to look at the TVEC landscape more comprehensively. Only ten services appear on both the popular and intensive lists, compared to 11 in the third benchmarking report. However, many policy discussions and responses still tend to focus on the largest platforms. Paired with the finding that the intensive services tend to be less transparent than the popular services (see below), the takeaway is that neglecting smaller but intensive services risks under-scrutinising or even turning a blind eye to a core part of the problem.3. The evidence shows mixed results regarding the clarity of the popular services popular services’ definitions of TVEC, while most of the intensive services’ still do not define or even expressly prohibit TVEC. On the one hand, the definitions related to TVEC in the popular services’ policies and procedures are, overall, clearer than in the previous report. Services are using more comprehensive descriptions of TVEC and related concepts, but new gaps among the services’ approaches have emerged, with a proportion of them still using vague terminology (18%) or having become less precise. On the other hand, 60% of the intensive services still do not define or explicitly prohibit TVEC, or they simply have not established any governing documents. 4. Transparency reporting on TVEC reveals new gaps among popular services and remains rare among intensive services. Seventeen of the popular services now issue transparency reports with specific information on TVEC, as compared to just five in the first edition, 11 in the second, and 15 in the third of this series. This represents the slowest year-to-year growth rate to date. For the first time in the series, one of the services (present on both the popular and the intensive services lists) that previously issued transparency reports with TVECspecific information ceased this practice. In addition, three of the four newest Services to issue transparency reports on TVEC provide very limited information, both quantitatively and qualitatively. Furthermore, there is still significant heterogeneity among the popular services’ reporting approaches, which continues to make data aggregation and cross-platform comparisons difficult, if not impossible. Among the intensive services, only six issue transparency reports on their policies and actions concerning TVEC, against 8 previously, and the vast majority (5 of 6) also appear in the popular services list. The scarcity in transparency reporting on TVEC among the intensive services may be explained by the fact that many of them are operated by terrorist and violent extremist groups and supporters, or by free speech “absolutists” who deliberately let TVEC flourish on their platforms. 5. Content moderation approaches continue to pose risks for privacy, freedom of expression and due process. Continuing a trend that began during the COVID-19 pandemic, popular services rely more heavily on automated tools to detect and remove TVEC, which has generally increased the removal of lawful content and unjustified censorship. Furthermore, half of the intensive services remain opaque regarding their approaches to content moderation; and most of them either have no notifications and appeal mechanisms in place, or do not provide any information in this regard. This raises questions regarding their efforts to ensure the respect of privacy, freedom of expression and due process.6. New online safety laws and regulations are creating an increasingly fragmented transparency reporting landscape. As new online safety laws and regulations come into force, content-sharing services are facing new obligations to issue transparency reports in multiple jurisdictions, and they face different reporting requirements in each of them. To conclude, this report highlights the need for more precision in the Services’ governing documents; more consistency in the metrics and methodologies used to prepare transparency reports; more transparency in their content moderation approaches; and more efforts to ensure due process and to safeguard human rights and fundamental freedoms.