Misogynistic Pathways to Radicalisation: Recommended Measures for Platforms to Assess and Mitigate Online Gender-Based Violence
By Sara Bundtzen
This paper reviews online gender-based violence (OGBV) as existing within a continuum of (on- and offline) violence, emphasising the connections with different extremist ideologies, including the dissemination of terrorist and violent extremist content (TVEC). It aims to prioritise a gender perspective in responding to TVEC so that social media platforms can better intervene in and mitigate misogynistic pathways to radicalisation that can begin (or be reinforced) online. The discussion recognises that the mitigation of OGBV and online pathways to radicalisation requires a whole-of-society and whole-of-government approach. Whilst there are steps that governments and civil society can and should take, such as overseeing and enforcing emerging regulatory frameworks and voluntary commitments, this paper and its recommendations emphasise the role and actions of platforms.
Outlining the impact of OGBV at micro (individual) and macro (societal) levels, the paper considers the role platforms can play in exacerbating the risks of OGBV, evaluating platform policies, content moderation practices, user interface design and algorithmic recommender systems. In this context, the paper asserts that researching and mitigating the risks of OGBV can enable earlier warning of and intervention in misogynistic pathways to different forms of violent extremism. Reiterating that any mitigation of risks must come in support of users’ fundamental rights, including their right to privacy and freedom of expression, the paper proposes and elaborates on the following key recommendations:
Enable API access to publicly available data for public interest research;
Develop gender-disaggregated and standardised transparency reporting;
Apply a victim-survivor-centred Safety and Privacy by Design approach;
Enhance cross-platform cooperation and information sharing of OGBV incidents (including actors and tactics);
Review content moderation policies, processes, and systems to acknowledge the continuum of violence and misogyny as a vector for violent extremism;
Apply intersectional feminist knowledge in risk assessments of AI-based systems;
Strengthen and encourage multi-stakeholder dialogue and collaboration.
Berlin: Institute for Strategic Dialogue, 31p....2023.