The Open Access Publisher and Free Library
10-social sciences.jpg

SOCIAL SCIENCES

SOCIAL SCIENCES-SUICIDE-HATE-DIVERSITY-EXTREMISM-SOCIOLOGY-PSYCHOLOGY

Posts in diversity
Hacking Diversity: The Politics of Inclusion in Open Technology Cu!tures

Christina Dunbar-Hester

"Hacking Diversity: The Politics of Inclusion in Open Technology Cultures" delves into the complex dynamics of diversity within the realm of open technology. The book critically examines the challenges and opportunities surrounding inclusivity, offering insights into how diversity can be effectively navigated and embraced in these fast-paced, innovation-driven environments. Through a blend of research, analysis, and real-world case studies, this book serves as a valuable resource for individuals and organizations seeking to cultivate more diverse and inclusive open technology cultures.

PRINCETON. PRINCETON UNIVERSITY PRESS. 2020. 283p.

Pro-Palestine US Student Protests Nearly Triple in April

HO, BIANCA; DOYLE, KIERAN

From the document: "Pro-Palestine demonstrations involving students in the United States have nearly tripled from 1 to 26 April compared with all of March, ACLED [ [Armed Conflict Location and Event Data]] data show [...]. New York has been one of the main student protest battlegrounds since the Israel-Palestine conflict flared up in and around Gaza last October, and the arrest of more than 100 students at Columbia University in New York around 18 April heralded a new wave of campus demonstrations."

ARMED CONFLICT LOCATION & EVENT DATA PROJECT. 2 MAY, 2024. 5p.

Grievance and Conspiracy Theories as Motivators of Anti-Authority Protests

By Timothy Cubitt, Anthony Morgan and Isabella Voce

Recent protest activity in Australia has related to a range of political and social causes, including climate change, women’s rights, pandemic-related government policies, and a range of ideological movements. While peaceful protests were held in parts of the country, some resulted in arrests, fines and violence (ABC News 2021; Bavas & Nguyen 2021). Over time, fringe and conspiratorial rhetoric increased across social media (De Coninck 2021) and began featuring more prominently in anti-authority ‘freedom’ protests (Khalil & Roose 2023). While the public health measures have ceased, these freedom protests—and related social movements—have persisted. Conspiratorial and far-right actors have become increasingly prominent among anti-government or anti-authority protests

Trends & issues in crime and criminal justice no. 693. Canberra: Australian Institute of Criminology. 2024. 16p.

The Misperception of Organizational Racial Progress Toward Diversity, Equity, and Inclusion

By Brittany Torrez, LaStarr Hollie, Jennifer Richeson, and Michael Kraus

Despite a checkered racial history, people in the US generally believe the nation has made steady, incremental progress toward achieving racial equality. In this paper, the researchers investigate whether this US racial progress narrative will extend to how the workforce views the effectiveness of organizational efforts surrounding diversity, equity, and inclusion (DEI). Across three studies (N = 1,776), they test whether Black and White US workers overestimate organizational racial progress in executive representation. Torrez, Hollie, Richeson, and Kraus also examine whether these misperceptions, surrounding organizational progress, drive misunderstandings regarding the relative ineffectiveness of common organizational diversity policies. Overall, they find evidence that US workers largely overestimate organizational racial progress, believe that organizational progress will naturally improve over time, and that these misperceptions of organizational racial progress may drive beliefs in the effectiveness of DEI policies.

Evanston, IL: Northwestern University, Institute for Policy Research, 2024. 49p.

The interaction between online and offline Islamophobia and anti-mosque campaigns

By Gabriel Ahmanideen

In the aftermath of the war on terror, mosques have become targets for hate groups, leveraging online platforms to amplify global anti-mosque campaigns. These groups link local protestors with international hate networks, fuelling both online and offline (i.e., onsite) anti-mosque campaigns. Thoroughly reviewing the literature addressing the interaction between online and offline Islamophobia and introducing an anti-mosque social media page instilling the public with online and offline anti-mosque hate, this article suggests a strong interaction between online and offline Islamophobia. In the provided case study from the Stop Mosque Bendigo (SMB), purposeful sampling was used to collect postings before and after the Christchurch Mosque attacks to analyse the evolution of online anti-mosque campaigns in tandem with real-life hate cases. The literature and the case study reveal the interaction between local and global, digital, and physical realms, as well as the convergence of everyday racism with extremist far-right ideologies like the Great Replacement theory. Relying on the present literature and indicative findings, the article advocates for systematic investigations to uncover the direct connection between online hate and physical attacks and urges closer monitoring and accountability for those online platforms and social media pages apparently contributing to onsite hate-driven actions.

Australia, Sociology Compass. 2023, 14pg

Dear Stephen: Race and Belonging 30 Years On

By Runnymede Trust

Racism has always been a matter of life and death. This was never more true than for Stephen Lawrence, a bright young man who dreamed of becoming an architect. Stephen was murdered by racist strangers as he made his way home with a friend in South East London, 30 years ago. It was not only his killers who targeted Stephen with racism. The behaviour of the police - from those first on the scene, to those who handled the disastrous investigation into his murder and dealt closely with his family - was characterised at every stage by racist treatment and bias in the system. Significant questions were raised on accountability in the criminal justice system and whether Black and minority ethnic communities and families were treated fairly. The fight for justice that followed, led by Stephen’s grieving parents, has brought us all to know Stephen’s name, and carry forward his legacy. The seminal 1999 Macpherson Report, published in direct response to the manner in which the police handled Stephen’s case, recognised unequivocally that the Metropolitan Police Force was ‘institutionally racist,’ an unprecedented finding at the time. Many events in the wake of Stephen’s murder, including race equality legislation, still inform and influence racial justice work today.

London, Runnymede Trust. 2023, 80pg

Fifth National Climate Assessment: Report-In-Brief

By Crimmins, Allison R.; Avery, Christopher W.; Easterling, David R.; Kunkel, Kenneth E. (Kenneth Edward), 1950-; Stewart, Brooke C.; Maycock, Thomas K.

From the document: "The Global Change Research Act of 1990 mandates that the US Global Change Research Program (USGCRP) deliver a report to Congress and the President not less frequently than every four years that 'integrates, evaluates, and interprets the findings of the Program and discusses the scientific uncertainties associated with such findings; analyzes the effects of global change on the natural environment, agriculture, energy production and use, land and water resources, transportation, human health and welfare, human social systems, and biological diversity; and analyzes current trends in global change, both human-induced and natural, and projects major trends for the subsequent 25 to 100 years.' The Fifth National Climate Assessment (NCA5) fulfills that mandate by delivery of this Assessment and provides the scientific foundation to support informed decision-making across the United States. By design, much of the development of NCA5 built upon the approaches and processes used to create the Fourth National Climate Assessment (NCA4), with a goal of continuously advancing an inclusive, diverse, and sustained process for assessing and communicating scientific knowledge on the impacts, risks, and vulnerabilities associated with a changing global climate (App. 1)."

U.S. Global Change Research Program. 2023. 144p.


Social Protest and Corporate Diversity

By Victor Viruena

The global economy has driven companies to develop strategies, adopt and promote diversity as a core value in their organizations. The blend of ethnicity, gender, and age strengthens internal ties, boosts productivity, creativity, and innovation. According to Esvary (2015), the sharing of best practices in managing and promoting workplace diversity is intended to strengthen diversity policies further. All around the world, discrimination is rejected by society, but at the same time, businesses still reluctant to incorporate women, LGTB, young and people of different races and cultures on boards and top management positions. Lately, the raising of nationalism, racism, and political polarization has polluted the environment, making it more challenging to integrate minorities as decision-makers in organizations. The national protests against police brutality and racism have opened a new chapter in the U.S. civil rights movement; for the first time in history, Fortune 500 corporations were obligated to publicly stand against racism and take concrete actions to boost management diversity medium-level positions in their organizations.

Academia Letters, Article 430. https://doi.org/10.20935/AL430.. 5p.

2023-2024 CISA Roadmap for Artificial Intelligence

By United States. Cybersecurity & Infrastructure Security Agency

From the document: "As noted in the landmark Executive Order 14110, 'Safe, Secure, And Trustworthy Development and Use of Artificial Intelligence (AI),' [hyperlink] signed by the President on October 30, 2023, 'AI must be safe and secure.' As the nation's cyber defense agency and the national coordinator for critical infrastructure security and resilience, CISA [Cybersecurity & Infrastructure Security Agency] will play a key role in addressing and managing risks at the nexus of AI, cybersecurity, and critical infrastructure. This '2023-2024 CISA Roadmap for Artificial Intelligence' serves as a guide for CISA's AI-related efforts, ensuring both internal coherence as well as alignment with the whole-of-government AI strategy. [...] The security challenges associated with AI parallel cybersecurity challenges associated with previous generations of software that manufacturers did not build to be secure by design, putting the burden of security on the customer. Although AI software systems might differ from traditional forms of software, fundamental security practices still apply. Thus, CISA's AI roadmap builds on the agency's cybersecurity and risk management programs. Critically, manufacturers of AI systems must follow secure by design [hyperlink] principles: taking ownership of security outcomes for customers, leading product development with radical transparency and accountability, and making secure by design a top business priority. As the use of AI grows and becomes increasingly incorporated into critical systems, security must be a core requirement and integral to AI system development from the outset and throughout its lifecycle."

United States. Cybersecurity & Infrastructure Security Agency. Nov, 2023. 21p.

Teaching 'Proper' Drinking? Clubs and Pubs in Indigenous Australia

By Maggie Brady

In Teaching ‘Proper’ Drinking?, the author brings together three fields of scholarship: socio-historical studies of alcohol, Australian Indigenous policy history and social enterprise studies. The case studies in the book offer the first detailed surveys of efforts to teach responsible drinking practices to Aboriginal people by installing canteens in remote communities, and of the purchase of public hotels by Indigenous groups in attempts both to control sales of alcohol and to create social enterprises by redistributing profits for the community good. Ethnographies of the hotels are examined through the analytical lens of the Swedish ‘Gothenburg’ system of municipal hotel ownership.

The research reveals that the community governance of such social enterprises is not purely a matter of good administration or compliance with the relevant liquor legislation. Their administration is imbued with the additional challenges posed by political contestation, both within and beyond the communities concerned.

Canberra: ANU Press, 2017. 344p.

When Protest Makes Policy: How Social Movements Represent Disadvantaged Groups

By Sirje Laurel Weldon

A must-read for scholars across a broad sweep of disciplines. Laurel Weldon weaves together skillfully the theoretical strands of gender equality policy, intersectionality, social movements, and representation in a multimethod/level comparative study that unequivocally places women's movements at the center of our understanding of democracy and social change."" ---Amy G. Mazur, Washington State University "Laurel Weldon's When Protest Makes Policy expands and enriches our understanding of representation by stressing social movements as a primary avenue for the representation of marginalized groups. With powerful theory backed by persuasive analysis, it is a must-read for anyone interested in democracy and the representation of marginalized groups." ---Pamela Paxton, University of Texas at Austin ""This is a bold and exciting book. There are many fine scholars who look at women's movements, political theorists who make claims about democracy, and policy analysts who do longitudinal treatments or cross-sectional evaluations of various policies. I know of no one, aside from Weldon, who is comfortable with all three of these roles."" ---David Meyer, University of California, Irvine What role do social movements play in a democracy? Political theorist S. Laurel Weldon demonstrates that social movements provide a hitherto unrecognized form of democratic representation, and thus offer a significant potential for deepening democracy and overcoming social conflict. Through a series of case studies of movements conducted by women, women of color, and workers in the United States and other member nations of the Organisation for Economic Co-operation and Development (OECD), Weldon examines processes of representation at the local, state, and national levels. She concludes that, for systematically disadvantaged groups, social movements can be as important---sometimes more important---for the effective articulation of a group perspective as political parties, interest groups, or the physical presence of group members in legislatures. When Protest Makes Policy contributes to the emerging scholarship on civil society as well as the traditional scholarship on representation. It will be of interest to anyone concerned with advancing social cohesion and deepening democracy and inclusion as well as those concerned with advancing equality for women, ethnic and racial minorities, the working class, and poor people.

Ann Arbor: University of Michigan Press, 2011. 244p.

The Prospect of a Humanitarian artificial Intelligence: Agency and Value Alignment

By Carlo Montemayor

In this open access book, Carlos Montemayor illuminates the development of artificial intelligence (AI) by examining our drive to live a dignified life. He uses the notions of agency and attention to consider our pursuit of what is important. His method shows how the best way to guarantee value alignment between humans and potentially intelligent machines is through attention routines that satisfy similar needs. Setting out a theoretical framework for AI Montemayor acknowledges its legal, moral, and political implications and takes into account how epistemic agency differs from moral agency. Through his insightful comparisons between human and animal intelligence, Montemayor makes it clear why adopting a need-based attention approach justifies a humanitarian framework. This is an urgent, timely argument for developing AI technologies based on international human rights agreements.

London: Bloomsbury Academic, 2023. 297p.

From Bad To Worse: Amplification and Auto-Generation of Hate

By The Anti-Defamation League, Center for Technology and Society

The question of who is accountable for the proliferation of antisemitism, hate, and extremism online has been hotly debated for years. Are our digital feeds really a reflection of society, or do social media platforms and tech companies actually exacerbate virulent content themselves? The companies argue that users are primarily responsible for the corrosive content soaring to the top of news feeds and reverberating between platforms. This argument serves to absolve these multi-billion-dollar companies from responsibility for any role their own products play in exacerbating hate.

A new pair of studies from ADL and TTP (Tech Transparency Project) show how some of the biggest social media platforms and search engines at times directly contribute to the proliferation of online antisemitism, hate, and extremism through their own tools and, in some cases, by creating content themselves. While there are many variables contributing to online hate, including individual users’ own behavior, our research demonstrates how these companies are taking things from bad to worse.

For these studies, we created male, female, and teen personas (without a specified gender) who searched for a basket of terms related to conspiracy theories as well as popular internet personalities, commentators, and video games across four of the biggest social media platforms, to test how these companies’ algorithms would work. In the first study, three of four platforms recommended even more extreme, contemptuously antisemitic, and hateful content. One platform, YouTube, did not take the bait. It was responsive to the persona but resisted recommending antisemitic and extremist content, proving that it is not just a problem of scale or capability.

In our second study, we tested search functions at three companies, all of which made finding hateful content and groups a frictionless experience, by autocompleting terms and, in some cases, even auto-generating content to fill in hate data voids. Notably, the companies didn’t autocomplete terms or auto-generate content for other forms of offensive content, such as pornography, proving, again, that this is not just a problem of scale or capability.

What these investigations ultimately revealed is that tech companies’ hands aren’t tied. Companies have a choice in what to prioritize, including when it comes to tuning algorithms and refining design features to either exacerbate or help curb antisemitism and extremism.

As debates rage between legislators, regulators, and judges on AI, platform transparency, and intermediary liability, these investigations underscore the urgency for both platforms and governments to do more. Based on our findings, here are three recommendations for industry and government:

Tech companies need to fix the product features that currently escalate antisemitism and auto-generate hate and extremism. Tech companies should tune their algorithms and recommendation engines to ensure they are not leading users down paths riddled with hate and antisemitism. They should also improve predictive autocomplete features and stop auto-generation of hate and antisemitism altogether.

Congress must update Section 230 of the Communications Decency Act to fit the reality of today’s internet. Section 230 was enacted before social media and search platforms as we know them existed, yet it continues to be interpreted to provide those platforms with near-blanket legal immunity for online content, even when their own tools are exacerbating hate, harassment and extremism. We believe that by updating Section 230 to better define what type of online activity should remain covered and what type of platform behavior should not, we can help ensure that social media platforms more proactively address how recommendation engines and surveillance advertising practices are exacerbating hate and extremism, which leads to online harms and potential offline violence. With the advent of social media, the use of algorithms, and the surge of artificial intelligence, tech companies are more than merely static hosting services. When there is a legitimate claim that a tech company played a role in enabling hate crimes, civil rights violations, or acts of terror, victims deserve their day in court.

We need more transparency. Users deserve to know how platform recommendation engines work. This does not need to be a trade secret-revealing exercise, but tech companies should be transparent with users about what they are seeing and why. The government also has a role to play. We’ve seen some success on this front in California, where transparency legislation was passed in 2022. Still, there’s more to do. Congress must pass federal transparency legislation so that stakeholders (the public, researchers, and civil society) have access to the information necessary to truly evaluate how tech companies’ own tools, design practices, and business decisions impact society.

Hate is on the rise. Antisemitism both online and offline is becoming normalized. A politically charged U.S. presidential election is already under way. This is a pressure cooker we cannot afford to ignore, and tech companies need to take accountability for their role in the ecosystem.

Whether you work in government or industry, are a concerned digital citizen, or a tech advocate, we hope you find this pair of reports to be informative. There is no single fix to the scourge of online hate and antisemitism, but we can and must do more to create a safer and less hate-filled internet.

New York: ADL, 2023. 18p.

Countering the Challenge of Youth Radicalisation

Kumar Ramakrishna

One significant highlight of the recent Singapore Terrorism Threat Assessment Report (STTAR) 2023 was that since 2015, 11 self-radicalised Singaporean youths aged 20 or below have been detained under the Internal Security Act (ISA). In addition, three of the four cases dealt with since the previous STTAR in 2022 involved youths. STTAR 2023 noted that the youngest detainee was only 15 years old.

The three Singaporean youths referenced in STTAR 2023 were all supporters of the Islamic State of Iraq and Syria (ISIS) and had been radicalised by Islamist extremist narratives online. The 15-year-old mentioned earlier was also a staunch supporter of the rival Al Qaeda (AQ) global terror network. An 18-year-old detainee apparently went so far as to have planned to declare Coney Island (about 133 hectares in size and lying very close to the main island of Singapore) to be an ISIS wilayat (province). He had also planned to travel to overseas conflict zones to fight alongside ISIS’s affiliates.

Such concern with youth radicalisation is not new. In 2018, Singaporean authorities had already observed that youth aged between 17 and 19 were “falling prey to extremist ideologies” through “heavy reliance” on “social media and the Internet” for information.

There are two observations that can be made in this regard.

Global and Regional Trends

First, youth radicalisation is not just a Singaporean, but a global and regional issue. Terrorism researchers J. M. Berger and Jessica Stern in their publication ISIS: The State of Terror (2015) affirmed that ISIS “actively recruits children” to engage in “combat, including suicide missions”. AQ is hardly different. US intelligence has long warned that AQ sought to radicalise western youth for the purpose of mounting terror strikes in the West – including suicide attacks.

The Malaysian government noted in 2017 that “around 80 per cent of the arrests that the Malaysian police” had made since September 2016 were of people “under the age of 40”. The same year, the Indonesian government estimated that about 101 youths had joined ISIS in Iraq and Syria. Analysts since 2018 have worried that the use of youth in terrorist attacks in Southeast Asia may well be a “future trend”.

Low-Tech, Lone-Actor Attacks

Second, STTAR 2023 observed that self-radicalised youth, rather than mounting complicated attacks using firearms and explosive materials that are difficult to procure in Singapore, could nevertheless “pivot towards other available means for conducting terrorist attacks, such as knives” in conducting so-called “lone actor” attacks.

This low-tech, lone-actor attack modality has been actively promoted by ISIS for years. A 2016 article in the online ISIS magazine Rumiyah enjoined supporters around the world to “stage knife attacks in public places”, as knives were easy to obtain and “effective weapons of terror”. In fact, it has been observed that the “use of knives by single jihadists is gaining popularity around the world”.

Understanding Youth Radicalisation

Youth radicalisation is a complex and multifaceted phenomenon. Three key explanatory factors can be outlined briefly.

First, at a psychological level, during the teenage years the executive reasoning centres of the brain develop more slowly than the emotional parts. This helps explain why teenagers between 18 and 20 years of age often appear as impulsive and rash. Additionally, such emotional immaturity frequently expresses itself as a quest for absolute, black-and-white, intellectual and moral certainty.

Hence STTAR 2023 observes that the essentially “structured and dichotomous” extremist worldview appears as “more appealing to the young”. Emotionally vulnerable youth are also relatively susceptible to false extremist promises of excitement and thrills – all for an ostensibly righteous cause. In essence, because youth are in the midst of a “tumultuous biological, cognitive, social and emotional transition to adulthood”, they are relatively ripe targets for terrorist cultivation.

Second, experts have observed that youth coming from unstable family contexts with weak or no father figures tend to possess fragile egos and identities, ill-prepared to endure life’s ordinary challenges. Such youth, as James W. Jones in Blood That Cries Out from the Earth (2012) notes, tend to seek “external objects that claim to be perfect and ideal” and that supposedly offer “that necessary sense of connection to something of value” that can “buttress” their “self-esteem”.

This is precisely where ISIS and AQ propaganda strike home. The importance of stable families cannot be overstated. In Saudi Arabia, for instance, it was found that youth who had grown up “without their parents present” were at risk, as their “personal and social problems” appeared to “contribute to radicalisation”.

Third, youths growing up in subcultures that are relatively insulated from the wider community are also at risk. In particular, subcultures that promote exclusionary attitudes that are “self-righteous, prejudicial and condemnatory toward people outside their groups” may inadvertently soften the ground for future exploitation by extremists.

Meanwhile, subcultures that even passively promote retrograde norms of masculinity, tend to also pave the way for extremist ideologues to later persuade male youth that taking up violence against one’s putative enemies – including up-close-and-personal knife attacks – is to be a “real man” and “heroic”.

Policies Needed to Counter Youth Radicalisation

The foregoing analysis suggests that a suite of integrated policy interventions are needed in three broad areas to counter youth radicalisation.

First, policies are needed to directly foster strong and stable family contexts in Singapore. Ameliorating the societal economic and competitive pressures that generate stress levels negatively affecting parenting is important. Fundamentally, fostering a healthy family unit anchored by strong father figures and role models helps encourage normal ego and intellectual development in youth. This also strengthens their emotional and intellectual resilience against false extremist promises of absolute intellectual and moral certainties.

Second, cultural and other community institutions have a role in actively promoting inclusiveness. Such institutions could assist parents and communities in socialising their young into appropriate prosocial behaviour as they grow up in a secular, diverse and globalised multicultural society like Singapore. The community-building elements of the ongoing SG Secure campaign in Singapore have much relevance in the socialisation process.

Third, a central piece of the policy puzzle must be education. Ideally, whether secular or religious, the education of our youth should aim to broaden intellectual horizons. The core idea is to develop in youth intellectual resilience against the “simplified monocausal interpretation of the world” offered by ISIS and AQ – and other extremists – “where you are either with them or against them”.

Another key aspect of the educational space – religious and secular – is to promote healthy and balanced societal norms about masculinity. The aim is to create mental firewalls against attempts by online extremists to encourage more toxic and violent expressions of what it means to be male. In this context, as STTAR 2023 states, rather than travelling to conflict zones to fight, Singaporean youth should know that there are peaceful, legitimate and more effective ways to support good causes around the world, such as “the cause of helping Palestine”.

Conclusion

The United Nations Plan of Action to Prevent Violent Extremism urges that in the struggle against violent extremism, the world simply must “harness the idealism, creativity and energy of young people”. In this regard, the hearts and minds of Singaporean youth is absolutely one strategic battlespace that we must not ignore.

Singapore: S. Rajaratnam School of International Studies, NTU Singapore, 2023. 4p.

Bad Gateway: How Deplatforming Affects Extremist Websites

By Megan Squire

Deplatforming websites—removing infrastructure services they need to operate, such as website hosting—can reduce the spread and reach of extremism and hate online, but when does deplatforming succeed? This report shows that deplatforming can decrease the popularity of extremist websites, especially when done without warning. We present four case studies of English-language, U.S.-based extremist websites that were deplatformed: the Daily Stormer, 8chan/8kun, TheDonald.win/Patriots.win, and Nicholas Fuentes/America First. In all of these cases, the infrastructure service providers considered deplatforming only after highly publicized or violent events, indicating that at the infrastructure level, the bar to deplatforming is high. All of the site administrators in these four cases also elected to take measures to remain online after they were deplatformed. To understand how deplatforming affected these sites, we collected and analyzed publicly available data that measures website-popularity rankings over time.

We learned four important lessons about how deplatforming affects extremist websites:

  • It can cause popularity rankings to decrease immediately.

  • It may take users a long time to return to the website. Sometimes, the website never regains its previous popularity.

  • Unexpected deplatforming makes it take longer for the website to regain its previous popularity levels.

  • Replicating deplatformed services such as discussion forums or live-streaming video products on a stand-alone website presents significant challenges, including higher costs and smaller audiences.

    Our findings show that fighting extremism online requires not only better content moderation and more transparency from social media companies but also cooperation from infrastructure providers like Cloudflare, GoDaddy, and Google, which have avoided attention and critique.

New York: Anti-Defamation League, Center for Technology and Society, 2023. 37p.

Making #BlackLivesMatter in the Shadow of Selma: Collective Memory and Racial Justice Activism in U.S.

By Sarah J Jackson

It is clear in news coverage of recent uprisings for Black life that journalists and media organizations struggle to reconcile the fact of ongoing racism with narratives of U.S. progress. Bound up in this struggle is how collective memory-or rather whose collective memory-shapes the practices of news-making. Here I interrogate how television news shapes collective memory of Black activism through analysis of a unique moment when protests over police abuse of Black people became newsworthy simultaneous with widespread commemorations of the civil rights movement. I detail the complex terrain of nostalgia and misremembering that provides cover for moderate and conservative dele-gitimization of contemporary Black activism. At the same time, counter-memories, introduced most often by members of the Black public sphere, offer alternative, actionable, and comprehensive interpretations of Black protest.

Communication, Culture and Critique, Volume 14, Issue 3, September 2021, Pages 385–404,

Hate in the Lone Star State: Extremism & Antisemitism in Texas

By The Anti-Defamation League, Center on Extremism

Since the start of 2021, Texas has experienced a significant amount of extremist activity. One driver of this phenomenon is Patriot Front, a white supremacist group that has distributed propaganda across Texas – and the rest of the U.S. – with alarming frequency, using the state as a base of operations. Two other factors are extremists who continue to target the LGBTQ+ community and QAnon supporters who have gathered for conferences and rallies across the state.

Texas has also seen a significant increase in antisemitic incidents over the last two years. It recorded the country’s fifth-highest number of antisemitic incidents in 2022, at a time when ADL has tracked the highest-ever number of antisemitic incidents nationwide.

This report will explore a range of extremist groups and movements operating in Texas and highlights the key extremist and antisemitic trends and incidents in the state in 2021 and 2022. It also includes noteworthy events and incidents from the first half of 2023.

Key Statistics

  • Antisemitic Incidents: According to the ADL’s annual Audit of Antisemitic Incidents, Texas has seen a dramatic rise in antisemitic incidents in recent years. In 2022, the number of incidents increased by 89% from 2021 levels, rising from 112 to 212 incidents. Since 2021, ADL has tracked a total of 365 incidents in the state.

  • Extremist Plots and Murders: In 2021 and 2022, ADL documented two extremist murders in Texas and six terrorist plots. In 2023, a gunman who embraced antisemitism, misogyny and white supremacy opened fire in a mall parking lot in Allen, killing eight people and wounding seven more before police shot and killed him.

  • Extremist Events: Since 2021, ADL has documented 28 extremist events in Texas, including banner drops, flash demonstrations, training events, fight nights, protests, rallies and meetings.

  • White Supremacist Propaganda: In 2022, ADL documented 526 instances of white supremacist propaganda distributions across Texas, a 60% increase from 2021 (329). There have been 1,073 propaganda incidents since 2021. The groups responsible for the majority of the incidents were Patriot Front and the Goyim Defense League (GDL).

  • Hate Crimes Statistics: According to the latest FBI hate crimes statistics from 2021, there were 542 reported hate crimes in Texas in that year, an increase of 33% from the 406 incidents recorded in 2020. Hate and bias crime data in Texas and nationally highlights how hate crimes disproportionately impact the Black community.

  • Insurrection Statistics: Seventy-four of the 968 individuals logged by the George Washington University Program on Extremism who have been charged in relation to the January 6, 2021 attack on the U.S. Capitol are Texas residents, the second most in the nation.

  • ADL and Princeton’s Bridging Divides Initiative Threats and Harassment Dataset: The Threats and Harassment Dataset (THD) tracks unique incidents of threats and harassment against local U.S. officials between January 1, 2020, and September 23, 2022 in three policy areas (election, education and health). Texas recorded seven incidents of threats and harassment against local officials.

New York: ADL, Center on Extremism, 2023. 23p.

Hate in the Prairie State: Extremism & Antisemitism in Illinois

By The Anti-Defamation League

In May 2023, a man outraged over abortion rights set his sights on a building in Danville, Illinois, that was slated to become a clinic offering women’s health services, including abortions. The man, Philip Buyno of Prophetstown, allegedly filled containers with gasoline and loaded them into his car. His alleged efforts to destroy the clinic – by ramming his car into the building and throwing a gas can into the space – failed, and he was arrested. He later told the FBI he’d “finish the job” if given the chance.

Buyno was an extremist, intent on attacking his perceived enemy no matter the cost. Over the past several years, Americans have witnessed a barrage of extremist activity: attacks on our democratic institutions, antisemitic incidents, white supremacist propaganda efforts, vicious, racially motivated attacks, bias crimes against the LGBTQ+ community and violent threats to women’s healthcare providers.

Illinoisians have watched these same hatreds – and more – manifest in their own state.

This report explores a range of extremist groups and movements operating in Illinois and highlights the key extremist and antisemitic trends and incidents in the state in 2021 and 2022. It also includes noteworthy events and incidents from the first half of 2023.

There is no single narrative that tells the story of extremism and hate in Illinois. Instead, the impact is widespread and touches many communities. As in the rest of the country, both white supremacist and antisemitic activity have increased significantly over the last two years, but that’s not the whole story.

The Prairie State is also home to a sizeable number of current and former law enforcement officers who have at one point belonged to or associated with extremist organizations or movements. Our research additionally shows a continued threat to Illinois’s women’s health facilities, which have been targeted with arson and other violent plots by anti-abortion extremists. This reflects the broader, national threat to reproductive rights.

Key Statistics

Antisemitic Incidents: According to ADL’s annual Audit of Antisemitic Incidents, Illinois has seen a dramatic rise in antisemitic incidents in recent years. In 2022, the number of incidents increased by 128% from 2021 levels, rising from 53 to 121. The state’s total was the seventh-highest number of incidents in the country in a year when ADL tracked the highest-ever number of antisemitic incidents nationwide. This is a dramatic increase from 2016, when there were 10 incidents. Preliminary numbers through June 2023 indicate that there have been at least 33 additional antisemitic incidents in the state.

Extremist Plots and Murders: In 2021 and 2022, ADL documented one extremist murder in Illinois. In November 2022, a man allegedly intentionally drove the wrong way on an interstate highway and crashed into another car, killing the driver. The man said he wanted to kill himself after being convicted for crimes committed while participating in the January 6 insurrection, and he has been charged with additional crimes, including first-degree murder.

Extremist Events: Since 2021, ADL has documented four white supremacist extremist events in Illinois, predominately marches and protests.

White Supremacist Propaganda: In 2022, ADL documented 198 instances of white supremacist propaganda distributions across Illinois, an increase of 111% from 2021 (94). Through May 2023, there have been an additional 64 white supremacist propaganda incidents. Patriot Front was responsible for a large majority of white supremacist propaganda throughout Illinois.

Hate Crimes Statistics: According to the latest FBI hate crimes statistics available, there were 101 reported hate crimes in Illinois that targeted a variety of communities, including Jewish, Black and Asian American and Pacific Islander. This total was an increase of 80% from the 56 incidents recorded in 2020.

Insurrection Statistics: Thirty-six of the 968 individuals logged by the George Washington University Program on Extremism who have been charged in relation to the January 6, 2021 attack on the U.S. Capitol are Illinois residents.

ADL and Princeton University’s Bridging Divides Initiative Threats and Harassment Dataset: The Threats and Harassment Dataset (THD) tracks unique incidents of threats and harassment against local U.S. officials between January 1, 2020, and September 23, 2022, in three policy areas (election, education and health). Illinois recorded six incidents of threats and harassment against local officials.

New York, ADL, Center on Extremism, 2023. 24p.

From Bad to Worse: Auto-generating & Autocompleting Hate

By The Anti-Defamation League, Center for Technology and Society

Executive Summary Do social media and search companies exacerbate antisemitism and hate through their own design and system functions? In this joint study by the ADL Center for Technology and Society (CTS) and Tech Transparency Project (TTP), we investigated search functions on both social media platforms and Google. Our results show how these companies’ own tools–such as autocomplete and auto-generation of content–made finding and engaging with antisemitism easier and faster.1 In some cases, the companies even helped create the content themselves. KEY FINDINGS: • Facebook, Instagram, and YouTube are each hosting dozens of hate groups and movements on their platforms, many of which violate the companies’ own policies but were easy to find via search. Facebook and Instagram, in fact, continue hosting some hate groups that parent company Meta has previously banned as “dangerous organizations.” • All of the platforms made it easier to find hate groups by predicting searches for the groups as researchers began typing them in the search bar. • Facebook automatically generated business Pages for some hate groups and movements, including neo-Nazis. Facebook does this when a user lists an employer, school, or location in their profile that does not have an existing Page–regardless of whether it promotes hate. Our researchers compiled a list of 130 hate groups and movements from ADL’s Glossary of Extremism, picking terms that were tagged in the glossary with all three of the following categories: “groups/ movements,” “white supremacist,” and “antisemitism.”2 The researchers then typed each term into the respective search bars of Facebook, Instagram and YouTube, and recorded the results. The study also found that YouTube auto-generated channels and videos for neo-Nazi and white supremacist bands, including one with a song called “Zyklon Army,” referring to the poisonous gas used by Nazis for mass murder in concentration camps. • In a final test, researchers examined the “knowledge panels” that Google displays on search results for hate groups–and found that Google in some cases provides a direct link to official hate group websites and social media accounts, increasing their visibility and ability to recruit new members.

New York: Anti-Defamation League, Center for Technology and Society, 2023. 18p.

From Bad to Worse: Algorithmic Amplification of Antisemitism and Extremism

By The Anti-Defamation League, Center for Technology and Society

The question of who is accountable for the proliferation of antisemitism, hate, and extremism online has been hotly debated for years. Are our digital feeds really a reflection of society, or do social media platforms and tech companies actually exacerbate virulent content themselves? The companies argue that users are primarily responsible for the corrosive content soaring to the top of news feeds and reverberating between platforms. This argument serves to absolve these multi-billion-dollar companies from responsibility for any role their own products play in exacerbating hate. A new pair of studies from ADL (the Anti-Defamation League) and TTP (Tech Transparency Project) show how some of the biggest social media platforms and search engines at times directly contribute to the proliferation of online antisemitism, hate, and extremism through their own tools and, in some cases, by creating content themselves. While there are many variables contributing to online hate, including individual users’ own behavior, our research demonstrates how these companies are taking things from bad to worse. For these studies, we created male, female, and teen personas (without a specified gender) who searched for a basket of terms related to conspiracy theories as well as popular internet personalities, commentators, and video games across four of the biggest social media platforms, to test how these companies’ algorithms would work. In the first study, three of four platforms recommended even more extreme, contemptuously antisemitic, and hateful content. One platform, YouTube, did not take the bait. It was responsive to the persona but resisted recommending antisemitic and extremist content, proving that it is not just a problem of scale or capability. In our second study, we tested search functions at three companies, all of which made finding hateful content and groups a frictionless experience, by autocompleting terms and, in some cases, even auto-generating content to fill in hate data voids. Notably, the companies didn’t autocomplete terms or auto-generate content for other forms of offensive content, such as pornography, proving, again, that this is not just a problem of scale or capability. What these investigations ultimately revealed is that tech companies’ hands aren’t tied. Companies have a choice in what to prioritize, including when it comes to tuning algorithms and refining design features to either exacerbate or help curb antisemitism and extremism. As debates rage between legislators, regulators, and judges on AI, platform transparency, and intermediary liability, these investigations underscore the urgency for both platforms and governments to do more.

New York: The Anti-Defamation League, Center for Technology and Society, 2023. 36p.