Open Access Publisher and Free Library
CRIMINAL JUSTICE.jpeg

CRIMINAL JUSTICE

CRIMINAL JUSTICE-CRIMINAL LAW-PROCDEDURE-SENTENCING-COURTS

Posts in Criminal Justice
SherlockAI and the Sentencing Review: AI- Assisted Radical Help

By: Dave Nicholson and Helen Codd

The Independent Sentencing Review highlights the potential of AI for supporting behavioural change in criminal justice involved people and identifies SherlockAI as deserving further exploration and evaluation to realise that potential (p.135). SherlockAI was co-founded by criminal justice involved people in partnership with the authors, and in this article, we explain how SherlockAI offers a distinctive and innovative resource. Both the authors are part of the SherlockAI team and this short article offers insights into why the innovative approach of this particular app resonates with the findings of the review at a time when there are ongoing emergent developments in mobile-based technologies to encourage and support desistance (Bartels, 2023; Knight et al., 2024; McGreevy, 2017; Morris and Graham, 2019) and ongoing interest in Hilary Cottam’s work on radical help and radical care (Cottam, 2018; Cottam, 2021).

WOMEN, SENTENCING, AND SYSTEMIC CHANGE: IMPLEMENTING THE REVIEW IN A GENDERED CJS

By: Phoebe Lil, Advance Charity

The publication of the Independent Sentencing Review (ISR) in May 2025 provoked a diverse range of reactions from the specialist women’s sector. For some, particularly those delivering services responding to Violence Against Women and Girls (VAWG), there was trepidation about the impact of measures designed to ease the prison capacity crisis on victim/survivors. Specialist organisations working with justice-experienced women welcomed the range of measures that would have an overall positive impact on criminalised women.

But what do the women affected by these recommendations think? This article explores several thematic ISR recommendations, subsequently accepted by the Government, in the context of women’s experiences of existing interventions. Drawing on Advance’s experience – a leading women’s charity that supports women in contact with the criminal justice system and those who have experienced, or are at risk of, domestic abuse and other forms of gender-based violence – we will examine how the recommended measures can be implemented to best meet the needs of women who have been victimised, criminalised or – as is often the case – both.

The article will begin by demonstrating how a lack of adequate victim response can result in women committing offences, including examples from Advance’s services for criminalised women. As is well documented, women who offend are much more likely than the general population to have experienced some form of abuse, domestic or other.

Drawing on insights from services and best practice by Advance and partners, we then highlight how ISR recommendations should be implemented to ensure women’s safety. Finally, we consider where gaps in recommendations remain, and where the Government must invest to deliver a truly whole-system reform of a CJS that works for women, enabling

both the successful implementation of the ISR and other government ambitions, including the Women’s Justice Board and halving VAWG in a decade.

Racial and Ethnic Disparities in Felony Case Processing in New York State

By New York State Division of Criminal Justice Services Justice Lab

This report describes an analysis of racial and ethnic disparities in felony case processing in New York State at three processing points: arrests made in 2019, disposition of those arrests, and prison sentences imposed after convictions resulting from those arrests. 2019 was chosen as the benchmark because arrests made that year occurred prior to the implementation of landmark changes to the state’s bail, and evidence and information disclosure (discovery) laws. As a result, this analysis provides an overview of how the system functioned prior to those reforms and before the full impact of the COVID-19 pandemic, which disrupted all facets of the state’s criminal justice system.


Albany: New York State, Division of Criminal Justice Services Justice Lab.. 2025. 22p.

The Scam Economy: The True Cost of Online Scams and Crimes in America

By  Consumer Federation of America

Federal agencies, third parties, and other groups report on scam losses each year, but these numbers are only the tip of the iceberg in measuring the size and devastation experienced by those who are targeted. Behind these reports and big spreadsheets describing reported losses are shattered families, rent money lost, and grandmothers exploited. Newer technology is leading to a rise in these scams – in both severity and number: AI is supercharging these scams, social media platforms are enabling the spread, and data brokers facilitate targeting of victims, allowing criminals to reach consumers at massive scales while exploiting highly precise profiling to victimize vulnerable people. One of the biggest problems in fully understanding the scope of these scams is underreporting. Due to reporting fragmentation and communication, as well as the understandable devastation, embarrassment, and confusion that victims often feel, estimates on how many people report their losses to scams put it extremely low – often in the single digit percent of the actual number, according to conservative key government estimations. CFA is proud to publish this report that takes the most conservative estimate of underreporting and uses it to estimate The True Cost of Scams. While this issue is complicated to solve completely, there are significant unrealized opportunities for legislators, enforcement agencies, and industry to step up to address it.

Washington, DC: Consumer Federation of America, 2026, 31p.

CHICAGO POLICE TRAINING TEACHES OFFICERS THAT THEIR LIVES MATTER MORE THAN COMMUNITY LIVES

Public Report on Chicago Police Training on the Use of Force

From the introduction; This Report from community representatives of Chicago’s Use of Force Community Working Group offers our feedback on the Chicago Police Department’s (CPD) training on de-escalation and the use of force. The Working Group was first convened in the summer of 2020 in response to the requirements of the federal civil rights Consent Decree designed to bring an end to the CPD’s pattern of police brutality and racial discrimination. Over the course of two years, the Working Group persuaded the CPD to make transformative changes to its policies governing police use of force. 1 Last fall, we issued a Public Report on CPD’s new policies, including areas still in need of change. 2 The new policies, if implemented and enforced on the ground, have the potential to dramatically reduce unnecessary CPD violence and improve public safety.

Second Report of the Community Representatives of Chicago’s Use of Force Working Group. 2023March 2023. 24p.

Prevention Beyond Deterrence

By Benjamin A. Barsky
This Article reconceptualizes preventive justice—the public safety paradigm that seeks to prevent harm before it occurs. Scholars have long documented how cities have advanced this paradigm through largely punitive measures, notably variants of broken windows policing, which posit that aggressive misdemeanor enforcement deters more serious crime. Yet in the aftermath of the 2020 George Floyd protests, and as underscored recently in City of Grants Pass v. Johnson, these measures have faced a legitimacy crisis—prompting calls for nonpunitive responses to nonviolent incidents. This Article establishes a preventive justice approach that advances health and safety without emphasizing crime deterrence. It draws on fieldwork research on alternative emergency response programs (“Alternative Responses”) that proliferated after the 2020 protests to replace police in health crises and other nonviolent incidents. Data include interviews with fifty individuals and over two hundred hours of observations in Oakland, California; Dayton, Ohio; and Madison, Wisconsin. 

Chatbots in the Criminal Justice System An overview of chatbots and their underlying technologies and applications

By Camello, M. L., Houston-Kolnik, J. D., & Planty, M

This technology brief explores the use of chatbots within the criminal justice system. The goal of this brief is to orient the reader to chatbots, present foundational insights from real-world examples of chatbot use, highlight considerations for implementation, and discuss the future of chatbots in the criminal justice system  

  Key Takeaways ¡ There are numerous benefits to implementing chatbots, including: Ÿ improved efficiency for users accessing information, Ÿ enhanced community engagement by creating a 24/7 communications channel, Ÿ expanded access to justice through multilingual chatbot capabilities, Ÿ reduced costs by automating FAQ support traditionally done through live chat, and Ÿ reduced staff workloads. ¡ Chatbots carry inherent risks that decision-makers need to consider before implementation, including: Ÿ misinterpretation of user input leading to incorrect responses, Ÿ biased training data, and Ÿ vulnerability to hacking. ¡ Advancements in AI have enhanced and will continue to enhance chatbot capabilities and applications; however, despite these advancements, deploying AI-driven chatbots is not a “plug-and-play” opportunity for criminal justice applications

Research Triangle Park, NC:RTI International, 2021. 15p.  

Landscape Study of Generative Artificial Intelligence in the Criminal Justice System

By Smith, J., Camello, M., & Planty, M

  Generative artificial intelligence (AI) refers to AI1 used to create content, such as text, images, music, audio, and videos.2 Generative AI offers many potential benefits, enabling users to automate, augment, and accelerate a wide range of workflows, from simple administrative tasks like transcription and translation to more-complex functions such as investigation and decision support. In the criminal justice system, generative AI offers promising solutions to address human resource and budget challenges, allowing practitioners to focus on more-impactful work. Generative AI–integrated tools may enhance data analysis, improve detection and objective assessment of evidence, and streamline administrative processes. However, its integration, particularly in the criminal justice domain, raises some concerns, including potential biases, privacy issues, and the need for rigorous oversight to ensure effective implementation. It is unclear whether these tools can deliver on their promised efficiencies in practice, as evidenced by early research evaluating time savings of implementing AI-assisted report writing software.3 These concerns highlight the necessity for addressing bias and accuracy, maintaining strict data privacy and security protocols, and promoting transparency and accountability in AI-driven decisions and processes. This report is intended to help criminal justice decision-makers do the following: ¡ Understand what generative AI is and how it relates to the criminal justice system ¡ Identify how generative AI may be applied to tasks and jobs within the criminal justice system and the potential benefits, realities, and limitations ¡ Consider the technical, operational, and governance factors that may influence adoption and implementation ¡ Understand what makes up the generative AI technology stack and how models can be trained Key Takeaways • Generative AI represents an acceleration and advancement in technological innovation that already impacts the criminal justice system and will continue to do so—it is no longer a question of if or when, but how and to what extent. • Generative AI–powered software tools may offer many potential benefits, such as improving efficiency and augmenting capabilities across an extremely broad set of applications for criminal justice system stakeholders. Although these products hold promise, little empirical evidence currently supports or refutes promised benefits from these products. • Generative AI models can be deployed in various forms, including cloud-based models that centralize data processing and federated models that enable decentralized training across multiple locations, preserving data privacy and enhancing security for sensitive criminal justice applications. • Decision-makers should be aware of the substantial technical, operational, and governance risks associated with generative AI– powered software tools prior to implementation. • Responsible use of generative AI requires addressing bias and accuracy concerns, maintaining strict data privacy and security protocols, adhering to ethics and legal standards, and promoting transparency and accountability in AI-driven decisions and processes. • Generative AI technology is evolving faster than the legal or policy environment for AI—the criminal justice community must be proactive and must implement robust internal training and policy frameworks rather than relying solely on external legal or regulatory guidance.  

Research Triangle Park, NC: RTI International, 2025 28p.

Above the Law? NYPD Violations of the Public Oversight of Surveillance Technology (POST) Act--

By Eleni Manis, PHD and Albert Fox Cahn, Esq

In this report, S.T.O.P. documents the New York City Police Department (NYPD)’s repeated failures to comply with New York City’s Public Oversight of Surveillance Technology Act (POST Act). Enacted in 2020, the POST Act is the first law to oversee the NYPD’s use of surveillance technology. A first attempt to regulate NYPD’s surveillance tools, the law only requires NYPD disclose its surveillance tools. As this report establishes, NYPD falls far short of the reporting norms set by other police departments subject to similar surveillance technology oversight laws. The report concludes by calling on the New York City Council to use its oversight authority to ensure that the bill is not ignored. S.T.O.P. also recommends that individual lawmakers and civil society organizations continue to evaluate potential litigation, seeking judicial intervention to compel the NYPD to comply with the POST Act.

Scarily Precise: Location Tracking with Ultra-Wideband

By Mahima Arya, Juilee Shivalker, Maxwell Votey, Jackie Singh, Albert Fox Cahn, ESQ., Eleni Manis, PHD, MPA

Ultra-Wideband (“UWB”) is a short-range wireless technology somewhat like Bluetooth or Wi-Fi, but with superior locating abilities, enabling the highly accurate identification of an object’s position in three-dimensional space. UWB capabilities are now standard on many newer-model smartphones, allowing users to track UWB-enabled beacons from their smartphones and allowing vendors to leverage inbuilt UWB capabilities to create massive sensor networks using unwitting users’ mobile devices. This report focuses on privacy and anti-trust concerns surrounding UWB beacons, the tip of the iceberg of planned UWB applications. Because UWB’s locating abilities are so precise, beacons provide an easy way to track and stalk people. Beacons pass detailed device location data through neighboring devices’ networks, introducing the twin risks of malicious hacking and commercial exploitation by vendors. Apple and Amazon have acted to shut competitors out of the UWB beacon space, demonstrating a clear linkage between the story of these beacons and the larger story of Big Tech anti-trust concerns.

The Spy Next Door: The Danger of Neighborhood Surveillance Apps

By Paula Garcia-Salazar, Nina Loshkajian, Albert Fox Cahn, ESQ., Eleni Manis, PHD, MPA

Everyone sees them: signs “welcoming” you to a neighborhood with the warning that “All Suspicious Persons and Activities Reported to Law Enforcement.” In the 1960’s, Neighborhood Watch groups proliferated in supposed response to increased burglaries. 1 In fact, the groups appear to have been a direct response to increased residential integration.2 A brainchild of the National Sheriffs Association, Neighborhood Watch groups were touted as a way to increase community involvement in crime prevention by encouraging residents to patrol their own streets and act as the eyes and ears of the local police. 3 But too often, local residents have interpreted this as a chance to become vigilantes, in many cases acting purely on bias to raise false alarms and profile fellow community members, endangering the very people these groups are allegedly designed to protect. 4 The groups have proliferated across the country even as they have been demonstrated to promote profiling and distract from actual public safety, there being little evidence that Neighborhood Watch programs reduce crime. 5 Now, a new form of Neighborhood Watch is here: smartphone-based apps that supplant the classic program with online forums for local neighborhoods. Nextdoor (27 million regular users),6 Neighbors by Ring (10 million users),7 Citizen (7 million users),8 and recently piloted Facebook Neighborhoods deliver “hyperlocal” updates to geographic “neighbors.” Nextdoor and Facebook Neighborhoods invite users to post on a range of local-interest content: upcoming events, business reviews, goods for sale, and so on. But the backbone of neighborhood surveillance apps is crime, both real and imagined. Apps encourage users to upload video footage, photos and descriptions of suspected crimes and supposedly “suspicious” people near their homes, producing reports riddled with dog whistles and overt racism. Apps court a police audience for these posts and enable police to request app users’ video footage, photos, and input. Apps even drive user engagement by inviting bystanders join in on the crime-oriented conversation: as on Facebook, users can comment, “like” and otherwise interact with posts with the click of a button.

NYC Internet Remastered: A Privacy & Equity Analysis of the New York City Internet Master Plan

By Albert Fox Cahn, ESQ. Caroline Magee

On January 7, 2020, New York City released its Internet Master Plan. The document identified how many New Yorkers lacked access to broadband and what the City intended to do about it. The numbers were staggering: 46% of New York households in poverty lack a home broadband subscription.[1]

But what had been a problem evolved into a crisis when the COVID-19 pandemic descended on New York City in March. For the first time, New Yorkers had to stay home: as New York’s 1.1 million public school students logged into Zoom for the first time, and their parents tried to take phone meetings in the same rooms, it became clear that the internet, once a luxury, was now a necessity.

As the fall semester loomed, the de Blasio administration tried to close the gap in July 2020, investing $157 million for providing low-or-no-cost internet to 600,000 New Yorkers, one-third of whom live in New York City Housing Authority housing.[2] The City is scrambling. In this light, a plan to expand internet access for residents of New York City is much needed and reflects the modern reality of reliable, affordable internet access as a barrier for reaching public services and economic opportunities. What is missing from the City’s Internet Master Plan, however, is a needed degree of specificity on the privacy and cybersecurity protections built into this planned internet expansion.

2021 NYC Hikvision Camera Census

By S.T.O.P

In this first annual surveillance census, S.T.O.P. sought to map out all of the internet-enabled cameras operating in New York City. Even as many companies hide the location of their surveillance equipment, the Chinese-based firm Hikvision still allows their devices to be located…and the results are shocking. We identified 16,692 cameras in New Yorker City alone. This page details where Hikvision cameras across the five boroughs. While the numbers are extraordinarily high, please remember that for every one Hikvision camera we have mapped, there are dozens, possibly hundreds of other camera systems whose location remains hidden.

January 6th: A Surveillance Review

By S.T.O.P.

Our review of the 146 individuals who plead guilty in connection with the January 6th insurrection shows that facial recognition and other surveillance technology is not needed to properly identify suspects. Department of Justice data documents only 3 cases that used facial recognition. In contrast, the vast majority of cases used low-tech and less-invasive techniques, with 104 cases using tips from the public.

ShotSpotter and the Misfires of Gunshot Detection Technology

By Helen Webley-Brown, Anna Sipek, Katie Buoymaster, Juliee Shivalker, Will Owen, Eleni Manis, PHD, MPA

U.S. cities are squandering money on ShotSpotter’s unproven gunshot surveillance technology. 

  • ShotSpotter surveillance increases police activity, but it wastes officers’ time. One major study of the technology showed that ShotSpotter fails as an investigative tool, providing no evidence of a gun-related crime more than 90% of the time and producing exceedingly few arrests (less than 1 per 200 stops) and recovered guns (less than 1 per 300 stops).

  • ShotSpotter fails the Black and Latinx communities where it appears to be disproportionately deployed. The tool increases police activity and the risk of police violence without producing any significant effect on firearm offenses or on shooting victims’ medical outcomes.

Wiretaps on Wheels

By Evan Enzer, Anna Sipek, Mahima Arya, Nina Loshkajian, David Siffert, Eleni Manis, PHD, MPA

New cars are surveillance on wheels, sending sensitive passenger data to carmakers and police. Cars also store enormous amounts of passenger data onboard, where police can extract it using specialized tools. We estimate that law enforcement agencies could have accessed car data hundreds of thousands of times in 2020.

  • Constitutional loopholes allow access to most data on cars without a warrant. Police can access information from car-connected phones and online accounts without the warrant typically required.

  • U.S. immigration agencies weaponize car data. Other law enforcement agencies are poised to follow suit if they are not already doing so.

  • New legislation, enforcement of existing data protection laws, and responsible car design and data storage policies can shift car data surveillance into reverse.

Privatizing The Surveillance State: How Police Foundations Undermine Rule of Law

Police foundations allow police departments to secretly fund controversial programs and equipment.

  • Foundations invest in dangerous surveillance tools like predictive policing software, digital surveillance platforms, cellphone hacking devices, and robotic spy dogs. 

  • Foundations allow departments and officers to accept gifts from contractors in a way that would normally be illegal for city employees.

  • Foundations violate good-government standards for city agencies and transparency standards for nonprofit organizations. Ideally, they should be abolished, but at a minimum, cities must end untraceable donations and corporate influence peddling.

The Trojan Horse

By Evan Enzer, Arjun Ravi, Julian Melendi, Sohini Upadhyay, Eleni Manis, PHD, MPA

“Smart home” devices record audio and video in the home—and even collect daily schedules and health details. Once collected, this data is less than a warrant or data breach away from police and hacker access. Across the board, smart home devices have superior, privacy-protecting alternatives that perform the same key functions. The law doesn’t protect smart home users. “Do not buy” is the best advice until it does

Obstructed Justice: NYC's Biased License Plate Enforcement

Eleni Manis, PHD, MPA, Alexander Hughes, PHD

As NYC relies increasingly on traffic cameras, the NYPD has pulled over more and more drivers for a minor traffic infraction—license plate obstruction—particularly in precincts with the most BIPOC residents.

  • This problem is only getting worse: NYPD’s racist enforcement gap doubled between 2016 and 2021.

  • Automated traffic enforcement shouldn’t lead to more in-person traffic stops. Cities should study cameras' effects and adjust policing policies to ensure that cameras don't contribute to the over policing of BIPOC communities.

Guilt By Association: How Police Databases Punish Black and Latinx

By Andy Ratto, Nino Loshkajian, Eleni Manis, PHD, MPA

  • Police increasingly replace stop-and-frisk practices with databases that crudely profile Black and Latinx youth based on their neighborhoods, peer groups, and clothing.

  • These databases ruin lives: police typecast minority youths as gang members without evidence, putting them at risk of false arrest and wrongful deportation.

  • Many police departments refuse to implement due process safeguards despite clear evidence that their databases are based on racial profiling, not evidence.

  • Even the most rigorous safeguards would be insufficient to mitigate the full range of harms that these databases pose. They must be eliminated in their entirety.