Open Access Publisher and Free Library
HUMAN RIGHTS.jpeg

HUMAN RIGHTS

Human Rights-Migration-Trafficking-Slavery-History-Memoirs-Philosophy

Posts tagged artificial intelligence
The New Jim Crow: Unmasking Racial Bias in AI Facial Recognition Technology within the Canadian Immigration System

By Gideon Christian

Facial recognition technology (FRT) is an artificial intelligence (AI)-based biometric technology that utilizes computer vision to analyze facial images and identify individuals by their unique facial features. This sophisticated AI technology uses advanced computer algorithms to generate a biometric template from a facial image. The biometric template contains unique facial characteristics represented by dots, which can be used to match identical or similar images in a database for identification purposes. The biometric template is often likened to a unique facial signature for each individual.

A significant rise in the deployment of AI-based FRT has occurred in recent years across the public and private sectors of Canadian society. Within the public sector, its application encompasses law enforcement in criminal and immigration contexts, among many others. In the private sector, it has been used for tasks such as exam proctoring in educational settings, fraud prevention in the retail industry, unlocking mobile devices, sorting and tagging of digital photos, and more. The widespread use of AI facial recognition in both the public and private sectors has generated concerns regarding its potential to perpetuate and reflect historical racial biases and injustices. The emergence of terms like “the new Jim Crow” and “the new Jim Code” draws a parallel between the racial inequalities of the post-US Civil War Jim Crow era and the racial biases present in modern AI technologies. These comparisons underscore the need for a critical examination of how AI technologies, including FRT, might replicate or exacerbate systemic racial inequities and injustices of the past.

This research paper seeks to examine critical issues arising from the adoption and use of FRT by the public sector, particularly within the framework of immigration enforcement in the Canadian immigration system. It delves into recent Federal Court of Canada litigation relating to the use of the technology in refugee revocation proceedings by agencies of the Canadian government. By delving into these legal cases, the paper will explore the implications of FRT on the fairness and integrity of immigration processes, highlighting the broader ethical and legal issues associated with its use in administrative processes.

The paper begins with a concise overview of the Canadian immigration system and the administrative law principles applicable to its decision-making process. This is followed by an examination of the history of integrating AI technologies into the immigration process more broadly. Focusing specifically on AI-based FRT, the paper will then explore the issues of racial bias associated with its use and discuss why addressing these issues is crucial for ensuring fairness in the Canadian immigration process. This discussion will lead to a critical analysis of Federal Court litigation relating to the use of FRT in refugee status revocation, further spotlighting the evidence of racial bias in the technology's deployment within the immigration system.

The paper will then proceed to develop the parallels between racial bias evident in contemporary AI-based FRT (the “new” Jim Crow) and racial bias of the past (the “old” Jim Crow). By focusing on the Canadian immigration context, the paper seeks to uncover the subtle, yet profound ways in which AI-based FRT, despite its purported neutrality and objectivity, can reinforce racial biases of the past. Through a comprehensive analysis of current practices, judicial decisions, and the technology's deployment, this paper aims to contribute to the ongoing dialogue about technology and race. It challenges the assumption that technological advancements are inherently equitable, urging a re-evaluation of how these tools are designed, developed, and deployed, especially in sensitive areas such as refugee status revocation, where the stakes for fairness and equity are particularly high.

69 McGill Law Journal 441 (October 2024)

A Hazard to Human Rights

By Human Rights Watch

The 61-page report, “A Hazard to Human Rights: Autonomous Weapons Systems and Digital Decision-Making,” finds that autonomous weapons, which select and apply force to targets based on sensor rather human inputs, would contravene the rights to life, peaceful assembly, privacy, and remedy as well as the principles of human dignity and non-discrimination. Technological advances and military investments are now spurring the rapid development of autonomous weapons systems that would operate without meaningful human control.

Human Rights Watch, April 28, 2025, p. 61

Unzipping Detention From Deportation

By Mary Holper

Alleged mandatory immigration detainees are unable to access federal court review of whether they are illegally detained without a bond hearing. The conviction that causes a detainee to be deportable also causes mandatory detention, so that the substantive findings in the deportation litigation path and detention litigation path overlap, even though their consequences differ. In this situation, habeas courts invoke 8 U.S.C. § 1252(b)(9)— the “zipper clause”—a 1996 statute barring habeas petitions. With § 1252(b)(9), Congress intended to “zip” all claims “arising from any action taken or proceeding brought to remove” a noncitizen into a single circuit court petition for review of a final removal order. But the detention and deportation litigation paths are two sides of an unmatched zipper. One path leads to deportation while the other leads to detention without a bond hearing pending the decision on deportation.

This article exposes a problem that, while under-litigated in immigration detention law, is robbing alleged mandatory detainees of their right to access habeas corpus in order to challenge their illegal detention. As a solution, this article proposes the “Great Writ,” habeas corpus, as a remedy. Because the alleged mandatory detainees do not seek a review of their removal orders, and seek only release from custody, they invoke the “core” of habeas corpus. Although a federal appellate court will ultimately review the substantive legal question that causes both their deportation and detention, that review comes too little, and too late. Thus, it provides no adequate substitute for habeas corpus because there is no meaningful opportunity to demonstrate, to a politically independent adjudicator, that the noncitizen is illegally detained. For these detainees, § 1252(b)(9) has proven to be an ill-fitted zipper that allows illegal executive detention to continue for months and years. If the core of the Suspension Clause is to mean anything, it must guard these detainees’ liberty interests.

Boston College Law School, Boston College Law School Legal Studies Research Paper No. 634, 2024. 53p.

AI Executive Order and Considerations for Federal Privacy Policy [January 25, 2024]

STUESSY, MEGHAN M.

The passage that follows includes several links embedded in the original text. From the document: "On October 30, 2023, President Biden issued Executive Order (E.O.) 14110 on 'Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence.' This E.O. advances a coordinated approach to the responsible development and use of artificial intelligence (AI) and directs agencies to mitigate privacy risks and bias potentially exacerbated by AI, including 'by AI's facilitation of the collection or use of information about individuals, or the making of inferences about individuals.' [...] The E.O. focuses on three priorities relating to privacy: 1. Identifying and evaluating agency use of commercially available information (CAI); 2. Revising existing privacy requirements for the adoption of AI, including privacy impact assessments (PIAs); and 3. Encouraging agency use of PETs [privacy-enhancing technologies]."

LIBRARY OF CONGRESS. CONGRESSIONAL RESEARCH SERVICE. 2024.