The Open Access Publisher and Free Library
01-crime.jpg

CRIME

CRIME-VIOLENT & NON-VIOLENT-FINANCLIAL-CYBER

Posts tagged Evidence verification
Deepfakes on Trial: A Call To Expand the Trial Judge’s Gatekeeping Role To Protect Legal Proceedings from Technological Fakery

By Rebecca A. Delfino

Deepfakes—audiovisual recordings created using artificial intelligence (AI) technology to believably map one person’s movements and words onto another—are ubiquitous. They have permeated societal and civic spaces from entertainment, news, and social media to politics. And now deepfakes are invading the courts, threatening our justice system’s truth-seeking function. Ways deepfakes could infect a court proceeding run the gamut and include parties fabricating evidence to win a civil action, government actors wrongfully securing criminal convictions, and lawyers purposely exploiting a lay jury’s suspicions about evidence. As deepfake technology improves and it becomes harder to tell what is real, juries may start questioning the authenticity of properly admitted evidence, which in turn may have a corrosive effect on the justice system. No evidentiary procedure explicitly governs the presentation of deepfake evidence in court. The existing legal standards governing the authentication of evidence are inadequate because they were developed before the advent of deepfake technology. As a result, they do not solve the urgent problem of how to determine when an audiovisual image is fake and when it is not. Although legal scholarship and the popular media have addressed certain facets of deepfakes in the last several years, there has been no commentary on the procedural aspects of deepfake evidence in court. Absent from the discussion is who gets to decide whether a deepfake is authentic. This Article addresses the matters that prior academic scholarship on deepfakes obscures. It is the first to propose a new addition to the Federal Rules of Evidence reflecting a novel reallocation of fact-determining responsibilities from the jury to the judge, treating the question of deepfake authenticity as one for the court to decide as an expanded gatekeeping function under the Rules. The challenges of deepfakes—problems of proof, the “deepfake defense,” and juror skepticism—can be best addressed by amending the Rules for authenticating digital audiovisual evidence, instructing the jury on its use of that evidence, and limiting counsel’s efforts to exploit the existence of deepfakes.

Hastings Law Journal, 2023. 57p.

Challenge Trial Judges Face When Authenticating Video Evidence in the Age of Deepfakes

By Taurus Myhand

The proliferation of deepfake videos has resulted in rapid improvements in the technology used to create them. Although the use of fake videos and images are not new, advances in artificial intelligence have made deepfakes easier to make and harder to detect. Basic human perception is no longer sufficient to detect deepfakes. Yet, under the current construction of the Federal Rules of Evidence, trials judges are expected to do just that. Trial judges face a daunting challenge when applying the current evidence authentication standards to video evidence in this new reality of widely available deepfake videos. This article examines the gatekeeping role trial judges must perform in light of the unique challenges posed by deepfake video evidence. This article further examines why the jury instruction approach and the rule change approaches proposed by other scholars are insufficient to combat the grave threat of false video evidence. This article concludes with a discussion of the affidavit of forensic analysis approach, a robust response to the authentication challenges posed by deepfakes. The AFA approach preserves most of the current construction of the Federal Rules of Evidence while reviving the gatekeeping role of the trial judge in determining the admissibility of video evidence. The AFA will provide the trial judges with the tools necessary to detect and exclude deepfake videos without leaving an everlasting taint on the juries that would have otherwise seen the falsified videos.

Widener Law Review, 2023. 19p.