The Open Access Publisher and Free Library
01-crime.jpg

CRIME

CRIME-VIOLENT & NON-VIOLENT-FINANCLIAL-CYBER

Posts tagged Technology
Future Crimes: Inside The Digital Underground And The Battle For Our Connected World

MAY CONTAIN MARKUP

Marc Goodman

In "Future Crimes: Inside The Digital Underground And The Battle For Our Connected World," author Marc Goodman delves into the dark and complex world of cybercrime. He explores the ways in which technology has transformed criminal activities, from hacking and identity theft to cyberterrorism and digital espionage. Goodman sheds light on the threats that the digital age poses to individuals, organizations, and governments, urging readers to become more vigilant and informed about cybersecurity. Through detailed research and gripping real-life stories, "Future Crimes" offers a compelling and sobering look at the vulnerabilities of our interconnected world.

ANCHOR BOOKS. A Division of Penguin Random House LLC New York. 2016. 601p.

Going Dark: The Inverse Relationship between Online and On-the-Ground Pre-offence Behaviours in Targeted Attackers

By Julia Kupper and Reid Meloy

This pilot study examines the correlation of online and on-the-ground behaviours of three lone-actor terrorists prior to their intended and planned attacks on soft targets in North America and Europe: the Pittsburgh synagogue shooter, the Buffalo supermarket shooter and the Bratislava bar shooter. The activities were examined with the definition of the proximal warning indicator energy burst from the Terrorist Radicalization Assessment Protocol (TRAP-18), originally defined as an acceleration in frequency or variety of preparatory behaviours related to the target. An extensive quantitative and qualitative assessment of primary and secondary sources was conducted, including raw data from different tech platforms (Gab, Discord and Twitter–now X) and open-source materials, such as criminal complaints, superseding indictments and court trial transcripts. Preliminary findings of this small sample suggest an inverse relationship between the online and offline behaviours across all three perpetrators. The average point of time between the decision to attack and the actual attack was five months, with an elevation of digital activities in the three months leading up to the incident, along with some indications of offline planning. In the week prior to the event, social media activity decreased–specifically on the day before the acts of violence with two subjects going completely dark–while terrestrial preparations increased. On the actual day of the incident, all assailants accelerated their tactical on-the-ground actions and resurfaced in the online sphere to publish their final messages in the minutes or hours prior to the attack. It appears that the energy burst behaviours in the digital sphere and the offline actions can be measured in both frequency and variety. Operational implications of this negative correlation are suggested for intelligence analysts, counter-terrorism investigators and threat assessors.

London: The Global Network on Extremism and Technology (GNET), 2023. 36p.

Deepfakes on Trial: A Call To Expand the Trial Judge’s Gatekeeping Role To Protect Legal Proceedings from Technological Fakery

By Rebecca A. Delfino

Deepfakes—audiovisual recordings created using artificial intelligence (AI) technology to believably map one person’s movements and words onto another—are ubiquitous. They have permeated societal and civic spaces from entertainment, news, and social media to politics. And now deepfakes are invading the courts, threatening our justice system’s truth-seeking function. Ways deepfakes could infect a court proceeding run the gamut and include parties fabricating evidence to win a civil action, government actors wrongfully securing criminal convictions, and lawyers purposely exploiting a lay jury’s suspicions about evidence. As deepfake technology improves and it becomes harder to tell what is real, juries may start questioning the authenticity of properly admitted evidence, which in turn may have a corrosive effect on the justice system. No evidentiary procedure explicitly governs the presentation of deepfake evidence in court. The existing legal standards governing the authentication of evidence are inadequate because they were developed before the advent of deepfake technology. As a result, they do not solve the urgent problem of how to determine when an audiovisual image is fake and when it is not. Although legal scholarship and the popular media have addressed certain facets of deepfakes in the last several years, there has been no commentary on the procedural aspects of deepfake evidence in court. Absent from the discussion is who gets to decide whether a deepfake is authentic. This Article addresses the matters that prior academic scholarship on deepfakes obscures. It is the first to propose a new addition to the Federal Rules of Evidence reflecting a novel reallocation of fact-determining responsibilities from the jury to the judge, treating the question of deepfake authenticity as one for the court to decide as an expanded gatekeeping function under the Rules. The challenges of deepfakes—problems of proof, the “deepfake defense,” and juror skepticism—can be best addressed by amending the Rules for authenticating digital audiovisual evidence, instructing the jury on its use of that evidence, and limiting counsel’s efforts to exploit the existence of deepfakes.

Hastings Law Journal, 2023. 57p.

Challenge Trial Judges Face When Authenticating Video Evidence in the Age of Deepfakes

By Taurus Myhand

The proliferation of deepfake videos has resulted in rapid improvements in the technology used to create them. Although the use of fake videos and images are not new, advances in artificial intelligence have made deepfakes easier to make and harder to detect. Basic human perception is no longer sufficient to detect deepfakes. Yet, under the current construction of the Federal Rules of Evidence, trials judges are expected to do just that. Trial judges face a daunting challenge when applying the current evidence authentication standards to video evidence in this new reality of widely available deepfake videos. This article examines the gatekeeping role trial judges must perform in light of the unique challenges posed by deepfake video evidence. This article further examines why the jury instruction approach and the rule change approaches proposed by other scholars are insufficient to combat the grave threat of false video evidence. This article concludes with a discussion of the affidavit of forensic analysis approach, a robust response to the authentication challenges posed by deepfakes. The AFA approach preserves most of the current construction of the Federal Rules of Evidence while reviving the gatekeeping role of the trial judge in determining the admissibility of video evidence. The AFA will provide the trial judges with the tools necessary to detect and exclude deepfake videos without leaving an everlasting taint on the juries that would have otherwise seen the falsified videos.

Widener Law Review, 2023. 19p.

Spaceless violence: Women’s experiences of technology-facilitated domestic violence in regional, rural and remote areas

By Bridget Harris & Delaine Woodlock

This project explored the impact of technology on victim–survivors of intimate partner violence in regional, rural or remote areas who are socially or geographically isolated. Specifically, it considered the ways that perpetrators use technology to abuse and stalk women, and how technology is used by victim–survivors to seek information, support and safety. Interviews and focus groups with 13 women were conducted in regional, rural and remote Victoria, New South Wales and Queensland. The findings showed that perpetrators used technology to control and intimidate women and their children. While this impacted women and children’s lives in significant ways, causing fear and isolation, the use of technology was often not viewed as a serious form of abuse by justice agents. 

Australia, Institute of Criminology. 2022, 81pg