Discrimination, Fairness and Prediction in Policing: Fare Evasion in New York City
By Nicolas S. Rothbacher
Predictive policing has quickly become widespread in the United States. Practitioners claim it can greatly increase police efficiency and base decisions on objective statistics. Critics say that these algorithms reproduce discriminatory outcomes in a biased justice system. In this thesis, I investigate fare enforcement in New York City and what might happen if predictive policing were applied. First I analyze legal precedents on discrimination law to create a framework for understanding whether policy is legally discriminatory. In this framework the fairness of a government policy is judged based on how different groups are treated by the process of carrying out the policy. Three elements must be examined: a comparison group that is treated fairly, discriminatory burden for the disadvantaged group, and government negligence or intent. Next, using this framework, I perform data analysis on fare evasion arrests in New York City, and find evidence of discrimination. Finally, I examine predictive policing to determine what its effect on fare enforcement might be. I conclude that predictive policing algorithms trained on the arrests will be ineffective and seen as unfair due to the institutional practices that impact the data. This examination sheds light on how machine learning fairness could be analyzed using societal expectations of fairness.
Cambridge, MA: Massachusetts Institute of Technology m Institute for Data, Systems, and Society, 2020 . 54p.