Beruflich Dokumente
Kultur Dokumente
Algorithmic Profiling: The Use of Risk Assessment Technology and Predictive Policing
Kostas A. Hinkle
LITERATURE REVIEW
In the American Bar Association’s news article, The Good, Bad, and Ugly of New Risk-
Assessment Tech in Criminal Justice published on February 16, 2020; it discusses the
controversial use of computer technology and algorithms that offer advice to judges and parole
officers regarding decisions about individuals’ sentencing. The article explains polling done by
the National Judicial College of 369 judges, and their opinions regarding the matter. There are
also bits of interviews with several other judges throughout, allowing for one to view the issue
from their perspective. The article itself accurately depicts both sides of the argument.
The article written by Pamela Udwudike titled Digital Prediction Technologies in the
Justice System: The Implications of a “Race-Neutral” Agenda can be found in the journal
Theoretical Criminology published this year, 2020. The articles take an analytical look at the
correlation between racial bias and risk assessment/prediction technologies that are used within
justice systems across Western civilizations’ justice systems. Udwudike proposes that the
evidence for these technologies harming and overpredicting minorities, particularly Black
people, is alarming. And, although, the technologies being used all are compliant with the laws in
place that are meant to guarantee racial equality, the outcomes are still greatly skewed. The
proposal made is the idea of “race neutrality” ends up doing more damage, than good.
Keith Kirkpatrick wrote the article It’s Not the Algorithm, It’s the Data and was published
in February 2017 in the Communications of the ACM journal’s society section. This article also
goes over the issues with the data that these criminal justice technologies are using and claims
that while, the issue may not be within the patterns they’re using necessarily, they do base their
programs on prejudiced data. The article describes and analyzes specific tools that are being used
ALGORITHMIC PROFILING 3
across the U.S. such as COMPAS, also known as Correctional Offender Management Profiling
for Alternative Sanctions, and the companies which sell them to Criminal Justice offices.
MIT Technology Review featured an article written by Will Douglas Heaven on July 17,
2020 via their online news feed. The article titled Predictive Policing Algorithms are Racist.
They Need to be Dismantled. states exactly what the title suggests. Heaven presents a substantial
amount of quantitative and qualitative evidence which is all on the side of these technologies
being filled with racial biases, whether intentional or not. The article also discusses the issues
with the allowance of these particular problems to continue to work their way throughout the
justice system in the United States. The lack of transparency regarding these technologies is
called into question, as well as proposals for how to move forward still utilizing these artificial
Algorithmic Profiling: The Use of Risk Assessment Technology and Predictive Policing
The technology of risk assessment and predictive policing has been in use within the
Criminal Justice System for years, yet it is something that is rarely audited or overseen. The
purpose behind the initial creation was to alleviate possible bias in sentencing, allow for judges
to oversee more cases at a time, assist probation officers in their decisions, and help police
properly intercept crime before it happens. While, these are all well intentioned motives; critics
claim that the actual operation of this technology has proven to only worsen bias in the Criminal
Justice System. These technologies use data sets accrued from what some say are decades of
prejudiced policing and statistics from a system that has a reputation for its racial biases.
Contrarily, there are several police departments that state these technologies allow for them to do
their jobs better, and for judges to sentence more fairly. The main issue lies within the third-party
companies that design this technology, and the criminal justice data they utilize in the creation of
the algorithms. There is plenty of evidence proving that minority groups, particularly people of
color and even more so Black Americans, are still continuing to be hurt most by these systems. If
the technology was truly created to eliminate all racial bias, one would expect a change in the
overall system, which has not occurred. Therefore, the question remains: do the technologies
involved in predictive policing and risk assessment algorithms perpetuate systemic racism within
The United States has a tainted history when it comes to structural inequality. Sociologists
identify the rankings based on differences amongst groups of people as social stratification. A
ALGORITHMIC PROFILING 5
society’s stratification is proven in the social sciences to have “significant consequences for its
members’ attitudes, behavior, and, perhaps most important of all, life chances,” (Barkan, 2014).
Stratification is caused by several aspects and are perhaps most providential within the Criminal
Justice System. While, there are many arguments to be made regarding why the prison and court
systems are overwhelmed with the Black population, it is apparent it is an arbitrary system once
comparison is made. For example, Black people are 30% more likely to be pulled over by the
police (Langton, 2013). Blacks receive 20% longer prison sentences than their White
counterpart; this includes considering their criminal records, and violent criminal histories, if
applicable (Demographic Differences in Sentencing, 2018). Black men are 2.5 times more likely
to be killed by a police officer than a white man. Black women are 1.4 times more likely than a
white woman (PNAS, 2019). To correlate, violent crime rates do not seem to be related to police
shootings or fatalities (Mapping Police Violence, 2019). Disparities throughout the justice
system begin at arrest and play a role throughout sentencing and imprisonment. The use of
technologies such as AI in Predictive Policing and algorithms that assess risk for recidivism have
not cured these ills the justice system faces, but only made them more convenient for Criminal
The technology for predictive policing is an algorithm that is described in the article It’s Not the
Algorithm, it’s the Data as “using data analytics and algorithms to better pinpoint where and
when a crime might occur,” (Kirkpatrick, 2017). The most popular software used amongst police
departments in the U.S. is PredPol, created by mathematicians at the University of California and
officers and crime analysts from the Los Angeles and Santa Cruz police departments. This
technology uses data to provide a prediction for police on when and where crime is most likely
ALGORITHMIC PROFILING 6
to occur, based on past crime reports (Kirkpatrick, 2017). The chief scientist at Santa Cruz’s
PredPol, Inc. stated they are “using algorithms that go through historical crime reports… for the
officers to allocate their resources,” (Kirkpatrick, 2017). This specific tool has been called into
question on multiple occasions for its racial implications and unfair distribution of police officers
in communities that are mostly non-white. An activist in Los Angeles named Hamid Khan fought
for years to try and get the LAPD to stop using this predictive policing. He demanded an audit of
the tool be made but in March of 2019, he was told nothing could be done because the tool itself
is “too complicated” (Heaven, 2020). The issue being ignored is that those in power remain the
most powerful if those being affected by their decisions and actions are unable to even challenge
the ways in which they are being discriminated against. The sociological theory of conflict
theory states that society is in a perpetual state of conflict, because of an unfair imbalance of the
allocation of resources and fairness amongst groups in society. One could determine conflict
theory is at play in this scenario because social order is being maintained through dominance and
The Biased Results from Predictive Policing Technologies in turn, make it more
likely that a person of color will have an altercation with a police officer. There has been
evidence showing these predictive policing tools end in an overpopulation of communities made
up of mostly minority groups, particularly Black people (Ugwudike, 2020). This is likely due to
the crime reports being used coming from a history of racial bias within the police force of the
United States. If these technologies are basing their predictions off patterns from the past, that
will only halt the progress of the future in a world fighting against a racially divisive society.
Researches have even found these tools being used to create new ways of circumventing citizens.
For example, Rashida Richardson, the director of policy research at AI Institute has discovered
ALGORITHMIC PROFILING 7
that in some states: “police were warning people on lists that they were at high risk of being
involved in gang related crime and asking them to ‘take actions to avoid this’. If they were later
arrested for any type of crime, prosecutores used the prior warning to seek higher charges.”
(Heaven, 2020). This is not the only instance of situations like this occurring, and is accused of
being a “digital form of entrapment” (Heaven, 2020). It seems that classifying a specific area or
demographic as more likely for crime occurrence, is far too similar to profiling which has major
negative implications for Black Americans most of all. These types of tools seemingly make it
more likely that a police officer will stop or arrest individuals based off a prejudice, rather than a
need. This further supports the Conflict Theory of sociology, by maintaining the classes and
Risk Assessment Algorithms are largely based off altercations with the police and
arrest records. This is an unfair system as we have now seen just how prejudiced the predictive
policing tools can be, which causes people of color to have a higher likelihood to end up having
an interaction with the police. The risk assessment technologies themselves do not mention the
color of person’s skin based on the laws put in place to attempt to “guarantee” racial equality
amongst these tools. Unfortunately, the lack of acknowledgment happens to create additional
damage. While, yes, the technology does not outwardly state or relate to a person’s race, they
remain to be racially divisive. The structural conditions of these algorithms generate racialized
outcomes. The perspective of the vendors who create these technologies, as well as those who
collaborate, could in fact include not only human error, but also biased opinions. The choices
made that determine what data sets will be used to set the patterns for deciding someone’s
recidivism are that of a human being. The Symbolic Interactionism theory of sociology explains
that society has a large effect on the way you think and behave, whether or not you realize it. To
ALGORITHMIC PROFILING 8
neglect the fact that society has racial divides and could subconsciously constitute a bias within
us, is to slight those whom it continues to oppress. The most commonly used technologies
themselves provide insurmountable evidence that they overpredict the recidivism risks of
racialized groups of people. This is seemingly obvious when one considers the fact that the data
is based off “arrest record”/ ”been stopped by the police” without differentiating between the
two, which is well known more common to occur within Black communities (Ugwudike, 2020).
The algorithms are unable to determine the overall effect of systemic issues, such as unwarranted
involvement versus crime involvement. The logics of these standardization processes obscure the
functionalism can be recognized here as maintaining the “status quo” by not recognizing that
“The reality that structural deficiencies such as blocked access to resources (not cultural
and biological deficiencies) can constrain the agency of affected populations and explain their
CONCLUSION
The use of algorithms and predictive technologies within the criminal justice system is a
concept that could have a positive impact, if done properly and responsibly. Unfortunately, that is
not presently the case overall. The systems in place already uphold a racial bias throughout the
United States, as one can easily bear witness to. In turn, these patterns and predictions further
perpetuate that. If our society continues to disregard the implications these tools have for people
of color, there is no hope for a truly unbiased justice system. And further, will only cause greater
References
ABA News. (2020, February 16). The good, bad and ugly of new risk-assessment tech in
https://www.americanbar.org/news/abanews/aba-news-archives/2020/02/the-good--bad-
and-ugly-of-new-risk-assessment-tech-in-criminal-j/.
Barkan, S. (2014). Sociology: understanding and changing the social world. publisher not
identified.
Berk, R., Heidari, H., Jabbari, S., Kearns, M., & Roth, A. (2018). Fairness in Criminal Justice
https://doi.org/10.1177/0049124118782533
Hao, K. (2019, February 13). Police Across the US are Training Crime-Predicting AIs on
https://www.technologyreview.com/2019/02/13/137444/predictive-policing-algorithms-ai-
crime-dirty-data/.
Heaven, W. D. (2020, July 17). Predictive policing algorithms are racist. They need to be
https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-
racist-dismantled-machine-learning-bias-criminal-justice/.
ALGORITHMIC PROFILING 10
Kirkpatrick, K. (2017, February). It's Not the Algorithm, It's the Data | February
not-the-algorithm-its-the-data/fulltext.
Proceedings of the National Academy of Sciences of the United States of America. PNAS.
https://www.pnas.org/.
The Sentencing Project: Racial Disparity. The Sentencing Project. (2020, July 20).
https://www.sentencingproject.org/issues/racial-disparity/.
Ugwudike, P. (2020). Digital prediction technologies in the justice system: The implications of a
https://doi.org/10.1177/1362480619896006