Digital prediction technologies in the justice system: The implications of a ‘race-neutral’ agenda

Published date01 August 2020
Date01 August 2020
DOI10.1177/1362480619896006
AuthorPamela Ugwudike
Subject MatterPart II: Raceing Ahead
https://doi.org/10.1177/1362480619896006
Theoretical Criminology
2020, Vol. 24(3) 482 –501
© The Author(s) 2020
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/1362480619896006
journals.sagepub.com/home/tcr
Digital prediction
technologies in the justice
system: The implications
of a ‘race-neutral’ agenda
Pamela Ugwudike
University of Southampton, UK
Abstract
This article critically analyses the nexus of race and risk prediction technologies applied
in justice systems across western jurisdictions. There is mounting evidence that the
technologies are overpredicting the risk of recidivism posed by racialized groups,
particularly black people. Yet the technologies are ostensibly race neutral in the sense
that they do not refer explicitly to race. They are also compliant with race equality
laws. To investigate how apparently race neutral technologies can nevertheless yield
racially disparate outcomes, the article draws on insights distilled from the sociology of
race and the sociological scholarship on standardization. It uses themes from these two
scholarships to unravel the intersecting structural and creational dynamics capable of
fomenting the digital racialization of risk.
Keywords
Algorithmic justice, algorithmic risk prediction, computational criminology, digital
criminology, digital justice, risk technologies
Introduction
This article addresses the racial dynamics of prediction technologies applied by proba-
tion and prison services across western jurisdictions such as the UK and the USA.
Despite the proliferation of these technologies, criminological scholarship has remained
relatively indifferent to the ways in which their structural and creational dynamics
Corresponding author:
Pamela Ugwudike, Department of Sociology, Social Policy and Criminology, University of Southampton,
University Road, Southampton, SO17 1BJ, UK.
Email: p.ugwudike@soton.ac.uk
896006TCR0010.1177/1362480619896006Theoretical CriminologyUgwudike
research-article2020
Article
Ugwudike 483
intersect, with implications for racialized groups.1 This represents a missed opportunity
for the nascent field of computational or digital criminology in particular, to contribute
to growing intellectual debates about the need to curb the harms of digital technological
advances including their potential for reproducing inequalities (see, for example, Halford
and Savage, 2010).
Although there is a dearth of criminological scholarship in this area, some scholars
have addressed the topic of racial bias in forensic risk assessments (see, generally, Case,
2007; Goddard and Myers, 2017; Hannah-Moffat and Marutto, 2010; Harcourt, 2015).
But this article contributes to existing knowledge by providing a theoretical analysis of
the racial dynamics inherent in the structural and creational dimensions of prediction
technologies. In doing so, it demonstrates that the structural conditions in which the
technologies are created, can give rise to racializing effects such as the overprediction of
risks ascribed to racialized people (e.g. Angwin and Larson, 2016). This rebuts the pre-
sumption that technologies reflecting the liberal race neutral ideology and the logics of
standardization (such as scientific neutrality), are free of racial bias.
The article begins with an overview of the emergence of risk-focused penality and
risk prediction technologies. It notes that the technologies are apparently race neutral.
But it draws on themes from the sociology of race to argue that the liberal race neutrality
frame invoked to legitimize such technologies obfuscate the structural conditions that
can racialize their outcomes (Bonilla-Silva, 2015; Feagin, 2013; Golash-Boza, 2016;
Goldberg, 2015; Vickerman, 2013). Structural analysis of the devices is therefore
required to unravel their racializing effects.
The digital prediction technologies are also standardized. There is therefore a pre-
sumption that the technologies reflect the logics of standardization. Central to this is the
belief that excluding social categories such as race and using complex numerical analy-
sis: eliminates bias; ensures scientific objectivity; and improves efficiency (Hirschman
and Bosk, 2019; Reskin, 2012; Timmermans and Epstein, 2010). But this article uses
themes from the sociology of standardization to critique these logics. In doing so, it dem-
onstrates that, similar to liberal race neutrality, they obscure the structural dynamics of
prediction technologies and their racializing effects. Thus, these insights from the schol-
arship on standardization and the sociology of race share a key theme. They emphasize
that, to unravel the racial dynamics of the ostensibly race neutral prediction technologies
applied in the justice system, their structural dynamics have to be analysed.
To this end, this article examines the structural dynamics of the prediction technolo-
gies to uncover their racial dynamics. It conceptualizes these as infrastructural and con-
stitutional dynamics and argues that they bolster the exercise of power with which the
technologies are created, although non-whites do not have access to this creational
power. The structures also sustain what the article describes as the digitized epistemic
dominance of those who create and develop the technologies, some of whom are com-
mercial vendors (see Angwin and Larson, 2016; Dressler and Farid, 2018).
Following its analysis of the structural conditions in which digital prediction technolo-
gies are created, the article uses insights from the sociological scholarship on standardiza-
tion and the example of the commonly used prediction technologies applied in cases of
general offending, to demonstrate how standardized prediction technologies can neverthe-
less generate racializing effects. The racializing effects are enabled by structural conditions

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT