Directly Discriminatory Algorithms
| Published date | 01 January 2023 |
| Author | Jeremias Adams‐Prassl,Reuben Binns,Aislinn Kelly‐Lyth |
| Date | 01 January 2023 |
| DOI | http://doi.org/10.1111/1468-2230.12759 |
bs_bs_banner
Modern Law Review
DOI:10.1111/1468-2230.12759
Directly Discriminatory Algorithms
Jeremias Adams-Prassl,∗Reuben Binns†
and Aislinn Kelly-Lyth‡
Discriminatory bias in algorithmic systems is widely documented. How should the law respond?
A broad consensus suggests approaching the issue principally through the lens of indirect dis-
crimination, focusing on algorithmic systems’ impact. In this article, we set out to challenge this
analysis, arguing that while indirect discrimination law has an important role to play, a narrow
focus on this regime in the context of machine learning algorithms is both nor matively un-
desirable and legally awed. We illustrate how certain forms of algor ithmic bias in frequently
deployedalgor ithms might constitute direct discrimination,and explore the ramications—both
in practical terms, and the broader challenges automated decision-making systems pose to the
conceptual apparatus of anti-discrimination law.
INTRODUCTION
Algorithmic decision-making systems (ADMS) discriminate: no automated
system is completely free of bias. The discriminatory impact of ADMS has
been documented in areas ranging from grade allocation and benets deci-
sions to the criminal justice system.1Private operators have, if anything, been
even more enthusiastic in their embrace of ADMS, with predictably dire conse-
quences; from banks persistently and systematically rejecting credit applications
made by customers from certain ethnic groups to hiring systems automatically
rejecting female candidates for engineering positions.2
∗Professor of Law, Magdalen College,University of Oxford.
†Associate Professor of Human Centred Computing, Department of Computer Science, University
of Oxford.
‡Researcher,Bonavero Institute of Human Rights, University of Oxford.The authors acknowledge
funding from the European Research Council under the European Union’s Horizon 2020 research
and innovation programme (grant agreement No 947806), and are grateful to Robin Allen QC,
Shreya Atrey, Catherine Barnard, Mark Bell, Hugh Collins, Jinghe Fan, Sandy Fredman, Philipp
Hacker, Deborah Hellman, Tarun Khaitan, Dee Masters, Dinah Rose QC, Sandra Wachter and
Raphaële Xenidis, as well as participants at the Oxford Algorithms at Work Reading Group, the
UT Austin iSchool Research Colloquium, the Oxford Ethics in AI Research Seminar, the Lorentz
Centre Workshop on Fairness in Algor ithmic Decision Making and the Oxford Business LawWork-
shop,as well as the anonymous reviewers, for feedback and discussion. The usual disclaimers apply.
1 ‘A-levels and GCSEs: How did the exam algorithm work?’BBC News 20 August 2020 at https:
//perma.cc/RCF7-4A9L; Amnesty International, Xenophobic Machines (2021); and J. Larson, S.
Mattu, L. Kirchner and J. Angwin, ‘How We Analyzed the COMPAS Recidivism Algorithm’
ProPublica 23 May 2016 at https://perma.cc/4QR7-485S.
2 See N. Campisi, ‘From Inherent Racial Bias to Incorrect Data – The Problems with Current
Credit Scoring Models’ Forbes Advisor 26 February 2021 at https://per ma.cc/U6K5-UGR7;
© 2022 The Authors. The Moder n Law Reviewpublished by John Wiley & Sons Ltd on behalf of Modern Law Review Limited.
(2023)86(1) MLR 144–175
This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License,which per mits
use and distribution in any medium, provided the original work is properly cited,the use is non-commercial and no modications or
adaptations are made.
Jeremias Adams-Prassl, Reuben Binns and Aislinn Kelly-Lyth
The technical causes of machine bias are varied and complex.A vast literature
explores dierent facets of the problem, from proxy discrimination to tainted
training data; potential technical solutions to de-bias ADMS;and the inherent
trade-os (if any) between accuracy and bias.3
In addition to this technical scrutiny, litigants are increasingly turning to the
courts to challenge algorithmic discr imination across a wide range of regulatory
domains, from judicial review of public-sector ADMS to employment law.4A
growing academic literature suggests that most cases of algorithmic bias will
best be addressed through the lens of indirect discrimination.5Even a facially
neutral provision, criter ion, or practice (PCP) will be pr ima facie unlawful where
it puts people with a protected characteristic at a ‘particular disadvantage’. By
characterising algorithms as PCPs, the focus shifts from the operation of an
ADMS to its impact: are there disparities in its eects on g roups sharing a
protected characteristic?
Under EU and UK anti-discrimination law, this neatly sidesteps dicult
questions of causation and avoids the need for technical explanations of ADMS’
underlying mechanisms – but at signicant cost. Indirect discr imination will
only be unlawful if use of the PCP is not a proportionate means of achieving
a legitimate aim.6In other words, if the use of a biased ADMS can be justied,
then in legal terms no indirect discrimination has occurred.
Financial Conduct Authority, ‘Pricing practices in the retail general insurance sector:Household
Insurance’ Thematic Review TR18/4 (October 2018) para 4.21; T.B.Gillis and J.L.Spiess, ‘Big
Data and Discrimination’ (2019) 86 The University of Chicago Law Review 459; and J. Dastin,
‘Amazon scraps secret AI recruiting tool that showed bias against women’Reuters 11 October
2018 at https://perma.cc/328A-UJFM.
3 See for example D.Pedreshi,S. Ruggieri and F.Turini, ‘Discrimination-aware data mining’(2008)
Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data
Mining 560; C.Dwork and others,‘Fair ness through Awareness’ (2012) Proceedings of the 3rd Inno-
vations in Theoretical Computer Science Conference 214;J.Buolamwini and T.Gebru, ‘Gender Shades:
Intersectional Accuracy Disparities in Commercial Gender Classication’ (2018) 81 Proceedings
of Machine Learning Research 77; J. Kleinberg, S.Mullainathan and M. Raghavan,‘Inherent Trade-
Os in the Fair Determination of Risk Scores’ (2017) 67 Innovations in Theoretical Computer
Science 1, 23. We dene ‘proxy’ here to refer to a feature which is correlated with a protected
characteristic.
4R (on the application of Bridges) vChief Constable of South Wales [2020] EWCA Civ 1058, [2020]
1 WLR 5037; C. Vallance, ‘Legal action over alleged Uber facial verication bias’ BBC News 8
October 2021 at https://perma.cc/TE4M-AMRH; ‘Home Oce drops “racist”algorithm from
visa decisions’ BBC News 4 August 2020 at https://perma.cc/EG22-4SAT.On the potential for
system-level challenges, see A. Adams-Prassl and J. Adams-Prassl, ‘Systemic Unfairness, Access
to Justice and Futility: A Framework’ (2020) 40 OJLS 561.
5 While our discussion in this paper responds to the legal literature on algorithmic bias as a general
phenomenon, werecognise that practitioners’ approaches to specic cases will be fact-sensitive.
See,for example, JointOpinion of R. Allen QC and D.Masters in the Matter of Automated Data
Processing in Government Decision Making 7 September 2019 at https://perma.cc/M2GU-
D8HS, considering a number of case studies.
6 See Council Directive 2000/43/EC of 29 June 2000 implementing the equal treatment of per-
sons irrespective of racial or ethnic origin [2000] OJ L180/22 (the Racial Equality Directive),
art 2(2)(b); Directive 2006/54/EC of the European Parliament and of the Council of 5 July
2006 on the implementation of the principle of equal opportunities and equal treatment of men
and women in matters of employment and occupation (recast) [2006] OJ L204/23 (the Recast
Directive), art 2(1)(b); Council Directive 2004/113/EC of 13 December 2004 implementing
the principle of equal treatment between men and women in the access to and supply of goods
and services [2004] OJ L373/37 (the Gender Access Directive),ar t 2(b);and Council Directive
© 2022 The Authors. The Moder n Law Reviewpublished by John Wiley & Sons Ltd on behalf of Modern Law Review Limited.
(2023) 86(1) MLR 144–175 145
Directly Discriminatory Algorithms
Where similarly situated people with dierent protected characteristics re-
ceive dierent treatment, on the other hand, the law’s approach is (in the-
ory) more straightforward: the use of an ADMS which treats individuals less
favourably on grounds of , or because of, a protected character istic,will consti-
tute direct discrimination. In many cases,this renders deployment of the ADMS
unlawful.7
In this paper,we set out to challenge the persistent assumption that algor ith-
mic decision-making systems will only be caught by the prohibition on direct
discrimination in a small set of cases, such as the deployment of an automated
system to camouage intentional discrimination, or where protected charac-
teristics are explicitly coded into an ADMS.8In scrutinising two paradigmatic
cases of algorithmic discrimination, we demonstrate how a much broader range
of ADMS may well treat individuals dierently on grounds of a protected char-
acteristic – and should thus fall into the scope of direct discrimination. This is
not to say that all forms of algorithmic bias should be understood as unlawful
direct discrimination. Just as there is no one technical cause of algorithmic bias,
there cannot be a uniform legal answer.
Discussion proceeds as follows.The next section dissects the default assump-
tion that algorithmic discrimination will usually fall within the scope of indi-
rect discrimination. Originating in the US doctrine of disparate treatment, a
near-exclusive focus on indirect discrimination raises the practical problem of
self-justifying feedback loops, and runs counter to the principles underpinning
the distinction between direct and indirect discrimination in UK and EU law.
2000/78/EC of 27 November 2000 establishing a general framework for equal treatment in
employment and occupation, OJ L 303 (2000) (the Framework Directive), art 2(2)(b)(i). Note
that under the European Convention on Human Rights,objective justication applies to both
direct and indirect discrimination: Burden vthe United Kingdom (2008) 47 EHRR 38 at [60].
7 UK discr imination law is largely standardised across domains including employment, provision
of services,and education. In all cases, directdiscr imination is generally not objectivelyjustiable,
save in the case of age:Equality Act 2010, s 13(2). In the employment context, there is a nar row
opportunity to justify direct discrimination for a ‘genuine occupational requirement’, ie where
satisfaction of the criterion is str ictly necessary to perform the role: Equality Act 2010, Sched
9. At EU level,the approach is less harmonised. Objective justication for direct discr imination
remains possible in some specic contexts: see, for example, Gender Access Directive,art 4(5).
Nonetheless, recourse to the objective justication framework is still barred in many cases of
direct discrimination, including where discrimination is on grounds of sex in the employment
context (Framework Directive, art 4(1);Recast Directive,art 14(2)) or on racial g rounds in any
regulated domain (Racial Equality Directive, art 4).Following the United Kingdom’s exit from
the European Union, UK courts should continue to have regard to developments in the EU
equality acquis: European Union (Withdrawal) Act 2018, s 6.
8 See, for example,F.Zuiderveen Borgesius,‘Price Discr imination,Algorithmic Decision-Making,
and European Non-Discrimination Law’ (2020) 31 European Business Law Review 401,409-411;
P. Hacker, ‘Teaching Fairness to Articial Intelligence: Existing and Novel Strategies against
Algorithmic Discr imination under EU Law’ (2018) 55 Common Market Law Review 1143,1151-
1152; S. Wachter, B. Mittelstadt and C. Russell, ‘Why fairness cannot be automated: Bridg ing
the gap between EU non-discrimination law and AI’(2021) 41 Computer Law & Secur ity Review
105567, 19-20; J. Gerards and R. Xenidis, Algor ithmic discrimination in Europe: Challenges and
opportunities for gender equality and non-discrimination law (Brussels: European Commission,2020),
67-73;A. Kelly-Lyth, ‘Challenging Biased Hiring Algorithms’ (2021) 41 OJLS 899, 906. See a lso
Decision no 216/2017 of the National Non-Discrimination and Equality Tribunal of Finland,
issued 21 March 2018 at https://perma.cc/ZKS8-SFNJ, where factors including gender and age
had been labelled as inputs in a credit scoring system.
146 © 2022 The Authors. The Moder n Law Reviewpublished by John Wiley & Sons Ltd on behalf of Modern Law Review Limited.
(2023) 86(1) MLR 144–175
Get this document and AI-powered insights with a free trial of vLex and Vincent AI
Get Started for FreeStart Your Free Trial of vLex and Vincent AI, Your Precision-Engineered Legal Assistant
-
Access comprehensive legal content with no limitations across vLex's unparalleled global legal database
-
Build stronger arguments with verified citations and CERT citator that tracks case history and precedential strength
-
Transform your legal research from hours to minutes with Vincent AI's intelligent search and analysis capabilities
-
Elevate your practice by focusing your expertise where it matters most while Vincent handles the heavy lifting
Start Your Free Trial of vLex and Vincent AI, Your Precision-Engineered Legal Assistant
-
Access comprehensive legal content with no limitations across vLex's unparalleled global legal database
-
Build stronger arguments with verified citations and CERT citator that tracks case history and precedential strength
-
Transform your legal research from hours to minutes with Vincent AI's intelligent search and analysis capabilities
-
Elevate your practice by focusing your expertise where it matters most while Vincent handles the heavy lifting
Start Your Free Trial of vLex and Vincent AI, Your Precision-Engineered Legal Assistant
-
Access comprehensive legal content with no limitations across vLex's unparalleled global legal database
-
Build stronger arguments with verified citations and CERT citator that tracks case history and precedential strength
-
Transform your legal research from hours to minutes with Vincent AI's intelligent search and analysis capabilities
-
Elevate your practice by focusing your expertise where it matters most while Vincent handles the heavy lifting
Start Your Free Trial of vLex and Vincent AI, Your Precision-Engineered Legal Assistant
-
Access comprehensive legal content with no limitations across vLex's unparalleled global legal database
-
Build stronger arguments with verified citations and CERT citator that tracks case history and precedential strength
-
Transform your legal research from hours to minutes with Vincent AI's intelligent search and analysis capabilities
-
Elevate your practice by focusing your expertise where it matters most while Vincent handles the heavy lifting
Start Your Free Trial of vLex and Vincent AI, Your Precision-Engineered Legal Assistant
-
Access comprehensive legal content with no limitations across vLex's unparalleled global legal database
-
Build stronger arguments with verified citations and CERT citator that tracks case history and precedential strength
-
Transform your legal research from hours to minutes with Vincent AI's intelligent search and analysis capabilities
-
Elevate your practice by focusing your expertise where it matters most while Vincent handles the heavy lifting