The fundamental rights challenges of algorithms

AuthorJanneke Gerards
DOI10.1177/0924051919861773
Published date01 September 2019
Date01 September 2019
Subject MatterColumn
Column
The fundamental rights
challenges of algorithms
Janneke Gerards
Utrecht University, the Netherlands
Abstract
Algorithms form an increasingly important part of our daily lives, even if we are often unaware of it.
They are enormously useful in many different ways. They facilitate the sharing economy, help
detect diseases, assist government agencies in crime control, and help us choose what series or
film to watch. Yet, there is also a darker side to algorithms, and that is that they (and their
applications) can easily interfere with our fundamental rights. This column explores some of the
main fundamental rights challenges set by the pervasiveness of algorithms, and it presents a brief
outlook for the future.
Keywords
Algorithms, Artificial Intelligence (AI), private life, freedom, equality, non-discrimination,
procedural fundamental rights
Algorithms are everywhere. They have become instrumental in our everyday lives. They enable us
to search the internet effectively, they allow us to find interesting new books, movies and music,
and they can help detect certain diseases with great precision. Algorithms assist government bodies
in making tax decisions, in crowd control, in police investigations and in the detection of social
security fraud. Companies use algorithms in price determination and in making employment
decisions. Algorithms, in short, are wonderful tools and in our present-day world, we cannot do
without them.
Nevertheless, over time, we have come to recognise that there is a dark side to algorithms. As is
often the case, it is this dark side that makes them highly relevant to lawyers – fundamental rights
lawyers in particular. Their relevance for the law has to do with three main, interrelated, charac-
teristics of algorithms. Put shortly, algorithms are non-transparent, non-neutral, human constructs.
Algorithms are human constructs in that they are created, programmed, trained and applied by
human beings. As such, that is not problematic; it merely means that, in many ways, we have
Corresponding author:
Janneke Gerards, Utrecht University, Newtonlaan 231, 3584 BH Utrecht, the Netherlands.
E-mail: j.h.gerards@uu.nl
Netherlands Quarterly of Human Rights
2019, Vol. 37(3) 205–209
ªThe Author(s) 2019
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/0924051919861773
journals.sagepub.com/home/nqh
NQHR
NQHR

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT