Preventing Machines From Lying: Why Interdisciplinary Collaboration is Essential for Understanding Artefactual or Artefactually Dependent Expert Evidence

Published date01 April 2024
AuthorTim J Wilson,Jesper Bergman,Adam Jackson,Oliver B Popov
Date01 April 2024
Subject MatterArticles
Preventing Machines From Lying:
Why Interdisciplinary Collaboration
is Essential for Understanding
Artefactual or Artefactually
Dependent Expert Evidence
Tim J Wilson
Northumbria University, UK
Jesper Bergman
Stockholm University, Sweden
Adam Jackson
Northumbria University, UK
Oliver B Popov
Stockholm University, Sweden
This article demonstrates a signif‌icantly different approach to managing probative risks arising
from the complex and fast changing relationship between law and computer science. Laws his-
torical problem in adapting to scientif‌ic and technologically dependent evidence production is
seen less as a socio-techno issue than an ethical failure within criminal justice. This often arises
because of an acceptance of epistemological incomprehension between lawyers and scientists.
Something compounded by the political economy of criminal justice and safeguard evasion
within state institutions. What is required is an exceptionally broad interdisciplinary collabor-
ation to enable criminal justice decision-makers to understand and manage the risk of further
ethical failure. If academic studies of law and technology are to address practitioner concerns,
it is often necessary, however, to step down the doctrinal analysis to a specif‌ic jurisdictional
Explaining/understating AI/ML-assisted decisions, interdisciplinary methodology in law and
technology studies, neoliberalism, ethics and criminal justice systems
Corresponding author:
Tim J Wilson, Law School, Northumbria University, Newcastle upon Tyne, UK
The Journal of Criminal Law
2024, Vol. 88(2) 105129
© The Author(s) 2024
Article reuse guidelines:
DOI: 10.1177/00220183231226087
Of course, machines cannot lie any more than, as Alan Turing observed, nothing could be gained by
asking whether they could think. Hence, the term Artif‌icial Intelligence(AI). Turing reformulated
the latter question as whether human interrogators could be taken in by cunningly designed but quite
unintelligent programs?.
Likewise, expert evidence that is unreliable because of error in or misunder-
standing about data processing by computer programmes trained by Machine Learning (ML), hereafter
artefactual/artefactually dependent evidence, clearly cannot be termed lies. Allowing verdicts or senten-
cing decisions to turn on unsound, potentially unsound or misunderstood artefactual/artefactual depend-
ent evidence or being complicit in the evasion of probative safeguards, however, is normatively
equivalent to negligently or knowingly colluding with perjury. This analogy,
ref‌lecting the legal,
moral and social foundations of criminal law,
highlights the importance of ensuring that the trial of
fact must not be invalidated because of avoidable errors or misunderstandings in AI-assisted decision
making. The responsibility to get this right applies to all individuals who, as expert witnesses, investiga-
tors, lawyers or factf‌inders use such evidence. Those who should have the expertise to prevent non-expert
decision-makers being misled by or misunderstanding opinion evidence, however, bear the greatest pro-
fessional responsibility in this respect. Such an ethos prevails in evidence-based medicine, where the cor-
relation (not causality as in criminal law) of data the driving force of AI/ML processing is normally
suff‌icient, but best practice still mandates critical assessment of artefactual outputs by professional
Doctors must understand the limitations of an AI/ML tool before using it.
This article conjoins socio-legal research in England and Wales with computer science research in
Sweden. Both were part of a research project with other international partners into the problems of
police detective work on the TOR-protocol, an anonymous communication network (ACN) with
hidden services that can be used for digital marketplaces, etc. as part of the Dark Web.
The English
research had earlier resulted in a paper that looked at aspects personnel, organisational, cultural and
ethical issues of functional adaptation by the police in response to crime involving digital communica-
tion and services.
The Swedish contribution is based on proof of conceptresearch into the design of an
AI/ML-encoding tool to improve the effectiveness and forensic soundness of dark web cybercrime inves-
This article goes wider than police dark web investigations because the English co-authors
have also drawn on insights from earlier research into forensic DNA and f‌ingerprint comparisons. Our
interdisciplinary co-authorship was essential for insight into what is required to adapt to increasing
1. G Oppy and D Dowe, The Turing Testin EN Zalta (ed), The Stanford Encyclopaedia of Philosophy (2021) < https://plato.> on 26 June 2022.
2. Perhaps not wholly analogous, American scholarship notes perjury entrapmentand government-induced perjury. See,
respectively, BL Gershman, Perjury Trap(1981) 129 Univ Pennsylvania Law Rev 624; A Stein, Constitutional Evidence
Law(2019) 61 Vanderbilt Law Rev 65.
3. P Roberts and A Zuckerman, Criminal Evidence 2nd edn (OUP: Oxford, 2010) 11.
4. The US Agency for Healthcare Research and Quality (AHRQ) def‌inition of evidence-based practice: A way of providing health
care that is guided by a thoughtful integration of the best available scientif‌ic knowledge with clinical expertise. This approach
allows the practitioner to critically assess research data, clinical guidelines, and other information resources in order to correctly
identify the clinical problem, apply the most high-quality intervention, and re-evaluate the outcome for future improvement.<> on 25 September 2022 (emphasis added).
5. D Schönberger Artif‌icial intelligence in healthcare: a critical analysis of the legal and ethical implications(2019) Int J Law
Inform Technol.27 171
6. Other colleagues were aff‌iliated to the Dutch Open University, the Amsterdam University of Applied Sciences (HvA), the NHL
Stenden University of Applied Sciences, the Dutch Police Academy and the Politihøgskolen (Norwegian Police University
7. D Johnson et al., Police Functional Adaptation to the Digital or Post Digital Age: Discussions with Cybercrime Experts(2020)
J Crim Law 84 427; B OShea et al., Mapping Cyber-enabled Crime: Understanding Police Investigations and Prosecutions of
Cyberstalking(2022) Int J Crime Justice Soc Democr Advance online publication.
8. J Bergman and OB Popov, The Digital Detectives Discourse A Toolset for Forensically Sound Collaborative Dark Web
Content Annotation and Collection(2022) 17 J Digital Forensics Security Law Article 5.
106 The Journal of Criminal Law 88(2)

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT