The Network of Law Reviews: Citation Cartels, Scientific Communities, and Journal Rankings

AuthorJudit Bar‐Ilan,Oren Perez,Nir Schreiber,Reuven Cohen
DOIhttp://doi.org/10.1111/1468-2230.12405
Published date01 March 2019
Date01 March 2019
bs_bs_banner
The Network of Law Reviews: Citation Cartels,
Scientific Communities, and Journal Rankings
Oren Perez, Judit Bar-Ilan, Reuven Cohen and Nir Schreiber
Research evaluation is increasingly influenced by quantitative data. We focus on the influential
Web of Science Journal Citation Reports (JCR) ranking of law journals and critically assess
its methodology. In particular, we consider the existence and impact of a tacit citation cartel
between US law reviews. A study of 45 US student-edited (SE) and 45 peer-reviewed (PR)
journals included in the category of Law in the JCR revealed that PR and SE journals are more
inclined to cite members of their own class and that this phenomenon is more pronounced
in SE generalist journals, reflecting tacit cartelistic behavior generated by deeply entrenched
institutional practices. Because US SE journals produce more citations than PR journals, the
fact that their citations are directed almost exclusively to SE journals elevates their scores and
distorts the journals’ ranking and can consequently undermine the flow and creation of ideas.
We discuss policy measures that can counter the adverse effects of this situation.
INTRODUCTION: THE ‘IMPACT FACTOR OBSESSION’, ACADEMIC
PUBLISHING PRACTICES, AND CITATION CARTELS
Research evaluation is increasingly being influenced by quantitative data. Im-
pact factor (the mean citation counts of items published in journals in the
preceding two years) has become particularly salient in this context, leading to
what some authors have described as ‘impact factor obsession’.1Impact factor
and other similar metrics are influencing promotion decisions, grant alloca-
tions, and project appraisals.2These ‘research-evaluation’ metrics constitute
just one manifestation of a vast body of quantitative measures that are used to
assess academic institutions.3The legal field has not escaped the influence of
the metrics wave. There are several global rankings for law schools and depart-
ments: Times Higher Education World University Subject Ranking, Shanghai
University Subject Ranking, SSRN Ranking for US and International law
schools (the influential US News Ranking of Law Schools only focuses on US
schools). Law journals are measured by four different rankings: Journal Citation
Oren Perez, Dean of Bar-Ilan Faculty of Law; Judit Bar-Ilan, Department of Information Science,
Bar-Ilan University; Reuven Cohen, Head of the Mathematics Department, Bar-Ilan University;
Nir Schreiber, PhD student, Mathematics Department, Bar-Ilan University.
1 D. Hicks et al, ‘The Leiden Manifesto for Research Metrics’ (2015) 520 Nature 429.
2 P. Stephan, R. Veugelers and J. Wang, ‘Blinkered by Bibliometrics’ (2017) 544 Nature 411; D.
Adam, ‘Citation Analysis: The Counting House’ (2002) 415 Nature 726.
3E.Hazelkorn,Rankingsand the Reshaping of Higher Education: The Battle for World-Class Excellence
(London: Springer, 2015); W. Espeland and M. Sauder, Engines of Anxiety: Academic Rankings,
Reputation, and Accountability (New York, NY: Russell Sage Foundation, 2016); M. M. Vernon,
E. A. Balas, and S. Momani, ‘Are UniversityRankings Useful to Improve Research? A Systematic
Review’ (2018) 13 PLoS One e0193762.
C2019 The Authors. The Modern Law Review C2019 The Modern Law ReviewLimited. (2019)82(2) MLR 240–268
Published by JohnWiley & Sons Ltd, 9600 Garsington Road, Oxford OX4 2DQ, UK and 101 Station Landing, Medford, MA 02155, USA
Oren Perez, Judit Bar-Ilan, Reuven Cohen and Nir Schreiber
Reports (JCR) by Clarivate Analytics, CiteScore from Elsevier, Scimago and
Washington and Lee. While the scientific community has warned on various
occasions (eg, the DORA declaration on research assessment and the Leiden
Manifesto) against the adverse effects of using metrics (eg, impact factor) as a
surrogate measure for evaluating the quality of research, this warning has not
been very successful in averting the metrics tide.4
In the UK the debate about the use of metrics has revolved around the
Research Excellence Framework (REF) - the system for assessing the quality
of research in UK higher education institutions. The REF has primarily relied
on peer assessment in its evaluation of academic output. However, a 2016
report on the REF system headed by Nicolas Stern may signal a change in the
REF approach. Although the report states that it agrees with the majority of
respondents to the call for evidence that REF panels should continue to assess
outputs through peer review, it also notes its support in the use of metr ics:
We support the appropriate use of bibliometric data in helping panels in their peer
review assessment, and recommend that all panels should be provided with the
comparable data required to inform their judgements.5
The more recent REF guidelines have generally adopted Stern’s view that
quantitative data may be used to inform the assessment of outputs, but have
limited the use of such data to a sub-group of panels.6While the law sub-
panel was not among the panels which have been guided to use citation data,7
it is doubtful whether the distinction made by REF in that context can be
sustained given global developments. Thus, for example, the 2017 Shanghai
subject rankings for law are based to a large extent on citation data.8
The increasing influence of metrics9has created an incentive for individual
scholars, research institutions and journals alike to manipulate data in order to
elevate their scores.10 Examples of manipulating strategies include the publica-
tion of editorials with many journal self-citations, coercive journal self-citation,
and citation cartels.11 There have been several conspicuous cases of citation
4 B. Brembs, K. Button, and M. Munaf`
o, ‘Deep Impact: Unintended Consequences of Journal
Rank’ (2013) 7 Frontiers in Human Neuroscience 291; American-Society-for-Cell-Biology, ‘San
Francisco Declaration on Research Assessment’ (2012) at https://sfdora.org/read/ (all URLs
were last accessed 10 December 2018). B. Alberts, ‘Impact Factor Distortions’ (2013) 340 Science
787; J. Wilsdon et al, The Metric Tide: Report of the Independent Review of the Role of Metrics in
Research Assessment and Management (2015) 136.
5N.Stern,Building on Success and Learning from Experience: An Independent Review of the Research
Excellence Framework (2016) 21.
6REFreport,Initial decisions on the Research Excellence Framework 2021 (REF 2017/01) 6, para 17.
7REFreport,Consultation on the panel criteria and working methods (2018/02) para 269.
8 See, http://www.shanghairanking.com/Shanghairanking-Subject-Rankings/law.html.
9 Wilsdon, n 4 above.
10 J. A. Oravec, ‘The Manipulation of Scholarly Rating and Measurement Systems: Constructing
Excellence in an Era of Academic Stardom’ (2017) 22 Teaching in Higher Education 423.
11 P. Heneberg, ‘From Excessive Journal Self-Cites to Citation Stacking: Analysis of Journal Self-
Citation Kinetics in Search for Journals, Which Boost Their Scientometric Indicators’ (2016)
11 PLoS One e0153730; C. Chorus and L. Waltman, ‘A Large-Scale Analysis of Impact Factor
Biased Journal Self-Citations’ (2016) 11 PLoS One e0161021; B. R. Martin, ‘Editors’ Jif-Boosting
Stratagems – Which Are Appropriate and Which Not?’ (2016) 45 Research Policy 1.
C2019 The Authors. The Modern Law Review C2019 The Modern Law Review Limited.
(2019) 82(2) MLR 240–268 241

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT