Exploratory analysis of Publons metrics and their relationship with bibliometric and altmetric impact

DOIhttps://doi.org/10.1108/AJIM-06-2018-0153
Pages124-136
Date21 January 2019
Published date21 January 2019
AuthorJose Luis Ortega
Subject MatterLibrary & information science,Information behaviour & retrieval,Information & knowledge management,Information management & governance,Information management
Exploratory analysis of Publons
metrics and their relationship
with bibliometric and
altmetric impact
Jose Luis Ortega
Spanish National Research Council, Córdoba, Spain
Abstract
Purpose The purpose of this paper is t o analyse the metrics provided by Publons abou t the scoring of
publications and the ir relationship with impact measurement s (bibliometric and altmetric ind icators).
Design/methodology/approach In January 2018, 45,819 research articles were extracted from Publons,
including all their metri cs (scores, number of pre and post revie ws, reviewers, etc.). Using the DOI i dentifier,
other metrics from altme tric providers were gath ered to compare the scores o f those publications in
Publons with their bib liometric and altmetric impact in PlumX, Al tmetric.com and Crossref Event Data.
Findings The results show that: there are important biases in the coverage of Publons according to
disciplines and publishers; metrics from Publons present several problems as research evaluation indicators;
and correlations between bibliometric and altmetric counts and the Publons metrics are very weak (ro0.2)
and not significant.
Originality/value This is the first study about the Publons metrics at article level and their relationship
with other quantitative measures such as bibliometric and altmetric indicators.
Keywords Peer-review, Bibliometrics, Altmetrics, PlumX, Publons, Publons score
Paper type Research paper
1. Introduction
Traditionally, peer-review has been the most appropriate way to validate scientific
advances. Since the first beginning of the scientific revolution, scientific theories and
discoveries were discussed and agreed by the research community, as a way to confirm
and accept new knowledge. This validation process has arrived until our days as a
suitable tool for accepting the most relevant manuscripts to academic journals, allocating
research funds or selecting and promoting scientific staff. However, this system presents
two important limitations: expensive and subjective. Peer-review requires the involvemen t
of two or more scholars that study and analyse each research unit (publication, institution,
researcher, etc.) and then present an assessment report. This process consumes large
amount of economic and time resources. Equally, peer-review suffers from subjective
judgements and it would cause arbitrary and biased decisions that undermine the
evaluation system.
Likewise, the professionalisation of science in the nineteenth century (Beer and Lewis,
1963) caused a rapid increase of economic and human resources and, in consequence, an
exponential growth of scholarly publications (Price, 1961). Bibliometrics indicators emerged
as a complementary way, less expensive and more objective, of assessing complex academic
scenarios resulted from this growing professionalisation. Based on production and impact
indicators, bibliometrics contributes indicators that allow to benchmark and assess the
performance of different research units into disciplinary or institutional environments. In
the manner of peer-review, bibliometrics also has important limi tations such as
manipulation and misuse (Narin et al., 1994). Practices such as salami publishing, abusive
self-citations or using journal metrics to evaluate articles or authors question the suitability
of bibliometrics for research evaluation.
Aslib Journal of Information
Management
Vol. 71 No. 1, 2019
pp. 124-136
© Emerald PublishingLimited
2050-3806
DOI 10.1108/AJIM-06-2018-0153
Received 18 June 2018
Revised 14 September 2018
6 November 2018
Accepted 12 November 2018
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/2050-3806.htm
124
AJIM
71,1

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT