Altmetrics and their relationship with citation counts: a case of journal articles in physics
DOI | https://doi.org/10.1108/GKMC-07-2021-0122 |
Published date | 17 January 2022 |
Date | 17 January 2022 |
Pages | 391-407 |
Author | Rishabh Shrivastava,Preeti Mahajan |
Altmetrics and their relationship
with citation counts: a case of
journal articles in physics
Rishabh Shrivastava
Department of Library and Information Science, Central University of Punjab,
Bathinda, India, and
Preeti Mahajan
Department of Library and Information Science, Panjab University,
Chandigarh, India
Abstract
Purpose –The first purposeof the present study is to investigate the coverage of journal articlesin Physics
in various sources of altmetrics. Secondly, the study investigates the relationship between altmetrics and
citations. Finally, the study also investigates whether the relationship between citationsand altmetrics was
strongeror weaker for those articles that had been mentionedat least once in the sources of altmetrics.
Design/methodology/approach –The journal articles in Physics having at least one author from an
Indian Institutionand published during 2014–2018 in sources of altmetricshave been investigated. Altmetric.
com was used for collecting altmetricsdata. Spearman’s rank correlation coefficient (
r
) has been used as the
data found to be skewed.
Findings –The highest coveragewas found on Twitter (22.68%), followed by Facebook (3.62%) andblogs
(2.18%). The coveragein the rest of the sources was less than 1%. The average Twittermentions for journal
articles tweeted at least once was found to be 4 (3.99) and for Facebook mentions, it was found to be 1.48.
Correlations between Twitter mentions–citations and Facebook mentions–citation were found to be
statisticallysignificant but low to weak positive.
Research limitations/implications –The study concludes that due to the low coverage of journal
articles, altmetrics should be used cautiously for research evaluation keeping in mind the disciplinary
differences.The study also suggests that altmetrics can functionas complementary to citation-based metrics.
Originality/value –The study is one of the first large scale altmetrics studies dealing with research in
Physics. Also, Indianresearch has not been attended to in the altmetrics literature and the present study shall
fill that void.
Keywords Scholarly communication, Research evaluation, Informetrics, Altmetrics,
Citation-based metrics, Science libraries
Paper type Research paper
1. Introduction
The past few decades have struggled in developing proper mechanisms for research
evaluation. The topic has remained a matter of debate and parleyfor long, and continues to
be so. The lack of a commonly accepted-by-allset of standards for evaluation of research has
led different organizations,funding agencies, research reviewers, etc. to decide on their own
Acknowledgments: The authors are thankful to Altmetric.com for providing the data through their
Altmetric Researcher Data Access Program. Special thanks to Ms Stacy Konkiel, Director of Research
Relations for her help and assistance from time to time.
Altmetrics and
their
relationship
391
Received21 July 2021
Revised4 November 2021
Accepted15 December 2021
GlobalKnowledge, Memory and
Communication
Vol.72 No. 4/5, 2023
pp. 391-407
© Emerald Publishing Limited
2514-9342
DOI 10.1108/GKMC-07-2021-0122
The current issue and full text archive of this journal is available on Emerald Insight at:
https://www.emerald.com/insight/2514-9342.htm
set of parameters, which have been elaborative and convolute to assess the quality of
research. Many research evaluationactivities have been exercised from time to time for the
improvement of quality of research focusing on formal characteristics of mechanisms
adopted at national and local levels (Fealing,2011;Reale and Seeber, 2013). Difference in the
nature of organizations is also one of the major factors that need to be kept in mind while
conducting researchevaluation so that more inter-disciplinary research,better research data
management, etc. maybe inculcated.
One of the significant but often neglected aspects of research impact has been the social
impact of research. A study conducted on the perception of res earchers who had applied for a
research grant in The Netherlands found that the researchers expected at least 20% of the
total assessment should consist of evaluating social impact of the propos ed research
(Nieuwlaat and Zwegers, 2019). Citation-based metrics, useful in understanding the
underlying concept and the historical context, has been in use since 1960s and has appreciably
influenced the task of research evaluation since the inception of Science Citation Index by
Institute of Scientific Information, Philadelphia (Mohammadi and Thelwall, 2013).
Bibliometricians however, have argued that citations do suffer from some flaws including the
time taken by them to aggregate for measuring impact of research (Wang, 2013). Using
citations for research evaluation completely ignores research outputs other than peer re viewed
articles and research papers (Mohammadi and Thelwall, 2013). Research impact of scholarly
outputs other than journal articles cannot be measured by citations, as they are seldom
indexed by citation databases. Evaluation on the basis of citations completely overlooks the
educational and professional usage of a research work (Aksnes et al.,2019). The impact of a
research work on the day to day functioning of working professionals in a field cannot be
measured through citations. A course instructor or a trainer may use a research work for
classroom instructions without citing the article in his/her own publication (Tenopir and King,
2000). To increase the recognition of journals by authors as well as publishers, unethical
practices have been pointed out by Falagas and Alexiou (2008). Friends/colleagues have been
found to citeeach other to mutuallyescalate their citation counts(known as cronyism) (Meho,
2006). An erroneous paper, if published may result in many critical responses, thereby being
cited each time, and the citation databases would reflect the work as a highly cited one (Cole
and Cole, 1971). Some authors argue that not all citations should be treated as equal. Those
research works cited by “first- rank”scientists and journals should be considered above or
better than those cited by those not recognized as “first-rank”in their respective fields
(Bergstrom and West, 2008). Another demerit of using citations for research evaluation is the
citationlatency which may be sometimes and for some disciplines morethan one year or even
longer (Brody et al., 2006). Moreover, the availability of a research work is also crucial in
getting citations. With the open access movement, articles that are in open access or in public
domain have been found to receive more citations (Davis and Fromerth, 2007;Gargouri et al.,
2010;Kousha and Abdoli, 2010). Therefore, authors try to publish their articles in open access
journals to increase their visibility and gain citations. Self-archived articles tend to get more
citations than non-open access articles (Kousha and Abdoli, 2010).
Researchers and academicians have started sharing their research on various social
media platforms. These platforms not only help in faster promotion and propagation of
research but are also a rich source of alternative metrics called the“altmetrics”. Altmetrics
have emerged as complementary to traditional metrics for research evaluation. The term
was first coined by a PhD student of the University of North Carolina–Chapel Hill, Jason
Priem, in a tweet in 2010. It has been defined as “the study and use of scholarly impact
measures based on activity in online tools and environments”(Priem et al.,2012). As the
most recent subfield of informetrics, the definition of altmetrics has remained in constant
GKMC
72,4/5
392
To continue reading
Request your trial