Evaluation of citations: a proposition to supplement the corresponding rule book in Serbia

Date05 June 2017
DOIhttps://doi.org/10.1108/EL-09-2015-0182
Pages598-614
Published date05 June 2017
AuthorBojana Dimic Surla,Dusan Ilija Surla,Dragan Ivanovic
Subject MatterInformation & knowledge management,Information & communications technology,Internet
Evaluation of citations:
a proposition to supplement the
corresponding rule book in Serbia
Bojana Dimic Surla and Dusan Ilija Surla
Department of Mathematics and Informatics, Faculty of Sciences,
University of Novi Sad, Novi Sad, Serbia, and
Dragan Ivanovic
Faculty of Technical Sciences, Institute for Computing and Automatics,
University of Novi Sad, Novi Sad, Serbia
Abstract
Purpose The purpose of this article is to describe a proposition for the evaluation of citations of
scientic papers, which could serve as a supplement to the existing Rule Book of the Ministry of the
Republic of Serbia, which is used in the procedure of electing candidates for particular academic and
research titles. The evaluation and quantitative presentation of the results and evaluation of citations
were carried out on data taken from the database of the Current Research Information System of the
University of Novi Sad (CRIS UNS), which is harmonized with the Rule Book of the Ministry with respect
to the evaluation of published scientic results of researchers.
Design/methodology/approach There are different criteria to evaluate the quality of scientic
papers based on their citations. The pertinent parameters can be the total number of citations, the number
of citations in a dened time period and by assigning the appropriate weighting values to the citations.
This work proposes a procedure of assigning the citation weighting values based on the evaluation of the
scientic results in which the citation appeared according to the Rule Book in the Republic of Serbia.
Based on this, the authors introduced the impact factor of researchers as the ratio of the number of points
of the evaluated citations and the number of points of the evaluated papers of the researcher.
Findings Results showed that the research information system CRIS UNS can be extended to the
evaluation of citations for a single researcher, groups of researchers and institutions.
Practical implications The proposed solution enables the evaluation of citations in the process of
election and promotion of academic staff. In this way, there is a means for measuring the scientic
inuence of a researcher in the relevant scientic area.
Social implications The evaluation of citations may be included in the national strategies of
scientic development, funding and evaluation of research projects; for promotions of academic staff at
the universities and other academic institutions; and ranking of researchers and research organizations.
Originality/value The main idea presented in the paper is the denition of a rule book (or several rule
books) for the evaluation of citations. Based on the evaluation of citations, the authors proposed the term
“the impact factor of researcher”.
Keywords Academic libraries, Data analysis, Bibliometric analysis, Universities,
Academic personnel, Research results
Paper type Case study
This paper is part of the research project “Infrastructure for Technology Enhanced Learning in
Serbia” supported by the Ministry of Education and Science of the Republic of Serbia [Project No.
47,003].
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/0264-0473.htm
EL
35,3
598
Received 23 September 2015
Revised 23 February 2016
11 May 2016
21 June 2016
Accepted 5 August 2016
TheElectronic Library
Vol.35 No. 3, 2017
pp.598-614
©Emerald Publishing Limited
0264-0473
DOI 10.1108/EL-09-2015-0182
Introduction
The evaluation of scientic work is an unavoidable step in dening national strategies for
scientic development. The evaluation results are used by research organizations and public
agencies to prove that their investments in science have produced valuable results.
Evaluations also serve the purpose of recruitment and selection of research staff, election and
promotion of candidates into academic and/or research titles/positions in the frame of
research organizations or academic institutions, ranking of researchers and/or research
organizations, making nancial decisions (nancing of research projects), awarding
individuals for their achievements and so forth (Koskinen et al., 2008). The evaluation of
scientic work need not necessarily have as its goal the allocation of nancial means, but it
can also promote competitiveness among research institutions or individual researchers and,
thus, increase their efciency (Weingart, 2005). Published research results are products of the
research activities, which are suitable for evaluation.
The evaluation of published scientic papers can be expressed by quantitative indicators
of their quality. Bar-Ilan (2008) pointed out that the quality of publications can be determined
by relying on three approaches: peer-reviewing, evaluation based on bibliometric indicators
and the combination of these two approaches (experts’ opinion supported by bibliometric
data).
Experts, or a group of experts (a commission), evaluate the results of the researchers on
the basis of adopted evaluation criteria or on the basis of their subjective opinions. Examples
of the evaluation based solely on the experts’ opinions are the Research Assessment Exercise
in Great Britain (HEFCE, 2015) and Excellence in Research for Australia (Australian
Research Council, 2016). Theoretically, this evaluation principle should be considered as the
best approach (Willcocks et al., 2008). However, in practice, this principle may have some
shortcomings, as pointed out in Weingart’s (2005) paper.
Evaluation based on the principle of bibliometric assessment relies on numeric indicators,
such as the number of citations, impact factor (IF) and h-index. These indicators are the
results of the analysis of qualitative and quantitative properties of scientic publications by
mathematical–statistical methods. Most often, they are based on the number of published
papers and their citations in a given period. The accessibility, transparency and objectivity
are characteristics of bibliometric indicators that portray the evaluation results, as pointed
out by Adler et al. (2009). These authors consider bibliometric evaluation to be advantageous
over expert evaluation. Namely, the results of bibliometric evaluation are more transparent
and clear to the public because the resulting numerical mark/category assigned to a
publication can be veried using the corresponding formula. However, this approach too has
certain shortcomings. Weingart (2005) pointed out numerous technical and methodological
problems related to the formation and collection of bibliometric indicators.
The third way – that is, the combination of the experts’ opinion and bibliometric
indicators – is considered to be the best evaluation approach. Iivari (2008) carried out a
comprehensive comparison of the rst and the second approaches and indicated their
positive and negative characteristics. In their research, Durieux and Gevenois (2010) pointed
out that the work of experts would be more efcient if they would use the corresponding
bibliometric indicators. The advantages of this evaluation approach were also pointed out in
a number of papers (Heiss et al., 2013;Moravcová, 2012;Vanclay and Bornmann, 2012).
The third approach, in its different variants, is used in the majority of countries. For
example, in the Republic of Serbia, the evaluation of works published in journals from the
Thomson Institute for Scientic Information (ISI) list is based only on the IF values, as
described in Section 2.1 of this paper. The other journals are valued individually by the
corresponding commission. The works from scientic meetings are valued in accordance
599
Citation
evaluation

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT