Evaluating a threefold intervention framework for assisting researchers in literature review and manuscript preparatory tasks

Publication Date08 May 2017
Pages555-580
DOIhttps://doi.org/10.1108/JD-06-2016-0072
AuthorAravind Sesagiri Raamkumar,Schubert Foo,Natalie Pang
SubjectLibrary & information science,Records management & preservation,Document management,Classification & cataloguing,Information behaviour & retrieval,Collection building & management,Scholarly communications/publishing,Information & knowledge management,Information management & governance,Information management,Information & communications technology,Internet
Evaluating a threefold
intervention framework for
assisting researchers in
literature review and
manuscript preparatory tasks
Aravind Sesagiri Raamkumar, Schubert Foo and Natalie Pang
Wee Kim Wee School of Communication and Information,
Nanyang Technological University, Singapore, Singapore
Abstract
Purpose Systems to support literature review (LR) and manuscript preparation tend to focus on only one or
two of the tasks involved. The purpose of this paper is to describe an intervention framework that redesigns a
particular set of tasks, allowing for interconnectivity between the tasks and providing appropriate user
interface display features for each task in a prototype system.
Design/methodology/approach A user evaluation study was conducted on the prototype system.
The system supports the three tasks: building a reading list (RL) of research papers, finding similar papers
based on a set of papers and shortlisting papers from the final RL for inclusion in manuscript based on article
type. A total of 119 researchers who had experience in authoring research papers, participated in
the evaluation study. They had to select one of the provided 43 topics and execute the tasks offered by the
system. Three questionnaires were provided for evaluating the tasks and system. Both quantitative and
qualitative analyses were performed on the collected evaluation data.
Findings Task redesign aspects had a positive impact in user evaluation for the second task of finding
similar papers while improvement was found to be required for the first and third tasks. The tasks
interconnectivity features seed basket and RL were helpful for the participants in conveniently searching for
papers within the system. Two of the four proposed informational display features, namely, information cue
labels and shared co-relations were the most preferred features of the system. Student user group found the
task recommendations and the overall system to be more useful and effective than the staff group.
Originality/value This study validates the importance of interconnected task design and novel
informational display features in accentuating task-based recommendations for LR and manuscript
preparatory tasks. The potential for improvement in recommendations was shown through the task redesign
exercise where new requirements for the tasks were identified. The resultant prototype system helps in
bridging the gap between novices and experts in terms of LR skills.
Keywords Literature review, Digital library, Manuscript writing, Scientific paper information retrieval,
Scientific paper recommender system, Task interconnectivity, Task redesign
Paper type Research paper
Introduction
The research lifecycle (Nicholas and Rowlands, 2011) encompasses the activities performed
by researchers, ranging from identification of research opportunities to management of the
overall research process. Information seeking is performed by researchers all through this
lifecycle for acquiring information objects such as research topics, scientific papers, books,
publication venues, collaborators to name a few. Models of scientific information seeking
such as the Ellis model (Ellis and Haugan, 1997) describe the different macro-level stages of
information seeking carried out by researchers. During these stages, researchers primarily
search for scientific papers for corresponding information needs since scientific papers are
one of the most important and core information objects for researchers. To facilitate easier
information seeking, information retrieval (IR) systems provide the necessary formal
channels for querying sources where information is structurally organized. IR based
Journal of Documentation
Vol. 73 No. 3, 2017
pp. 555-580
© Emerald PublishingLimited
0022-0418
DOI 10.1108/JD-06-2016-0072
Received 2 June 2016
Revised 31 October 2016
16 December 2016
Accepted 16 December 2016
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/0022-0418.htm
555
Assisting
researchers in
literature
review
systems such as academic search systems, academic databases and citation indices do not
provide direct support for all scientific information seeking stages since these systems are
not designed for specific LR search tasks. Instead a free-text search interface is provided in
these systems for supporting ad hoc information needs. In addition to these systems, certain
reference management systems such as Mendeley (Vargas et al., 2016) and Docear
(Beel et al., 2013) provide paper recommendations based on researchers paper collections.
However, these recommendations are mainly based on topical similarity with recently
viewed papers. Researchers task-based relevance factors have not been considered for
formulating recommendations in these systems.
Research has shown that task-based IR systems are more apt for users as they address
specific task requirements (Vakkari, 2001). Since information seeking and IR are intertwined
in the broad context of information behavior (Wilson, 1999), the integrated information
seeking and retrieval framework (Ingwersen and Järvelin, 2006) was proposed to consider
task characteristics in system design. Nevertheless, there is a scarcity of such systems
particularly for assisting researchers in multiple literature review (LR) and manuscript
preparatory (MP) search tasks. As an addition to the free-text search engines, studies in IR
and recommender systems (RS) have been conducted to put forth algorithms and systems
for supporting these tasks. Such user tasks include building reading list (RL) for LR
(Bae et al., 2014; Ekstrand et al., 2010; Jardine, 2014), finding similar papers for a given paper
(Küçüktunç et al., 2015; Liang et al., 2011), recommending citations for particular
placeholders in manuscripts (He et al., 2011; Livne et al., 2014), recommending papers
based on activity logs (Liu et al., 2012; Yang and Lin, 2013), recommending papers based on
author publication history (Lee et al., 2013; Sugiyama and Kan, 2013) to name a few.
These approaches/techniques generate paper recommendations for the corresponding tasks.
Support for these aforementioned tasks is generally provided as part of separate
systems. Support for specific LR tasks such as building reading listand finding similar
papersneed to be provided as constituents of a single system since these tasks are
interconnected and incremental in nature. The papers from the former task are inputs to the
latter task. In addition, it is to be noted that the user evaluation of the proposed techniques
for task recommendations in earlier IR and RS studies, have not been performed in the
context of a system where multiple tasks were supported. Therefore, the integration of these
approaches toward building a task-based LR and MP assistive system remains untested.
Practical implementation of these techniques in digital libraries might be a complex activity
as these techniques employ disparate data pre-processing and retrieval/recommendation
techniques based on preconceived requirements on the nature of the tasks.
LR search tasks have different requirements as per the stage of LR. Similar to general-
purpose information seeking, researchers transition through phases such as pre-focus,
problem formulation and post-focus (Vakkari, 2001) while working on an research problem.
Different types of papers are required for the series of search tasks in the LR process. It is
observed in previous RS studies that recommendation techniques are conceptualized
for LR tasks without analyzing the complexity of the tasks. There is a need to re-examine
the LR tasks requirements in lieu of earlier RS studies, in order to verify whether these tasks
provide the best set of papers for researchers. This redesign of LR tasksrequirements can
help researchers in getting papers of different types using the required paper discovery
mechanisms, in the digital libraries context. By redesign, we allude to changes in the system
task corresponding to the actual user task. Task redesign is the first intervention (I1) in our
current research on scientific paper retrieval/recommendation systems. Through this
intervention, the intent is to improve the recommendations of scientific papers for LR tasks.
The case for a singular system supporting multiple LR tasks has been raised in the past.
Scienstein (Gipp et al., 2009) and Papyres (Naak et al., 2009) are systems which were
designed to support LR tasks while systems such as CiteSight (Livne et al., 2014) help in
556
JD
73,3

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT