Identifying “best bets” for searching in chemical engineering. Comparing database content and performance for information retrieval

Published date08 January 2018
Pages80-98
Date08 January 2018
DOIhttps://doi.org/10.1108/JD-09-2016-0112
AuthorGiovanna Badia
Subject MatterLibrary & information science,Records management & preservation,Document management,Classification & cataloguing,Information behaviour & retrieval,Collection building & management,Scholarly communications/publishing,Information & knowledge management,Information management & governance,Information management,Information & communications technology,Internet
Identifying best betsfor
searching in chemical engineering
Comparing database content and performance
for information retrieval
Giovanna Badia
Schulich Library of Physical Sciences, Life Sciences, and Engineering,
McGill University, Montreal, Canada
Abstract
Purpose Performing efficient literature searches and subscribing to the most comprehensive databases for
interdisciplinary fields can be challenging since the literature is typically indexed in numerous databases to
different extents. Comparing databases will help information professionals make appropriate choices when
teaching, literature searching, creating online subject guides, and deciding which databases to renew when
faced with fiscal challenges. The purpose of this paper is to compare databases for searching the chemical
engineering literature.
Design/methodology/approach This paper compares journal indexing and search recall across seven
databases that cover the chemical engineering literature in order to determine which database and
database pair provide the most comprehensive coverage in this area. It also summarizes published, database
comparison methods to aid information professionals in undertaking their own comparative assessments.
Findings SciFinder, Scopus, and Web of Science, listed alphabetically, were the leading databases for
searching the chemical engineering literature. SciFinder-Scopus and SciFinder-Web of Science were the top
two database pairs. No single database or pair provided 100 percent complete coverage of the literature
examined. Searching a second database increased the recall of results by an average of 17.6 percent.
Practical implications The findings are useful since they identify best betsfor performing an efficient
search of the chemical engineering literature. Information professionals can also use the methods discussed to
compare databases for any discipline or search topic.
Originality/value This paper builds on the previous literature by using a dual approach to compare the
coverage of the chemical engineering literature across multiple databases. To the authors knowledge,
comparing databases in the field of chemical engineering has not been reported in the literature thus far.
Keywords Information retrieval, Searching, Comparative tests, Citation analysis, Online databases,
Search recall
Paper type Research paper
1. Introduction
No single source exists that covers all the literature in a discipline or about a topic.
Performing efficient literature searches and subscribing to the most comprehensive
databases for interdisciplinary fields can be challenging since the literature in these
areas is typically indexed in numerous databases to different extents. Why should
information professionals concern themselves with selecting the best databases when any
resource chosen will be incomplete? Comparing databases will help these practitioners
make appropriate choices when answering reference questions, teaching others
how to search the literature, and deciding whether to acquire a new resource or cancel
an existing subscription.
This paper compares journal indexing and search recall across seven databases that
cover the chemical engineering literature (i.e. Compendex, Environmental Sciences &
Pollution Management, Inspec, PubMed, SciFinder, Scopus, and Web of Science) in order to
determine which database(s) and database pair(s) provide the most comprehensive coverage
in chemical engineering. It also summarizes published, database comparison methods to aid
information professionals in undertaking their own comparative assessments, and provides
Journal of Documentation
Vol. 74 No. 1, 2018
pp. 80-98
© Emerald PublishingLimited
0022-0418
DOI 10.1108/JD-09-2016-0112
Received 22 September 2016
Revised 22 March 2017
Accepted 9 April 2017
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/0022-0418.htm
80
JD
74,1
an analysis of the search results at the journal level. Note that Web of Science in this paper
refers to the database, also known as the Web of Science Core Collection, and not the Web of
Science platform (formerly called Web of Knowledge) that includes other databases.
In addition, SciFinder includes the CAplus (Chemical Abstracts) and Medline databases;
duplicates from Medline were removed for all searches conducted in SciFinder.
2. Literature review
The literature reports different methods suitable for comparing database content and
performance. The purpose for comparing databases will decide which method or
combination of methods to employ. For instance, a librarian interested in comparing full-text
databases to identify duplication between resources for cancellation reasons might prioritize
comparing content across the different databases rather than contrasting performance.
Therefore, defining the purpose is important before undertaking any database comparison.
Using indexed journal lists, sample references, and citation analysis are three methods
that can be used for comparing database content (see Jacso, 1997, 2001 for comprehensive
overviews of different content evaluation methods). Comparing lists of indexed serials,
created by database producers, is the quickest of these methods. Lists of indexed periodicals
are usually found on the database platform or vendors website. One can copy and paste
different lists into one Excel file, highlight in different colors the titles from different
databases, and then sort all the titles alphabetically. Comparing indexed journal
lists does not, however, take into account depth of coverage. One database might provide
cover-to-cover indexing of a particular serial while another might just index articles on a
specific topic. Utilizing indexed periodical lists to perform a comparison is quick to
accomplish and provides a sense of which database has more or less content, even though it
does not show the complete picture. This method has been used to compare databases in
different disciplines, such as in earth and atmospheric sciences (Barnett and Lascar, 2012),
fine arts (Veeder, 2011), and nursing (Hill, 2009).
Searching for sample references across different databases is another way of comparing
coverage. Researchers have checked the indexing of cited journal articles in published
systematic reviews on a topic (e.g. Beyer and Wright, 2013; Brettle and Long, 2001;
Michaleff et al., 2011), looked up articles in a particular discipline that were written by staff
at a specific university (Cavacini, 2014), searched for a pre-defined set of articles on a topic
(Walters and Wilder, 2003), and checked the indexing of individual periodical titles on a list.
These journal title lists can be created from an existing bibliography of serials commonly
accepted in the discipline (Grindlay et al., 2012; Kawasaki, 2002; Sutton and Foulke, 1999),
by consensus of a committee (Ingold, 2007), by consulting the top frequently cited journals
in a subject area in Web of Science, Scopus, and Google Scholar for a given year
(Grabowsky, 2015), and from serial titles assigned the same subject heading in Ulrichs
International Periodicals Directory (McDonald et al., 1999). The selection of the sample
references searched is crucial to this method in order to ensure that any findings can be
generalized to searching the entire literature in the subject area under investigation.
Similar to using sample references, the source of data in citation analysis affects the
generalizability of the results when comparing databases. Citation analysis consists of
determining whether cited journal titles are indexed in the databases being contrasted.
These periodical titles ar e identified from referenc es cited in the bibliograph ies
of publications, such as articles published in the primary journals of a discipline
(Bergman, 2011; Speare, 2010) or theses by graduate students in a particular subject area
(Newton and Tellman, 2010). Members of the Medical Library Associations Nursing and
Allied Health Resources Section (NAHRS) created a protocol for undertaking citation
analysis studies, later called mapping studies,of specific disciplines in nursing and allied
health (e.g. Alpi, 2006) to enhance librariansunderstanding of publications and accessing
81
Best betsfor
searching

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT