Evaluating the usability of the information architecture of academic library websites

Pages566-590
Date16 September 2019
Published date16 September 2019
DOIhttps://doi.org/10.1108/LHT-07-2017-0151
AuthorIsabel Mariann Silvis,Theo J.D. Bothma,Koos J.W. de Beer
Subject MatterLibrary & information science,Librarianship/library management,Library technology,Information behaviour & retrieval,Information user studies,Metadata,Information & knowledge management,Information & communications technology,Internet
Evaluating the usability of the
information architecture of
academic library websites
Isabel Mariann Silvis, Theo J.D. Bothma and Koos J.W. de Beer
University of Pretoria, Pretoria, South Africa
Abstract
Purpose The purpose of this paper is to provide an integrated list of heuristics and an information
architecture (IA) framework for the heuristic evaluation of the IA of academic library websites as well as an
evaluation framework with practical steps on how to conduct the evaluation.
Design/methodology/approach A set of 14 heuristics resulted from an integration of existing
usability principles from authorities in the field of usability. A review of IA literature resulted in a framework
for dividing academic library websites into six dialogue elements. The resulting heuristics were made
applicable to academic library websites through the addition of recommendations based on a review of
20 related studies.
Findings This study provides heuristics, a framework and workflow guidelines that can be used by the
various evaluators of academic library websites, i.e. library staff, web developers and usability experts, to
provide recommendations for improving its usability.
Research limitations/implications The focus of the usability principles is the evaluation of the IA
aspects of websites and therefore does not provide insights into accessibility or visual design aspects.
Originality/value The main problem that is addressed by this study is that there are no clear guidelines
on how to apply existing usability principles for the evaluation of the IA of academic library websites.
Keywords Website usability, Information architecture, Academic library websites, Heuristic evaluation,
Usability inspection, User-centred design
Paper type Literature review
1. Introduction
Academic institutions need to be aware of the usability problems on their websites, which
can be improved to gain the advantages of usable educational websites(Hasan, 2013,
p. 231). A usability problem is something that is confusing, misleading or sub-optimal in an
interface (Lazar et al., 2010, p. 252). The aim of a usability inspection is to find usability
problems in an existing user interface design and to make recommendations for fixing the
problems and improve the usability of the design (Nielsen and Mack, 1994, p. 3). According
to Sherwin (2016), universities that prioritize a good user experience leverage the website to
contribute to larger institutional goals and see a clear return on investment.
This paper provides an integrated list of heuristics and an information architecture (IA)
framework for the usability evaluation of academic library websites as well as an evaluation
workflow with practical steps on how to conduct an evaluation using these resources.
1.1 History of academic library website usability
The evaluation of library websites has attracted a large amount of attention from
researchers in the field of library and information sciences (Chase et al., 2016; Pant, 2015;
Silvis, 2017; Wu and Brown, 2016).
A library website is an important gateway to a universitys library services, including
electronic resource access, online catalogues and online referencing services. Library
websites were initially developed to meet the needs of the people who work in the library
(Dominguez et al., 2015, p. 100) and were designed and maintained by library employees,
rather than web developers (King and Jannik, 2005, p. 236). This resulted in academic library
websites that were filled with too much library jargon, including endless acronyms for
Library Hi Tech
Vol. 37 No. 3, 2019
pp. 566-590
© Emerald PublishingLimited
0737-8831
DOI 10.1108/LHT-07-2017-0151
Received 27 July 2017
Revised 3 March 2018
25 July 2018
13 October 2018
Accepted 22 October 2018
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/0737-8831.htm
566
LHT
37,3
names of databases and tools, too much text, and confusing navigation options(Dougan
and Fulton, 2009, p. 218). According to Tidal (2012, p. 94), the confusion with terminology is
one of the biggest obstacles in library websites.
A large amount of research have been done on usability and IA in the 1980s and 1990s
(Nielsen and Mack, 1994; Rosenfeld and Morville, 1998; Shneiderman, 1986). Early research
in the field of usability and user-centred design (UCD) demonstrates how principles of
cognitive psychology apply to the design of almost anything (Norman, 1988). Norman (1988)
covers topics such as discoverability, feedback and affordance (of everyday things). These
principles are now widely used in website usability (Krug, 2014, p. 190). However, according
to the findings of early studies on academic library websites, some of these websites may
have been developed without consideration for the vast amount of research that already
existed in the field at the time. This may have been as a result of the rapid growth of
resources on these websites according to Duncan and Holliday (2008, p. 301).
Library websites have evolved dramatically from simple pages with a few links to
complex sites that provide direct access to hundreds of different resources(Duncan and
Holliday, 2008, p. 301). Library resources that were made available on library websites were
growing at a rapid rate and at that time the focus was more on quantity rather than quality
(Duncan and Holliday, 2008, p. 301; King and Jannik, 2005, p. 236). This resulted in websites
that were cluttered and crowded with information(Dominguez et al., 2015, p. 100).
The rapid growth of library websites resulted in the existence of certain websites that
were not robust enough to support a large number of resources, according to Duncan and
Holliday (2008, p. 301). One of the major problems with early academic library websites was
that they may have been designed and redesigned without much consideration for the
website IA the organisation and underlying structure of the website (Duncan and
Holliday, 2008, p. 302). Therefore, Duncan and Holliday (2008, p. 301) concluded that some
early academic library websites may not have been designed with the users in mind or with
a consideration for how it would affect the findability of information.
Academic librarywebsites are large, information-rich systems,which are only useful if the
available resources are findable. Therefore, IA is essential for the success of an academic
library website (Gullikson et al., 1999, p. 293). One of the main focusses in the field of IA is
supportingthe findability of information(Wodtke and Govella, 2009,p. xvii) and it is regarded
by Rosenfeldet al. (2015, p. 5) as a critical successfactorfor the overall usabilityof a website.
1.2 The value of heuristic evaluation as a usability evaluation method
Usability evaluation is a term that is used to describe a process or activity that aims to
improve the ease of use of an interface(Lazar et al., 2010, p. 256). There are three
broad categories under usability evaluation that define how usability problems are
identified expert-based evaluation (by expert evaluators), automated testing (with tools)
and user-based testing (with end-users) (Zahran et al., 2014, p. 26).
Expert-based evaluations, as opposed to user-based tests, are ideal for the evaluation of
IA because users are good at performing tasks within an interface, but they are not interface
experts (Lazar et al., 2010, p. 256). Although user testing is a valuable method for
getting usersopinions on the usability of a product, users might not necessarily be able to
comment on the usability of its IA. The IA of a website can effectively be evaluated using
the expert-based method called heuristic evaluation. Heuristic evaluation is also classified as
a usability inspection method (Nielsen and Mack, 1994, p. 2).
According to Nielsen and Mack (1994, p. 2), Several studies of usability inspection
methods have discovered that many usability problems are overlooked by user testing, but
that user testing also finds problems that are overlooked by inspection. Therefore, the best
results can be achieved by combining user-based and expert-based methods. It is not
uncommon for expert evaluators to be able to identify many of the usability problems that
567
Academic
library
websites

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT