Back to basics
DOI | https://doi.org/10.1108/DLP-07-2017-0021 |
Date | 13 November 2017 |
Pages | 288-293 |
Published date | 13 November 2017 |
Author | Robert Fox |
Subject Matter | Library & information science,Librarianship/library management,Library technology,Records management & preservation,Information repositories |
Back to basics
Robert Fox
University of Notre Dame, Notre Dame, Indiana, USA
Abstract
Purpose –This paper aimsto explore several methods for developingan effective content model for library
websites.
Design/methodology/approach –This paper is conceptual.
Originality/value –Websitescan take a couple of approaches to design. One is “top down”that focuseson
the User Interface(UI) and graphic design. Another is “bottom up”that focuses on the organizationof content.
This paper encourageswebsite designers to take a more “bottom up”approachto web design.
Keywords Informationarchitecture, Web design,User experience, Usability, Mental models,
Content modeling
Paper type Conceptual paper
The modern world has ushered in a plethora of almost miraculous discoveries and
inventions. Science has penetrated mysteries as large as the universe and as small as
subatomic particles. At both ends of the spectrum,questions are pondered about the nature
and origin of reality. As our understanding expands, the mental models proposed that help
to define our overall understanding of the world change and morph to accommodate that
deeper understanding.
One example of this is the field of nanotechnology. In 1959, physicist Richard Feynman
gave a talk entitled “There’splenty of room at the bottom”to the American Physical Society.
In that lecture, Feynman proposed a number of ideas that touched upon the development of
machines that are infinitesimally small. A nanometer (nm) is 10
9
meters, or 0.000000001
meters in size. At that scale, a machine could manipulate individual cells, for example
assisting T-cells in fighting an infection or mimicking antigens to test for allergies. When
one considers that a flagellum could function as a “propulsion system”for a nanobot, we
obviously need a new way of conceiving models for processes such as manufacturing,
power, data transmission and other technical systems that typically operate at a much
larger scale. To put this in perspective, cloth fibers are normally measuredin microns (
m
m)
which is 1/1000th of a millimeter. A nanometer is 1/100th of a micron, which is a common
measurement used to describe modern semiconductor bandwidth. Atoms are measured at a
mere 1/10th of that scale. Conceiving of machines this small opens up a universe of
possibilities“at the bottom”which would otherwise have been impossibleto conceive.
Feynman, while being a theoretical physicist, was also a pragmatist and proposed
several practical implicationsthat follow from the development of nanotechnology. Even at
the early stage in the evolutionof computer technology that Feynman found himself in 1959,
he posited that one-day processing speeds could only continue to be increased if
computational densitywere considered at the atomic level. The densityof integrated circuits
could theoretically be exponentially increased over standard silicon based circuitry
following Moore’s law, which stipulates that the number of transistors would basically
double in a densely packed integrated circuit every year owing to the decrease in average
size of a technology node that make up the transistors. The current standard holds at the
10 nm node, and the International Technology Roadmap for Semiconductors predicts that
DLP
33,4
288
Received 10 July 2017
Revised 10 July 2017
Accepted 12 July 2017
DigitalLibrary Perspectives
Vol.33 No. 4, 2017
pp. 288-293
© Emerald Publishing Limited
2059-5816
DOI 10.1108/DLP-07-2017-0021
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/2059-5816.htm
To continue reading
Request your trial