Building a local CMS at Kent State

Published date01 January 2006
DOIhttps://doi.org/10.1108/07378830610652121
Pages69-101
Date01 January 2006
AuthorRick Wiggins,Jeph Remley,Tom Klingler
Subject MatterInformation & knowledge management,Library & information science
THEME ARTICLE
Building a local CMS at Kent State
Rick Wiggins, Jeph Remley and Tom Klingler
Libraries & Media Services, Kent State University, Kent, Ohio, USA
Abstract
Purpose – The purpose of this paper is to describe the creation of a content management system
(CMS) for the Kent State University Libraries & Media Services web site. It describes the requirements
for the site and for the CMS, the CMS architecture and components.
Design/methodology/approach This paper describes the genesis and architecture of a
locally-written CMS that is strongly focused on metadata.
Findings – A review of local, library-specific needs combined with a review of the product universe
resulted in the decision to write a local CMS.
Practical implications – Includes enumerated goals and requirements for a database-driven and
metadata-focused web site.
Originality/value – Describes the creation of tools for data management in a locally-written CMS.
Keywords Content management,Information management, Libraries
Paper type Technical Paper
Background
The first generation Kent State University Libraries & Media Services (LMS) web site
was a small accumulation of static pages. The second generation site, in place from the
late 1990s through the early 2000s, responded annually to millions of page requests. In
the early 2000s, the web team, a subset of the systems department staff, began to
realize that the site had a long way to go in terms of data consistency, presentation
consistency, and automation. The team began to hatch the idea to build a new,
fully-automated site. A better site needed better control over presentation to achieve a
more professional, consistent look. Better control over staff data input would eliminate
poor quality HTML. A better site required optimized management of data that tended
to appear in multiple places in various stages of currency. And, a top need was better
content description for more flexible retrieval and presentation. Feeling that metadata
was driving the future of the web, the team knew that it needed to build a site that
exploited metadata for the organization and retrieval of both research data and basic
operational library content.
Each page on the second generation site was made up primarily of static HTML
pages coded in Macromedia’s Dreamweaver. Content providers for the site were a
small (but growing) number of LMS staff members with varying levels of HTML
experience. To create a new page, a staff member would typically copy an existing
page, save it as a new page, delete the content of the former page, and then create their
new content. Staff who did not know how to edit web pages would send their edits to
staff members with more experience (typically the site’s webmaster).
With the number of staff content providers on the site growing, most of whom had
only a basic understanding of HTML, the integrity of the site’s overall look and feel
The current issue and full text archive of this journal is available at
www.emeraldinsight.com/0737-8831.htm
Building a local
CMS at Kent
State
69
Received 1 September 2005
Revised 1 November 2005
Accepted 15 November 2005
Library Hi Tech
Vol. 24 No. 1, 2006
pp. 69-101
qEmerald Group Publishing Limited
0737-8831
DOI 10.1108/07378830610652121
started to degrade. The site was formed with a series of nested tables, with many
layers of tables for overall page layout, navigation options, and other components like
page headers, content areas, and footer. The problem with the practice of copying
existing pages was that, in deleting areas of the former page’s content before entering
the new content, the author would often delete too much code, or not enough, which
would result in a broken page layout of broken tables. Many times the staff did not
know how to fix it, or did not pay any attention, and so the errors were left in place.
Over time the pages on the site that were supposed to have a consistent structure
actually started to look more and more different. For example, the footers began to
stretch over different areas of the bottom of the pages, and the site logo in the upper
right hand corner of the pages began to shift position from one page to the next. The
team clearly had a fundamental problem with the structure and the editing process for
the site.
In addition to accidentally breaking a page’s layout, some staff started to go their
own way with their pages. Some altered page layouts by removing or changing the
navigation bar that was supposed to be consistent on all pages; others changed font
faces and colors, bringing a whole new level of undesired “creativity” to the site. With
different staff straying from the original design, the LMS web site started to look less
consistent and less professional.
Another problem inherent to a site full of static web pages was the need to make
site-wide updates. There were times when a new item needed to be added to the
navigation sidebar, or the contents of the footer needed to be updated. There were
times when an update was needed to the overall look and layout of the site to bring it
back to a more consistent structure. Attempts to perform site-wide find-and-replace
code corrections were difficult or impossible because staff had made changes to the
code, sometimes intentionally, sometimes accidentally. An attempt at a global update
to the code would reveal that many pages would still have to be individually updated.
And, even this approach assumed that these pages could be located by clicking
through the site and randomly coming across the errors. Sometimes a page’s code did
not get updated as intended, continuing to break the integrity of the site. All of these
problems provided more evidence of the site’s fundamental structural problems.
The team also discovered hundreds of stray and duplicate pages; pages that were
considered “deleted” just because there were no known links to them; pages that were
older versions of a current page; entire directories with no known links into them;
duplication of data on different pages, sometimes with one instance of the data being
current and the duplicate out of date. It was clearly time to plan a new site that would
be built and managed by a new method.
Beginning to plan
At the start of planning for the new site, a web committee of LMS empl oyees was
formed, composed of interested staff members from various departments. The
committee decided the site should be recreated from scratch, content should be easier to
publish, and the site should be easier to maintain and update. It quickly becam e
apparent that the new site also had to offer improved features. The committee very
much wanted the new site to be based on the needs and feedback of our users the
students, faculty and staff. It was widely felt that the old site was built by librarians for
users who, to navigate the site successfully, would have to think like librarians.
LHT
24,1
70
One of the first important steps with the new web committee was to meet with focus
groups and conduct usability studies to find out what they did and did not like about
the existing site, how they used it, how difficult the site was for them to use, and what
they wanted in a new site. Rather than restricting what types of users should be
listened to, the committee ensured that the focus groups were a mix of students, faculty
and student library employees. After completing release forms, the participants were
asked questions about the current site, were asked to compare it to other sites, and
were allowed to offer any concerns and requests that had not already been considered.
Participants in the focus groups were fed pizza and soda, and were given $15 each on
their university ID spending account. Usability studies were conducted as well, with
individual participants completing assigned tasks using the current site, and speaking
aloud whatever thoughts or problems they might be experiencing. Usability study
participants were also each given $15 on their accounts. The pizza and money turned
out to be just the right enticement; just the right number of participants were involved,
and they were genuinely interested in helping the committee work towards a new and
better site.
General goals and requirements
In order to improve the usability and ease-of-maintenance of the new web site, the team
established a set of general goals for the new site. This list of goals became the
foundation on which were built the more detailed requirements for the site and for a
CMS that would be used to manage it.
Edit content using web forms
Instead of using web content editing software like Dreamweaver or BBEdit, the staff
would do all editing in the CMS using web forms. This approach would reduce the need
for special software purchases, installation, and training. Staff would be able to edit
web content from anywhere they could access the CMS with a browser. In order to
allow for what-you-see-is-what-you-get (WYSIWYG) editing, special software would
be purchased and installed into the CMS to provide this functionality in a web browser.
But, in order to maintain a clean separation of the content from the presentation of the
pages, many of the typical HTML features that people would expect in such a tool
would not be made available. For example, the selection of fonts, font colors, and font
sizes would not be made available. More radical, the typical HTML table editing tool
available in web content editing software packages would not be made available. This
decision would prevent the use of tables for layout, making the web site more
ADA-compliant.
Staff would not need to know how to write HTML to edit content and would not be
given the ability to write raw HTML. The CMS and WYSIWYG editor would allow
them to create content without ever seeing the underlying HTML being generated. The
creation of the HTML would be done dynamically and would rely heavily on the use of
Cascading Style Sheets (CSS) for the description of the presentation.
Organize pages using metadata
The traditional hierarchical organization of the earlier sites now seemed too arbitrary
and inflexible. In time, any newly-invented hierarchy would no longer make sense, but
would be deeply embedded into the structure of the web site, making widespread
Building a local
CMS at Kent
State
71

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT