RAMP – the Repository Analytics and Metrics Portal. A prototype web service that accurately counts item downloads from institutional repositories

Date20 March 2017
Published date20 March 2017
Pages144-158
DOIhttps://doi.org/10.1108/LHT-11-2016-0122
AuthorPatrick OBrien,Kenning Arlitsch,Jeff Mixter,Jonathan Wheeler,Leila Belle Sterman
Subject MatterLibrary & information science,Librarianship/library management,Library technology,Information behaviour & retrieval,Information user studies,Metadata,Information & knowledge management,Information & communications technology,Internet
RAMP the Repository Analytics
and Metrics Portal
A prototype web service that accurately counts
item downloads from institutional repositories
Patrick OBrien and Kenning Arlitsch
Library, Montana State University, Bozeman, Montana, USA
Jeff Mixter
OCLC Online Computer Library Center Inc, Dublin, Ohio, USA
Jonathan Wheeler
Library, University of New Mexico, Albuquerque, New Mexico, USA, and
Leila Belle Sterman
Library, Montana State University, Bozeman, Montana, USA
Abstract
Purpose The purpose of this paper is to present data that begin to detail the deficiencies of log file analytics
reporting methods that are commonly built into institutional repository (IR) platforms. The authors propose a
new method for collecting and reporting IR item download metrics. This paper introduces a web service
prototype that captures activity that current analytics methods are likely to either miss or over-report.
Design/methodology/approach Data were extracted from DSpace Solr logs of an IR and were
cross-referenced with Google Analytics and Google Search Console data to directly compare Citable Content
Downloads recorded by each method.
Findings This study provides evidence that log file analytics data appear to grossly over-report due to traffic
from robots that are difficult to identify and screen. The study also introduces a proof-of-concept prototype that
makes the research method easily accessible to IR managers who seek accurate counts of Citable Content Downloads.
Research limitations/implications The method described in this paper does not account for direct
access to Citable Content Downloads that originate outside Google Search properties.
Originality/value This paper proposes that IR managers adopt a new reporting framework that classifies
IR page views and download activity into three categories that communicate metrics about user activity
related to the research process. It also proposes that IR managers rely on a hybrid of existing Google Services
to improve reporting of Citable Content Downloads and offers a prototype web service where IR managers
can test results for their repositories.
Keywords Web analytics, Assessment,Google Analytics, Institutional repositories, Google Search Console,
Log file analytics
Paper type Research paper
Introduction
Institutional repositories (IR) disseminate scholarly papers in an open access environment
and have become a core function of the modern research library. IR run on a variety of
software platforms, with great diversity in installation, configuration, and support systems,
Library Hi Tech
Vol. 35 No. 1, 2017
pp. 144-158
Emerald Publishing Limited
0737-8831
DOI 10.1108/LHT-11-2016-0122
Received 4 November 2016
Revised 4 November 2016
Accepted 26 November 2016
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/0737-8831.htm
© Patrick OBrien, Kenning Arlitsch, Jeff Mixter (OCLC), Jonathan Wheeler, Leila Sterman, Susan
Borda. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone
may reproduce, distribute, translate and create derivative works of this article (for both commercial
and non-commercial purposes), subject to full attribution to the original publication and authors. The
full terms of this licence may be seen at: http://creativecommons.org/licences/by/4.0/legalcode
The authors wish to express their gratitude to the Institute of Museum and Library Services, which
funded this research (Arlitsch et al., 2014). The authors would also like to thank to Bruce Washburn,
Consulting Software Engineer at OCLC Research, for his assistance in developing RAMP, and to Susan
Borda, Digital Technologies Librarian at Montana State University, for her help with data extraction.
144
LHT
35,1

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT