“Stickiness”: gauging students’ attention to online learning activities

Publication Date09 July 2018
DOIhttps://doi.org/10.1108/ILS-03-2018-0014
Date09 July 2018
Pages460-468
AuthorAinslie Robinson,David Cook
SubjectLibrary & information science,Librarianship/library management,Library & information services
Stickiness: gauging students
attention to online
learning activities
Ainslie Robinson
Learning and Teaching Ofce, University of Notre Dame Australia,
Fremantle, Australia, and
David Cook
Department of Marketing, CA Technologies, New York, New York, USA
Abstract
Purpose Online contentdevelopers use the term stickinessto refer to the ability of their online serviceor
game to attractand hold the attention of users and create a compellingand magnetic reason for them to return
repeatedly (examples includevirtual pets and social media). In business circles, the sameterm connotes the
level of consumer loyaltyto a particular brand. This paper aims to extend the concept of stickinessnotonly
to describe repeat return and commitment to the learning product, but also as a measure of the extent to
which studentsare engaged in online learning opportunities.
Design/methodology/approach This paper explores the efcacy of several approaches to the
monitoring and measuring of online learning environments, and proposes a framework for assessing the
extent to which theseenvironments are compelling, engaging and sticky.
Findings In particular, the exploration so far has highlightedthe difference between how lecturers have
monitoredthe engagement of students in a face-to-face setting versus the online teachingenvironment.
Practical implications In the higher education environment where increasingly students are being
asked to access learning in the online space,it is vital for teachers to be in a position to monitor and guide
studentsin their engagement with online materials.
Originality/value The mere presence of learning materials online is not sufcient evidence of
engagement. This paper offers options for testing specic attention to online materials allowing greater
assurancearound engagement with relevant and effectiveonline learning activities.
Keywords Online learning, Student engagement, Monitoring, Analytics, Stickiness, Time to read
Paper type Conceptual paper
Participation versus engagement metrics
In order fora student to gain a measurable benetfrom a learning experi ence,they need to be
present within the learning environment. In the classroom, their attendancemay be recorded,
they may be required to prepare readings or other activities in advance of the class, they may
be called upon or volunteer to offer their ideas, or they may through their physical interactions
with the lesson, show their interest in the proceedings. While presence in the learning
environment is a vital factor in learning experience, scholarship has shown that
impressionistic grading around class attendance tells an incomplete story (Bean and Peterson,
1998). Bean and Peterson (1998) cited concerns that academics use participationmarking as a
fudge factor in computing nal course grades, and contendedthat this phenomenon helps
explain why assessment and measurement scholars almost universally advise against
grading class participation. They concluded that participation grades, especially if
impressionistic, are hard to justify if challenged(Bean and Peterson, 1998). They, however,
ILS
119,7/8
460
Received19 March 2018
Revised4 May 2018
Accepted17 May 2018
Informationand Learning Science
Vol.119 No. 7/8, 2018
pp. 460-468
© Emerald Publishing Limited
2398-5348
DOI 10.1108/ILS-03-2018-0014
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/2398-5348.htm

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT