Reliability and validity test of a Scoring Rubric for Information Literacy

Published date13 March 2017
Pages305-316
Date13 March 2017
DOIhttps://doi.org/10.1108/JD-05-2016-0066
AuthorJos van Helvoort,Saskia Brand-Gruwel,Frank Huysmans,Ellen Sjoer
Subject MatterLibrary & information science,Records management & preservation,Document management,Classification & cataloguing,Information behaviour & retrieval,Collection building & management,Scholarly communications/publishing,Information & knowledge management,Information management & governance,Information management,Information & communications technology,Internet
Reliability and validity test
of a Scoring Rubric for
Information Literacy
Jos van Helvoort
The Hague University of Applied Sciences, The Hague, The Netherlands
Saskia Brand-Gruwel
Open Universiteit of The Netherlands, Heerlen, The Netherlands
Frank Huysmans
University of Amsterdam, Amsterdam, The Netherlands, and
Ellen Sjoer
The Hague University of Applied Sciences, The Hague, The Netherlands
Abstract
Purpose The purpose of this paper is to measure reliability and validity of the Scoring Rubric for
Information Literacy (Van Helvoort, 2010).
Design/methodology/approach Percentages of agreement and Intraclass Correlation were used to
describe interrater reliability. For the determination of construct validity factor analysis and reliability
analysis were used. Criterion validity was calculated with Pearson correlations.
Findings In the described case, the Scoring Rubric for Information Literacy appears to be a reliable and
valid instrument for the assessment of information literate performance.
Originality/value Reliability and validity are prerequisites to recommend a rubric for application.
The results confirm that this Scoring Rubric for Information Literacy can be used in courses in higher
education, not only for assessment purposes but also to foster learning.
Keywords Reliability, Higher education, Information literacy, Validity, Scoring rubrics,
Student performance measurement
Paper type Research paper
Introduction
A scoring rubric is a grading tool that is often used for the rating of authentic student work.
Jonsson and Svingby (2007) define it as criteria for rating important dimensions of
performance, as well as standards of attainment for those criteria. Angell (2015) and
Carbery and Leahy (2015) remark that rubrics are also popular in library and information
science literature and that they are often mentioned in the context of assessing
student assignments.
In the context of the measurement of information literacy skills, rubrics have the benefit
of supporting the assessment of the students real performance in resolving information
problem-solving tasks, while other popular evaluation methods like multiple choice tests are
more appropriate for the measurement of knowledge and understanding (Cameron et al.,
2007). Rubrics follow the trend in higher education towards authentic assessment, as Knight
(2006) remarked, a process that measures how students apply their knowledge to real-time
tasks. They are supposed to combat subjectivity and unfairness during the grading
process (Bresciani et al., 2009). Other benefits of scoring rubrics are their appropriateness for
the supply of detailed feedback, the possibility to inform students about the expectations of
their instructors, and the usefulness of rubrics for peer- and self-assessment (Oakleaf, 2008,
2009; Reddy and Andrade, 2010; Belanger et al., 2015).
Keeping in mind the importance of information problem solving in todays higher
education (Brand-Gruwel et al., 2005), Van Helvoort (2010) developed a Scoring Rubric for
Journal of Documentation
Vol. 73 No. 2, 2017
pp. 305-316
© Emerald PublishingLimited
0022-0418
DOI 10.1108/JD-05-2016-0066
Received 14 June 2016
Revised 29 September 2016
Accepted 2 October 2016
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/0022-0418.htm
305
Reliability and
validity test

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT