DOCUMENTATION NOTES. THE EFFECTIVENESS OF NON‐USER RELEVANCE ASSESSMENTS

Pages146-151
Published date01 February 1967
DOIhttps://doi.org/10.1108/eb026427
Date01 February 1967
Subject MatterInformation & knowledge management,Library & information science
DOCUMENTATION NOTES
THE EFFECTIVENESS
OF
NON-USER RELEVANCE ASSESSMENTS
Purpose
In many information systems, intervening relevance assessments
are an
accepted operational practice;
i.e.
documents identified
by the
system as
responses to
a question are assessed as
either relevant or non-relevant,1,2
and
only those judged relevant
are
forwarded to
the user
(questioner).
It is
com-
monly
assumed that
subject
experts can perform such
intervening relevance
assessments with
a
high degree of accuracy. However, much of the litera-
ture
on
relevance indicates
the
relevance-assessment process
to be
highly
subjective,3
and
hence
a
task that should
be
performed only by the user.
This experiment was undertaken
to
test the performance
of
a
non-user
subject specialist
in
assessing the relevance of responses
to
questions posed
to
a
pilot
Educational
Media Research Information Center.4
An
additional
non-user, possessing considerably
less
subject expertise
than
the subject ex-
pert, was included for purposes of comparison.
Procedures
Thirty-seven questions
(all
questions for
which complete data were
avail-
able)
submitted
by members of
a
pilot
user
group of educational researchers
were
searched
over
a
file of approximately
4,500
documents.
The questions
were
based on research
projects being conducted
by
the
pilot
users.
Search
outputs
for
the questions varied widely, and because the investigator
did
not
wish
to
overburden
the
evaluators,
the
following procedures were
established.
Search
outputs of fifty
or
fewer
responses were evaluated in
toto.
Where the number of responses
exceeded
fifty for any
question,
a
random
sample of fifty responses was obtained
for
evaluation.
Each
set of
responses
was
given three evaluations:
by the
user;
by an
expert; and
by
a
system specialist, where these are defined
as
follows:
User:
the individual posing the question.
Expert:
a
non-user subject specialist considered to
be a
peer of the
user.
System
specialist:
a
non-user, non-subject specialist
possessing
considerable
system experience but
only
moderate subject
knowledge.
User, expert, and system specialist were asked
to
indicate
for
each
re-
sponse (an abstract
of
200-250
words) whether the response was relevant
or non-relevant.
A
relevant response
is
defined
as
one which,
on
the basis
of
the
information
provided,
describes research which appears to be
directly
146

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT