Characterizing peer-judged answer quality on academic Q&A sites. A cross-disciplinary case study on ResearchGate

Published date21 May 2018
Date21 May 2018
Pages269-287
DOIhttps://doi.org/10.1108/AJIM-11-2017-0246
AuthorLei Li,Daqing He,Chengzhi Zhang,Li Geng,Ke Zhang
Subject MatterLibrary & information science,Information behaviour & retrieval,Information & knowledge management,Information management & governance,Information management
Characterizing peer-judged
answer quality on academic
Q&A sites
A cross-disciplinary case
study on ResearchGate
Lei Li
Department of Information Management,
Nanjing University of Science and Technology, Nanjing, China
Daqing He
School of Information Sciences, University of Pittsburgh,
Pittsburgh, Pennsylvania, USA
Chengzhi Zhang
Department of Information Management,
Nanjing University of Science and Technology, Nanjing, China
Li Geng
New York City College of Technology,
City University of New York, New York, USA, and
Ke Zhang
School of Information Sciences,
University of Pittsburgh, Pittsburgh, Pennsylvania, USA
Abstract
Purpose Academic social (question and answer) Q&A sites are now utilised by millions of scholars and
researchers for seeking and sharing discipline-specific information. However, little is known about the factors
that can affect their votes on the quality of an answer, nor how the discipline might influence these factors.
The paper aims to discuss this issue.
Design/methodology/approach Using 1,021 answers collected over three disciplines (library and
information services, history of art, and astrophysics) in ResearchGate, statistical analysis is performed to
identify the characteristics of high-quality academic answers, and comparisons were made across the three
disciplines. In particular, two major categories of characteristics of the answer provider and answer content
were extracted and examined.
Findings The results reveal that high-quality answers on academic social Q&A sites tend to possess two
characteristics: first, they are provided by scholars with higher academic reputations (e.g. more followers, etc.);and
second, they provide objective information (e.g. longeranswer with fewer subjective opinions). However, the impact
of these factors varies across disciplines, e.g., objectivity is more favourable in physics than in other disciplines.
Originality/value The study is envisioned to help academic Q&A sites to select and recommend high-
quality answers across different disciplines, especially in a cold-start scenario where the answer has not
received enough judgements from peers.
Keywords Social media, Academic social networking, ResearchGate, Academic social Q&A,
Answer quality, Peer judgment
Paper type Research paper
Aslib Journal of Information
Management
Vol. 70 No. 3, 2018
pp. 269-287
© Emerald PublishingLimited
2050-3806
DOI 10.1108/AJIM-11-2017-0246
Received 2 November 2017
Revised 20 April 2018
Accepted 8 May 2018
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/2050-3806.htm
The authors gratefully acknowledge the help of Wei Jeng who helped to provide suggestions. This
work is supported by Major Projects of National Social Science Fund (No. 16ZAD224), Fujian
Provincial Key Laboratory of Information Processing and Intelligent Control (Minjiang University)
(No. MJUKF201704) and Qing Lan Project.
269
Academic
Q&A sites
1. Introduction
In recent years, academic social networking sites (ASNSs), such as Academia.edu and
ResearchGate, have been gaining popularity among scholars. ASNSs enable scholars to
communicate with one another online for the main purposes of cooperation and acquiring
relevant academic information (Nández and Borrego, 2013; Jordan, 2014; Jeng et al., 2015).
One of their most important features is a social question and answer (Q&A) platform for
scholars to raise and answer academic questions ( Jeng et al. , 2017). As the examples in
Figure 1 show, users can post questions related to specific disciplines or topics, such as
information reliability in information science, or shadow patterns in history of art; they can
also provide answers to othersquestions, based on their knowledge and expertise.
As on general social Q&A sites, such as Yahoo! Answers and Answers.com, the quality of
user-generated content is a critical issue on ASNSs (Harper et al., 2008; Kim and Oh, 2009).
A question often receives many answers from different users. How to pick high-quality answers
among many responses and recommend them to users with the same questions has always
been a problem for social Q&A platforms to solve. On most social Q&A platforms, the user
ratings for each answer are used as a clear indication of answer quality. Therefore, numerous
studies have examined answer quality on general social Q&A sites in terms of satisfying the
information seeker (Harper et al., 2008; Liu et al., 2008; Shah and Pomerantz, 2010; Fichman,
2011; John et al., 2011). These studies confirm that some features related to answer content and
answerer reputation can effectively identify high-quality answers. In recent studies, researchers
have also begun to explore the similarities and differences between the characteristics of
high-quality answers on different topics (Fu et al., 2015; Le et al., 2016). However, there has been
little examination of the characteristics of peer-judged high-quality answers on academic social
Q&A sites, including how these compare across different disciplines.
Figure 1.
A question/answer
interface on
ResearchGate
270
AJIM
70,3

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT