Published:
2017-10-27
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 5
Volume
Issue:
Vol. 5 (2017): Fifth AAAI Conference on Human Computation and Crowdsourcing
Track:
Full Papers
Downloads:
Abstract:
Literature reviews allow scientists to stand on the shoulders of giants, showing promising directions, summarizing progress, and pointing out existing challenges in research. At the same time conducting a systematic literature review is a laborious and consequently expensive process. In the last decade, there have been several studies on crowdsourcing in literature reviews. This paper explores the feasibility of crowdsourcing for facilitating the literature review process in terms of results, time and effort, and identifies which crowdsourcing strategies provide the best results based on the budget available. In particular we focus on the screening phase of the literature review process and we contribute and assess strategies for running crowdsourcing tasks that are efficient in terms of budget and classification error. Finally, we present our findings based on experiments run on Crowdflower.
DOI:
10.1609/hcomp.v5i1.13302
HCOMP
Vol. 5 (2017): Fifth AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-793-3