Published:
2013-11-10
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1
Volume
Issue:
Vol. 1 (2013): First AAAI Conference on Human Computation and Crowdsourcing
Track:
Works in Progress
Downloads:
Abstract:
One common problem plaguing crowdsourcing tasks is tuning the set of worker responses: Depending on task requirements, requesters may want a large set of rich and varied worker responses (typically in subjective evaluation tasks) or a more convergent response-set (typically for more objective tasks such as fact-checking). This problem is especially salient in tasks that combine workers’ responses to present a single output: Divergence in these settings could either add richness and complexity to the unified answer, or noise. In this paper we present HiveMind, a system of methods that allow requesters to tune different levels of convergence in worker participation for different tasks simply by adjusting the value of one variable.
DOI:
10.1609/hcomp.v1i1.13130
HCOMP
Vol. 1 (2013): First AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-607-3