Published:
2013-11-10
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1
Volume
Issue:
Vol. 1 (2013): First AAAI Conference on Human Computation and Crowdsourcing
Track:
Works in Progress
Downloads:
Abstract:
Allocating tasks to workers so as to get the greatest amount of high-quality output for as little resources as possible is an overarching theme in crowdsourcing research. Among the factors that complicate this problem is the lack of information about the available workers’ skill, along with unknown difficulty of the tasks to be solved. Moreover, if a crowdsourcing platform customer is limited to a fixed-size worker pool to complete a large batch of jobs such as identifying a particular object in a collection of images or comparing the quality of many pairs of artifacts in crowdsourcing workflows, she inevitably faces the tradeoff between getting a few of these tasks done well or getting many done poorly. In this paper, we propose a framework called JOCR (Joint Crowdsourcing, pronounced as “Joker”) for analyzing joint allocations of many tasks to a pool of workers. JOCR encompasses a broad class of common crowdsourcing scenarios, and we pose the challenge of developing efficient algorithms for it.
DOI:
10.1609/hcomp.v1i1.13115
HCOMP
Vol. 1 (2013): First AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-607-3