Published:
2013-11-10
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1
Volume
Issue:
Vol. 1 (2013): First AAAI Conference on Human Computation and Crowdsourcing
Track:
Full Papers
Downloads:
Abstract:
Paid and volunteer crowd work have emerged as a means for harnessing human intelligence for performing diverse tasks. However, little is known about the relative performance of volunteer versus paid crowd work, and how financial incentives influence the quality and efficiency of output. We study the performance of volunteers as well as workers paid with different monetary schemes on a difficult real-world crowdsourcing task. We observe that performance by unpaid and paid workers can be compared in carefully designed tasks, that financial incentives can be used to trade quality for speed, and that the compensation system on Amazon Mechanical Turk creates particular indirect incentives for workers. Our methodology and results have implications for the ideal choice of financial incentives and motivates further study on how monetary incentives influence worker behavior in crowdsourcing.
DOI:
10.1609/hcomp.v1i1.13075
HCOMP
Vol. 1 (2013): First AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-607-3