Published:
2014-11-05
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 2
Volume
Issue:
Vol. 2 (2014): Second AAAI Conference on Human Computation and Crowdsourcing
Track:
Research Papers
Downloads:
Abstract:
In online labor platforms such as Amazon Mechanical Turk, a good strategy to obtain quality answers is to take aggregate answers submitted by multiple workers, exploiting the wisdom of crowds. However, human computation issusceptible to systematic biases which cannot be corrected by using multiple workers.We investigate a game-theoretic bonus scheme, called peer truth serum (PTS), to overcome this problem. We report on the design and outcomes of a set of experiments to validate this scheme. Results show peer truth serum can indeed correct the biases and increase the answer accuracy by up to 80%.
DOI:
10.1609/hcomp.v2i1.13145
HCOMP
Vol. 2 (2014): Second AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-682-0