Published:
2015-11-12
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 3
Volume
Issue:
Vol. 3 (2015): Third AAAI Conference on Human Computation and Crowdsourcing
Track:
Full Papers
Downloads:
Abstract:
Digital games are a viable alternative to accomplish crowdsourcing tasks that would traditionally require paid online labor. This study compares the quality of crowdsourcing with games and paid crowdsourcing for simple and complex annotation tasks in a controlled exper-iment. While no difference in quality was found for the simple task, paid contributors’ response quality was sub-stantially lower than players’ quality for the complex task (92% vs. 78% average accuracy). Results suggest that crowdsourcing with games provides similar and potentially even higher response quality relative to paid crowdsourcing.
DOI:
10.1609/hcomp.v3i1.13226
HCOMP
Vol. 3 (2015): Third AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-740-7