Published:
2015-11-12
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 3
Volume
Issue:
Vol. 3 (2015): Third AAAI Conference on Human Computation and Crowdsourcing
Track:
Works in Progress
Downloads:
Abstract:
Current crowdsourcing platforms such as Amazon Mechanical Turk provide an attractive solution for processing of high-volume tasks at low cost. However, problems of quality control remain a major concern. We developed a private crowdsourcing system (PCSS) running in a intranetwork, that allow us to devise for quality control methods. In the present work, we designed a novel task allocation method to improve accuracy of task results in PCSS. PCSS analyzed relations between tasks from workers' behavior using Bayesian network, then created learning tasks according to analyzed relations. PCSS increased quality of task results by allocating learning tasks to workers before processing difficult tasks. PCSS created 8 learning tasks automatically for 2 target task categories and increased accuracy of task results by 10.77 point on average. We found that creating learning tasks according to analyzed relations is a practical method to improve the quality of workers.
DOI:
10.1609/hcomp.v3i1.13246
HCOMP
Vol. 3 (2015): Third AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-740-7