Published:
2014-11-05
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 2
Volume
Issue:
Vol. 2 (2014): Second AAAI Conference on Human Computation and Crowdsourcing
Track:
Research Papers
Downloads:
Abstract:
With a large amount of tasks of various types, requesters in crowdsourcing platforms often bundle tasks of different types into a single working session. This creates a task switching setting, where workers need to shift between different cognitive tasks. We design and conduct an experiment on Amazon Mechanical Turk to study how occasionally presented performance-contingent monetary rewards, referred as monetary interventions, affect worker performance in the task switching setting. We use two competing metrics to evaluate worker performance. When monetary interventions are placed on some tasks in a working session, our results show that worker performance on these tasks can be improved in both metrics. Moreover, worker performance on other tasks where monetary interventions are not placed is also affected: workers perform better according to one metric, but worse according to the other metric. This suggests that in addition to providing extrinsic monetary incentives for some tasks, monetary interventions implicitly set performance goals for all tasks. Furthermore, monetary interventions are most effective in improving worker performance when used at switch tasks, tasks that follow a task of a different type, in working sessions with a low task switching frequency.
DOI:
10.1609/hcomp.v2i1.13160
HCOMP
Vol. 2 (2014): Second AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-682-0