Published:
2015-11-12
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 3
Volume
Issue:
Vol. 3 (2015): Third AAAI Conference on Human Computation and Crowdsourcing
Track:
Crowdsourcing Breakthroughs for Language Technology Applications Workshop
Downloads:
Abstract:
This paper examines the importance of presenting simple, intuitive tasks when conducting microtasking on crowdsourcing platforms. Most crowdsourcing platforms allow the maker of a task to present any length of instructions to crowd workers who participate in their tasks. Our experiments show, however, most workers who participate in crowdsourcing microtasks do not read the instructions, even when they are very brief. To facilitate success in microtask design, we highlight the importance of making simple, easy to grasp tasks that do not rely on instructions for explanation.
DOI:
10.1609/hcomp.v3i1.13265
HCOMP
Vol. 3 (2015): Third AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-740-7