Digital games are a viable alternative to accomplish crowdsourcing tasks that would traditionally require paid online labor. This study compares the quality of crowdsourcing with games and paid crowdsourcing for simple and complex annotation tasks in a controlled exper-iment. While no difference in quality was found for the simple task, paid contributors’ response quality was sub-stantially lower than players’ quality for the complex task (92% vs. 78% average accuracy). Results suggest that crowdsourcing with games provides similar and potentially even higher response quality relative to paid crowdsourcing.
Published Date: 2015-11-12
Registration: ISBN 978-1-57735-740-7