Published:
2017-10-27
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 5
Volume
Issue:
Vol. 5 (2017): Fifth AAAI Conference on Human Computation and Crowdsourcing
Track:
Full Papers
Downloads:
Abstract:
Crowdsourcing through human-computing games is an increasingly popular practice for classifying and analyzing scientific data. Early contributions such as Phylo have now been running for several years. The analysis of the performance of these systems enables us to identify patterns that contributed to their successes, but also possible pitfalls. In this paper, we review the results and user statistics collected since 2010 by our platform Phylo, which aims to engage citizens in comparative genome analysis through a casual tile matching computer game. We also identify features that allow predicting a task difficulty, which is essential for channeling them to human players with the appropriate skill level. Finally, we show how our platform has been used to quickly improve a reference alignment of Ebola virus sequences.
DOI:
10.1609/hcomp.v5i1.13309
HCOMP
Vol. 5 (2017): Fifth AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-793-3