AAAI Publications, Second AAAI Conference on Human Computation and Crowdsourcing

Font Size: 
Using Worker Quality Scores to Improve Stopping Rules
Ittai Abraham, Omar Alonso, Vasilis Kandylas, Rajesh Patel, Steven Shelford, Alex Slivkins

Last modified: 2014-09-05

Abstract


We consider the crowdsourcing task of learning the answer to simple multiple-choice microtasks. In order to provide statistically significant results, one often needs to ask multiple workers to answer the same microtask. A stopping rule is an algorithm that for a given microtask decides for any given set of worker answers if the system should stop and output an answer or iterate and ask one more worker. A quality score for a worker is a score that reflects the historic performance of that worker. In this paper we investigate how to devise better stopping rules given such quality scores. We conduct a data analysis on a large-scale industrial crowdsourcing platform, and use the observations from this analysis to design new stopping rules that use the workers’ quality scores in a non-trivial manner. We then conduct a simulation based on a real-world workload, showing that our algorithm performs better than the more naive approaches.

Keywords


stopping rules, crowdsourcing, quality control

Full Text: PDF