Published:
2021-11-14
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 9
Volume
Issue:
Vol. 9 (2021): Proceedings of the Ninth AAAI Conference on Human Computation and Crowdsourcing
Track:
Full Archival Papers
Downloads:
Abstract:
Recent research has demonstrated that cognitive biases such as the confirmation bias or the anchoring effect can negatively affect the quality of crowdsourced data. In practice, however, such biases go unnoticed unless specifically assessed or controlled for. Task requesters need to ensure that task workflow and design choices do not trigger workers’ cognitive biases. Moreover, to facilitate the reuse of crowdsourced data collections, practitioners can benefit from understanding whether and which cognitive biases may be associated with the data. To this end, we propose a 12-item checklist adapted from business psychology to combat cognitive biases in crowdsourcing. We demonstrate the practical application of this checklist in a case study on viewpoint annotations for search results. Through a retrospective analysis of relevant crowdsourcing research that has been published at HCOMP in 2018, 2019, and 2020, we show that cognitive biases may often affect crowd workers but are typically not considered as potential sources of poor data quality. The checklist we propose is a practical tool that requesters can use to improve their task designs and appropriately describe potential limitations of collected data. It contributes to a body of efforts towards making human-labeled data more reliable and reusable.
DOI:
10.1609/hcomp.v9i1.18939
HCOMP
Vol. 9 (2021): Proceedings of the Ninth AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-872-5