Although crowd work is typically completed through desktop or laptop computers by workers at their home, literature has shown that crowdsourcing is feasible through a wide array of computing devices, including smartphones and digital voice assistants. An integrated crowdsourcing platform that operates across multiple devices could provide greater flexibility to workers, but there is little understanding of crowd workers’ perceptions on uptaking crowd tasks across multiple contexts through such devices. Using a crowdsourcing survey task, we investigate workers’ willingness to accept different types of crowd tasks presented on three device types in different scenarios of varying location, time and social context. Through analysis of over 25,000 responses received from 329 crowd workers on Amazon Mechanical Turk, we show that when tasks are presented on different devices, the task acceptance rate is 80.5% on personal computers, 77.3% on smartphones and 70.7% on digital voice assistants. Our results also show how different contextual factors such as location, social context and time influence workers decision to accept a task on a given device. Our findings provide important insights towards the development of effective task assignment mechanisms for cross-device crowd platforms.
Published Date: 2020-10-09
Registration: ISBN 978-1-57735-848-0