Published:
2013-11-10
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 1
Volume
Issue:
Vol. 1 (2013): First AAAI Conference on Human Computation and Crowdsourcing
Track:
Works in Progress
Downloads:
Abstract:
Search engines can yield poor results for information retrieval tasks when they cannot interpret query predicates. Such predicates are better left for humans to evaluate. We propose an adaptive processing framework for deciding (a) which parts of a query should be processed by machines and (b) the order the crowd should process the remaining parts, optimizing for result quality and processing cost. We describe an algorithm and experimental results for the first framework component.
DOI:
10.1609/hcomp.v1i1.13131
HCOMP
Vol. 1 (2013): First AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-607-3