Proceedings:
Vol. 10 No. 1 (2016): Tenth International AAAI Conference on Web and Social Media
Volume
Issue:
Vol. 10 No. 1 (2016): Tenth International AAAI Conference on Web and Social Media
Track:
Full Papers
Downloads:
Abstract:
Recommender systems face several challenges, e.g., recommending novel and diverse items and generating helpful explanations. Where algorithms struggle, people may excel. We therefore designed CrowdLens to explore different workflows for incorporating people into the recommendation process. We did an online experiment, finding that: compared to a state-of-the-art algorithm, crowdsourcing workflows produced more diverse and novel recommendations favored by human judges;some crowdworkers produced high-quality explanations for their recommendations, and we created an accurate model for identifying high-quality explanations;volunteers from an online community generally performed better than paid crowdworkers, but appropriate algorithmic support erased this gap. We conclude by reflecting on lessons of our work for those considering a crowdsourcing approach and identifying several fundamental issues for future work.
DOI:
10.1609/icwsm.v10i1.14743
ICWSM
Vol. 10 No. 1 (2016): Tenth International AAAI Conference on Web and Social Media