AAAI Publications, Third AAAI Conference on Human Computation and Crowdsourcing

Font Size: 
It’s Not Just What You Say, But How You Say It: Muiltimodal Sentiment Analysis Via Crowdsourcing
Ahmad Khamis Elshenawy, Steele Carter, Daniela Braga

Last modified: 2016-03-28

Abstract


This paper examines the effect of various modalities of expression on the reliability of crowdsourced sentiment polarity judgments. A novel corpus of YouTube video reviews was created, and sentiment judgments were obtained via Amazon Mechanical Turk. We created a system for isolating text, video, and audio modalities from YouTube videos to ensure that annotators could only see the particular modality or modalities being evaluated. Reliability of judgments was assessed using Fleiss Kappa inter-annotator agreement values. We found that the audio only modality produced the most reliable judgments for video fragments and that across modalities video fragments are less ambiguous than full videos.

Keywords


Crowdsourcing; sentiment; multimodal

Full Text: PDF