AAAI Publications, Thirty-First AAAI Conference on Artificial Intelligence

Font Size: 
Multimodal Fusion of EEG and Musical Features in Music-Emotion Recognition
Nattapong Thammasan, Ken-ichi Fukui, Masayuki Numao

Last modified: 2017-02-12

Abstract


Multimodality has been recently exploited to overcome the challenges of emotion recognition. In this paper, we present a study of fusion of electroencephalogram (EEG) features and musical features extracted from musical stimuli at decision level in recognizing the time-varying binary classes of arousal and valence. Our empirical results demonstrate that EEG modality was suffered from the non-stability of EEG signals, yet fusing with music modality could alleviate the issue and enhance the performance of emotion recognition.

Keywords


emotion recognition; affective computing; brain-computer interface

Full Text: PDF