Publication | Closed Access
Classification of Five Emotions from EEG and Eye Movement Signals: Complementary Representation Properties
51
Citations
13
References
2019
Year
Unknown Venue
EngineeringAffective NeuroscienceEye Movement SignalsMultimodal LearningMultimodal Sentiment AnalysisSingle ModalitySocial SciencesPsychologyEmotional ResponseDisgust EmotionsPattern RecognitionAffective ComputingCognitive ElectrophysiologyCognitive NeuroscienceComplementary Representation PropertiesFive EmotionsCognitive ScienceModality FusionMultimodal Signal ProcessingFacial Expression RecognitionEeg Signal ProcessingEye TrackingNeuroscienceBraincomputer InterfaceEmotionEmotion Recognition
Recently, various multimodal approaches to enhancing the performance of affective models have been developed. In this paper, we investigate the complementary representation properties of EEG and eye movement signals on classification for five human emotions: happy, sad, fear, disgust, and neutral. We compare the performance of single modality and two different modality fusion approaches. The results indicate that EEG is superior to eye movements in classifying happy, sad and disgust emotions, whereas eye movements outperform EEG in recognizing fear and neutral emotions. Compared with eye movements, EEG has the advantage of classifying the five emotions, with the mean accuracies of 69.50% and 59.81%, respectively. Due to the complementary representation properties, the modality fusion with bimodal deep auto-encoder significantly improves the classification accuracy to 79.71%. Furthermore, we study the neural patterns of five emotion states and the recognition performance of different eye movement features. The results reveal that five emotions have distinguishable neural patterns and pupil diameter has a relatively high discrimination ability than the other eye movement features.
| Year | Citations | |
|---|---|---|
Page 1
Page 1