Publication | Closed Access
Multimodal Emotion Recognition from Eye Image, Eye Movement and EEG Using Deep Neural Networks
62
Citations
13
References
2019
Year
Unknown Venue
EngineeringBiometricsAffective NeuroscienceMultimodal Sentiment AnalysisNew FeaturesSocial SciencesMultilevel FusionImage AnalysisData SciencePattern RecognitionFusion LearningAffective ComputingMultimodal Emotion RecognitionEye ImageCognitive ScienceNeuroimagingMultimodal Signal ProcessingFeature FusionComputer VisionEye MovementFacial Expression RecognitionEeg Signal ProcessingEye TrackingNeuroscienceEmotionEmotion Recognition
In consideration of the complexity of recording electroencephalography(EEG), some researchers are trying to find new features of emotion recognition. In order to investigate the potential of eye tracking glasses for multimodal emotion recognition, we collect and use eye images to classify five emotions along with eye movements and EEG. We compare four combinations of the three different types of data and two kinds of fusion methods, feature level fusion and Bimodal Deep AutoEncoder (BDAE). According to the three-modality fusion features generated by BDAE, the best mean accuracy of 79.63% is achieved. By analyzing the confusion matrices, we find that the three modalities can provide complementary information for recognizing five emotions. Meanwhile, the experimental results indicate that the classifiers with eye image and eye movement fusion features can achieve a comparable classification accuracy of 71.99%.
| Year | Citations | |
|---|---|---|
Page 1
Page 1