Concepedia

Publication | Closed Access

Learning Perceived Emotion Using Affective and Deep Features for Mental Health Applications

20

Citations

27

References

2019

Year

Abstract

Virtual agents are being increasingly used in the areas of healthcare, treatments, and therapy. Virtual agents with emotional intelligence have shown the potential to be applicable to deliver mental health intervention and facilitate therapies for children on the autism spectrum. To build an emotionally intelligent agent, automatic recognition of emotions is an essential building block. For mental health therapies and treatments, detecting rapidly changing emotions of individuals can be useful to know whether the patients are showing appropriate emotional response or not. To this end, we present a new data-driven approach to identify the perceived emotions of individuals based on their walking styles. We extract an individual's walking gait in the form of a sequence of 3D poses given an RGB video of him/her walking. We leverage the gait features to classify the perceived emotional state of the individual into one of four categories: happy, sad, angry, or neutral. First, we use an LSTM network to extract deep features of the gait using labeled emotion datasets. Next, we compute the affective features of the gaits using posture and movement cues. We combine these affective features with the deep features and classify them using a Random Forest Classifier. We observe that this approach provides an accuracy of 80:07% in identifying the perceived emotions. Additionally, we present a new dataset consisting of videos of walking individuals, their extracted gaits, and perceived emotion labels associated with each gait. We refer to this dataset as the `EWalk (Emotion Walk)” dataset.

References

YearCitations

Page 1