Publication | Closed Access
VEmotion: Using Driving Context for Indirect Emotion Prediction in Real-Time
27
Citations
46
References
2021
Year
Unknown Venue
EngineeringMachine LearningAffective DesignAffective NeuroscienceWearable TechnologyIntelligent SystemsMultimodal Sentiment AnalysisPsychologySocial SciencesEmotional ResponseIndirect Emotion PredictionData ScienceDriver BehaviorAffective ComputingCognitive SciencePredictive AnalyticsAdaptive EmotionTraffic DynamicsComputer ScienceDriver PerformanceFacial Expression RecognitionDriver EmotionsEmotion Prediction PerformanceHuman-computer InteractionEmotionEmotion Recognition
Detecting emotions while driving remains a challenge in Human-Computer Interaction. Current methods to estimate the driver’s experienced emotions use physiological sensing (e.g., skin-conductance, electroencephalography), speech, or facial expressions. However, drivers need to use wearable devices, perform explicit voice interaction, or require robust facial expressiveness. We present VEmotion (Virtual Emotion Sensor), a novel method to predict driver emotions in an unobtrusive way using contextual smartphone data. VEmotion analyzes information including traffic dynamics, environmental factors, in-vehicle context, and road characteristics to implicitly classify driver emotions. We demonstrate the applicability in a real-world driving study (N = 12) to evaluate the emotion prediction performance. Our results show that VEmotion outperforms facial expressions by 29% in a person-dependent classification and by 8.5% in a person-independent classification. We discuss how VEmotion enables empathic car interfaces to sense the driver’s emotions and will provide in-situ interface adaptations on-the-go.
| Year | Citations | |
|---|---|---|
Page 1
Page 1