Publication | Closed Access
Design and evaluation of expressive gesture synthesis for embodied conversational agents
75
Citations
4
References
2005
Year
Unknown Venue
Avatar AnimationEngineeringExpressive Gesture SynthesisIntelligent SystemsCommunicationEmbodied AgentVirtual RealityExpressive BehaviorAffective ComputingConversation AnalysisEmbodied RoboticsAmerican Sign LanguageDanceExpressive GesturingHuman Agent InteractionAnimationDesignGesture SynthesisUser ExperienceEmbodied Conversational AgentsSpeech CommunicationGesture RecognitionInterpersonal CommunicationBehavior Synthesis TechniqueHuman-computer InteractionArtsRoboticsCharacter Animation
The study introduces a behavior synthesis technique to generate expressive gestures for ECAs, aiming to enhance their believability and life‑likeness, and calls for further investigation of parameter interactions. The technique models individual movement variability using a small set of expressivity dimensions and is empirically evaluated in two user studies. Results indicate the approach performs well for certain expressive behaviors, though animation fidelity is insufficient to capture subtle changes.
To increase the believability and life-likeness of Embodied Conversational Agents (ECAs), we introduce a behavior synthesis technique for the generation of expressive gesturing. A small set of dimensions of expressivity is used to characterize individual variability of movement. We empirically evaluate our implementation in two separate user studies. The results suggest that our approach works well for a subset of expressive behavior. However, animation fidelity is not high enough to realize subtle changes. Interaction effects between different parameters need to be studied further.
| Year | Citations | |
|---|---|---|
Page 1
Page 1