Concepedia

TLDR

The study introduces a behavior synthesis technique to generate expressive gestures for ECAs, aiming to enhance their believability and life‑likeness, and calls for further investigation of parameter interactions. The technique models individual movement variability using a small set of expressivity dimensions and is empirically evaluated in two user studies. Results indicate the approach performs well for certain expressive behaviors, though animation fidelity is insufficient to capture subtle changes.

Abstract

To increase the believability and life-likeness of Embodied Conversational Agents (ECAs), we introduce a behavior synthesis technique for the generation of expressive gesturing. A small set of dimensions of expressivity is used to characterize individual variability of movement. We empirically evaluate our implementation in two separate user studies. The results suggest that our approach works well for a subset of expressive behavior. However, animation fidelity is not high enough to realize subtle changes. Interaction effects between different parameters need to be studied further.

References

YearCitations

Page 1