Automatic Nonverbal Behavior Generation from Image Schemas - Archive ouverte HAL Access content directly
Conference Papers Year : 2018

Automatic Nonverbal Behavior Generation from Image Schemas


One of the main challenges when developing Embodied Conversational Agents is to give them the ability to autonomously produce meaningful and coordinated verbal and nonverbal behaviors. The relation between these means of communication is more complex than a direct mapping that has often been applied in previous models. In this paper, we propose an intermediate mapping approach we apply on metaphoric gestures first but that could be extended to other representational gestures. Leveraging from previous work in text analysis, embodied cognition and co-verbal behavior production, we introduce a framework articulating speech and metaphoric gesture invariants around a common mental representation: Image Schemas. We establish the components of our framework, detailing the different steps leading to the production of the metaphoric gestures, and we present some preliminary results and demonstrations. We end the paper by laying down the perspectives to integrate, evaluate and improve our model.

Not file

Dates and versions

hal-02287759 , version 1 (13-09-2019)


  • HAL Id : hal-02287759 , version 1


Brian Ravenet, Chloé Clavel, Catherine Pelachaud. Automatic Nonverbal Behavior Generation from Image Schemas. International Conference on Autonomous Agents and Multiagent Systems, Jul 2018, Stockholm, Sweden. ⟨hal-02287759⟩
47 View
0 Download


Gmail Facebook Twitter LinkedIn More