A multimodal dialogue control system makes androids more human-like: meet Erica and Ibuki

Within the ERATO Ishiguro project on Human Robot Interaction, a team of ATR, Osaka and Kyoto University developed a multimodal dialogue control system making full use of cameras and microphones to improve a natural presence in daily situations of the android “ERICA” and enhanced the sense of dialogue by the social dialog robot “CommU”. In addition, a child type android “ibuki” was developed which can move freely on wheels. The multimodal recognition system uses various sensors such as distance sensor (recognition of human position and head movement), camera (human facial expression recognition), microphone array (speaker’s position and voice recognition) and uses a dialogue control system that makes human-like presence feelable by controlling speech, movement, gaze and emotion based on intention and desire.

https://www.youtube.com/watch?v=kE_kZLieeV0&list=PLrOJ4swF2nHTJfqNAPgHSkEB0F1yU7Vt3

JST news release, July 31, 2018

A multimodal dialogue control system makes androids more human-like: meet Erica and Ibuki
Scroll to top