Turn-Taking and Coordination in Human-Machine Interaction
Papers from the 2015 AAAI Spring Symposium
Sean Andrist, Dan Bohus, Eric Horvitz, Bilge Mutlu, David Schlangen, Program Chair
Technical Report SS-15-07
Published by The AAAI Press, Palo Alto, California.
The goal of this symposium is to bring together researchers across multiple disciplines — including multimodal systems, human-robot interaction, embodied conversational agents, and spoken dialogue systems — to address a topic of common interest: the modeling, realization, and evaluation of turn-taking and real-time action coordination between humans and artificial interactive systems. This symposium will serve to build common ground for researchers from these disparate backgrounds to share their perspectives, methodologies, and results from their own investigations into the problem of multimodal coordination.
Regulating human-computer coordination hinges critically on multimodal sensing, making decisions under uncertainty and time constraints, and on synchronizing behaviors across different output modalities. On the sensing side, there are numerous challenges with tracking the conversational dynamics from multimodal data. Making coordination decisions often requires reasoning under uncertainty and strict time constraints. Designing and rendering appropriate coordination behaviors (for example, floor-taking actions, floor-releasing actions, and back-channels) appropriate for the affordances of a system's embodiment raises additional challenges.