AAAI Publications, 2015 AAAI Fall Symposium Series

Font Size: 
Represent and Infer Human Theory of Mind for Human-Robot Interaction
Yibiao Zhao, Steven Holtzen, Tao Gao, Song-Chun Zhu

Last modified: 2015-09-23

Abstract


This abstract is proposing a challenging problem: to infer a human's mental state — intent and belief — from an observed RGBD video for human-robot interaction. The task is to integrate symbolic reasoning, a field well-studied within A.I. domains, with the uncertainty native to computer vision strategies. Traditional A.I. strategies for plan inference typically rely on first-order logic and closed world assumptions which struggle to take into account the inherent uncertainty of noisy observations within a scene. Computer vision relies on pattern-recognition strategies that have difficulty accounting for higher-level reasoning and abstract representation of world knowledge. By combining these two approaches in a principled way under a probabilistic programming framework, we define new computer vision tasks such as actor intent prediction and belief inference from an observed video sequence. Through inferring a human's theory of mind, a robotic agent can automatically determine a human's goals to collaborate with them.

Keywords


Human robot interaction; Theory of Mind; Hierarchical Task Network

Full Text: PDF