Proceedings:
Proceedings of the AAAI Conference on Artificial Intelligence, 16
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 16
Track:
Robotics
Downloads:
Abstract:
Unlike conventional robots, a pet robot is autonomous and capable of exhibiting animal-like behaviors, including emotional ones, as it interacts with people and objects surrounding it. As pet robots become more integrated into humans’ everyday lives, a more natural way of communicating with them will be necessary. Similarly, for a pet robot to perceive humans’ intentions and communicate with people more effectively, it needs to be able to understand human gestures and body language. In this paper, we present an extensible, real-time, vision-based communication system that interprets 2D dynamic hand gestures in complex environments. This system uses both motion and color information to segment the hand from the cluttered background, without the use of special hand markers. The location of the hand is subsequently tracked in real-time as the human makes the gesture. Based on the hand’s trajectory information, the gesture is interpreted and translated into a command for the robot. We implemented our system on Yuppy, a pet robot prototype. Currently we can navigate Yuppy in unstructured environments using hand gestures.
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 16
ISBN 978-0-262-51106-3
July 18-22, 1999, Orlando, Florida. Published by The AAAI Press, Menlo Park, California.