Learning Planning Operators in Real-World, Partially Observable Environments

Matthew D. Schmill, Tim Oates, and Paul R. Cohen

We are interested in the development of activities in situated, embodied agents such as mobile robots. Central to our theory of development is means-ends analysis planning, and as such, we must rely on operator models that can express the effects of a robot’s action in a dynamic, partially-observable environment. This paper presents a two-step process which employs clustering and decision tree induction to perform unsupervised learning of operator models from simple interactions between an agent and its environment. We report our findings with an implementation of this system on a Pioneer-1 mobile robot.

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.