A Bayesian Language for Cumulative Learning

Avi Pfeffer

A cumulative learning agent is one that learns and reasons as it interacts with the world. The Bayesian paradigm provides a natural framework for cumulative learning as an agent can use its observations and prior models to reason about a particular situation, and also learn posterior models. Cumulative learning requires a rich, first-order representation language in order to handle the variety of situations an agent may encounter. In this paper, I present a new Bayesian language for cumulative learning, called IBAL. This language builds on previous work on probabilistic relational models, and introduces the novel feature of observations as an integral part of a language. The key semantic concept is a scope, which consists both of models and observations. The meaning of a scope is the knowledge state of an agent. The language is declarative, and knowledge states can be composed in a natural way. In addition to presenting a language, this paper also presents an EM based learning algorithm called functional EM for learning IBAL models.

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.